Your team shipped the AI copilot three months ago. The press release called it revolutionary. The engineering team shipped on time. You rolled it out to 50% of your user base. And then something unexpected happened: nobody used it. Daily active users for the feature flat-lined at 8%. Your executives asked uncomfortable questions. Your engineers asked uncomfortable questions. And you, the product leader, were left holding a beautiful feature that solved a problem nobody knew they had.

This is the story of most AI product launches in 2025 and 2026. And there's a framework that explains why.

The Struggling Moments Framework

The Struggling Moments framework, developed by Bob Moesta and Greg Tomato, flips traditional product discovery on its head. Instead of asking "What features do customers want?", it asks "When are customers frustrated?" The insight is elegantly simple: people don't buy products, they hire products to make their lives easier when they're stuck.

A struggling moment is the moment before someone buys a solution. It's the manual spreadsheet that takes two hours on Friday. It's the data scattered across five tools. It's the email chain where you're asking colleagues the same question for the fifth time this month.

The framework works in three steps:

First, identify the struggle. Where are your best customers most frustrated? Not theoretically frustrated. Actually frustrated. Frustrated enough that they've built workarounds, hired contractors, or resigned themselves to inefficiency. Moesta calls these "pushes" and "pulls". The push is the pain of the current situation. The pull is the vision of a better way.

Second, validate the struggle is real. Talk to customers in context. Watch them experience the problem. How much time does it waste? How often does it happen? What's the emotional weight? A struggling moment that happens once a quarter carries less weight than one that happens daily.

Third, design for the moment. Most products launch features that solve the problem in isolation. Great products solve the problem in the context where it happens. They reduce friction to adoption by meeting customers exactly where they're struggling, not requiring them to adopt a new workflow.

How This Applies to Your AI Feature

Let's go back to that AI copilot. Your team interviewed customers. Every interview mentioned the same pain: "We spend so much time searching through Slack to find decisions." That was the push. The pull was clear: an AI assistant that could search, synthesize, and summarize decisions instantly.

So your team built exactly that. A beautiful AI copilot that does real-time synthesis. Ship day comes. You roll it out. And then adoption stalls.

Here's what happened: You solved the pain in isolation. You created a new tool, added it to the sidebar, and asked users to remember to use it when they had the problem. You didn't change when or how they searched for information.

But the struggling moment wasn't "I wish I had a better search tool." The struggling moment was "It's 3 PM on Friday, I'm in a meeting about quarterly results, and I need to know what we decided about metrics last week, right now." That moment happens inside Slack, in the meeting context, under time pressure.

Your AI copilot was the right solution. But it was in the wrong place. Users were struggling inside their decision-making meetings, in Slack DMs, in real-time. Your feature required them to exit their context, open a sidebar, and remember that the problem they were having was solvable by your feature.

Teams that understand Struggling Moments think differently. They ask: "When is the exact moment the user is in pain? And how do we meet them there?"

How Smart Teams Applied This

A fintech product noticed their struggling moment was different than expected. Research showed customers were frustrated during financial reviews with their accountants. The moment was: accountant asks "what's your Q3 spending looking like?" and the founder panics because all their data is scattered.

Instead of building a beautiful analytics dashboard (which would be used maybe quarterly), they built a Slack integration that proactively prompted users the day before their accountant calls. "Hey, your accountant meeting is tomorrow. Want a summary of your Q3 spending?" Users could click once and get an email with charts ready for the call.

Adoption went from 3% to 67% in four weeks. Same feature. Different context.

The insight here transcends the product. It forced the team to think about:

  • How users actually work (in Slack, before meetings, under pressure)

  • When to intervene (right before the moment, not after)

  • How to reduce friction to adoption (one click in the flow, not five steps in a new interface)

This is the Team/Platform/Process lens in action. It's not just about the product feature. It's about how your feature fits into the customer's entire system. Your payment tool doesn't exist in isolation. It exists inside their accounting workflow, their month-end close, their ERP system, their mental model of cash flow.

What You Can Do in Five Working Days

If your AI feature is underperforming (or you're shipping one next), start here:

1. Conduct 5-7 user interviews focused on the struggling moment, not the feature. Ask: "Tell me about the last time you wanted to [solve the core problem]. What were you doing? Who was in the room? What was the time pressure? How did you actually solve it?" Record the exact context. Most teams skip this and go straight to feature feedback. Don't. The context is everything.

2. Map your feature to the actual moment of struggle. Draw a timeline of the customer's workflow. Mark where the pain happens. Mark where your feature currently sits. If there's a gap between the two, you've found the adoption problem. Your job is to shrink or eliminate that gap. That might mean integrations, proactive nudges, in-context access, or changing when the feature is offered.

3. Run a 2-week pilot where you meet users in their struggling moment, not in your product. If the struggle happens in Slack, build there. If it happens in email, meet them there. If it happens during a specific workflow, integrate into that workflow. Measure adoption as "did the user take action during their moment of struggle?" not "did they open your feature?"

The struggling moment framework isn't a magic wand. But it reframes the problem from "why aren't users adopting our feature?" to "are we meeting users where they're actually struggling?" Usually, the answer reveals something simple you missed.

Your AI copilot might be brilliant. But brilliance in the wrong moment is just friction.

Author's Note: This framework was developed by Bob Moesta and Greg Tomato in "Demand-Side Sales" and is foundational to the Practitioner's Guide approach. For product leaders, the insight transfers directly: solve the problem where it happens, not where it's convenient to ship

Keep Reading