Opinion | The Feel-Good Decade: How Not Solving Problems Built Fortunes Until AI Arrived

Engagement is No Longer The Holy Grail - But Are Users Ready For Whats Next?

Scope
Goals
Context

Last summer, I found myself sitting in a candlelit room in Chelsea at Peoplehood, surrounded by eleven strangers and a guide who would facilitate our hour-long “wellness experience.” The space was immaculate. Carefully curated lighting, the kind of bergamot-juniper ambient scent that screams upscale wellness. We all sat in a circle. The session began with breath-work and uplifting music choreographed into every step. Then we took turns sharing how we “really feel,” each person getting roughly five minutes of speaking time spread across the hour-long session with people they’d never see again. The ask was simple, drop your guard, and lay down your baggage, but make it super quick. An express confessional, if you will.

I watched person after person offer sanitized versions of their struggles (hints at deeper issues, carefully modulated vulnerability), and I realized I was witnessing something profound about our current moment in consumer experience design. Here was a service, created by the founders of SoulCycle, that promised community and connection, even “spirituality” (just without the bikes, as the founders put it) but delivered something fundamentally different: therapy for people not ready to open up, and the illusion of therapeutic progress without the uncomfortable work of actual therapy.

Peoplehood has since folded, though it may resurface as a product within Weight Watchers after its acquisition. But the experience crystallized something I’ve been observing throughout my career spanning visual, verbal, and interactive experience design. We’ve all lived through what I’d call the feel-good decade: the past ten years where we were perpetually in a transformative process but we never actually got there. Products prioritized making users feel like they were solving problems rather than actually solving them.

This isn’t accidental, it’s the logical endpoint of an engagement-obsessed product development culture that has confused metrics with meaning, interactions with impact. The result is a consumer landscape filled with beautiful, addictive products that offer the dopamine hit of progress without the friction of real change.

Consider Duolingo, a product I genuinely admire for its design sophistication and gamification mastery. Its owl mascot has achieved cultural icon status, its streak mechanics are legendary among product designers, and its user engagement numbers are enviable. But ask any polyglot, or honestly, anyone who’s tried to have a conversation in a language they’ve been “learning” on Duolingo for months and you’ll encounter an uncomfortable truth. The app excels at making you feel like you’re learning a language while keeping you comfortably distant from the messy, ego-bruising reality of actual language acquisition.

True fluency requires uncomfortable conversations with native speakers, embarrassing mistakes, and the kind of sustained, focused practice that doesn’t easily gamify into bite-sized daily streaks.

Or take Finch, a mental health app that couldn’t convince users to engage in the challenging work of focused journaling (that crucial practice of sitting with your thoughts, examining your patterns, and confronting uncomfortable truths about your behavior and emotions). Instead, it found success in sending push notifications reminding users to drink water, or tidy their room, that makes their self-care-Tamagotchi happy These are nice habits, certainly, but calling them mental health interventions is like calling a band-aid surgery.

This pattern repeats across verticals. Sleep tracking apps tell you precisely how long you slept and break down your REM cycles in beautiful data visualizations, but rarely translate that information into actionable insights that would help you get more sleep. Meditation apps offer thousands of guided sessions but struggle to help users develop the independent practice that defines genuine mindfulness. Fitness apps count steps and celebrate streaks but often fail to build the kind of sustainable movement practices that create lasting health changes.

The common thread? These products excel at engagement (that holy grail of the venture-backed growth) while quietly abandoning their stated mission of meaningful behavior change. They’ve discovered something powerful about human psychology: we’re willing to pay for and repeatedly use products that make us feel productive without demanding we actually be productive.

From a business perspective, this makes perfect sense. Engagement is measurable, scalable, and fundable. VCs understand DAU (daily active users) and retention curves. They can model the growth potential of habit-forming products. What’s harder to quantify (and therefore harder to fund) is actual impact on users’ lives. Real behavior change is messy, nonlinear, and often involves periods where users need to step away from your product entirely to practice what they’ve learned.

This creates a perverse incentive structure where products are rewarded for keeping users dependent rather than helping them graduate to independence. It’s the difference between a language learning app that celebrates when users no longer need it because they’re having conversations in their target language, versus one that celebrates 1,000-day streaks regardless of actual proficiency.

But I believe we’re approaching an inflection point, and it has everything to do with the rise of agentic AI.

Unlike the engagement-driven products dominating today’s market, agentic AI systems are fundamentally solution-oriented. They’re designed to complete tasks, solve problems, and deliver outcomes rather than maximize time-on-platform. An AI agent that helps you plan a trip isn’t trying to keep you planning forever. It wants to get you the best possible itinerary efficiently, so you can go enjoy your vacation.

This shift in technological capability could force a reckoning in consumer product design. When AI can actually solve problems rather than just simulate the feeling of progress, products that offer only the latter will become increasingly obsolete.

The challenge, of course, is that real solutions often require us to confront uncomfortable truths about ourselves and do difficult work. The Peoplehood experience was pleasant precisely because it offered the social validation of “opening up” without the risk of genuine vulnerability or the hard work of behavioral change that actual therapy demands. Similarly, Duolingo is popular partly because it never forces users to confront their lack of progress or pushes them into the uncomfortable zone of real conversation.

But here’s what’s fascinating about our current AI moment: we’re experiencing a kind of cognitive whiplash that perfectly illustrates how deeply we’ve been conditioned by engagement-trap products. Watch how people interact with AI today. They ask it to help plan their budgets, design workout routines, organize major life interventions, map out career transitions. Then they leave those tabs hanging, recommendations unimplemented, actionable advice gathering digital dust.

This is the psychological residue of years spent with products that trained us to find satisfaction in the process rather than the outcome. We’ve become so accustomed to apps that make us feel productive without demanding actual follow-through that when AI delivers genuinely implementable solutions (specific budget line items, detailed workout schedules, concrete next steps), we experience a kind of paralysis.

You can see this tension everywhere AI is being adopted. Business professionals ask ChatGPT to draft comprehensive strategic plans, then struggle to act on the recommendations because they’re used to planning software that celebrates the creation of plans, not their execution. Parents request detailed educational curricula for their children, then hesitate because they’re accustomed to learning apps that reward showing up, not mastering material.

I find this transition period revealing. Agentic AI doesn’t need us to be addicted. It needs us to achieve our goals so we’ll trust it with bigger, more complex problems. This fundamental alignment between user success and business success could herald a new era of products that prioritize genuine impact over engagement metrics.

But first, we have to unlearn years of conditioning that taught us to prefer the comfort of productive procrastination over the discomfort of actual change. The question isn’t just whether AI will disrupt engagement-driven products, it’s whether we’re psychologically ready to engage with tools that actually expect us to do the hard work.

The comfort trap is seductive, but the cost of remaining trapped is becoming increasingly clear. Real change has always required discomfort. Perhaps it’s time our products started being honest about that.

Photo credits: rodrigocartoon.com

Process
Outcomes
No items found.
No items found.