Obsession, Autonomy, and the Quiet Power of Saying No
Customer obsession is everywhere in tech. But as privacy-first tools and open stacks gain momentum, a deeper question is emerging: are we building for growth — or for user sovereignty?
Last week, I watched two very different product conversations unfold in parallel.
One was a familiar rallying cry: customer obsession. Quotes from Bezos. Screenshots of dashboards. A reminder that the most successful companies build from the outside in.
The other was quieter but just as pointed. A thread about GrapheneOS committing to remain usable without requiring personal information. A celebration that .NET MAUI is coming to Linux. Builders choosing Laravel because it gives them control over their SaaS stack. Engineers recreating 3dfx Voodoo hardware on FPGAs simply because they can.
On the surface, these conversations live in different corners of our industry. One is about growth and winning. The other is about autonomy and control. But the more I sit with them, the more I see the same tension running through all of it:
Are we obsessed with customers — or are we building products that let people stay sovereign?
As a product designer, that question matters more than it sounds.
The Seduction of "Customer Obsession"
Let’s start with the phrase itself.
“Customer obsession” has become a kind of moral shorthand in tech. If you question a feature, someone invokes it. If you push back on a decision, someone asks, “Is this really customer-obsessed?”
And to be clear — the principle is powerful. Companies that consistently outperform their peers are deeply attuned to user needs. According to PwC, 73% of consumers say experience is a key factor in their purchasing decisions, yet only 49% feel companies provide a good experience. There’s still an enormous gap.
Listening better absolutely builds better products.
But here’s the nuance we don’t talk about enough: obsession can turn into extraction.
To obsess over customers at scale, you need data. Lots of it. Behavioral telemetry. Session recordings. AI models trained on thousands of support tickets. Tools that promise PMs they can “understand 10x more customers” because they can summarize 500 feedback responses in seconds.
I use these tools. They’re powerful. They help us see patterns we’d otherwise miss.
But every time we say “we need more insight,” we are also saying, implicitly, “we need more of you.”
More clicks. More context. More personal information.
And that’s where the second set of conversations — about Linux support, privacy-first operating systems, open stacks — becomes interesting.
The Other Kind of Obsession: User Autonomy
GrapheneOS committing to remain usable without requiring personal information isn’t flashy. It won’t trend on LinkedIn. But it represents a different value system.
Not How do we know more about users?
But How little can we require from them and still deliver value?
As designers, we rarely frame our decisions that way. We’re trained to reduce friction, increase activation, improve retention. But sometimes the highest expression of respect is restraint.
I once worked on a SaaS product where the growth team wanted to require phone numbers for account creation. The data suggested that SMS-based onboarding would increase activation by a few percentage points.
On paper, it was a clear win.
But in user interviews, something subtle emerged. Freelancers using the product didn’t want clients to have their personal numbers. International users worried about spam and cross-border data use. A few mentioned previous breaches at other companies.
None of this showed up in the activation funnel.
We ultimately made phone numbers optional.
Activation grew more slowly than projected. But churn six months later was lower among users who opted in voluntarily. More importantly, support tickets about privacy concerns dropped significantly.
We didn’t obsess over extracting more information. We obsessed over preserving trust.
There’s a difference.
The Stack Is a Values Statement
The conversations about MAUI coming to Linux and choosing Laravel for SaaS might seem purely technical. But stack decisions are never just technical.
When a team celebrates broader Linux support, what they’re really celebrating is portability and independence. When a founder chooses a framework they fully understand rather than a trendy abstraction, they’re choosing clarity over hype.
In my experience, the technology stack quietly shapes the product’s moral posture.
A few patterns I’ve noticed:
- Closed ecosystems tend to centralize control. They make certain experiences seamless — but often at the cost of flexibility.
- Open ecosystems demand more effort. But they distribute power more evenly between builders and users.
- Instrumentation-heavy stacks encourage measurement-first thinking. This can be healthy — or it can distort priorities.
None of these are inherently good or bad. But they’re not neutral.
When we design a system that requires login for basic functionality, that’s a values choice. When we allow anonymous usage with limited features, that’s a values choice. When we store less data by default, that’s a values choice.
The craft of interaction design lives in these details.
A checkbox that defaults to “share usage data.” A modal that blocks access until permissions are granted. A dashboard that shows metrics about users without ever showing the humans behind them.
Design systems codify these decisions at scale. Once a pattern is set — required fields, default toggles, consent flows — it replicates everywhere. That’s why the details matter so much.
We aren’t just shaping interfaces. We’re shaping relationships.
AI Summaries and the Distance Between Us
The rise of AI tools that promise to help PMs “understand 10x more customers” is a perfect example of this tension.
On one hand, the efficiency gains are real. I’ve used AI to cluster hundreds of survey responses in minutes. What used to take days now takes hours. According to McKinsey, generative AI could add between $2.6 to $4.4 trillion annually to the global economy — much of it from productivity improvements in knowledge work.
But there’s a subtle risk.
When we summarize 500 pieces of feedback into five bullet points, we gain clarity — and we lose texture.
We lose the tone of the frustrated user who wrote three paragraphs at midnight. We lose the hedging language of someone unsure if their problem “counts.” We lose the emotional spikes that don’t fit neatly into clusters.
In a recent project, we ran AI clustering on open-ended feedback about a new feature. The model identified three dominant themes. Clean. Actionable.
But when I went back to read 20 raw responses myself, I noticed something the summary didn’t emphasize: several users weren’t confused by the feature — they were embarrassed by it. They felt it made them look incompetent in front of teammates.
That emotional layer barely registered in the quantitative summary.
If customer obsession becomes purely computational, we risk optimizing for patterns instead of people.
The goal isn’t to reject AI. It’s to stay close enough to the raw material that we remember what it feels like to be on the other side.
Designing for Power — Not Just Satisfaction
Here’s the deeper shift I’m seeing across these conversations:
We’re moving from designing for satisfaction to designing for power.
Satisfaction asks:
- Did the user complete the task?
- Did they rate us highly?
- Did they stay?
Power asks:
- Can they leave easily?
- Do they understand what we’re collecting and why?
- Can they use this tool without surrendering more than they’re comfortable with?
The first lens optimizes for growth. The second optimizes for dignity.
And here’s the important part: these aren’t mutually exclusive.
Products that respect autonomy often build deeper loyalty. Edelman’s Trust Barometer consistently shows that trust is a primary driver of brand loyalty, especially in technology sectors where data misuse is a growing concern.
In my own work, the products that endure aren’t the ones that squeezed the most out of users. They’re the ones that felt safe to invest in.
Safe to put your workflow into. Safe to connect your data to. Safe to recommend to a colleague without caveats.
That safety isn’t accidental. It’s designed.
It shows up in:
- Clear, human-readable privacy settings.
- Reversible actions and transparent logs.
- Defaults that protect rather than expose.
- Business models that don’t depend on hidden tradeoffs.
These are small interface choices with enormous relational consequences.
The Question Beneath the Trends
When I step back from the week’s conversations — about obsession, AI, open stacks, privacy-first systems — I don’t see fragmentation. I see a single, shared anxiety.
We want to build winning products.
But we also want to live in a world where technology doesn’t quietly erode our agency.
As designers and product leaders, we sit at that intersection every day. We choose what to measure. We choose what to require. We choose which defaults become invisible norms.
Customer obsession is a powerful principle. But perhaps the more mature version of it sounds like this:
Be obsessed with your customer’s success — not their data.
That shift changes how we write requirements. It changes how we structure onboarding. It changes how we think about AI.
And it changes how we define “winning.”
The teams celebrating Linux support, privacy-first systems, and thoughtful stack choices aren’t rejecting customer focus. They’re expanding it. They’re reminding us that sometimes the most customer-centered decision is the one that gives power back.
As someone who cares deeply about craft — about the alignment between what we say and what we ship — I find that grounding.
Because in the end, the interface isn’t just what users see.
It’s the boundary of what we ask from them.
And that boundary is one of the most important design decisions we make.
Alex leads product design with a focus on creating experiences that feel intuitive and human. He's passionate about the craft of design and the details that make products feel right.