The Discipline of Seeing What’s Actually There
Across walkthroughs, onboarding debates, and AI headlines, a pattern is emerging: we can build faster than ever, but we’re not always disciplined enough to see what users actually do. Real impact starts there.
Last week, I sat in on a live walkthrough with a customer using a reporting feature our team had shipped three months earlier. On paper, it was a success. Adoption was climbing. Support tickets were low. The dashboard looked clean and confident.
Ten minutes into the session, the customer said something quietly devastating: “I mean, it works. I just export it to Excel so I can trust it.”
Nothing was technically broken. The numbers were accurate. The feature met every requirement in the spec. And yet, the product we built wasn’t the product they felt safe using.
Over the past few days, I’ve been watching conversations across our field—about guided UX walkthroughs, ruthless prioritization while building SaaS, digital product engineering, onboarding drop-off, financial unit testing, and even CEOs reporting that AI hasn’t meaningfully changed productivity. On the surface, they seem like separate threads.
But they’re orbiting the same question:
Are we disciplined enough to see what’s actually happening — not what our metrics, frameworks, or roadmaps say should be happening?
As designers, that question sits uncomfortably close to our craft. Because design isn’t just about what we ship. It’s about what people actually do when we’re not in the room.
Walkthroughs Are About Behavior, Not Validation
There’s a renewed interest in live, guided walkthroughs — and for good reason. Watching someone use your product in real time has a way of stripping away comforting narratives.
In a recent project, we invited five mid-market customers to walk us through how they generated monthly performance reports. We didn’t ask them to “test” anything. We simply asked them to narrate their normal process.
Here’s what we observed:
- 4 out of 5 users opened a secondary spreadsheet before touching our analytics tool.
- 3 recreated calculations we already displayed.
- 2 explicitly said they needed a “sanity check” outside our interface.
Our quantitative dashboard showed strong engagement. But the walkthroughs revealed something more important: engagement isn’t the same as trust.
The Nielsen Norman Group has long reported that usability testing with just five users can uncover around 85% of usability problems. I’d add a nuance: five users won’t just show you usability gaps — they’ll show you where your product doesn’t fit into reality the way you thought it did.
Live sessions surface things no dashboard can:
- The second monitor habit.
- The sticky note with backup instructions.
- The hesitation before clicking “Publish.”
Design lives in those details.
Ruthless Prioritization Is a Design Skill
Another thread I’ve seen this week: founders building SaaS products while working full-time jobs, emphasizing ruthless prioritization over hustle.
That framing matters.
We often treat prioritization as a product management discipline — roadmaps, scoring models, impact-effort matrices. But at its core, prioritization is design. It’s deciding what cognitive load your user carries and what your product carries instead.
When you’re building nights and weekends, you don’t have the luxury of ornamental features. You focus on what makes the product viable. Ironically, larger teams often lose this clarity.
I’ve seen this firsthand. On a previous team, we spent six weeks refining advanced filtering options for power users. During research, we discovered that 60% of new users never made it past onboarding because they couldn’t configure the first data source correctly.
We were polishing the fifth step. They were stuck on step one.
Ruthless prioritization isn’t about speed. It’s about asking:
- Where is the first real moment of friction?
- What would make this feel safe enough to continue?
- If we removed three things, would the core experience become clearer?
That’s not hustle culture. That’s design maturity.
When Engineering Accelerates, Judgment Becomes the Constraint
There’s also a lot of discussion about digital product engineering and accelerating launch cycles — cloud infrastructure, DevOps pipelines, AI-assisted development. We can ship faster than ever.
And yet, according to a 2023 McKinsey report, 75% of digital transformations still fail to meet their stated objectives.
Speed isn’t the bottleneck.
Judgment is.
I’ve worked on teams where we could spin up a new feature flag in hours. The question wasn’t “Can we build it?” It was “Should this exist?”
Acceleration without clarity creates a different kind of drag:
- More surface area to maintain
- More onboarding complexity
- More edge cases to test
- More cognitive load for users
As a design lead, I’ve learned that my job isn’t just to shape interfaces. It’s to protect coherence. Every new feature is a promise. Every toggle is a tiny contract with a user’s attention.
When we move fast without discipline, we don’t just add functionality — we dilute meaning.
Onboarding Isn’t a Funnel. It’s a Trust Exchange.
One of the most persistent conversations in SaaS is about onboarding drop-off. And the numbers are sobering. Industry benchmarks often show that 40–60% of users who sign up for a SaaS product will use it once and never return.
We respond with tooltips, checklists, progress bars.
Sometimes those help. Often, they decorate confusion.
In one onboarding redesign I led, we initially focused on reducing time-to-first-action. We shaved the setup process from eight steps to four. Completion rates improved by 18%.
But retention barely moved.
When we ran live sessions, we noticed something subtle: users weren’t asking “How do I finish onboarding?” They were asking, implicitly, “Is this going to make me look competent in front of my team?”
So we reframed the experience around reassurance:
- We added contextual examples using realistic data.
- We surfaced a short explanation of how calculations worked.
- We made error states specific and instructional instead of generic.
We didn’t just reduce friction. We reduced anxiety.
Retention improved meaningfully in the following quarter.
Onboarding isn’t about guiding someone through your UI. It’s about earning enough trust that they’re willing to change a habit.
AI, Productivity, and the Illusion of Impact
Another headline circulating this week: thousands of CEOs reporting that AI hasn’t significantly impacted employment or productivity.
That finding doesn’t surprise me.
In design critiques, I’ve started asking a simple question when AI features are proposed: “What human hesitation does this remove?”
If the answer is vague — “It makes things faster” — we’re probably designing for novelty.
If the answer is concrete — “It reduces the blank-page moment,” or “It helps you double-check before sending something irreversible” — now we’re in the territory of actual impact.
Productivity isn’t improved by adding intelligence in the abstract. It improves when we reduce the right kind of friction:
- Decision paralysis
- Repetitive formatting work
- Cross-referencing between tools
- Fear of making irreversible mistakes
If AI doesn’t address those lived moments, it won’t move the needle — no matter how impressive the demo.
And that brings us back to discipline.
The Craft Is in the Small, Observable Truths
Across all these conversations — walkthroughs, prioritization, engineering velocity, onboarding, AI — I see a quiet throughline:
We are surrounded by sophisticated systems, but the work still hinges on noticing small, human truths.
The exported spreadsheet. The abandoned onboarding flow. The unused advanced filter. The AI feature that no one quite trusts.
Design is not the art of making things beautiful. It’s the discipline of aligning intention with behavior.
Practically, that has changed how I approach my work:
- I push for at least one live session before finalizing major flows.
- I treat every added feature as a cost to coherence.
- I ask what emotional state we’re designing for — not just what task.
- I look for workarounds as signals of misalignment, not user stubbornness.
These aren’t grand strategies. They’re habits of attention.
And in a moment when we can build almost anything, attention is the scarce resource.
Closing the Gap Between Signal and Story
It’s tempting to believe that better dashboards, better frameworks, or better AI tools will close the gap between what we think is happening and what’s actually happening.
But the gap isn’t technical.
It’s perceptual.
We have to be willing to watch someone export the data we carefully designed — and resist the urge to defend it.
We have to be willing to see that acceleration doesn’t equal impact.
We have to notice when a feature works perfectly and still doesn’t feel usable.
That kind of discipline isn’t flashy. It doesn’t show up in release notes. But it’s the difference between a product that functions and a product that fits.
And in the end, fit is what people remember.
Not how fast we shipped. Not how advanced the system was.
But whether it quietly supported the work they were already trying to do.
That’s the standard I keep coming back to — in walkthroughs, in roadmaps, in every design critique.
See what’s actually there.
And design from that truth forward.
Alex leads product design with a focus on creating experiences that feel intuitive and human. He's passionate about the craft of design and the details that make products feel right.