The real output of research is "Inception"
Insight is just the start. Your job isn’t done until something (or someone) changes.
The to-do that broke my brain
"TODO: Communicate the value of UX Research."
That one-liner, part of a retro doc I wasn’t a part of, nearly made me spit out my coffee. It was written like a task, a small checkbox in the long tail of a completed project. But it perfectly captured the root problem with how most of us treat research and analysis: as deliverables.
Let me be clear: the value of research is the communication of research. If no one hears it, remembers it, or changes because of it, it didn’t happen.
I saw this firsthand while building a new data function. One of our first debates was about process: what’s the actual “Definition of Done” for quantitative analysis? The initial suggestions were what you’d expect: "report shared", "dashboard built", "ticket closed." But those are delivery milestones, not impact.
Eventually, we aligned on something sharper: the work isn’t done until the insight has been planted in the right stakeholder’s mind and influenced what they do next. In other words, the real output of research is inception.
Before we go deeper: a disclaimer
Let’s get one thing straight: generating insight is pretty damn hard. Qualitative or quantitative, finding the right question, getting clean data, applying the right lens—it takes skill, judgment, and often a bit of pain. This piece doesn’t discount that. That work is the backbone of everything.
But this article isn’t about how to do great research. It’s about what happens after. Because even the sharpest, most well-crafted insight is useless if it never lands.
The I³ model: insight → inception → impact
Here’s a simple mental model:
Insight = what you discovered
Inception = someone else believes it
Impact = they act on it
Let’s frame it that way:
Insight: "Churn spikes at Day 2."
Inception: A PM starts asking, "What are we doing about Day 2?"
Impact: The team ships a new onboarding flow targeting Day 2.
Most folks stop at the first step and wonder why nothing changes. A few make it to step two. But step three? That’s the rarest, and it’s the only one that matters.
Where good analysis goes to die
You know the graveyard. Maybe you’ve added to it.
Dashboards dropped in Slack with "FYI"
12-slide decks shared after the meeting ends
Insights surfaced two days after the roadmap was locked
Charts with no story, context, or narrative
I’ve done it too. We all have. But let’s be real: if a decision’s already been made, your insight is trivia.
We don’t fail at research because the analysis is wrong. We fail because no one cared at the right time.
Redefining "done"
Let’s draw a clear line.
The bad definition of done:
Dashboard has 2 new filters (that no one will use)
Ticket marked complete
Folder created in Google Drive
Stakeholder says "Cool, thanks"
These are checkboxes of delivery. But delivery is not impact.
The REAL definition of done:
The right person saw the insight
They understood it
They believed it
They changed something because of it
To make this real, you can add a checklist to your team’s workflow (i’m gonna try this soon and let you know about the results):
A stakeholder can summarize the takeaway in their own words
The insight is referenced in a live conversation
A PM brings it up unprompted in a meeting
Something actually ships because of it
If none of that happens? Congrats. You made content, not change.
Communication is the work
The hardest part of research isn’t the stats. It’s making someone care. And that’s not a soft skill — it’s the actual job.
To get inception-level impact, try this:
1. Know the decision you want to shape
If there’s no decision on the table, your insight will float. I once did a deep dive into how Premium features were being used by Premium accounts. The analysis was solid, with some interesting insights about usage patterns. But we had just changed our pricing, and no one was paying attention to Premium usage at that moment, all eyes were on the top-level metrics and other projects. The timing made the whole thing irrelevant. No decision = no landing zone.
2. Use their language, not yours
Instead of saying "users are confused by this step", say "people are dropping off because it feels like a dead end". Instead of "this color contrast violates accessibility guidelines", try "5% of users physically can’t use this feature." Translate the insight into the language of pain, opportunity, or risk. "The p-value is 0.002" becomes "This is 20x more likely to work than random" when you’re talking to executives.
3. Tell one story, not three charts
If they can’t repeat your insight in a meeting, it didn’t land.
Instead of sending a dashboard, send this:
"The thing to know: Day 2 retention is broken. 60% of churn happens here. This is where we win or lose."
When you get it right
The shift is subtle, but massive:
People ask for your input before decisions, not just after
They quote your work in meetings you’re not in
Your work stops being a reaction to the roadmap, and starts shaping it
That TODO from the retro? Not wrong. Just underestimated. Communicating the value of research isn’t a task.
It’s the actual job.