How We Gather Product Insights Faster By Automating These 5 Questions
Objective As Product Designers we know that user interviews or feedback can yield the most valuable insights. Yet, Product Owners often have little appetite for User or Customer Research. To keep up with the fast-moving pace of digital products, we needed to answer: âHow might we continuously...

Objective
As Product Designers we know that user interviews or feedback can yield the most valuable insights. Yet, Product Owners often have little appetite for User or Customer Research.
To keep up with the fast-moving pace of digital products, we needed to answer:
âHow might we continuously get valuable User Research without spending a lot of time and effort?â
Specifically, we were looking for:
- âA process for âpre-recruitingâ people for user interviews. If we need quick research, this would cut the time it takes to spin up a recruiting process.â
- A source of âAlways Onâ qualitative feedback. So that at a moment's notice, a quick review of our feedback could inform product decision making.â
Solution
Step 1: Automate 5 Questions To Measure Product/Market Fit
We created a simple Typeform survey to show to people inside the actual product. But, we only showed it to people who engaged with the product at least twice. The survey asked the following questions:
- What type of people do you think would most benefit from the app?
- What is the main benefit you receive from the app?
- How can we improve the app for you?
- How would you feel if you could no longer use the app? A) Very disappointed, B) Somewhat disappointed, or C) Not disappointed
- Would you be open to providing feedback on the designs of future versions of the app? (Y/N)â
Step 2: Analyze the Results
We review submissions every 8 weeks and present findings during our Product Strategy Workshops. The responses can help inform our understanding of two basic questions:
- Why do people love the product?
- What holds people back from loving the product?
We were inspired by this productâs case study approach to gathering insights (not by their product decision makingâa reminder that user feedback does not replace good judgement).
The TL;DR version of the Case Study is this:
- âAsk your users how they would feel if they could no longer use your product. The group that answers âvery disappointedâ is experiencing product/market fit.â
- âOur next step was somewhat counterintuitive: we decided to politely pass over the feedback from users who would not be disappointed if they could no longer use the product.â
- âTo increase your product/market fit score, spend half your time doubling down on what users already love and the other half on addressing whatâs holding others back.ââ
Step 3: Perform Ad-Hoc User Interviews When Needed
When we want to do user interviews, we'll recruit people who have already opted in to giving feedback in the future. This gives us a quick and easy way to tap a pool of people willing to provide feedback. We focus on speaking with the people who responded "very disappointedâ or âsomewhat disappointedâ in the survey.â
What does it mean to prioritize feedback from the very/somewhat disappointed crowd?
Surprisingly, the case study would suggest ignoring the unhappy ("not disappointed") users completelyâŚ
âThis batch of not disappointed users should not impact your product strategy in any way. Theyâll request distracting features, present ill-fitting use cases and probably be very vocal, all before they churn out and leave you with a mangled, muddled roadmap.â
Instead, it's the "somewhat disappointed" users who provide the clearest insight into what is actually holding users back.
To segment even further, focus on the "somewhat disappointed" users that share the same "main benefit" as the "very disappointedâ users.
âFrom analyzing our third survey question, we knew that happy users enjoyed speed as their main benefit, so we used this as a filter for the somewhat disappointed groupâŚand looked more closely at their answers to the fourth question on our survey: âHow can we improve [the product] for you?âââ
Results
By the end of the case study, the Product Owner knew to devote half of their roadmap to doubling down on the main benefit (speed), while using the other half to address things that are holding "somewhat disappointed" users back (lack of a mobile app).â
One more thing
Does this apply to Design Sprint User Interviews?â
These principles can be useful in knowing which Userâs feedback to bookmark and carry forward past the point of validation.
Imagine a Design Sprint's "5 User Interviews" scenario. Letâs say 3 of your 5 interviewees had only great things to say about the prototype. The 4th interviewee had positive and negative things to say. The 5th was not a fan.
You could then ask the following:
- Is there an identifiable "main benefit" among the positive responses?
- Was this "main benefit" acknowledged by the 4th interviewee?
If both answers are yes, you might consider the 4th interviewee's responses to be especially valuable for product strategy because they've identified both a) the common thing that everyone loves and b) at least one thing holding them back.