Customer interview playbook
How to recruit, run, and synthesize customer interviews — without falling into the traps that kill the signal. 30 tested questions organized by what you're trying to learn, plus the 5 questions that always lie.
What makes a customer interview different
The phrase "talk to your customers" gets used for everything. A customer interview is a specific format with specific rules — and it's different from the things it gets confused with.
Customer interview
Goal: learn. You ask 20%, listen 80%. No pitching.
Sales call
Goal: convert. You pitch, they evaluate. Outcome is yes/no.
Survey
Goal: scale. Many people, shallow answers. No follow-ups.
Mixing them destroys all three. If you pitch in a customer interview, you contaminate the data. If you ask 50 open-ended questions in a survey, you get a 2% completion rate. Pick a format and respect its constraints.
Before the interview
Recruit deliberately
Three channels in order of quality: warm intros (best signal, slowest), targeted LinkedIn (5–10% reply, scalable), paid platforms like User Interviews or Respondent ($50–150/interview, fastest but quality varies). For early validation, prioritize warm intros — your time is more valuable than the interview cost.
Screen aggressively
A single screener question filters out 70% of unqualified interviewees: "In the last 30 days, have you experienced [specific problem]?" If they haven\'t, they\'ll guess in your interview — and you\'ll spend an hour on noise. Better to lose the interview than fill your data with hypotheticals.
Prepare a structured guide, not a script
Write down your top 5 questions per category (problem severity, past behavior, willingness to pay, alternatives, workflow). Order them logically but don\'t read them. The best interviews follow the interviewee\'s lead — a script forces you back to your agenda when you should be following their tangent.
The structure of a great interview
60 minutes split into 5 phases. The middle 30 minutes (past behavior + workflow) are where the real signal lives. The first 5 and last 5 are protocol.
Warm-up (5 min)
Build rapport. Ask about their role, how long they've been doing it, what their day looks like. Don't mention your product. Goal: they relax, you confirm they actually fit your ICP.
Context & current state (15 min)
Get them talking about their world — what tools they use, how the team is structured, what their goals are. You're building a mental model of their reality before you can identify pain.
Past behavior — the meat (20 min)
Ask about specific recent events: the last time they faced this problem, what they did, what worked, what frustrated them. Past behavior is the most reliable predictor. Avoid hypotheticals.
Workflow deep-dive (10 min)
Walk through their actual workflow step by step. "Show me how you'd normally do X." Or "what happened right before this would have helped you?" This surfaces unmet needs the customer can't articulate.
Wrap & commitment ask (5 min)
Summarize what you heard. Ask if you missed anything. Then close with the commitment ask: can you follow up? Can they intro you to a colleague? Skin-in-the-game signals are the closest thing to a buying signal you'll get.
30 questions that work
Organized by what you're trying to learn. Don't use all 30 in one interview — pick 5–7 from the categories most relevant to your hypothesis. Save the rest for follow-up rounds.
Problem severity & frequency
- 01Tell me about the last time you ran into [problem]. Walk me through what happened.
- 02How often does that happen — once a week? Once a month? Once a year?
- 03What did you do about it? Did anything you tried actually help?
- 04On a scale of 1 to 10, how painful was it the last time?
- 05If you could wave a magic wand and make this go away, what would change in your life or work?
- 06What's the impact when this problem isn't solved? Lost time? Lost revenue? Lost sleep?
Past behavior — what they actually did
- 01Walk me through the last time you tried to solve this. Take your time, even small details.
- 02What tools or methods did you try? In what order?
- 03Where did each of those fall short?
- 04How long did the whole process take?
- 05Who else was involved? Did you have to convince anyone?
- 06When you finally settled on something, why that choice and not the others?
Willingness to pay
- 01How do you currently spend money or time on this? (Tools, contractors, your own hours?)
- 02What's the budget process at your company for tools like this?
- 03If a solution existed that completely solved this, what would be a reasonable price?
- 04What would you have to give up to pay for it? (Other tools? Headcount? Other budget?)
- 05Who would need to approve a purchase decision of $X?
- 06Have you ever paid for something to solve this before? How much, and was it worth it?
Alternatives & competition
- 01What's your current go-to solution for this — even if it's a workaround?
- 02What do you like about it? What annoys you about it?
- 03If you had to switch tomorrow, what would you switch to?
- 04Have you ever tried [closest competitor]? Why did/didn't you stick with it?
- 05What would a competitor have to do to win you over?
Workflow & integration
- 01Show me how you'd normally do this (literally, screen-share if remote).
- 02What happens right before this step? What happens right after?
- 03Where do you switch between tools? What gets dropped or duplicated?
- 04If you could remove one step from this workflow, which would it be?
- 05What does the perfect version of this look like to you?
Wrap & commitment
- 01Anything I should have asked but didn't?
- 02Who else faces this problem that I should talk to?
- 03Would you be willing to try an early version when we have one?
- 04Would you put down a deposit / book a follow-up / sign a letter of intent?
- 05How would I know if I built the right thing for you?
5 questions that always lie
These questions feel productive — they\'re actually traps. The answers are hypothetical, polite, or both. Replace them with the alternatives.
❌ "Would you use this?"
Hypothetical. Polite people answer yes. You learn nothing.
✓ Instead: "What was the last time you faced this — and what did you actually do?"
❌ "How often do you have this problem?"
Self-reported frequency is exaggerated when the topic is on someone's mind. They overestimate.
✓ Instead: "When did this last happen?" Then count the gaps in their stories yourself.
❌ "What features would you want?"
Customers are bad at solution design. They'll list features they'll never use, and miss the obvious ones they need.
✓ Instead: "Walk me through your workflow." Then identify the gap yourself.
❌ "Would you pay $X for this?"
Free hypothetical money. Everyone says yes — until the credit card comes out.
✓ Instead: "Would you put down a $50 deposit to be first in?" Real money is the only honest answer.
❌ "Don't you think this is a great idea?"
Leading question + compliment-bait. They'll agree to spare your feelings.
✓ Instead: Don't pitch. Don't ask for opinions. Ask about facts and behavior.
After the interview: synthesis
The interview is 30% of the work. Synthesis is the other 70% and the part most founders skip — which is why their "research" never produces decisions.
Within 24 hours
Re-read the transcript. Highlight quotes that surprised you. Tag each quote: PROBLEM BEHAVIOR WTP ALTERNATIVE WORKFLOW. Memory decays fast — every hour you wait, specifics fade.
After 5 interviews
Don\'t draw conclusions yet. Look for emerging themes — phrases or stories that appear in 2+ interviews. These are weak patterns. Keep going.
After 12–15 interviews
Strong patterns appear. The same problem framed in similar language. The same workaround tried. The same alternative chosen. This is when you start drawing conclusions and updating your hypothesis.
After 20–25 interviews
Decision time. Either commit to building (the patterns are strong, the buying signals are real), pivot (the customer was wrong but the problem is real), or kill (the problem isn't real or isn\'t severe enough).
Customer interview tools
Five categories of tools, each useful at a different stage. Don\'t buy them all — pick the one that addresses your actual bottleneck (recruiting, running, or synthesizing).
User Interviews / Respondent.io
RecruitingQuality varies. Vet aggressively in the screener.
Calendly + Zoom + Otter
Run + transcribeCombine with a Notion template for synthesis.
Dovetail / EnjoyHQ / Condens
SynthesisOverkill for solo founders. Powerful for ongoing UX research orgs.
UserTesting / dscout
Async + recruitPremium platform. Faster than manual recruit, more expensive.
GoNoGo (this site)
AI prep + practiceSynthetic interviewer that asks Mom Test–style questions. Practice tool, not a replacement for real customer conversations.
Practice the question rhythm
The hardest part of customer interviews isn\'t finding people — it\'s catching yourself in real time when you start drifting into hypotheticals or pitching. That muscle takes practice.
We built GoNoGo as a way to drill the question rhythm before your real interviews. A 30-minute voice session where an AI strategist asks Mom Test–style questions about your idea — and gives you a transcript showing where you fell into the traps. Free, runs in your browser, no credit card.
Not a replacement for real conversations. A way to warm up.
30 min · No credit card · Then go talk to humans
Frequently asked questions
How long should a customer interview be?+
How many customer interviews should I run?+
Should I record customer interviews?+
How do I recruit customers for interviews?+
What's the difference between a customer interview and a sales call?+
Customer interview deep dives
Related guides