Skip to main content
← All guides
Guide · Updated May 2026 · 13 min read

User interview: a practical methodology guide

How to plan, recruit, run, and synthesize user interviews — for product teams, designers, and solo researchers. The 5 types, the 60-minute structure, the recruiting channels, and the tools.

Looking for the recruiting platform User Interviews? They're at userinterviews.com. This guide is about the methodology — how to actually run interviews well, regardless of where you found the participants.

What is a user interview?

A user interview is a 1-on-1 research conversation where you learn how a real person uses (or would use) a product, service, or process. The goal is depth — understanding why people do what they do, not just what they say they want.

User interviews are the workhorse format of UX research. They're cheap, fast, and produce richer data than any survey or analytics dashboard. They're also the format most teams run badly — leading questions, biased recruitment, no synthesis. Done well, they're the single best research tool you have.

User interview vs related formats

These overlap but aren't interchangeable. Picking the wrong format wastes everyone's time.

User interview

1-on-1, 30–60 min, qualitative. Goal: depth on motivation and workflow.

Customer interview

Subset of user interview focused on buying decisions and willingness to pay.

Usability test

Subset of user interview where the user completes specific tasks in your product.

Focus group

5–10 people at once. Group dynamics produce different (often worse) signal than 1-on-1s.

5 types of user interviews

Different research questions call for different formats. Picking the wrong type is the most common reason research produces ambiguous results.

Discovery interview

When:You're exploring an unfamiliar problem space

Goal:Understand the user's world, workflow, and pain points before you have a solution.

Example:""Walk me through how you currently manage your team's sprints.""

Generative interview

When:You have a problem hypothesis but no solution yet

Goal:Surface unmet needs, jobs-to-be-done, and patterns across users that point toward solutions.

Example:""Tell me about the last time this process broke down. What happened?""

Evaluative interview

When:You have a concept or prototype to test

Goal:Get reactions to a specific direction. Distinguish polite interest from real pull.

Example:""I'm going to show you a mockup. Tell me what you think is happening here.""

Usability test

When:You have a working product or feature

Goal:Watch users complete tasks in your product. Identify friction, confusion, and abandonment points.

Example:""Imagine you want to invite a teammate. Show me what you'd do.""

Contextual inquiry

When:You need to see the user's actual environment

Goal:Observe users in their real setting (office, factory, kitchen). Catch context that interviews miss.

Example:"You sit alongside a nurse for an hour during a hospital shift, asking questions only when natural."

The user interview process

Six phases. Skipping any of them is the difference between research that drives decisions and research that fills folders.

1

Define the question

Before you recruit anyone, write down: what specific decision will this research inform? "Learning about users" isn't a goal — "deciding whether the new pricing tier resonates with mid-market PMs" is. Vague goals produce vague output.

2

Write the screener

A 5–8 question screener filters unqualified candidates. Always include: a behavior question (have they done X recently?), a role question (are they actually who they say?), and a knockout question for competitor employees. Keep it under 2 minutes.

3

Recruit

Through a paid platform, your customer list, LinkedIn outreach, or community channels. Recruit 30% more participants than you need — no-shows are 10–25%, late cancellations another 10%.

4

Schedule + send pre-read

Calendar invite with timezone, joining link, and (optionally) a 1-paragraph pre-read explaining what you'll cover. No NDAs unless legally required — they signal distrust and lose participants.

5

Conduct the interview

60 minutes is standard. Record with consent. Take light notes (full transcripts come from recordings). Stay curious — every "tell me more about that" is more valuable than your next prepared question.

6

Synthesize

Within 24 hours, tag transcripts by theme. After 5 interviews look for emerging patterns. After 12, draw conclusions. Output should be decisions and recommendations, not a deck of quotes.

The 60-minute interview structure

60 minutes is the standard. Less than 30 feels rushed; more than 90 burns out the participant. The middle 25 minutes (past behavior + task) carry most of the signal.

TimePhaseWhat you do
0–5 minWarm-upIntroduce yourself, explain the goal, get consent to record. Ask about their role for the first 2 minutes — builds rapport, confirms screener.
5–20 minContextAsk about their world, workflow, tools, and goals. Don't mention your product. You're building a model of their reality.
20–45 minPast behaviorThe meat of the interview. Ask about specific recent events: last time they faced X, what they did, what worked, what frustrated them. Past behavior > hypotheticals.
45–55 minTask or stimulusFor evaluative/usability: show prototype or assign task. Stay silent while they work — interrupt only when they ask. For discovery: deeper workflow walkthrough.
55–60 minWrapSummarize what you heard. Ask if you missed anything. Get permission to follow up. Ask for intros to colleagues with the same problem.

Recruiting users

Five channels in order from highest-quality-but-slowest to fastest-but-needs-vetting. Most teams should mix at least two channels per research project.

Existing customers

CostFree (or thank-you gift)
QualityHighest — they know your product
Speed1–3 days

Best for evaluative and usability testing. They're biased toward you, so weigh feedback accordingly.

Personal network

CostFree
QualityHigh but biased
Speed1–7 days

Useful for early discovery. Be aware that friends-of-friends will be polite — push past compliments.

LinkedIn outreach

Cost$0 + your time
QualityMedium-high
Speed1–2 weeks

5–10% reply rate. Personalize the first line. Offer $50–$100 if they're busy professionals — increases reply rate to 15–25%.

Reddit / community channels

CostFree
QualityVariable
Speed3–10 days

Niche subreddits, Indie Hackers, specialized Discord servers. Always disclose your purpose. Some communities ban DM solicitation.

Paid platforms (User Interviews, Respondent, Prolific)

Cost$50–$200 per session
QualityMedium
Speed1–5 days

Fastest for volume. Quality depends on screener rigor — vet aggressively. Best for evaluative work where you need many sessions fast.

5 common user interview mistakes

Leading the witness

"Don't you think this would be useful?" is a yes-trap. Even with no leading wording, your tone and follow-ups can leak preference.

✓ Instead: Default to neutral phrasings: "walk me through", "tell me about a time when", "what happened next?" Stay curious, not evaluative.

Talking too much

Founders explain their idea, then realize they spent 45 minutes pitching. The interview gave you no signal.

✓ Instead: Aim for 80% them, 20% you. After every question, count to 5 silently before asking the next. The most useful answers come in the gap.

Skipping the screener

Everyone wants a quick interview. You skip vetting, and discover 10 minutes in that the participant isn't actually in your ICP.

✓ Instead: Always screen. Even friends-of-friends. A 90-second screener prevents 60 wasted minutes.

Stopping at 5 interviews

Five sessions surface 80% of usability issues — but 80% of pattern-level insight needs 12+. Stopping early gives you anecdotes you mistake for signal.

✓ Instead: Plan in cohorts: 8–12 interviews per research question. If after 12 the patterns aren't clear, your hypothesis or your screener is wrong.

Synthesizing alone, late

You finish all 12 interviews, then sit down to make sense of them a week later. Memory has decayed; transcripts feel flat without the body language context.

✓ Instead: Synthesize within 24 hours of each session. Tag themes as you go. By the time you finish all interviews, the synthesis is mostly done.

User interview tools

The full landscape — from $0 setups to enterprise stacks. Most teams need 1 recruiting tool + 1 conduct/transcribe tool + 1 synthesis tool. The rest is overkill until you scale past 30 interviews per quarter.

User Interviews (userinterviews.com)

Recruiting$50–$200 per session

The dominant US platform for recruiting research participants. Their database has 4M+ vetted users across roles and industries.

Respondent.io

Recruiting$60–$200 per session

Strong for B2B research — recruits professionals by job title, company size, software used. Faster than User Interviews for niche roles.

Prolific

Recruiting (academic)$10–$30 per session

Originally academic; now used by product teams for quick consumer research. Cheaper, less B2B-focused.

Zoom + Otter / Fireflies

Conduct + transcribeFree–$30/mo

The default stack. Zoom for the call, Otter for searchable transcripts. Combine with a Notion template for synthesis.

Lookback / Userlytics / Maze

Usability + remote testing$100–$500/mo

Purpose-built for usability testing. Screen recording, click tracking, async test runs.

Dovetail / Condens

Synthesis$30–$200/seat/mo

Tag transcripts, search across all qualitative data, generate themes. Worth it once you're past 30 interviews.

GoNoGo (this site)

Practice + prepFree–$20

AI-led 30-min session that drills the question rhythm and surfaces hypotheses to validate first. A practice tool, not a replacement for real users.

Practice before the real interviews

The hardest part of user interviews isn't the methodology — it's catching yourself in real time when you start leading the witness or talking too much. That muscle takes practice.

We built GoNoGo as a way to drill the question rhythm before you sit across from a real participant. A 30-minute voice session with an AI strategist who asks past-behavior questions about your idea — and gives you a transcript showing where you slipped into hypotheticals or pitch mode.

Not a replacement for real users. A way to warm up.

Practice for free →

30 min · No credit card · Then go interview real users

4 deep dives

Each guide below answers one specific question this pillar surfaces. Built for product, design, and research teams running their next round of interviews.

Frequently asked questions

How is a user interview different from a customer interview?+
They overlap heavily but have different framing. A "customer interview" usually focuses on buying behavior, willingness to pay, and the path to purchase — useful for validation and positioning. A "user interview" usually focuses on how people use products, their workflows, and their pain points — useful for UX, product, and design decisions. The same conversation can serve both, but the questions you prioritize differ.
Should I pay users for interviews?+
In most cases, yes. Standard rates: $25–$50 for 30 minutes from consumers, $75–$150 for 60 minutes from professionals, $200+ for executives or specialists. Paying users improves quality (people show up prepared and engaged) and respects their time. The exception: existing customers who genuinely care about your product often decline payment — in that case, send a thank-you gift or product credit instead.
How many user interviews are enough?+
For most research goals, 5 interviews surface 80% of usability issues, 12 interviews reveal strong patterns, and 20+ are needed for confident segmentation work. The diminishing returns are real — your 25th interview rarely tells you something new. Plan in cohorts of 8–12 per research question, not endless rolling interviews.
What's the best way to find users to interview?+
Three main channels. Warm: existing customers, people in your network, recent signups — best signal quality but smallest pool. Targeted outreach: LinkedIn DMs to people in your ICP (5–10% reply rate), industry communities, Reddit niches. Paid: platforms like User Interviews, Respondent.io, or Prolific charge $50–$200 per qualified user. Use paid for volume, warm for depth.
Should I record user interviews?+
Yes, with explicit consent. Recordings let you re-watch body language, pull verbatim quotes for research artifacts, and let teammates synthesize without sitting through every session. Use Zoom's built-in recording, or tools like Otter or Fireflies for auto-transcription. Always disclose recording at the start and store files securely — interviewees consented to research, not public distribution.

Related guides