Table of Contents
Most founders don’t fail at customer interviews because they ask the wrong questions. They fail because they accidentally run a polite demo — and mistake encouragement for evidence.
That is why interviews often feel “useful” yet change nothing. You collect a handful of opinions, translate them into feature requests, and keep building without ever proving that the pain is urgent enough to trigger a real decision.
A good customer interview is a lightweight diagnosis. It should tell you what hurts, how often it hurts, what it costs, what the current workaround is, and what would make someone switch in the next 30–90 days.
The script below is designed for early-stage teams who need strong signal fast without building a heavyweight research process.
Why this matters at early stage
Early-stage teams rarely fail because they “didn’t work hard enough.” They fail because they spend weeks building on weak assumptions. A customer interview is your fastest way to replace opinions with evidence — if you run it to learn behavior, not to collect compliments.
The job of a good interview is simple: find out whether there is a real, costly problem, how people handle it today, and what would make them switch. If you cannot answer those three things, your product roadmap is just guesswork.
The interview outcome you want
You want to leave the call with:
- A clear pain statement in the customer’s words
- Proof that the pain shows up frequently and creates real cost
- A view of current alternatives and workarounds
- A credible trigger for switching
- A next step you can test in under 7–14 days
The core framework
Think of interviews as a mini-diagnosis. The structure below is designed to prevent the most common failure mode: founders pitching too early and collecting polite feedback.
Rule 1: Focus on past behavior
People are good at imagining the future and bad at predicting it. Anchor your questions in what they did last week — not what they might do someday.
Rule 2: Treat the current workaround as the competitor
In early stage, your real competitor is usually a spreadsheet, an agency, a manual process, or “we live with it.” Your goal is to understand why that workaround exists and what it costs.
Rule 3: End every interview with a decision-oriented next step
A strong interview ends with a clear learning action: a follow-up with data, a short test, or a paid pilot conversation. If you end with “thanks, this was helpful”, you are collecting stories, not building momentum.
The founder’s customer interview script
Use this as a copy-paste structure. Keep it conversational, but keep the order. It protects you from drifting into a demo.
Opening (2 minutes)
Set context and make it safe for the person to be honest.
- “Thanks for taking the time. I’m not here to pitch you something. I’m trying to understand how you handle X today and what’s painful about it.”
- “If this isn’t a problem for you, that’s still a great outcome — please tell me.”
Problem exploration (10–15 minutes)
Your goal is to get specific examples and costs.
Questions that reliably produce signal:
- What triggered you to think about this recently
- Walk me through the last time this happened
- What broke, and what did you do next
- Who else is affected downstream
- What does it cost you in time, money, risk, or missed growth
Current solution and workarounds (10 minutes)
This is where you learn what you must beat.
- What do you use today (tools, people, processes)
- What do you like about it
- What do you hate about it
- What have you tried before, and why didn’t it stick
Switching triggers and willingness-to-pay cues (5–10 minutes)
This is not a pricing debate. It is about commitment and urgency.
- What would need to be true for you to change this in the next 30–90 days
- Who would need to approve a change like this
- If you solved this, what would improve first
Wrap-up (2–3 minutes)
Close cleanly and tee up the next learning step.
- Who else should I talk to who has this problem strongly
- Can I follow up with a one-page summary of what I heard and a proposed next test
Example.
“Let me summarize what I heard: the biggest pain is that interviews are happening, but notes are inconsistent and decisions are subjective. The workaround is a spreadsheet plus memory, which breaks when more than one person is involved. If you had a repeatable script and a simple scoring method, you’d run 10 interviews in a week and decide faster. If I send a one-page template today, would you be open to a 15-minute follow-up to pressure-test it?”
The 5 mistakes that kill signal
Most interviews fail for predictable reasons. Fix these and your hit rate goes up immediately.
Mistake 1: You pitch during the first 10 minutes
If you explain the product too early, the person starts reacting to your idea instead of describing their reality.
Fix:
- Delay any mention of your solution until after you understand their last real example
Mistake 2: You ask hypotheticals
Questions like “Would you use this?” produce encouragement, not evidence.
Fix:
- Ask “What did you do the last time?” and “What have you paid for to solve this before?”
Mistake 3: You recruit the wrong people
Founders often interview friendly contacts who are curious but not urgent.
Fix:
- Recruit people with a clear trigger event and an active workaround
Mistake 4: You let the conversation drift into feature requests
Feature requests are downstream. If you jump there, you miss the real job and constraints.
Fix:
- Capture feature ideas, but keep steering back to pain, workflow, and switching
Mistake 5: You do not synthesize into decisions
Notes without a synthesis method turn into a folder full of anecdotes.
Fix:
- Use a simple scoring rubric and write validated problem statements after every 5 interviews
How to capture notes and synthesize signal
You do not need a research tool. You need consistency. Use one shared doc and a repeatable tagging approach.
A lightweight note template
- Context (role, company type, stage)
- Trigger (why now)
- Pain (their exact words)
- Workaround (tools + steps)
- Cost (time, money, risk)
- Switch trigger (what would force change)
- Evidence score (1–5)
A simple evidence score
Score each interview quickly on five dimensions (1–5):
- Urgency
- Frequency
- Budget ownership
- Authority
- Switching trigger
If your average score is below 3 after 10 conversations, your problem hypothesis is likely too weak or too broad.
Turning notes into 3–5 validated problem statements
After 10 interviews, write 3–5 statements that look like this:
- “[Segment] struggles with [problem] when [context], because [constraint]. They currently use [workaround], which fails because [reason].”
Then choose one statement as your next focus and design the smallest test that reduces the biggest remaining risk. If you want a structured way to turn the strongest statement into messaging and an early funnel, see /services/positioning-first-funnel-sprint/.
Implementation plan (7 days)
Treat this as a sprint. The goal is learning speed, not perfect research.
Days 1–2: Define your interview target
- Pick one narrow segment
- Write one problem hypothesis
- Define what counts as a “strong signal”
Days 3–5: Run 6–10 interviews
- Use the script
- Capture notes in the same template
- Score evidence immediately after each call
Days 6–7: Synthesize and decide
- Cluster themes
- Write 3–5 validated problem statements
- Choose the next experiment (landing page, prototype, paid pilot, or workflow test)
Weekly rhythm
A sustainable discovery cadence prevents random building.
The 60-minute weekly discovery review
- Review interview scores and themes
- Pick one bottleneck to learn about next
- Decide one experiment for the week
- Update your assumptions in one page
Ready to turn conversations into clear product decisions?
If your interviews feel “interesting” but you still do not know what to build next, you likely need a sharper script, a stronger synthesis method, and a decision rule that forces clarity. In a short strategy call, we can review your current interview notes, tighten the questions, and define a 7–14 day validation plan you can actually execute with a small team.
- Erman Aydin





