The last time you tried to run ten customer interviews, how long did it take? Three people rescheduled. One no-showed. Two couldn't find a time that worked for three weeks. By the time you finished the last conversation, the first ones felt stale and the sprint that was supposed to take two weeks had stretched to six.
Ten interviews at ninety minutes each — scheduling, conducting, and writing up notes — is fifteen hours before you've learned anything.
Most teams don't abandon customer interviews because they don't believe in them. They abandon them because the logistics eat the time they were supposed to spend on the insights.
There's a different way to do this.
What "Without Scheduling" Actually Means
The core idea is simple: instead of booking a live call, you send a link. The participant clicks it whenever they have a few minutes — morning, evening, between meetings. An AI interviewer greets them, asks your questions one at a time, listens to their answers, follows up on interesting threads, and wraps up when the topics are covered.
The participant talks to the AI the same way they'd talk to a human interviewer. Voice or text, their choice. The conversation adapts based on what they say — it's not a form, and it's not a script. If they give a short answer, the AI probes deeper. If they go on a tangent, it acknowledges what they said and gently redirects. If they seem uncomfortable with a topic, it moves on.
You get the transcript, a summary, and the key points — without having been in the room.
This is sometimes called async interviewing, and it changes the math on qualitative research in a meaningful way.
This isn't a replacement for every type of interview. Deep ethnographic research, emotionally sensitive topics, and highly exploratory conversations where you don't yet know what to ask — those still need a human in the room. Async AI interviews handle the 70-80% of research conversations that follow a knowable structure, freeing you to spend your limited live-interview time on the 20-30% that genuinely require it.
How It Works in Practice
The workflow has three steps:
1. Write your brief.
A brief is the set of instructions for the AI interviewer. You define the topics you want covered, the tone of the conversation, a greeting message, and any context the interviewer needs to understand your goals.
Here's an example — a customer discovery brief for a SaaS product team exploring how people manage projects:
Customer Discovery Brief
Topics:
- How do you currently track tasks and deadlines across your team?
- What frustrates you most about the tools you use today?
- What have you tried before, and what worked or didn't?
- If you could change one thing about your project workflow, what would it be?
Guidelines: We're a small SaaS team exploring whether to build a project management feature. We want to understand real workflows, not opinions about hypothetical features. Listen for specific stories and examples.
Tone: Conversational
Greeting: "Hi — thanks for taking a few minutes to share your experience. I'm going to ask you a few questions about how you manage projects and tasks. There are no right or wrong answers — I'm just interested in how things actually work for you."
You can clone a brief like this and have it ready in about five minutes.
2. Share the link.
Once published, you get a URL and a QR code. Send the link to your participants however you normally reach them — email, Slack, SMS, social media, printed on a flyer at an event. Each person who clicks the link gets their own conversation.
They click the link, choose voice or text, and start talking. No app to download, no account to create, no friction. Most conversations take 8-12 minutes.
3. Review the results.
Each completed conversation produces a full transcript with speaker labels, an AI-generated summary, and a list of key points. You can read through the transcripts individually or scan the summaries to spot patterns across multiple conversations.
If you send the link to fifty people, you get up to fifty individual interviews — each with its own transcript and summary — without having sat in a single one.
What a Conversation Actually Looks Like
Here's a real exchange from a customer discovery interview. The participant was asked about their project management workflow:
AI: How do you currently track tasks and deadlines across your team?
Participant: We use Asana, mostly.
AI: What does your typical workflow in Asana look like — how do tasks get created and assigned?
Participant: Honestly, it's kind of a mess. Our PM creates the tasks but half the team just checks Slack instead of Asana, so things fall through the cracks. We had a launch last month where two people didn't realize they had dependencies on each other until the day before.
AI: That sounds like it caused some real problems. What happened with that launch?
Participant: We pushed it back a week. Which wasn't the end of the world, but it was the third time in six months. Our CEO started asking why we can't ship on time, and the honest answer is we don't have a single source of truth for who's doing what.
AI: You mentioned that half the team checks Slack instead of Asana. What do you think drives that behavior?
Participant: Asana feels like homework. Slack is where the actual conversations happen. People would rather ask "hey, is this done?" in a channel than go check a board. I get it — I do the same thing sometimes.
That's the kind of specificity you get from a conversation — not "3 out of 5 stars for project management tools," but a concrete story about a missed launch, a frustrated CEO, and a team that treats their tracking tool like homework.
What You Actually Get Back
A traditional interview gives you rich data, but it's locked inside a recording. Someone has to transcribe it, tag it, and synthesize it. That's hours of work per interview — work that compounds quickly when you're running ten or twenty conversations.
With an AI-conducted interview, the transcript exists the moment the conversation ends. The summary and key points are generated automatically. You can start reviewing insights the same day you sent the link.
The data quality is different from a survey, too. Because the AI asks follow-up questions and adapts to what the participant says, you get the kind of nuanced, specific answers that only come from conversation. Not "4 out of 5 stars" — but "I switched to a spreadsheet because the timeline view doesn't show dependencies, and my team kept missing handoffs."
That's the difference between quantitative signal and qualitative understanding. You need both, but the qualitative side has always been the harder one to scale.
When This Works Well
Async AI interviews are strongest when:
You need volume or interviews are hard to schedule. Ten interviews is manageable to schedule. Fifty is a logistics nightmare. A hundred is impossible without a team. With a link, the number is limited only by how many people you can reach.
Your participants are busy. Executives, founders, working parents, people in different timezones — anyone who's hard to pin down for a 30-minute call. They can do a 10-minute voice interview at 11pm if that's when they have time.
You want consistency. A human interviewer's energy and approach shift throughout the day. The AI asks the same questions with the same tone every time, which makes it easier to compare responses across participants.
You're running continuous research. If customer feedback interviews are part of your ongoing practice rather than a one-time study, the logistics savings compound. What used to take a week of scheduling becomes a link you can send anytime.
You're juggling multiple projects. If you're running research across several clients or product lines, you can have interviews running in parallel — one brief per project, one link per project, results arriving simultaneously without any of them blocking the others.
Honesty matters more than rapport. Some topics — employee satisfaction, product frustrations, reasons for leaving — produce more candid responses when the participant isn't face-to-face with a human who might judge them or relay their comments to a colleague. An AI has no social agenda, and participants seem to feel that.
Getting Started
If you want to try this, you can set up your first interview in about five minutes:
- Write a brief with 3-5 topics and a short greeting
- Publish it and copy the share link
- Send the link to your participants
- Review transcripts and summaries as conversations complete
Start with a small batch — five or ten conversations — and see whether the insights match what you'd expect from a live interview. That's the fastest way to know if this approach works for your research.
You can try it free at Guided Surveys. Set up your first interview, send the link to a handful of people, and see what comes back.