Convo 6: Advanced Modules & Essential Considerations
Advanced Interviewing Techniques
Once a practitioner has mastered the fundamentals of user interviews, they can expand their toolkit with more specialized, context-rich research methods. These advanced techniques are designed to uncover deeper layers of user behavior, motivation, and environmental influence that traditional interviews might miss. They require more planning and skill but can yield profoundly valuable insights, especially for complex problems or new product domains.
Contextual Inquiry: Research in the Wild
Contextual inquiry is a research method that combines observation and interviewing by studying users in their natural environment as they perform tasks. Instead of asking a user to recall how they work, the researcher watches them work and asks questions in the moment. This "master-apprentice" model, where the user is the master of their work and the researcher is the apprentice, is incredibly effective for uncovering hidden details, workarounds, and environmental factors that users would never think to mention in a standard interview.
Key Principles
- Context: The study must happen where the work happens (e.g., an accountant's office, a doctor's examination room, a factory floor).
- Partnership: The researcher and participant collaborate to understand the work. The researcher observes and asks clarifying questions, and the participant explains their actions and thought processes.
- Interpretation: The researcher shares their interpretations with the participant in real-time ("So, it looks like you're opening that spreadsheet because you don't trust the dashboard's summary. Is that right?") to validate or correct their understanding.
- Focus: The observation is focused on understanding the participant's work practices to achieve a specific research goal.
Benefits and Challenges
- Benefits: This method provides rich, detailed data about actual behavior, not just self-reported behavior. It reveals the complexities of a user's workflow, including interruptions, collaborations, and physical artifacts (like sticky notes on a monitor) that are invisible in a lab setting.
- Challenges: Contextual inquiry is time- and resource-intensive. It requires travel, significant preparation, and skilled analysis of qualitative data. The presence of a researcher can also influence the participant's behavior (the Hawthorne effect).
Example Case
A team designing a new project management tool conducts a contextual inquiry with a project manager. They observe her juggling spreadsheets, emails, and a legacy software system. They see her manually copy-pasting data between applications, a tedious task she would likely forget to mention in an interview. By asking questions in context ("I noticed you just copied that status update. Can you tell me what you're doing there?"), they uncover that her primary pain point is not a lack of features, but a lack of integration between her tools. This insight, discovered only through observation in her natural environment, fundamentally shifts the product's focus towards integration and automation.
The "Jobs-to-be-Done" (JTBD) Interview: Focusing on Progress
The Jobs-to-be-Done (JTBD) framework is a powerful lens for innovation that shifts the focus of research from the user themselves to the "job" the user is trying to accomplish. The core idea is that customers "hire" products to make progress in their lives. A JTBD interview, therefore, is not about demographics or personas, but about deconstructing the story of a user's struggle and their quest for a better way.
How it Changes the Interview
JTBD interviews are often "switch" interviews, focusing on the moment a user switched from one solution to another (or from no solution to a solution). The goal is to understand the forces that caused this change. The questions are designed to uncover a timeline of events and the underlying motivations.
The Four Forces of Progress
Pro-Tip: Beyond Features
JTBD helps you understand the why behind a purchase, not just the what. This allows you to innovate beyond current features and build solutions that truly help users make progress in their lives, often in ways competitors haven't considered.
A JTBD interview seeks to identify four key forces acting on the user :
- Push of the Situation: What was the pain or frustration with the old way that pushed them to seek a change? (e.g., "My spreadsheet system for invoicing became too messy and I missed a payment.")
- Pull of the New Solution: What was the appeal of the new solution that pulled them toward it? (e.g., "The new app promised automated invoicing and looked much more professional.")
- Anxiety of the New Solution: What fears or uncertainties did they have about the new solution? (e.g., "I was worried it would be hard to set up and that I might lose my existing data.")
- Habit of the Present: What attachments to their old way of doing things held them back? (e.g., "I was just so used to my spreadsheet, and I knew where everything was.")
Example Questions
Instead of asking about features, a JTBD interviewer asks narrative questions :
- "Take me back to the day you first realized you needed a better way to handle your invoicing. What was happening?" (Identifies the "push").
- "What other solutions did you look at or try before you chose this one?" (Uncovers the competitive landscape and the "pull").
- "What was your biggest concern before you signed up?" (Reveals "anxieties").
A famous JTBD case study involves McDonald's milkshakes. When researchers asked customers why they "hired" a milkshake during their morning commute, they discovered the job was not "to satisfy hunger" but "to have something interesting and easy to consume with one hand during a long, boring drive." This insight led them to make the milkshakes thicker (to last longer) and move the dispensers for faster access, dramatically increasing sales. This demonstrates how focusing on the "job" can lead to non-obvious but highly effective product improvements.
Ethnographic Studies: Deep Immersion
Ethnography is a qualitative research method that involves the researcher immersing themselves in a user's culture and environment over an extended period. It sits on a spectrum with contextual inquiry but typically involves longer-term observation and less direct intervention from the researcher. The goal is to gain a holistic understanding of a group's social dynamics, rituals, language, and unarticulated needs.
When to Use It
Ethnography is most valuable when designing for cultures or subcultures that are very different from the researcher's own, or when the problems are so deeply embedded in social context that users cannot verbalize them. It is about observing what people do, not just what they say.
The Role of the Interview
In ethnographic research, interviews are often informal and conversational, emerging naturally from the observational context. The primary data comes from the researcher's detailed field notes, with interviews serving to add color and clarification to observed behaviors.
The Line Between Interview and Observation
The key distinction is the level of researcher intervention. In a pure interview, the researcher directs the conversation with questions. In a pure observation, the researcher is a "fly on the wall." Ethnography blends these, with the researcher participating in the environment to some degree and engaging in conversations as they arise, but always prioritizing the observation of natural behavior. The decision of when to ask and when to simply watch is a key skill of the ethnographer.
The Ethical Researcher
Conducting user research is a profound responsibility. Researchers are granted access to people's time, thoughts, and personal information. Upholding the highest ethical standards is not just a matter of compliance; it is the foundation of trust upon which all valid research is built. Key ethical considerations include securing informed consent, protecting data privacy, handling sensitive topics with care, and actively managing one's own biases.
Informed Consent and Data Privacy
Before any research session begins, the participant must give their informed consent. This is a formal process that ensures the participant fully understands what they are agreeing to, protecting both the user and the researcher.
Elements of Informed Consent
A proper consent process, often documented in a consent form, must clearly explain :
- Purpose: The goal of the research study.
- Procedure: What the participant will be asked to do and how long it will take.
- Data Collection: What data will be collected (e.g., notes, audio recording, video recording).
- Data Use and Sharing: How the data will be used, who will have access to it, and how findings will be shared (e.g., in an internal report).
- Risks and Benefits: Any potential risks or benefits to participating.
- Voluntary Participation: The participant must understand that their involvement is completely voluntary and that they can withdraw at any time without penalty.
Data Privacy (GDPR/CCPA)
Researchers must comply with data privacy regulations like the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA). Key principles include:
- Data Minimization: Only collect personal data that is strictly necessary for the research.
- Secure Storage: Store all identifiable data securely and control access.
- Anonymization: Anonymize data as soon as possible by removing names and other personally identifiable information (PII).
- Right to Deletion: Users have the right to request that their data be deleted.
Handling Sensitive Topics and Emotional Participants
Pro-Tip: Prioritize Well-being
When dealing with sensitive topics, your primary responsibility is the participant's well-being, not completing your interview guide. Be prepared to pause, offer breaks, or even end the session if a participant becomes distressed.
When research touches on sensitive topics such as health, finance, or personal trauma, the researcher's duty of care is heightened. These interviews require additional planning, skill, and empathy to ensure the participant's well-being.
Transparency in Recruitment
Be clear and transparent in the recruitment materials about the sensitive nature of the topic so participants can make an informed decision about whether to opt-in.
Building a Safe Environment
Create a calm, private, and supportive environment. Have tissues and water available. Spend extra time at the beginning building rapport and explicitly stating that the participant is in control and can pause or stop the interview at any time.
During the Interview
- Pacing: Start with less sensitive warm-up questions and gradually move to more difficult topics.
- Acknowledge and Pause: If a participant becomes emotional or distressed, the immediate response is to pause the interview. Acknowledge their feelings with empathy ("Thank you for sharing that. I can see this is difficult. Please take all the time you need."). Do not push through the script.
- Offer an Out: Remind them that they do not have to answer any question they are not comfortable with and can end the session if they wish.
- Plan a Diversion: Have a neutral, unrelated question ready (e.g., "What did you have for breakfast?") to gently change the subject and de-escalate the emotional intensity if needed.
After the Interview
Do not leave a participant in a state of distress. If they are upset at the end of the session, offer to stay with them for a few minutes until they feel more composed or help them contact someone from their support network.
The Interviewer's Mindset: Managing Bias
Every researcher has cognitive biases—subconscious mental shortcuts that can influence how we interpret information. Being an ethical and effective researcher requires developing the self-awareness to recognize and actively mitigate these biases to prevent them from distorting research findings.
Confirmation Bias
The tendency to seek out and favor information that confirms our existing beliefs. This is one of the most dangerous biases in research.
- Mitigation: Actively seek disconfirming evidence. Frame hypotheses not to be proven, but to be challenged. Ask questions like, "What would have to be true for this idea to fail?".
Social Desirability Bias
The tendency for participants to answer questions in a way they believe will be viewed favorably by the interviewer.
- Mitigation: Emphasize that there are no right or wrong answers. Use neutral, non-judgmental language. Focus on past behavior ("Tell me about the last time...") rather than asking for opinions on sensitive topics.
Framing Effect
The way a question is framed influences the answer.
- Mitigation: Scrutinize every question for leading or biased language. Have a colleague who is not close to the project review the interview guide for neutrality.
General Mitigation Strategies
- Team-Based Analysis: Involve multiple researchers in the data analysis process. Different perspectives can help challenge individual biases.
- Self-Reflection: Before and after interviews, take time to reflect on your own assumptions and emotional reactions. Acknowledging your biases is the first step to controlling them.
- Focus on Facts: During note-taking and synthesis, prioritize documenting what was actually said and done, separating direct observations from your own interpretations.
Pro-Tip: The "Devil's Advocate" Mindset
Actively seek out information that disproves your hypothesis, rather than just confirming it. This "devil's advocate" approach is a powerful way to combat confirmation bias and ensure you're getting the full picture.
Driving Action: Communicating Findings for Impact
The value of user research is not in the findings themselves, but in the action they inspire. An insightful research project that fails to influence product decisions is a wasted effort. Therefore, the ability to communicate findings effectively to different stakeholders is a core competency for any researcher. This requires translating raw data into a compelling story and tailoring that story to the specific needs and priorities of the audience.
Tailoring Your Story for Different Stakeholders
A one-size-fits-all research report is rarely effective. Executives, engineers, and designers care about different aspects of the research and consume information in different ways. A skilled researcher tailors their communication to resonate with each audience.
Presenting to Executives
- Their Priority: Business impact, strategic alignment, risk, and return on investment. They have limited time and need the "so what" immediately.
- How to Tailor: Get straight to the point. Start with a concise executive summary (a "TL;DR") that highlights the top 3-5 key insights and their business implications. Use quantitative data to frame the qualitative story.
- Bad Example: "In our interviews, five out of eight users said the checkout process was confusing."
- Good Example: "Our research indicates that friction in the checkout flow, stemming from user confusion around shipping options, is a direct contributor to cart abandonment. Based on our analytics, this could be costing us an estimated $50,000 in lost revenue per month. We recommend simplifying the shipping selection to a single-step process."
Presenting to Engineers
- Their Priority: Clarity, feasibility, and actionable problem statements. They need to understand what is broken and why it needs to be fixed so they can build a solution.
- How to Tailor: Avoid UX jargon. Focus on clear, logical explanations of user behavior and the technical implications. Provide concrete examples and well-defined problems.
- Bad Example: "Users experienced cognitive dissonance due to a lack of signifiers in the interface."
- Good Example: "Users consistently fail to see the 'Save' button because it's located at the bottom of the page, while their mental model, based on other applications, expects it to be in the top-right header. This mismatch violates their expectation and causes them to believe their work is not being saved. The problem to solve is: how can we align our save functionality with established user patterns?"
Presenting to Designers
- Their Priority: The user's emotional journey, motivations, and mental models. They need deep empathy to create intuitive and delightful experiences.
- How to Tailor: Tell the user's story. Use powerful quotes, video clips of users struggling, and journey maps to build empathy. Focus on the "how it feels" to be the user.
- Bad Example: "The task success rate for feature X was only 40%."
- Good Example: "Here is a video clip of Anisa. Notice her sigh of frustration as she tries to find the filter option. She says, 'I feel so stupid, I just can't find it.' This moment of frustration and self-doubt is what we need to design for. Her goal is to feel empowered, but our current design makes her feel incompetent."
From Research to Roadmap: Prioritization Frameworks
User research often generates more opportunities and problems than a team can possibly address at once. The final step in driving action is to help the product team translate these insights into a prioritized product roadmap. Using a structured prioritization framework helps make these decisions objective, transparent, and defensible.
Value vs. Complexity Quadrant
This is a simple but powerful framework for quick decision-making. The team plots potential features on a 2x2 matrix with "User Value" on one axis and "Implementation Complexity/Effort" on the other.
- High Value, Low Complexity: Quick Wins / Low-Hanging Fruit (Do these now).
- High Value, High Complexity: Major Projects / Big Bets (Plan strategically).
- Low Value, Low Complexity: Fill-ins (Do if there's time).
- Low Value, High Complexity: Time Sinks / Money Pits (Avoid). This framework is excellent for new products or when fast, objective decisions are needed.
The MoSCoW Method
This framework helps align stakeholder expectations by categorizing features into four buckets :
- Must-have: Critical features without which the product is not viable. These are non-negotiable.
- Should-have: Important features that are not critical for launch but are high-value.
- Could-have: Desirable "nice-to-have" features that will be included if time and resources permit.
- Won't-have (this time): Features that are explicitly out of scope for the current release. This is crucial for managing expectations.
RICE Scoring
A more quantitative framework that scores each feature based on four factors to generate a single priority score :
- Reach: How many users will this feature impact in a given timeframe? (e.g., 5,000 users/month).
- Impact: How much will this feature impact individual users? (Often scored on a scale: 3=massive, 2=high, 1=medium, 0.5=low).
- Confidence: How confident are we in our estimates for reach and impact? (Expressed as a percentage, e.g., 90%). This helps to de-prioritize high-risk guesses.
- Effort: How much person-months will this take to build? (e.g., 4 person-months).
- Formula: RICE Score = (Reach × times Impact × times Confidence) / Effort
By using these frameworks, the research function moves from simply delivering a report to actively participating in the strategic decision-making process, ensuring that the voice of the user is not just heard, but is translated directly into a prioritized plan of action.
Key Takeaways
- Go to the User's World with Contextual Inquiry: To understand how a product is really used, observe users in their natural environment. You'll uncover workarounds and contextual factors they would never mention in a standard interview.
- Uncover Motivation with Jobs-to-be-Done (JTBD): Focus on the "job" a user is "hiring" your product to do. This shifts the focus from features to the user's underlying goal, leading to more innovative solutions.
- Ethics are Non-Negotiable: Always get informed consent, be transparent about your research, protect user data, and be prepared to handle sensitive topics with care. Your primary responsibility is to the participant's well-being.
- Manage Your Own Biases: Actively fight against confirmation bias by seeking evidence that disproves your assumptions. Acknowledge your biases to prevent them from distorting your findings.
- Tailor Your Story to Drive Action: A research report is useless if it doesn't inspire change. Translate your findings into the language of your audience—business impact for executives, technical problems for engineers, and emotional journeys for designers.
Remember This Even If You Forget Everything Else
The value of your research is not measured by the quality of your report, but by the quality of the decisions it enables. Your job is to be a translator: translate a user's messy reality into a clear, compelling story, and then translate that story into the specific language that motivates each stakeholder to act. Master this, and you'll move from simply gathering insights to driving real product impact.