Ever wonder how to recover from hosting a research findings meeting where no one shows up? Or have you ever been drinking your morning coffee while a homeless man emerges from under the table where your test devices are set up?
On this panel edition of SDXD, four UX researchers shared their most outrageous failures in research, strategies to move on, and learning to expect the unexpected. The goal was to learn from their failures in hopes to make research challenges more approachable. Panelists did a great job humanizing the process with their personal stories and highly actionable tips and tricks for making research easier for everyone involved.
- Hoa Loranger, VP @ NN/g
- Paul Hong, Director of Product Design & Experience @ Epicor Software
- Teena Singh, UX Strategy & Customer Insights @ ServiceNow
- Deborah Gill-Hesselgrave, Sr Human Factors Engineer & Executive Manager @ SPAWAR
Because we all have them, the panelists recounted how they felt when they realized they had failed before, during, or after conducting their research. Hoa recalled how on her very first job, she organized a meeting to present her findings to a large group of military members, only to have no one show up! She realized very early in her career that it’s not only about the researcher and the data, it’s also about the needs of those around you. My lesson: make sure your insights are as much about the problem as they are about the impact on stakeholders, to get those involved more interested in your results.
Another good lesson came from Paul who remembered facing very angry people when he took pictures without permission, compromising the whole session. My lesson: make sure to always be in the clear in your intention!
Deborah’s memory of demoing a laser disk to elementary school students while ignoring the fact that her audience was easily distracted children taught her to always keep her audience in mind when planning her sessions, and always expect the unexpected by remaining flexible.
Teena made the crowd erupt in laughter as she told the story of when she had set up a user research lab in a conference room in Las Vegas and saw a homeless man emerge from under the table minutes away from starting her morning session with participants. Hey, what happens in Vegas…
To each their own
Panelists also observed the importance of figuring out the type of research required to uncover the desired data. For enterprise products, where there are very specific users performing repetitive tasks with alternating frequency of use. Being aware of your impact on the work environment, and the users’ motivations (based on tasks, not desires) should be part of your planning process.
Similarly, working in a military environment poses recruitment problems because of its inherently hierarchical nature and rigid structure. Participants are sometimes commanded to participate or told to perform tasks during the session that might clash with the work dynamic or normal behaviors/activities. There can be a fear to participate, which affects their performance on the research tasks. Understanding and not imposing on the system in which users operate allows researchers to turn participants into converts.
Navigating the waters of human behavior
Even in controlled lab settings, humans will remain humans. As the panelists mentioned, some participants can be unhelpful, uncooperative, or downright rude. When they have strong opinions about the product or the company sponsoring the study, they will not hesitate to express it.
Other very human challenges: a well-intended team member might comment too much or guide the participant, effectively ruining the session. Ensuring stakeholders and other collaborators are aware of the testing plan and taught best practices can help avoid major headaches for researchers. A helpful tip from Teena is to send out separate meeting invitations with specific instructions to participants and stakeholders so each side knows what to expect. She also tells her clients, “if you can make it to one session, you can make it to two.” Hoa mentioned that she schedules 10 minutes at the end of each session for clients to ask the participant questions. She then has a separate debriefing with observers once participants have left.
Another point raised was having all the bases covered only to have something go wrong. Hoa proposes that instead of going over a heavily detailed test plan with specific tasks, keep it in mind to guide the conversation during the study, because “sticking too strictly to your discussion guides will not give you the deep, rich data you want. … This is how research becomes fun”. And Teena suggests having themes to touch on during the session in lieu of a plan.
Dealing with You-Know-Who
I’m talking about stakeholders. As a standard practice, five is the magic number of subjects for qualitative research. But Hoa shared that most new clients will immediately request a large sample size, so she has a nifty formula:
Budget + Time = Feasibility
And a small sample size is always feasible. By prioritizing budget, and breaking the project into smaller studies, it is easier to be flexible with the test plan and be able to pivot.
Another important point that Hoa mentioned was to is make clients feel a part of the process, which helps build trust over time.
Teena summed it up pretty well when she said surveys tell you the what, but not the why. She followed with: “Use data well, and find the right situation to present your findings.” If you’re researching for a B2B, it is vital to build a network within client companies in order to find the right users. Specialized studies require target users, so go the extra mile.
Finally, the panelists gave their impressions on research trends. Teena mentioned a more widespread adoption of design systems and expects discovery based-research and product strategy to hopefully merge in the future. In a similar spirit, Hoa advised the crowd to be proactive and not wait to be asked to research, to become a part of the strategy. Paul foresees that the traditional researcher role will go away. Deborah reminisced how HCI research used to be sort of obscure in the past, whereas now scientists from other disciplines are reaching out for guidance on how to focus their research under the human lens.
And what did I learn? No matter how comprehensive and detailed, a plan will always be flawed in the face of human interaction. But testing with the right audience nevertheless delivers valuable insights which should ultimately become part of the product development strategy.
Be proactive. And yes, to learn to expect the unexpected.
Thank you to our guest blogger, Azalia Medina Mexia for writing this article.