The Key to Excellent Remote User Testing

Benefits and Goals

Excellent User Testing uncovers gaps in user expectations versus reality. It reveals how users perceive specifics within your software, app, or prototype. It sheds a spotlight on gaps in how users actually use your product (or prototype) versus how they’re supposed to. These gaps are what defines friction for users, which in turn results in frustration and low user retention. These gaps in user expectations versus software reality are what define user (un)happiness.

Who to User Test

Your current users are a great resource to user-test new ideas, and new features with, especially if this is done at an early stage clickable prototype stage. However, current users are not recommended for testing your existing live product, because of their level of familiarity. To reveal gaps and opportunities with your software’s existing implementation, it’s best done with individuals who fit your key user personas, but who haven’t seen your product before. For B2B or Enterprise SaaS applications the key user persona is typically a combination of job title and level of technical expertise. For Consumer applications, this is more lifestyle-related demographic information. Data to identify the key user personas can come out of User Discovery Interviews.

How User Testing is Done Remotely

I have written more extensively about how to conduct remote User Testing. My view is that it’s more effectively conducted by a UX professional or even any team member within your organization, rather than outsourced to third party user testing platforms such as usertesting.com because the gold nuggets are revealed in probing questions and digging in deeper, rather than watching videos of users walking themselves through tasks in a linear fashion (which is the output of user testing platforms). User testing is not a linear exercise, it's a free-flowing conversation based on guidelines and scripts that test existing hypotheses, with many tangents that lead to insights. It’s like therapy, you never know where you’ll end up, and sometimes won’t remember how you got there.

Deliverables and Outcomes of User Testing

The deliverables include a document outlining takeaways and findings from each round of user tests, which typically consist of 5-7 user tests. A full transcript of the user tests is included as well since various stakeholders within your organization might catch different issues based on their perspectives and expertise. A UX Watch Party might be part of the output as well, consisting of various recordings of users' struggles as well as key quotes capturing the key takeaways.

If you want to find out if Remote User Testing is appropriate for your organization’s needs, feel free to reach out to us! Subscribe to our newsletter to keep up with our series on UX Research Roadmaps.

Logistics of Remote User Testing During Covid

Covid-19 has changed the landscape of work, forcing almost everyone to either work remotely or do remote “parties” with friends. Even my parents have been doing Zoom calls as a means to socialize. This makes it a great time to do remote usability testing, since “Zoom” has become a household name. In this post, I’ll cover the logistics of how to set up remote user tests during Covid-19.

Determine Who to Test With

The pool of less tech savvy candidates that you can do user tests with has immensely grown, in the sense that many more people are now used to getting on a Zoom call. This is especially important if your target market that you need to test with is not super tech savvy.  Now you can share a calendar invite with a Zoom link, or just a Zoom link with someone, and they’ll know how to hop on without much trouble. Chances are that it won’t be their first time using Zoom. 

Recruit Users

This is the hardest step for any user test, and it gets harder the more specific your users are, which is usually the case when you are testing B2B enterprise product. For enterprise apps, a user’s job title, years in the industry, as well as indicators of their technology skills, such as age, matter greatly. LinkedIn as well as some specialized websites that dedicate themselves to finding and recruiting these people for research purposes can also be used. If it’s an existing product, existing users can be used to test new features, or existing users’ referrals can be used to test old features on new eyes.

For B2C consumer products typically demographic information such as age, sex, hobbies are what’s important. These candidates are easier to recruit, SubReddits for special hobbies, even Craigslist can be used. And if it’s an existing product, your current users can be tested for new features, and they can be incentivized to invite their friends to test old features on new pairs of eyes.

Schedule Events with a Zoom Link

Send a calendar invite with a Zoom link in it, including any instructions for what they need to prepare. For example, if you expect them to go through a certain prototype, send them the link so they have it open beforehand, but instruct them to not go any further.

Compensate

$20 - $50 Amazon gift card is on the lower end of compensating someone for an hour of their time for a user test. Lotteries and draws can be done, or free credit to your service. This is the key to making the recruitment piece easier.

Case study: User testing script for an iphone app

Identifying the Usability Sticking Points

This client came to me with an app already in the app store, with poor retention. My first step was to understand the goal of the application by asking the appropriate questions, such as: What problem is your app solving? 

The next steps were to identify the key use cases, and features that a user needs to go through before becoming a sticky user. In this step I also used MixPanel, their go to analytics tool, to assist in identifying some of these sticky users and their behaviors.

This was followed by doing a heuristic evaluation to create hypotheses as to what sticking points might be, and cross correlate this with any existing data points they have through an application like MixPanel. 

Usability Test Methodology

The final step before running the actual user tests was to create the user test methodology and script based on the information found in the previous steps. This involved creating tasks for users to perform (ie "What would you do if you wanted to replay your song"), without giving any hints or clues as to how to do it. Some of these tasks were just there to throw the user off, before giving them the next task. Some tasks asked for one thing, while I was observing for something else. Here is a subsection of the script, tasks and questions:

[Briefing: The purpose of this user test is to find out how to make Whispa more user friendly. I’ll be giving you several tasks, and questions. Please think out loud, the more you talk and express your thought process the better. There is no right or wrong.]

[Question: In your own words, how would you describe what Whispa does, to a friend?]

[Task: Create a song with drums and piano]

[Task: Add a guitar to the stage and make it’s volume 40%]

[Task: Listen to a song someone else created on Whispa]

...

The goal of the methodology was to cover the major use cases, as well as the major UX sticking points that were discovered or hypothesized in the previously described steps.

It is also important to gather relevant demographic information. For this particular app, one hypothesis was that users who do DJ understand the app better than those who do not. Other demographic information gathered was about what app the users currently use, to determine if there is a correlation between tech savvy-ness, or social-media-savvyness and the Net Promoter Score.

Post-test screening questions

“Are you able to play instruments or DJ?”

“Have you ever used any DJ or music production software in the past? If yes, when and for how long?”

“Which social media tool do you use?”

“How old are you?”

“What smartphone do you own?”

“What are your favorite apps?”

 

The following is an example of the key discoveries from one user test.

whispa2.jpeg

 

Usability Test Outcome

Based on the first round of user testing, I listed out all the problems, ranked them in terms of impact, and brainstormed solutions.

The final solution was implemented by the client's development team. This was followed by another round of user testing, which determined that the Net Promoter Score increased from 2/10 to 8.5/10 between the two iterations.

whispa3.jpeg
whispa4.jpeg