Friday, December 20, 2019

User Testing: Seeing From Your Audience's Perspective

By Ramzi Badwi, Program and Outreach Coordinator, Michigan Legal Help Program

MichiganLegalHelp.org is a website that depends on its ability to express information clearly and efficiently.  As a provider of legal information for those who cannot afford a lawyer, it is easy to imagine how desperate a visitor to the website may be to find what they need and thoroughly understand it.  We take several steps to ensure that our information is accessible and user-friendly, including writing our content at a sixth-grade reading level.  Another very important step we take toward accessibility is directly testing our materials on users and analyzing their feedback.

If you have ever been involved in user design, you know how difficult it can be to predict how others will feel about a choice you make.  User testing allows us to ask the public directly about their preferences and the results can be surprising.  Often times, those who are making decisions about user design know the product so well that it is difficult to imagine how a first-time visitor would view it.  Placing a feature in a particular location on a website might seem intuitive only because you’ve spent an extensive amount of time with it.  User testing helps illuminate what feels intuitive to the visitor so that one can design a seamless and easy-to-use website, tool, or process.  User testing can be done by anyone - all you need is an audience!

Planning Your Session
A successful venture into user testing requires a bit of preparation.  One basic step you can take is to keep a running list of all of the aspects of your website, tool, or process that you’d like to user test. This can help you decide other parameters like how long the test will be, how you will compensate your participants, and how often you should user test.  You may want to keep the amount of time it would require to complete your test to about 30 minutes or less.

Once you finalize your list of questions you’d like to answer, you can begin thinking about how you want to test them.  A/B testing is a popular type of test where visitors compare two versions of the same content.  For example, if you’d like to test whether your audience would prefer a brief description of a legal term over a more in-depth one, you can give them both definitions separately and have them compare the two.  Another type of test that we often use is called observational testing - give the participant a task and observe them while they complete it.  In the case of MichiganLegalHelp.org, this could be asking the participant to find a specific tool on the website.  We can observe things such as where the participant looked first and how long it took them to find it to inform us on whether its current location is best.  During these types of tests, it is helpful to ask the participant to think out loud while they work so you can hear why they are looking where they are, or clicking on a particular link.  Once you know how you’d like to test each inquiry, you can draft the related set of questions and exercises.  These substantive tests will make up the core of the user testing experience.  However, we believe it is also important to gather supplemental information to help provide context.

The supplemental information we collect is made up of the participants’ demographic factors and other useful information that might qualify results from the substantive test.  This helps ensure that we are representing our audience well, and may help explain any anomalies we find when analyzing the test results.  The demographic questions we ask our testers touch on topics such as gender, race/ethnicity, and their highest level of education.  We also ask our participants' about their comfortability with computers, familiarity with our website, and familiarity with the judicial system in Michigan.  This information can be crucial to understanding the data that we gathered and any changes we need to make to the website as a result of this information.  For example, if half of the participants took much longer than the others at finding a tool on our website, but they all also self-identified as being uncomfortable with computers, that information is now qualified and makes more sense as we start to analyze the data as a whole.


We also like to make sure the visitor has plenty of opportunities to give us feedback.  They are given the chance to mention any issues they experienced with a particular exercise and with the user testing experience as a whole.  This helpful feedback gives us the opportunity to improve our tests.  We choose to place the supplemental questions in a pre-test survey, and the questions about their overall experience in a post-test survey.

Time to Test
Choosing your location for user testing is an important piece of the puzzle.  Ideally you will find participants there that represent a fairly accurate sample of your usual audience.  It is also important to form a professional and courteous relationship with your user testing host.  You will want to inform them of all of the necessary details and any accommodations you might need with ample time to consider your proposal.  For Michigan Legal Help, we know that our website is often used in public buildings like libraries.  For this reason, we often choose a local library as our user testing grounds.  Another important factor to consider when preparing for user testing is your visitors’ motivation to participate.  We provide compensation for user testing with the Michigan Legal Help Program.  Usually our tests will take a visitor 20 to 30 minutes to complete, for which we provide a $20 grocery store gift card.

An average day of user testing for us consists of a team of at least two staff members visiting the user-testing location for about five or six hours.  We usually see between 10 and 15 testers.  The time required can vary greatly depending upon the location’s foot traffic.  You may have to go around and ask folks if they’d like to participate in order to reach your required number of testers.  We recommend having at least one or two signs that invite people to participate.

We Conducted the Test, Now What?
You did it! You successfully conducted user testing on all of the topics you were wondering about; now what?   Now you have to analyze your results. These results will be used to inform your design choices, so the way they are analyzed and presented is important.  You’ll want to make sure you communicate the goal of each exercise, and briefly explain the method you used to get your answer.  It is helpful to create accurate and easy-to-read graphics to visually display the results to your team of decision makers and designers.  If there are any outliers or caveats to the results, you’ll want to make sure to note them.  Then you and your team can have a discussion about the results and whether they were conclusive.

If they were all conclusive, you may want to make decisions based on the results.  If some of the tests were inconclusive, the team may want to make the decision they believe is best for their visitors. Another option is to test it again, perhaps with a different question or exercise.  Once this round of user testing is finished, it does not have to end there; you and your team can conduct additional rounds of user testing regularly.

User testing is an invaluable tool when making design choices.  The participants can bring a fresh perspective to your team for a low cost and a couple days of work.  Michigan Legal Help views user testing as a way to continue to make our website more accessible and stay connected to our target audience.  We encourage other organizations to give it a try as well.  If anyone would like to talk more about user testing, please contact us!