Hound Download & Onboarding
August 2019
Hound is a personal assistant app that uses voice AI to help you complete any task. Stakeholders wanted a better understanding of a user’s onboarding experience with Hound. To accomplish this, I followed users’ actions on the Google Play store & in the app to determine what content they interacted with and if it adequately prepared them to use Hound for the first time. The test lead to actionable insights and changes that led to a significant increase in app downloads.
Background
Hound’s onboarding experience was pretty bare-bones, so most app education was actually coming from the store that users interacted with before they even downloaded it. Although Hound’s Google Play store page was full of great info, it was unclear if users were interacting with it in the way that the product and marketing teams expected. Not only does the content on that page serve as marketing material, it is also the user’s first glimpse at how to use the app. In partnership with another UX researcher, I was tasked with determining what exactly users did on the store page and then how they interacted with Hound upon their first download.
Process
This research request was prompted by the Director of Product Management, but other key stakeholders included a UX designer and members of the marketing team. First, we met with the product manager to figure out exactly what she wanted to learn from the research, outline the project, and determine its timeline.
Given time and budget constraints, we chose to run an unmoderated usability test designed and implemented using UserTesting.com. This method was picked because it allowed us to reach a diverse sample of people who had not had prior experience with Hound. Given that I worked remotely, it made it possible for me to work on the project without being located in the office. Finally, this method also allowed me to review recordings multiple times and share them with stakeholders to drive home certain points.
This project was completed in two rounds. In the first round, the senior UX researcher created the test script and inputted it into UserTesting’s program. Then I analyzed the testers’ videos paying specific attention to what content they interacted with, which they ignored, and how easily they completed tasks once within the app. I then presented the results in a deck to the relevant stakeholders. This meeting led to further questions and resulted in running a second round of the test.
I took lead on this round, editing the test plan to focus more on the user experience after download. I put a larger focus on analyzing which kinds of tasks testers performed using Hound. I also quantified user behavior in order to provide data that was more actionable. For example, I accounted for how many users interacted with a feature in order to provide an approximation for how many real users would also behave accordingly. This round of the project ended with a presentation to the stakeholders and recommendations I had drawn from the data.
Impact
Marketing then took my recommendations and redesigned some of their materials. An A/B test in the Google Play store compared performance of the app’s posting with different images. This test came directly from the insight that users skimmed the information available to them even within the screenshots on the app store page. The version pictured below with the updated images showed a 6% to 31% increase in installs on the page.