Uncategorized

5 tips for SaaS user onboarding testing

Published May 24, 2023 ⚡ Updated on October 2, 2023 by Paul Doerwald
Paul Doerwald from Clockk shares his 5 tips for SaaS user onboarding testing

The SaaS dream is a no-touch self-service tool. Users sign up for a 14-day trial, onboard themselves, and convert to paid at the end. What could be sweeter?

Then reality hits. Your sign-up stats show only a fraction of your new sign-ups complete onboarding. Even fewer get to the coveted “aha” moment. On-boarding is hard

Worse, some tools need the user to take steps outside the app. These include installing a javascript snippet or a desktop or mobile app. Onboarding is really hard.

At Clockk we’ve been leveraging Userbrain’s testers to make our onboarding seamless.

This is our guide to successful user onboarding testing.

Context

Clockk is an AI-powered time tracker. It is more than a mere with a timer users have to remember to start and stop. With Clockk, users work the way they want to, jumping from project to project. They never have to remember to click a timer.

New users not only need to orient themselves to Clockk’s unique user interface, but they also need to:

  • Install a desktop app
  • Install a browser extension
  • Link their email
  • Link their calendar

It’s hard to imagine a more challenging onboarding process.

The challenge of SaaS onboarding

SaaS onboarding is something that every user experiences exactly once. Never more, never less.

The best time to ask about what is confusing the user is while they’re confused, not after they’ve figured it out or given up.

New users are not good at giving onboarding feedback. They are not invested in your product enough to spend the time giving critical comments. There is no incentive to make the effort to give feedback.

You can’t test the onboarding experience with internal team members. No one on your team will ever have a true onboarding experience:

  • They’re a developer, so they’re not on-boarding. They’re setting up a development environment. Onboarding bugs they see could be due to a misconfigured environment.
  • They’re in a product management, sales, or other internal role. They are not only onboarding to a new product but to a whole new job. Everything, right down to the chair they’re sitting on, and the computer in front of them is new.
  • They are onboarding to an administrative view of the product.
  • They have colleagues nearby, telling them where to click and what to type. They get answers to any questions.
  • Their goal is to impress their new colleagues and managers.
  • They are not in the product’s Ideal Customer Persona (ICP).

Needless to say, the internal user’s experience does not resemble a real user’s experience.

Internal users have almost no chance of seeing any bugs. Their test is being done in isolation; no one is exploring the onboarding problem at that moment.

Because onboarding happens only once, it’s easy to let it slide and treat it as unimportant. After all, 100% of your active users have successfully completed onboarding! What could possibly be needed?

At Clockk, we’ve re-thought and re-written our onboarding process a dozen times. We’ve tried many different approaches as we searched for the optimal way to on-board users. Through all our rewrites, we’ve had one constant companion…

Enter: Userbrain

Userbrain is our Rosetta stone to understanding the obstacles users face when onboarding.

Userbrain testers are awesome:

  • They’re thoughtful and hard-working.
  • They won’t give up easily.
  • They read and follow the instructions.

But there are three important down-sides:

  • They’re thoughtful and hard-working.
  • They won’t give up easily.
  • They read and follow instructions.

The big plus about Userbrain testers is also the big negative. Userbrain testers will stop at nothing to complete a task. Normal users will stop a task at nothing

Userbrain testers and normal users’ goals are not at all aligned.

But how do we overcome this misalignment?

We don’t test to see if a tester can complete the tasks; it’s a foregone conclusion that they will. Instead, we test for obstructions:

  • Where are the bugs that prevent or slow a user from moving forward?
  • Where are the “friction points”?

I’ll dig into both of these below.

We run Userbrain onboarding tests at two different points:

  1. While we’re working on improving the onboarding workflow.
  2. The rest of the time ? Yes, we try to run a Userbrain onboarding test every few product deploys to make sure:
    1. We haven’t introduced any new bugs in onboarding
    2. What used to be true about onboarding remains true

Bugs

Developers test to pass.

When they test their work, developers will choose to test with inputs that they know will succeed. They want to know that their code works so they can check the box “done.” It’s a rare developer that tries to find ways that their code doesn’t work.

It’s even more challenging for us at Clockk. Part of our onboarding requires installing a locally running tracker. Once the user leaves our web app, we lose all control:

  • Are our installation instructions clear?
  • Is the user familiar with installing software?
  • Are they able to successfully install and run the tracker?
  • Do they successfully authenticate?
  • Are there any system restrictions that prevent them from installing the tracker? e.g. administrative permissions, virus checkers, malware.
  • Is the tracker correctly signed and notarized?
  • Does the tracker successfully inform our server that it is installed and running?
  • Does the tracker restart automatically on reboot?
  • Can the user find where the software is installed so they can open the window successfully?

There are over 100,000 Userbrain testers who have never installed Clockk before. Each one can show us new and interesting ways that our software will break in their environment.

Even better, because they’re software testers, they install all sorts of software. Who knows what state their computer is in? It’s a frightening, messy world out there!

Bugs are easy to recognize. Whether the bug causes the software to fail outright, or it’s a subtle design bug because of a small, low-resolution small screen, they’re immediately obvious to any developer watching a Userbrain test session. Bugs are binary. Some are worse, some are better, but the software either works or it doesn’t. Bugs are easy to fix.

Friction points, on the other hand, are harder to recognize and much more interesting.

Friction points

Friction points are the spots in user onboarding where the user slows down.

Everything runs smoothly until the user slows down or stops.

The slow-down won’t be obvious until you’ve run a few tests and you’ve seen several testers slow down at the same point. Here are some examples we’ve seen:

Some steps are cognitively expensive

An early version of Clockk’s onboarding had these steps:

  1. Name, email, password, agree to license agreement
  2. Company name, number of team members
  3. Enter your first clients & projects
  4. Install the browser extension
  5. Install the desktop app
  6. Done!

We found that step 3 — enter your first clients & projects — was surprisingly difficult!

Our screening questions ensured that our testers were in our Ideal Customer Persona (ICP). Everyone who is onboarded should have easily listed several clients and projects.

It took about 5 tests for us to realize that testers struggled with this step. Once they completed it, the pace of onboarding resumed.

On reflection, we realized that this step was cognitively expensive.

The user was onboarding to a new time-tracking tool. They were answering biographical questions and setting up software. They wanted to get through the process to the other side.

With this third step, we asked users’ brains to switch gears. They needed to context shift to their professional lives. This was a significant cognitive burden.

The solution was simple. We moved clients and projects out of the onboarding and into the application. In the application, setting up clients and projects wasn’t a cognitive burden. In that context, it was the logical next step.

Userbrain helped us see friction we could never have anticipated.

A big checklist is overwhelming

In a more recent iteration of our onboarding, we had a single screen with 5 tasks the user needed to complete:

  1. Install the browser extension
  2. Install & run the desktop app
  3. Set up Active Window tracking
  4. Link email
  5. Link calendar

Once again, it was at the 5th test that we noticed that when testers saw five empty checkboxes, they sighed.

It was the briefest of sighs, and it would only take a moment, but it was there every time.

We learned that 5 steps are enough to feel overwhelming.

Our solution was simple:

  • We removed steps 3-5, leaving steps 1 and 2.
  • When the user completed steps 1 and 2, we said, “Congratulations! You’re done! Here are 3 more things you can do that will make your Clockk experience better!”

There were no more sighs or pauses. Users jumped into and completed steps 1 and 2. When steps 3-5 appeared, they felt like natural extensions to what the user had already been doing. The extra steps felt like they were always there. Users kept their momentum and completed all 5 steps with ease and without complaint.

Thanks to Userbrain testing, our onboarding rate (real-life users who install at least one tracker) is 75%. Given how challenging these steps are, it is remarkable that we got such a high rate. We owe it all to Userbrain.

5 tips for SaaS user onboarding testing

We’ve summarized the five tactics we use to make our Userbrain user testing successful.

1. Use screener questions to narrow testers down to your ICP

Userbrain has over 100,000 testers. They have a wide variety of experience in all kinds of industries. Use screener questions to find testers that have the experience you’re looking for.

Userbrain only allows 2 screener questions. We cheated and added a third – the first step of our test. This is what we say:

 

To see that the tester told the truth, we double-check the tester with our next question:

If the tester was honest in question 1, they should have no trouble answering question 2. In practice, we’ve seen testers pause the recording, do a quick Google search, and then answer the question. (They leave a Google search or a competitor’s window open.) We can reject the test (Userbrain’s 100% satisfaction guarantee is excellent!) or take the results with a grain of salt.

Use screener questions to get the right tester. Don’t be afraid to test them by asking them a question that someone in your ICP should know the answer to.

2. Don’t test to pass — test to find bugs and friction

I’ve discussed this at length above, so I’ll keep this section short. Assume that Userbrain testers will pass the test and will complete the goal. If they actually fail to achieve the goal, then you have a much bigger problem.

As the saying goes, it’s not the destination; it’s the journey. It is more likely that the tester will run into bugs or friction along their journey.

3. Run at most 3 tests at a time

Userbrain gives you tooling to run tens or hundreds of tests at once. It’s in their interests to have you do so because you’ll use more credits. It’s also convenient for you because you can get feedback in the form of quantifiable reports. “We scored 8.4/10 on ‘How easy was it to use our software’!” sounds great! Plus, objective reports save you from the messy, tedious work of watching long videos.

Never do this.

You will learn very little from quantified reports.

Instead, run between 1 and 3 tests at a time, and watch every recording, looking for subjective details.

This is our process:

  1. Run a single test
  2. Watch the test.
  3. If the test fails (i.e., we identified a bug or a friction):
    1. Fix the issue
    2. Go to step 1
  4. If the test passes:
    1. Go to step 1
    2. Repeat until we’ve had 3 consecutive bug-free and frictionless tests

Only when 3 testers are able to move through our test are we confident that our onboarding process is good.

4. Choose subjective questions over objective ones

Userbrain has several question types:

We recommend only using “Task.”

Ratings tend to skew positive. We’ve observed that users will struggle and then rate your software 10/10. They will rate your software based on whether they succeeded. Their score will reflect how pleased they are with themselves for finishing the task. Their score won’t reflect their struggles.

Single & multiple choice questions are great for objective data that look great in reports. However, the real value of Userbrain tests is in their subjective data.

Written responses are good, but it’s tedious to watch someone type out their answer as they speak it.

That’s why we use “Task.” We ask the user to speak the answer to our questions out loud. Unlike a written response, a spoken answer lets you hear emotion. For example, we had one tester who became increasingly agitated as he went through our test. A written response would have lost this emotional nuance. The audio was important to understanding and contextualizing his feedback.

5. Ask users to explain your software in their own words

In addition to testing if our onboarding works, we want to know if our users understand what Clockk is and what it does.

Since we screened testers down to our ICP, we have the opportunity to find out if our messaging worked.

These are the questions we ask a the end of a test:

If the tester can describe what Clockk does and for whom, then it is likely that an in-the-wild user can do the same. That means our messaging is accurate.

Your testers should be able to answer questions like these. If they can’t, you should review your messaging and make your value proposition more clear during onboarding.

Final thoughts

Userbrain has been a lifesaver for Clockk, especially given our onboarding complexity. Userbrain gives us insight into our users’ computers and their heads as they on-board.

The killer feature of Userbrain is watching the test recording with your colleagues. There are few experiences as humbling as watching users fail — because of your bug or bad decision — and not being able to help them.

This killer feature is the only reason why Clockk’s onboarding works as well as it does. Userbrain is an essential part of our onboarding workflow.

If you would like to know more about how we create perfect user tests for Clockk, check out my presentation here:

About the author

Paul Doerwald is Founder & CEO of Clockk.com Inc.. Years of forgetting to start (or stop) his timer made Paul think “there must be a better way” to do time tracking. Clockk’s AI timesheets are the better way. When he’s not thinking about Clockk, Paul likes to relax by playing with his son, running, and skiing. One day, he will show up at the Userbrain office in Austria with skis and poles in hand, looking for people to go skiing with. ⛷️

Back to homepage