Menu Search
Jump to the content X X
SmashingConf London Avatar

We use ad-blockers as well, you know. We gotta keep those servers running though. Did you know that we publish useful books and run friendly conferences — crafted for pros like yourself? E.g. our upcoming SmashingConf London, dedicated to all things web performance.

A Guide To Simple And Painless Mobile User Testing

The incredible growth of mobile and the proliferation of mobile devices has made the UX designer’s job more challenging and interesting. It also means that user-testing mobile apps and websites is an essential component of the UX toolkit.

But unlike the desktop environment, no out-of-the-box software packages such as Silverback or Camtasia are specifically designed to record mobile usability tests.

Further Reading on SmashingMag: Link

Even if you’re not developing a mobile app, chances are that a large proportion of your website traffic is coming from mobile. Running regular mobile usability tests is the only way to gauge how well this channel is working for your customers.

A bit of hacking is required. And, after years of experimentation, we think we’ve figured out the best hack available yet. If you want to test iPhone or Android experiences, this solution is simple, cost-effective and high quality.

The Old Hack: Wires And Duct Tape Link

In days gone by, we used a “sled” to mount a smartphone and camera into a position where we could record what users were doing on screen. (To create the sled, we bought some acrylic in a hardware store and bent it into shape over a toaster. Fun.)

We attached a webcam to the sled with duct tape and mounted the phone with tape and some velcro strips. Looking back, the device was pretty crude. It wasn’t a natural experience for users, who would often cradle the phone in two hands to keep the sled steady.

Smartphone on a sled5

A user with smartphone and camera mounted on a sled. (View large version6)

Technically, it wasn’t reliable. Because we were using two cameras on one laptop (the camera on the sled and the laptop’s built-in camera), we had to have two camera apps open at the same time. This led to flaky performance. Either setting it up would be a stressful time or we’d get a blackout in the middle of a test — or, often, both.

And there were other issues, such as screen glare and camera focus. Overall, it was time-consuming to set up, with unreliable performance and a suboptimal testing environment. It was particularly stressful if clients were around, but it was the best solution we knew at the time.

A Better Way: Wireless Link

Ideally, testing equipment and software should be invisible to the users. We want as natural an environment as possible, just the users and their smartphones — no wires, sleds, cameras or duct tape.

For the UX team, the focus should be on learning and insight. We don’t want to be sweating over the setup or worrying about blackouts.

I’d like to introduce you to a simple setup that achieves all of these goals. It allows the UX team to focus on what really matters, and lets users focus on their phones. And it’s so reliable that we regularly use it in front of clients and during our training classes.

We’ll focus here on testing usability on smartphones, using a MacBook as the recording device. But the approach works with Windows PCs, too.

Wireless testing is more natural7

Wireless testing is more natural. (View large version8)

Step 1: Install Software Link

The magic ingredient in this setup is Apple’s wireless AirPlay technology. This is the software that lets you stream music or videos wirelessly to an Apple TV.

So, the first software you’ll need to buy (for $15) and install is Reflector239, which converts your laptop into an AirPlay receiver, just like an Apple TV. This allows us to mirror the user’s smartphone screen onto the laptop: Whatever is on the user’s smartphone will also be seen on the laptop.

Now we have a scenario in which we don’t need an external camera to record the user’s smartphone screen. We just need screen-capturing software to record the smartphone on the laptop. My personal favorite is ScreenFlow2410 ($99), for two reasons. It’s reliable, and it uses the laptop’s camera to record the user’s face during the session, an essential component of any usability test.

Step 2: Set Up Monitor Link

This step is optional, but I like to use an external display so that the facilitator and notetaker don’t have to peer over the user’s shoulder to see the action. It also minimizes distraction for the user; they won’t see a giant version of their smartphone on the laptop in front of them — it will be on the monitor instead.

So, run an extension cable from your MacBook to the monitor. If the monitor and the laptop are showing the exact same thing, that means they’re being mirrored, which we don’t want. Open up “System Preferences” and select “Displays,” and make sure the box for “Mirror Displays” is unchecked.

The correct display preferences for your Mac11

The correct display preferences for your Mac. (View large version12)

Step 3: Set Up Reflector Link

To start beaming the smartphone to the laptop, open Reflector on your Mac. You’ll see the Reflector icon in the toolbar in the top left of your screen.

You'll see this icon when Reflector is open13

You’ll see this icon when Reflector is open. (View large version14)

Step 4: Mirror Your Smartphone Link

Now we come to the magic part. If you’re using an iPhone, swipe up from the bottom of the screen, and enable AirPlay. Then select your MacBook from the list (in this example, it’s “Colman’s MacBook Pro”). Finally, flick the “Mirroring” switch to active (green).

Turning on mirroring on your iPhone15

Turning on mirroring on your iPhone. (View large version16)

Your iPhone should now appear in the middle of your external monitor. Magic! (If the iPhone appears on your MacBook’s screen, just drag it onto the external monitor.)

For devices with Android 4.4.2 or higher, swipe down from the top of your screen to access the settings. Select the “Cast screen” option, and then select your MacBook.

Note: Your smartphone and MacBook need to be on the same Wi-Fi network for any of this to work. It’s the first thing to do when troubleshooting if you can’t get it to work right away.

Step 5: Set Up ScreenFlow Link

To start recording, open ScreenFlow; the new recording configuration box will appear. You’ll need to adjust the following three settings:

Setting up ScreenFlow17

Setting up ScreenFlow. (View large version18)
  • “Record desktop from”
    Check this and make sure to select the external monitor from the dropdown menu (“2270W” in the example below).
  • “Record video from”
    Check this and select “FaceTime HD Camera (Built-in),” which is the default option.
  • “Record audio from”
    Check this and select “Built-in microphone.”

Step 6: Start Recording the Test Link

Position the user directly in front of the MacBook. You should see their face in the ScreenFlow preview. Then, press the large red record button. That’s it — you are now recording.

As you and the notetaker are watching the action on the monitor, the user will be sitting in front of a blank laptop, using their smartphone as they normally would — no wires, duct tape, cameras or intrusive mounts.

In the screenshot below, I’m playing around with Spotify on my iPhone. You can see that, as well as capturing the smartphone’s screen, ScreenFlow also provides a picture-in-picture display of the user, perfect for usability testing.

TA screenshot of the recording output of ScreenFlow19

A screenshot of the recording output of ScreenFlow. (View large version20)

Granted, the recording won’t show the user’s fingers interacting with the device. But the overall benefits of this technique are so numerous (see the list below) that the trade-off is justifiable.

Overview of Setup Link

To be clear, let’s review what your setup should look like. The user should be sitting in front of the MacBook, with the smartphone in their hand. And the facilitator and notetaker (if you have one) should be sitting nearby, looking at the external monitor.

Room setup21

How the room and screens should be set up for the test. (View large version22)

Keep the monitor pointed away from the user. It can get distracting seeing their smartphone flashing on the big screen.

Conclusion Link

There are so many advantages to this approach that it would be worth listing them:

  • Simple
    After your first time getting things together, setup takes about five minutes the second time.
  • Reliable
    It’s not perfect, but crashes and setup issues are rare. With the sled-and-camera approach, however, problems were par for the course.
  • Cost-effective
    You can have this solution in place for less than $200 if you’re using a MacBook. (By comparison, Morae, the high-end usability testing software, sells for $2,000.)
  • Professional
    The output is high-quality and professional. It doesn’t look like a hack. We’ve shared our recordings with clients, executives, everybody.
  • Flexible
    The solution works with the major platforms: PC, Mac, Android and iOS.
  • Convenient
    Finally, because you don’t need any duct tape or velcro, test participants can use their own phones. This makes your tests even more natural and effective.

More Resources Link

(alda, ml, jb)

Footnotes Link

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26

↑ Back to top Tweet itShare on Facebook

Colman Walsh is the owner and founder of, a specialist provider of user experience design training.

  1. 1

    Is there any iOS Software which visualizes Taps (like the dots you see when you pinch in the simulator)? Because what you’d not see in this video is where the user tapped and if she maybe tapped on some area or element you didn’t expect…

  2. 5

    Daniel Fisher

    December 1, 2015 1:53 pm

    Reflector on windows can record and save to mp4 on its own :-)

    • 6

      Yes, but it doesn’t record the users’ face or comments at the same time. So it’s not a usability testing tool.

  3. 7

    You don’t need Reflector if you don’t need the facial recordings. You can, over USB, record into Quicktime for free.

    • 8

      Yes, I agree. But recording the user’s face and emotional reactions is critical, in my opinion.

  4. 9

    Infinite Digital

    December 1, 2015 10:29 pm

    Isn’t the whole point of the sled to point it at the hand of the user to see what they’re doing on the device?

    By removing what the sled is meant to do from the equation, all this does is record the screen and the users face + voice. In which case it’s not hard to set up a web cam at an angle to record video + audio and use a screen recording device on the mobile device to capture what they’re doing on the phone which is free (ie. quicktime for Mac).

    Sorry to be negative but I was hoping for a replacement to the sled…

    • 10

      I don’t think you’re being negative. It’s a good point.

      It depends on your priorities. In my experience, not seeing the user’s fingers hasn’t had an impact on the quality of insights we gained from testing. It’s not too difficult to follow what users are doing.

      But if seeing fingers is absolutely critical to you, then obviously using Reflector won’t work.

      Also, in my experience the setup you suggested isn’t that easy. And doesn’t produce reliable, high-quality outputs.

  5. 11

    There is a cheaper option. So Chris has already pointed out that if you have Mavricks or El Capitan installed on your Mac you can screen record your iPhone through a USB with quicktime. Simply, quicktime recognises your iPhone as another web camera.
    If you don’t have silverback you can download v2 for free from the silverback app website.
    Step 1. Create a second desktop space (
    Step 2. In the 2nd desktop space open QuickTime and start a new video recording with your iPhone set as web camera.
    Step 3. Open Silver back on the 1st desktop and start your session
    Step 4. Once Silverback is running swipe to the 2nd desktop space and Silverback will capture the whole session.
    Notes: With this setup you can capture in the one session, both the desktop and iPhone interactions. But because you’ve already told QuickTime that your iPhone is your default camera, your inbuilt camera will become inactive (you can only capture one video input at a time unless you have a A/V splitter box to capture 2 or more video/audio inputs)
    The plus side is you can still add notes to your iPhone session, through Silverback. You can also swipe between iPhone, browser and desktop sessions in the one session and you’ll also capture all the audio as well.
    The downside is you can’t see the participants faces and their reactions, you also can’t see the taps through QuickTime. But there’s enough feedback in the OS to indicate what the user is doing. If you are building your own app you can include these tap states as added visual feedback and they will come up in the Quicktime video.
    I have found the USB iPhone to Quicktime connection to be a bit flakey as the OS can sometimes get confused and stops registering the iPhone as an input source, requiring a reboot but nothings perfect I guess.
    This setup also removes the nauseating shakes that can happen when people are holding their device, and you can also setup another iPhone or camera just in front of you user to capture the forward facing user responses and expressions.
    This also doesn’t work on Windows but it is the cheapest setup I’ve been able to setup thus far.

    • 12

      Hi Jared, thanks for the detailed instructions!

      If you’re using the iPhone as a camera, why can’t you see the user’s face?

      And if you’re not seeing the user’s face, that’s a deal breaker for me, I’m afraid.

  6. 13

    Hassan Mohamed

    December 2, 2015 6:42 pm

    – Mount a GoPro on a helmet & get the user to wear it
    – Make her sit in front of a camera / MacBook
    – Record on the GoPro & front camera / MacBook
    – Also capture the phone’s screen as a video
    – Use a clap to sync the 3 videos

    You get the user’s hand actions, facial expressions and the screen activity.

  7. 14

    Nick from Mr. Tappy.

    December 2, 2015 8:53 pm

    Stefan and Zed’s comments are spot on here.
    This is a sweet setup for mirroring the user’s screen, but you’re missing out on 50% of the UX equation here.
    If you’re wanting to study how the app behaves, that’s fine, but if the human interaction and behaviour is what you need to learn from (and in my experience that’s putting the U in UX)… you’ll want to see those fingers, taps, attempted taps etc.
    This is what Mr. Tappy was designed for.
    Mr. Tappy is a filming rig for capturing both the on and off screen interactions of mobile UX.
    So this way you can capture both sides of the story.
    you can find him at

    • 15

      Hi Nick, thanks for the input. But I disagree with your statement that we’re missing out on 50% of the equation. Not sure how you arrived at that percentage, but I think it’s exaggerated.

      As well as the mobile screen, we’re also capturing our user’s face, their expressions and their comments as well. Plus, we see a lot of behaviour too. You don’t need to see finger taps to know when somebody is scrolling, pressing buttons, moving to a new screen, etc. There’s quite a lot of “U” in our approach too :)

      Using a mount like Mr Tappy can be intrusive. And the behaviour observed is a little less natural because of this.

      My opinion is that we’re missing out on some of the equation, definitely. But only a small amount (maybe 10%). And the simplicity of this approach, the low cost, and the natural way that users can hold the phone, makes up for it.

      • 16

        Nick from Mr. Tappy.

        March 23, 2016 10:25 am

        Hi Colman. You’ve generated (and nicely kept floating) some great debate here, and there are some handy suggestions of alternatives too.

        When you can see the user’s hand or face you simply get more information – in some instances that’s more valuable than others.

        To put hard-working UX people out of their collective misery of fiddling about with multiple windows, softwares etc. to capture faces, hands and screens, I’ve made browser based viewer:

        It lets you view the image of two webcams (internal or external), has the option of picture in picture (resizeable) and is free for anyone to use – Tappy or no Tappy.

        It’s built and tested in Chrome for desktop and is a bit of a ‘beta’.

        If you’d like to record, try using a screen recorder.

        Keep up the good work.

  8. 17

    Not sure why my last comment was deleted, but one of the best thing that’s happened to UX in the last 12 months is It’s the answer to all of the issues you mention in your post, minus all the complicated setup.

    I don’t work for them, i use it, and its been a revelation.

    Mr Tappy is a ridiculous idea. Have you ever done user testing with a rig strapped to the top of a phone? It’s skews the user behaviour, because its soooo intrusive.

    The best way to test anything is in the same environment as people would use the actual product, on the train, in Starbucks, on their own phone. Not in a sterile lab with a camera strapped to your back.

    • 18

      Hi Ash, thanks for the comment.

      The only drawback with something like Lookback is that you need to implement the SDK and/or use a jailbroken iPhone. These factors can add complexity.

  9. 19

    The title is misleading. It should say “iOS”.

    We used Reflector extensively, but have stopped since we can do all this natively with Quicktime.

    Set up a web meeting and anyone/anywhere can see the interaction.

    We have a somewhat clunky workaround for Android. But I’m yet to find a method that is slick and simple. Android was the reason i clicked on this article, and there is nothing about Android here.

    • 20

      Sorry Graham, maybe I should have been more clear. But in several places during the article I tell readers that this solution works for Android too. Reflector mirrors both Android and iOS.

  10. 21

    For Android there is a tool called Android tool for mac to record the screen . The problem is that it is not live. You have to save the recorded video after you are done. For Iphone I think quicktime is good.

  11. 22

    What about using Quicktime to mirror and record the screen on the Mac or solutions like Lookback?


↑ Back to top