Remote usability testing of mobile device interaction using motion capture and replay

Brown HCI
teaser image for Remotion software
Remotely mirror the phone interactions of your users

Remotion enables richer context sharing from remote users, especially during usability studies. Using this application, you can replay the behavior of people visiting your website or application, along with the screen capture. With careful observation, you could better learn about participating users' intent, posture and grip, attention, emotional state, and habits. This website provides instructions for setting up Remotion, and hosts add-ons for Remotion that enable more functionality.

How does it work?
screenshot of Remotion analysis mode
Analyze and annotate motion behavior

A remote user's hand motions can reflect contextual information about their surroundings. Imagine a scenario where a user visits a recipe website and scrolls to the ingredient list. The phone is placed at an angle in portrait mode. That may signal that they are collecting the ingredients and plan to follow that recipe. On the other hand, the same scrolling action but then the phone is placed sideways face-down may indicate a lack of interest in the recipe.

None of this contextual information is captured on the screen, yet the different movements resulting from how a user handles their device provide more clues about the situation than what is displayed on the screen.

How does Remotion work?

Remotion extracts more information about a user's performance in a remote usability testing, without the privacy or data size issues of video recordings. Remote user studies can feel as if the user is invisible but in the room manipulating the device, through a software replay. The mobile device is shown as a 3D model to simulate exactly how the device was manipulated and what was shown on the screen. You can then label parts of the interaction for attention, and frustration or confusion.

The client software library for motion sensing and screen capture on mobile devices (currently, Android phones) enables developers to collect sensing and screen data after being deployed on the participants' phone. The desktop application running on Mac or Windows operating systems act as a server to receive data sent by the remote phone. It then organizes the data collected and provides an interface to control the software visualization replay. The replay can occur in real-time, or later. The replay shows on the experimenter's screen a replica of what the user sees on their screen and the phone's motion. The user interface allows the experiment to annotate attention and emotion as part of the observation process.



Detailed instructions for setting up Remotion, including Frequently Asked Questions. Read this to get started. The instructions include two sections, one for recording interactions, and one for replaying interactions.

Mobile Client Capture Software

Screen and motion capture for Android phones. Ask the participant to install this file on their phone. An iPhone version not yet implemented.

Replay and Annotation Software

Desktop software for showing and saving the replays, along with annotation features. Install this on the experimenter computer.

Download for Windows Download for Mac

Add-ons (optional)

Pressure Pad

A pressure pad mounted behind the phone can capture the user's grip intensity and placement. A circuit diagram and instructions for building it with an Arduino are available.

Download Blueprint
remotion hardware replay photo
Hardware Replay

Replay can be done with a robotic arm with a placeholder phone, rather than a visualization of a 3D on the screen. This is prior published research, and is offered as a standalone project separately.

Attention and Emotion Label Suggestions

This add-on makes preditions based on the motion behavior about the user's attention and emotion, so automatically-generated annotations can be reviewed and applied by the experimenter. Currently under development.

Research Team

Past Members

Arielle Chapin
Alexandra Papoutsaki
Fumeng Yang
Klaas Nelissen

Funded in part by grants from the National Science Foundation IIS-1552663 and Army Research Office 71881-NS-YIP

Want to be notified about updates?