SlideShare a Scribd company logo
1 of 55
Not an accurate depiction
http://www.uxpa2016.org/sessionsurvey?sessionid=81

More Related Content

Viewers also liked

Viewers also liked (17)

Building a product content strategy practice
Building a product content strategy practiceBuilding a product content strategy practice
Building a product content strategy practice
 
Prototyping - 4 Strategic Factors for Designers
Prototyping - 4 Strategic Factors for DesignersPrototyping - 4 Strategic Factors for Designers
Prototyping - 4 Strategic Factors for Designers
 
Basic Guide For Mobile Application Testing
Basic Guide For Mobile Application TestingBasic Guide For Mobile Application Testing
Basic Guide For Mobile Application Testing
 
Career Crossroads: Navigating Your Next Professional Move
Career Crossroads: Navigating Your Next Professional MoveCareer Crossroads: Navigating Your Next Professional Move
Career Crossroads: Navigating Your Next Professional Move
 
Who's Using Our Product? A Story of Enterprise UX Research
Who's Using Our Product? A Story of Enterprise UX ResearchWho's Using Our Product? A Story of Enterprise UX Research
Who's Using Our Product? A Story of Enterprise UX Research
 
Sketch tips and hacks by mark bailey
Sketch tips and hacks by mark baileySketch tips and hacks by mark bailey
Sketch tips and hacks by mark bailey
 
Using UX skills to craft your career
Using UX skills to craft your careerUsing UX skills to craft your career
Using UX skills to craft your career
 
UXPA 2016 - Kelly Goto's Opening Keynote
UXPA 2016 - Kelly Goto's Opening KeynoteUXPA 2016 - Kelly Goto's Opening Keynote
UXPA 2016 - Kelly Goto's Opening Keynote
 
15 Killer Lessons Learned From 15 Years of UX Research
15 Killer Lessons Learned From 15 Years of UX Research15 Killer Lessons Learned From 15 Years of UX Research
15 Killer Lessons Learned From 15 Years of UX Research
 
Presumptive Design: "It's not research! We're getting stuff done!"
Presumptive Design: "It's not research! We're getting stuff done!"Presumptive Design: "It's not research! We're getting stuff done!"
Presumptive Design: "It's not research! We're getting stuff done!"
 
Coaching Lean Experiments in an Agile World
Coaching Lean Experiments in an Agile WorldCoaching Lean Experiments in an Agile World
Coaching Lean Experiments in an Agile World
 
Rise of the machines
Rise of the machinesRise of the machines
Rise of the machines
 
Mobile and IoT testing
Mobile and IoT testingMobile and IoT testing
Mobile and IoT testing
 
Mobile App Testing Strategy
Mobile App Testing StrategyMobile App Testing Strategy
Mobile App Testing Strategy
 
Designing Great Dashboards for SaaS and Enterprise Applications
Designing Great Dashboards for SaaS and Enterprise ApplicationsDesigning Great Dashboards for SaaS and Enterprise Applications
Designing Great Dashboards for SaaS and Enterprise Applications
 
Can Single Page Applications Deliver a World-Class Web UX?
Can Single Page Applications Deliver a World-Class Web UX?Can Single Page Applications Deliver a World-Class Web UX?
Can Single Page Applications Deliver a World-Class Web UX?
 
Drive testing in mobile networks
Drive testing in mobile networksDrive testing in mobile networks
Drive testing in mobile networks
 

More from UXPA International

More from UXPA International (20)

UXPA 2023: Start Strong - Lessons learned from associate programs to platform...
UXPA 2023: Start Strong - Lessons learned from associate programs to platform...UXPA 2023: Start Strong - Lessons learned from associate programs to platform...
UXPA 2023: Start Strong - Lessons learned from associate programs to platform...
 
UXPA 2023: Disrupting Inaccessibility: Applying A11Y-Focused Discovery & Idea...
UXPA 2023: Disrupting Inaccessibility: Applying A11Y-Focused Discovery & Idea...UXPA 2023: Disrupting Inaccessibility: Applying A11Y-Focused Discovery & Idea...
UXPA 2023: Disrupting Inaccessibility: Applying A11Y-Focused Discovery & Idea...
 
UXPA 2023 Poster: ESG & Sustainable UX
UXPA 2023 Poster: ESG & Sustainable UXUXPA 2023 Poster: ESG & Sustainable UX
UXPA 2023 Poster: ESG & Sustainable UX
 
UXPA 2023 Poster: The Two Tracks of UX Under Agile: Tactical and Strategic
UXPA 2023 Poster: The Two Tracks of UX Under Agile: Tactical and StrategicUXPA 2023 Poster: The Two Tracks of UX Under Agile: Tactical and Strategic
UXPA 2023 Poster: The Two Tracks of UX Under Agile: Tactical and Strategic
 
UXPA 2023: Data science and UX: Smarter together
UXPA 2023: Data science and UX: Smarter togetherUXPA 2023: Data science and UX: Smarter together
UXPA 2023: Data science and UX: Smarter together
 
UXPA 2023: UX Fracking: Using Mixed Methods to Extract Hidden Insights
UXPA 2023: UX Fracking: Using Mixed Methods to Extract Hidden InsightsUXPA 2023: UX Fracking: Using Mixed Methods to Extract Hidden Insights
UXPA 2023: UX Fracking: Using Mixed Methods to Extract Hidden Insights
 
UXPA 2023 Poster: Are virtual spaces the future of video conferencing?
UXPA 2023 Poster: Are virtual spaces the future of video conferencing?UXPA 2023 Poster: Are virtual spaces the future of video conferencing?
UXPA 2023 Poster: Are virtual spaces the future of video conferencing?
 
UXPA 2023: Learn how to get over personas by swiping right on user roles
UXPA 2023: Learn how to get over personas by swiping right on user rolesUXPA 2023: Learn how to get over personas by swiping right on user roles
UXPA 2023: Learn how to get over personas by swiping right on user roles
 
UXPA 2023: F@#$ User Personas
UXPA 2023: F@#$ User PersonasUXPA 2023: F@#$ User Personas
UXPA 2023: F@#$ User Personas
 
UXPA 2023 Poster: Pocket Research Guide - Empower your Solution and Foster Cu...
UXPA 2023 Poster: Pocket Research Guide - Empower your Solution and Foster Cu...UXPA 2023 Poster: Pocket Research Guide - Empower your Solution and Foster Cu...
UXPA 2023 Poster: Pocket Research Guide - Empower your Solution and Foster Cu...
 
UXPA 2023: Experience Maps - A designer's framework for working in Agile team...
UXPA 2023: Experience Maps - A designer's framework for working in Agile team...UXPA 2023: Experience Maps - A designer's framework for working in Agile team...
UXPA 2023: Experience Maps - A designer's framework for working in Agile team...
 
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
UXPA 2023 Poster: Atomic Research in Practice: Using a Feedback Repository to...
 
UXPA 2023 Poster: Leveraging Dial Testing To Measure Real-Time User Frustrati...
UXPA 2023 Poster: Leveraging Dial Testing To Measure Real-Time User Frustrati...UXPA 2023 Poster: Leveraging Dial Testing To Measure Real-Time User Frustrati...
UXPA 2023 Poster: Leveraging Dial Testing To Measure Real-Time User Frustrati...
 
UXPA 2023: UX Enterprise Story: How to apply a UX process to a company withou...
UXPA 2023: UX Enterprise Story: How to apply a UX process to a company withou...UXPA 2023: UX Enterprise Story: How to apply a UX process to a company withou...
UXPA 2023: UX Enterprise Story: How to apply a UX process to a company withou...
 
UXPA 2023: High-Fives over Zoom: Creating a Remote-First Creative Team
UXPA 2023: High-Fives over Zoom: Creating a Remote-First Creative TeamUXPA 2023: High-Fives over Zoom: Creating a Remote-First Creative Team
UXPA 2023: High-Fives over Zoom: Creating a Remote-First Creative Team
 
UXPA 2023: Behind the Bias: Dissecting human shortcuts for better research & ...
UXPA 2023: Behind the Bias: Dissecting human shortcuts for better research & ...UXPA 2023: Behind the Bias: Dissecting human shortcuts for better research & ...
UXPA 2023: Behind the Bias: Dissecting human shortcuts for better research & ...
 
UXPA 2023 Poster: Improving the Internal and External User Experience of a Fe...
UXPA 2023 Poster: Improving the Internal and External User Experience of a Fe...UXPA 2023 Poster: Improving the Internal and External User Experience of a Fe...
UXPA 2023 Poster: Improving the Internal and External User Experience of a Fe...
 
UXPA 2023 Poster: 5 Key Findings from Moderated Accessibility Testing with Sc...
UXPA 2023 Poster: 5 Key Findings from Moderated Accessibility Testing with Sc...UXPA 2023 Poster: 5 Key Findings from Moderated Accessibility Testing with Sc...
UXPA 2023 Poster: 5 Key Findings from Moderated Accessibility Testing with Sc...
 
UXPA 2023: Lessons for new managers
UXPA 2023: Lessons for new managersUXPA 2023: Lessons for new managers
UXPA 2023: Lessons for new managers
 
UXPA 2023: Redesigning An Automotive Feature from Gasoline to Electric Vehicl...
UXPA 2023: Redesigning An Automotive Feature from Gasoline to Electric Vehicl...UXPA 2023: Redesigning An Automotive Feature from Gasoline to Electric Vehicl...
UXPA 2023: Redesigning An Automotive Feature from Gasoline to Electric Vehicl...
 

Recently uploaded

Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 

Recently uploaded (20)

FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​
 

Tips and Tools for Testing Mobile Interactions Remotely (and On a Budget!)

Editor's Notes

  1. User Experience specialists at UserWorks, a UX consulting firm in the DC area. A little bit of background on how we got here. This all started for us about a year ago. Just started working on an in-person usability study of a mobile app related to diabetes management. We wanted to be able to capture video of our participants’ mobile screens as they used them, both for archival purposes, and so that notetakers/observers could view the interactions live without being in the same room. Unfortunately, the traditional camera-on-a-sled method of mobile capture wasn’t ideal for us. We wanted users to be able to hold and rotate their devices naturally. So we started looking into our options, and eventually came across an Android app called Mobizen. It allowed us to mirror a mobile device’s screen wirelessly onto a computer in real time. That seemed like a perfect solution for our needs, so we gave it a shot. For that project, we were just using Mobizen to mirror mobile devices locally. And it worked well for us. But as we continued to work with it, we found that the tool also had some internet streaming functionality built in. At first we were thinking, “hm, maybe we could use this to allow clients to the view our sessions remotely”. And that was true, but it was really just the tip of the iceburg. Turns out, we had just stumbled on to something that might let us run mobile usability studies with remote participants as well.
  2. TW So that set us on a journey which eventually led to us being here. We wanted to figure out whether we could use new technologies to conduct better remote mobile research. We wanted to get a better understanding of the tools that were out there.. .. And compare them to each other to find out what was best. Our final goal was to get to a point at which we were comfortable enough with our new tools that we could put them to use in a real live study.
  3. TW Before we get into it, we just wanted to clarify a couple of terms that we’ll be using throughout the presentation. As our title suggests, this talk is all about “Tips and Tools for Remote Mobile Testing”. When we say “Remote Testing”, what we’re talking about is moderated user testing in which the participant and researcher are located in two different places. When we say “Mobile”, we’re basically just talking about modern portable smart devices – phones and tablets.
  4. DD Most of you are probably very familiar with Remote Usability Testing, so you know there are many benefits, but also some drawbacks.
  5. Enhanced Research Validity + Participants can use their own device and their own setup + More naturalistic environment and real-world use case Lower Cost & Increased Efficiency + Fewer travel-related expenses for researchers and participants + Decreased need for fancy labs and/or equipment Greater Convenience + Ability to conduct global research from one location + No participant travel to and from the lab Expanded Recruitment Capability + Increased access to diverse participant sample + The money saved from reduced travel costs may allow for more participants to be included in the study
  6. Potential Reduction in Quality of Data - Difficult to control testing environment, and particularly distractions in the participant’s environment Expanded Spectrum of Technical Issues - More reliance on quality of internet connection - Dealing with a variety of hardware Reduced Scope of Research - Typically limited to software testing - Shorter recommended session duration Diminished Participant-Researcher Interaction - Restricted view of participant body language - Difficult to establish rapport
  7. When it comes to remotely testing mobile interactions, there are a few additional drawbacks to consider. First, you’re often dealing with a wider range of operating systems among the various mobile devices on the market. Also, while on desktop, you can pretty easily track the participant’s mouse movements, that is not really possible with mobile devices. So it can be more difficult (or impossible) to see where participants are tapping or hovering when interacting with a mobile device. With current offerings (which we’ll get into more later), participants must share their own screen, rather than control the researcher’s screen. This has the potential to feel invasive on the part of the participant. Finally, when relying on wifi, as mobile devices do, you are at the mercy of the bandwidth of the wireless network to transfer the data between participant and researcher. As many of us know, wifi can often be slower and less reliable than hard-wired networks.
  8. TW One thing that we think is important to note, is that there ARE already a few ways of conducting remote mobile tests. They aren’t the most elegant solutions in the world, but you can make it work.
  9. One of these methods is what we call, “The Browser Resize” The setup for this is essentially the same as a traditional desktop remote session. The researcher opens up a web page on their computer, and shares their screen with a remotely located participant. The difference here, is that while the participant is still interacting with the web page via their PC, the researcher has resized the window of the browser to mimic the dimensions of a mobile device. The browser resize is great for situations in which you’re just trying to test layout, or IA – It’s not so great if you’re concerned about people’s interactions with your product on a real mobile device.
  10. The laptop can be used concurrently for communication with researchers and/or study material display
  11. TW While all of those other methods work, there are new methods emerging the are higher quality and higher fidelity and are easier for the participant and for the moderator.
  12. Two major pieces of the remote mobile puzzle - Mirroring - Streaming Mirroring is relatively commonplace in itself, but tricky when it comes to live streaming that feed over the internet A majority of the remote mobile setups we found can be differentiated by the ways in which these two ingredients are combined.
  13. DD There are different combinations of these two ingredients (mirroring and streaming) that will accomplish what we need for remote mobile testing. These combinations or configurations vary in complexity. The first, and most complicated, configuration includes two apps, or tools. First, the participant installs one tool on both their mobile device and computer. This enables them to mirror their mobile screen onto their PC. Then, both the participant and the researcher install one web conferencing tool on each of their PCs. This web conference tool then shares the participant’s PC screen which is mirroring their mobile device (equivalent of sharing your screen while playing a YouTube video). An example of a tool we found that would require this configuration is called Mirroring 360.
  14. TW
  15. DD In this third configuration, it is simpler in that both the participant and the researcher install one web conferencing tool on each of their PCs. But that tool also works with the native screen mirroring technology on the participant’s mobile device (e.g., AirPlay), meaning the participant does not need to install an app on their phone. In addition, the web conferencing capability of this tool enables the researcher to see the participant’s mirrored mobile screen shared from the participant’s PC. An example of a tool that would require this configuration is Zoom.
  16. TW Potentially the least complicated configuration that we found. Notably, only on Android, as far as we’ve found.
  17. TW
  18. DD As we uncovered tool after tool that we thought had the potential to work in the remote mobile testing context, we quickly realized that we needed to come up with a way to assess the different tool options. So, we came up with a set of criteria that we considered the ideal characteristics of a tool that could serve the purpose of testing a mobile device remotely.
  19. First, the tool should be easy to use, both for the participant and for us, the researchers. This means easy to install and set up, as well as easy to use during the session. It also means that ideally the participant doesn’t have to worry about setting up multiple devices, such a PC in addition to their mobile device, as you saw in a few of the configurations. Secondly, of course, our ideal tool would be inexpensive. Knowing that technology changes rapidly, we knew we wouldn’t want to invest a lot of money into a tool that would become outdated quickly.
  20. Third, we needed the tool to perform well. This includes minimal lag time between the participant’s interactions and what we saw on our end. We also wanted the tool to play nice with other applications that needed to be running on the device at the same time. And, if possible, we wanted to be able to see a clear representation of the participant’s interactions with the device (like taps and gestures) – similar to mouse tracking on a desktop. Fourth, we wanted the tool to be a one-stop shop in that it included things like web conferencing, chat, and recording directly within the tool. Also, we wanted it to at least work on both iOS and Android, but also other mobile platforms as well. And to avoid participants needing a landline or separate device or computer to communicate with us, we wanted the app to allow them to talk to us on the phone while also mirroring their screen.
  21. Finally, we wanted to protect our participants’ privacy as much as possible and put them at ease. Therefore, the ideal tool would allow the participant to remotely control our mobile device from theirs. They would have to be able to control it from their mobile device (not their desktop) in order for the interaction to be as natural as possible. But we knew that may not be an option, so we figured that if the participant had to share their own screen, we wanted to see a few features that would protect their privacy while they were sharing their screen. First, the tool would clearly indicate to them when mirroring has begun. They should also have complete control over when the sharing would start and could stop it at any time. We thought it would be nice if the tool would snooze the participant’s notifications while the screen was being shared. (Technically, this is possible, but many participants do not know how to do it. Also, Do Not Disturb will silence the notifications, but they still appear on the screen and would therefore be mirrored to the researcher.) Finally, as you can do with many desktop screen sharing tools, we thought it would be nice if the mobile tool would have the option to share only one application. Therefore, if the participant accidentally pulled up their messaging or photo app, it would not be shared with the researcher.
  22. TW So, over the last year, these are the options we found. We just want to note that we are not affiliated with any of these tools, nor are we endorsing any of them. We evaluated them for our own purposes and simply wanted to share our findings with you.
  23. An overview of the available tools across platforms. Choices for Android are much more diverse. iOS is more locked down so fewer developers are coming out with mirroring tools.
  24. Here’s a current list of tools. This is changing quickly though, so tomorrow may be different. Tools with ** are advertising that they are developing iOS versions.
  25. Only found three that claim to work on both Android and iOS. But spoiler alert, they don’t work that well.
  26. DD Now that we had found a number of tools with potential to meet our needs, we wanted to evaluate and compare them in order to determine which tool or tools came closest to “ideal.”
  27. Although there appear to be a number of tools available, we found that many of them had certain characteristics that made them either impractical or unsuitable for remote mobile research. For example, at least one required you to access the code of the app for it to work. Others required the participant’s phone to be jailbroken or rooted. And others had such severe privacy concerns that we didn’t even consider asking a participant to use them. After eliminating the obvious dealbreakers, we chose six that we considered “best in class.”
  28. For those six tools, we conducted more in-depth in-house trial runs and then subjectively rated each tool across five categories so we could more easily compare and contrast their strengths and weaknesses. The five categories encompassed our ideal characteristics and included: The rating scale was from 1 to 10, where 1 was the least favorable and 10 was the most favorable. Again, these were all our subjective ratings and were based on what we believed our needs would be for a remote mobile test.
  29. TW Requires an app on the phone, and two apps on the computer. Explain the graphs.
  30. DD Price is on $14.99 for a license. It works on both iOS and Android, but doesn’t have web conferencing included so participants have to install a separate web conferencing app. It does offer an option to stream directly to YouTube, but that was not exactly useful to us for remote research.
  31. TW Free, so clearly affordability is the highest rating. Only works on Android. Only relevant feature for remote testing is screen sharing. Works over the internet, doesn’t require participant to have computer. But it does require the participant to create an account. It’s the tool that got us started.
  32. DD The strengths of TeamViewer were definitely in the ease of use for participants and moderators. The participant simply needs to open the app on their phone and read a 9 digit number to the researcher who can enter that into a desktop app and be automatically connected to the participant’s phone. The screen sharing currently only works on Android though. The main downfall of TeamViewer is the price: $809 for a one year business license (or free for personal use).
  33. TW No need for participant computer. Included web conferencing, chat, desktop sharing, and recording. Only works on Android. $20-$30/month on a Pro plan.
  34. DD Generally well rated overall. It is $14.99/month for the Pro license which was fairly middle of the road compared with the other tools. Includes web conferencing, chat, recording, and desktop and web cam sharing in addition to the mobile screen sharing. However, the mobile screen sharing currently only works on iOS.
  35. DD As you can see, Join.Me and Zoom were rated the best overall for our needs, but as you can see they weren’t necessarily significantly better than all of the others. All of the tools had their own strengths and may work better in different research scenarios. Team Viewer had notable strengths in ease of use and Mobizen, which is free, clearly had a strength in affordability.
  36. TW After all of this in-house testing, it was finally time to put them to the test in the real world.
  37. TW Why we picked Zoom and Android because they allowed us to remote mobile screen sharing and also share a desktop.
  38. TW We wanted to make sure everything would go smoothly since this was our first time using these tools in a real study. In a tightly packed schedule, it reduced the possibility of us running over time. DD So we wanted to show you what this setup looked like from the participant’s perspective and how easy it can be when everything works properly. First step: Participant clicks on join.me link in email.
  39. DD Join.me setup: Participant clicks on join.me link in email. Join.me app opens or downloads on their phone and joins the meeting. Moderator passes presenter role to participant. Participant accepts presenter role. Participant taps on icon to share screen and reads and accepts the privacy warnings. Screen is shared.
  40. TW Zoom setup Participant confirms their computer and phone are on same network. Participant selects option on Zoom desktop app to share iPhone screen. Participant swipes up on their iPhone to reveal AirPlay option. Participant selects AirPlay, then selects the computer (“Zoom-Tristan”). Participant turns mirroring toggle on. Screen is mirrored to participant computer and streamed to moderator computer.
  41. DD This project didn’t succeed without some bumps along the way. But, due to the upfront work we did, we were able to address these issues and deal with them before the data collection even started.
  42. DD Since participants would be sharing their own screen instead of controlling ours, we had to adjust some of our tried and true methods to work within this new context.
  43. TW
  44. TW Here we have a brief clip from one of our Android sessions showing how smooth the transition was when we asked participants to begin sharing their screen. What you’ll see in the video is the moderator’s perspective. You will hear Dana passing the presenter role to the participant’s phone which they then accept and their screen with the wireframe we were testing will be shared via the Join.me meeting.
  45. DD I know that might not be the most exciting video clip, but let me tell you how exciting it actually was that it worked that smoothly during data collection when it really mattered. Here we have a brief clip from one of our iOS sessions, again showing the smooth transition when we asked the participant to begin sharing their screen. What you’ll see/hear is me asking the participant to select the Share screen option on the Zoom desktop app, then swipe up on her phone, look for AirPlay, and select her computer from the list. Once she has turned the mirroring toggle on, her phone screen will be mirrored on her computer and streamed to mine. Miraculously, it worked that smoothly is all seven of our sessions.
  46. DD Based on our experience in this first real world implementation of some of the tools we uncovered, we learned a few lessons about how to make a remote mobile usability test run as smoothly and successfully as possible. Some of these things we did and therefore thought they contributed to our success and other things we didn’t do, but realized after the fact that they would have been helpful. Planning ahead: Spending extra time up front making sure the setup worked for participants. Clearly thinking through the logistics of sending participants the information they needed (tasks, links, phone numbers) when they needed it. Practice makes perfect. By becoming intimately familiar with the tools we were going to use during the session allowed us to more easily troubleshoot any issues that arose. In particular, we wanted to become familiar with what the participant sees on their end so we could give them accurate instructions. Always have a backup. When technical issues arise, it’s always good to have a backup. We knew that if the phone screen sharing didn’t work during the session, we could quickly relegate our testing method to one of the less optimal, but still valid mobile testing methods, such as re-sizing the browser to a mobile device-sized screen. If Zoom or Join.me didn’t work at all, we knew we could revert to our more reliable and commonly used tool for sharing desktops remotely, GoToMeeting. Fortunately, we didn’t need to use either of these options in our study. Put participants at ease. We gave participants a verbal overview of the process and walked them through it on the phone, which allowed us to establish a rapport with them and gain their trust. Therefore, they were much more willing to stick with us through the bumps in the road. Tailor recruitment. By limiting recruiting to either iOS OR Android (not both), you will only need to support one screen sharing tool. Obviously, this was something we did not do, but we feel it would have been easier if we had. However, one thing we did do, and we feel it made things go smoother, was that we recruited participants who already possessed basic mobile device interaction skills, such as being able to switch from one app to another. This made giving them instructions much easier.
  47. DD Thinking about the future of remote mobile testing, as a reminder, we were looking for tools with these certain characteristics, and the tools that we found generally fulfilled most of them. However, there are a couple areas with room for improvement.
  48. DD
  49. DD As far as features, we have yet to find a tool that works reliably with both Android and iOS. But the area where the most improvement is necessary is in participant privacy. We would love to see a tool that allows participants to share just one app rather than their whole screen (without having to change the app’s code), or even better, a tool that allows participants to control the researcher’s mobile device from their own. We believe the desire to test mobile interactions remotely will only continue to grow. So, as technology improves and the need for more robust tools is recognized, we anticipate more options and capabilities will emerge on the market in the near future. And by near future, we mean that these capabilities could be on the market in the next few months because these things change so rapidly.