To Automate or not to Automate your Mobile Testing.
In mobile testing just poking at the GUI will leave bugs hiding. So different tests and a variety of testers are needed. Context is also important; there is no one test set or test approach that will work all the time.
JeanAnn Harrison has years of experience with mobile testing and is a well-known figure in the QA and software testing community. She regularly speaks at conferences and publishes in software testing magazines.
In these slides JeanAnn discusses mobile testing strategies that deliver the right results.
You will learn:
- Types of Mobile Testing
- When and when not to automate your mobile testing
- Mobile exploratory testing strategies and guidelines
- Lesson learned
3. XBOSoft info
• Founded in 2006
• Dedicated to software quality
• Software QA consulting
• Software testing services
• Offices in San Francisco and Amsterdam
4. Housekeeping
• Everyone except the speakers is muted
• Questions via the gotowebinar control on the
right side of your screen
• Questions can be asked throughout the
webinar, we’ll try to fit them in when
appropriate
• General Q and A at the end of the webinar
• You will receive info on recording after the
webinar
5. WEBINAR:
TO AUTOMATE OR NOT TO
AUTOMATE
& EXPLORATORY TESTING
JEAN ANN HARRISON
Software Test Consultant for
Project Realms.
FOR MOBILE AND EMBEDDED DEVICES
Jean Ann Harrison Copyright 2013
6. AGENDA
• Introduction
• Note taking & use of slide
information
• Types of devices & Apps
• Quick overview of definitions
for Mobile Testing
• Planning out Testing
• Automation Concepts
Jean Ann Harrison Copyright 2013
• Exploratory Testing Definition
• Good Exploratory Testing
Skills/Mobile ET Skills
• Keys to Mobile ET
• Designing your Heuristics
• Organizing Tests/Stories
• Control System Considerations
• Hardware/Firmware test ideas
• Common tools for ET
• Summation & Conclusion
Not covered – User Experience, Usability,
Trainability, Web Application & Security Tests
7. MOBILE/EMBEDDED
SOFTWARE & DEVICES
• Mobile phones, tablets
• Proprietary devices: mobile heart monitors, law enforcement
ticket generators, restaurant personnel taking customer orders
on devices, GPS devices, PDAs
• Embedded Software examples: automobile’s computer system,
air traffic control systems, airplane navigation system, drug
infusion pumps in hospital rooms, elevators, cameras, robots
Jean Ann Harrison Copyright 2013
8. TYPES OF MOBILE
APPS
• Native Applications
• Local to device
• Hybrid Applications
• Local to device but
interacts w/internet
• Web Applications
• Not local to device.
All interactions on
internet
Jean Ann Harrison Copyright 2013
9. DEFINING SKILL SET
FOR THE
MOBILE/EMBEDDED
TESTER
• Some exposure or knowledge about products from the domain in which you
are testing: aerospace, medical, automobile manufacturing, airplanes,
factory systems, robotics, regulated environments, etc.
• Some knowledge of: hard sciences: math, physics, electronics, engineering,
etc for logical thought process.
• Software sciences: psychology, philosophy, sociology, human factors (human
machine interface) for creative & conceptual thought process
• These testers use skills from many knowledge domains, patterns of errors,
and basic testing skills to create tests. They are using their intuition, critical
thinking, and their mental models.
Chapter 1 – Software Test Attacks to Break Mobile & Embedded Devices
Jean Ann Harrison Copyright 2013
10. MOBILE TESTING
DEFINITIONS
• Mobile Application Testing is testing the application on a mobile
device.
• Mobile Device Testing is testing hardware and operating system. Does
the Operating System install? Does the device power on? Do the LED
lights work as expected? Does the battery charge when the AC
Adapter is plugged into the phone?
• Mobile Phone Testing is doing any testing on a mobile phone.
• Mobile System Testing – incorporates testing more than one
application and can combine hardware, software, firmware, along with
other applications.
• Mobile Testing – all of the above.
Be clear when using this terminology. If you are only testing apps on
mobile phones, then state “mobile apps testing” and mobile testing
which technically incorporates mobile website testing, mobile hybrid
apps, mobile hardware.
Jean Ann Harrison Copyright 2013
11. TO AUTOMATE OR NOT TO AUTOMATE?
CONCEPT: Planning – What do you want to Test?
Jean Ann Harrison Copyright 2013
Plan out types of tests – what do you want to test so you know then
what you can even consider to automate?
Types of tests include: functional, regression, usability, performance,
stress, load tests, system integration tests, trainability and
configuration tests.
12. DEFINE YOUR TESTS
• Performance testing for mobile software is not the same as
for web or client/server apps
• How does the app behave on a device vs talking to a
server?
• How does the app perform interacting with other
applications on device? Does it play nice? <note iPhone 4
error shutdown>
• Stress & load: use the environment conditions to stress
and put load on device.
Jean Ann Harrison Copyright 2013
14. CONCEPT: USABILITY TESTING &
CONFIGURATION TEST COMPARISONS
• Usability Testing & configuration test comparisons:
• Critical testing for mobile applications and to automate is
difficult. You can test font sizes but then do these sizes
compare across configurations? Does one automated
script fit all? iPhone 5 vs Galaxy 4 or how about an iPad vs
a Kindle?
• Ease of use on devices are difficult to automate – question
is do you want to spend the time trying to write scripts
when manually you not only can test faster, you can
combine your tests and test once within a project.
Jean Ann Harrison Copyright 2013
15. CONCEPT: TEST
COVERAGE
• When applying one script on an iPhone vs Kindle
Fire, your script may not contain all recognizable
objects on one configuration vs the other. You
may need more than one script to cover various
configurations. Is it worth writing these scripts
up? Planning out is key.
Jean Ann Harrison Copyright 2013
16. CONCEPT: SPEED OF
TEST
• Think about how fast you could test for example
the displayable area for the application on a
phone and then a tablet. How quickly could the
eye scan the displayable area? How would you
automate this?
Jean Ann Harrison Copyright 2013
17. CONCEPT:
MAINTAINABILITY
• Consider how easily to maintain your scripts. Will you be
forced to rewrite your scripts after each new release? What
can you easily maintain? Mobile applications often change
dramatically from release to release. Work with Design team
from release to release to see what changes will come up.
Jean Ann Harrison Copyright 2013
18. CONCEPT: REGRESSION AND
FUNCTIONAL TESTING
• Scripts can be done to record how someone
would use the application but if there are different
paths, many options, do you need to record these
scripts? Sometimes a “once through” is enough
testing to make sure the functionality works.
Jean Ann Harrison Copyright 2013
19. CONCEPT: TESTING BEYOND
THE GUI
• Charging the device, notification testing, network
communication are all tests which can be
combined with your performance testing and
functional testing. But you need to OBSERVE
what is happening when it’s happening. Is it
necessary to have the test run automatically?
Jean Ann Harrison Copyright 2013
20. CONCEPT: ONE SIZE DOES NOT
FIT ALL
• Do not expect one tool to do it all. Operating
system testing will not be included in a touch and
record tool. You may have to create small tools
yourself to get the job done but again, is it worth
your time to do so?
Jean Ann Harrison Copyright 2013
21. EXPLORATORY TESTING -
DEFINITION
• Quoting James Bach: “The plainest definition of exploratory
testing is test design and test execution at the same time. This is
the opposite of scripted testing (predefined test procedures,
whether manual or automated). Exploratory tests, unlike
scripted tests, are not defined in advance and carried out
precisely according to plan.”
http://www.satisfice.com/articles/what_is_et.shtml
Jean Ann Harrison Copyright 2013
22. WHAT YOU NEED TO BE
A GOOD EXPLORATORY
TESTER
• Awareness
• Discipline
• Focus
Jean Ann Harrison Copyright 2013
23. KEYS TO MOBILE
EXPLORATORY TESTING
• Look beyond the GUI
• Timing
• Domain knowledge
• Knowledge of System
(hardware/firmware/software)
• Characteristic of bugs behavior, formulate
patterns
• Use life experiences for inspiration
Jean Ann Harrison Copyright 2013
24. TESTING BEYOND THE GUI = SYSTEM
INTEGRATION TESTING
Jean Ann Harrison Copyright 2013
SIT for Mobile/Embedded contains elements:
Software + Hardware + Software Variants +
Hardware Variants + Timing + Operating
System Drivers + Network Protocols
26. LEARNING THE ARCHITECTURE
Work closely with
development to learn more
about how the mobile
application under test works.
Jean Ann Harrison Copyright 2013
27. WHERE DO BUGS LURK?
Desktop, Web, Mobile/Embedded Considerations:
• Requirements & Design
• Logic & Math
• Control Flow
• Data
• Initialization & Mode changes
• Interfaces
• Security
• Gaming functions
• etc…
Originally written by Jon D Hagar Copyright 2013 Jean Ann Harrison Copyright 2013
Mobile /Embedded Considerations:
• Software and hardware
development cycles done in
parallel, where aspects of the
hardware may be unknown to the
software development effort
• Hardware problems which are
often fixed with software late in
the project
• Small amounts of dense complex
functions often in the control
theory or safety/hazard domains
• Very tight real-time performance
issues (often in mili or micro
second ranges)
28. ”WE BECOME MORE EXPLORATORY WHEN WE
CAN'T TELL WHAT TESTS SHOULD BE RUN, IN
ADVANCE OF THE TEST CYCLE, OR WHEN WE
HAVEN'T YET HAD THE OPPORTUNITY TO
CREATE THOSE TESTS.” JAMES BACH
• Create your description or theory. “What happens when…?”
• … battery runs low?
• … battery is charging from a dead battery?
• … when the device gets too hot?
• … when the battery is replaced due to defective battery?
• … device memory is full?
• … I receive a notification while using the phone or another application?
• … when there is a time/date change?
• … device searches are too slow?
Jean Ann Harrison Copyright 2013
29. ORGANIZE YOUR
THOUGHTS/TESTS
Jean Ann Harrison Copyright 2013
Battery
level
Time/Temp Application
Installing
Application
Installed/not in
use/started
App in use (note –
various conditions
apply here)
0%-5% Time each level
increment of
charge
Timing of
download?
Would
notifications
factor?
Notifications?
Memory usage
CPU speed
6%-15% Check the temp
at each
increment
Completed
installation?
Note CPU speed,
memory usage
Searches with
application DB, online
DB?
16%-30% Interaction with data
transfer/network
communication
31%-50%
30. CONTROL SYSTEM CONSIDERATIONS
DEVELOPING TESTS:
HARDWARE TO SOFTWARE SIGNAL
INTERFACE
• Know the input and output connections to the software
• Consider differences in variations between devices like calibrations, physical noise,
electron, light, cold, water, dust device, response time, wear & tear on the hardware.
• Identify input & output devices with ranges and resolutions of values (how is the
software installed onto the device?)
• Define the full range of environmental input disturbances (unexpected inputs)
• Define possible environmental output disturbances (unexpected outputs)
• Determine what is or is not possible in the test lab
• Conduct a risk analysis of what is acceptable and what is not
• Understand device limitations: CPU processing capabilities, memory, time.
• The Aerospace Industry use the term, “Test like you fly, fly like you test” to combine all
facets of how a device’s software should be tested.
Originally written by Jon D Hagar Copyright 2013 Jean Ann Harrison Copyright 2013
31. • Hardware to Software signal type test
• Network/cellular communication signal strength tests
• Installation of software on a mobile device
• Device limitation tests: CPU processing capabilities, storage, memory
Jean Ann Harrison Copyright 2013
32. HARDWARE / SOFTWARE
• Evaluate Software with Hardware Variants and System Operations
• Finding Bugs in Hardware–Software Communications (include
Firmware)
• Stressing Software Error Recovery
Jean Ann Harrison Copyright 2013
33. SPECIALIZED MOBILE / EMBEDDED
SOFTWARE ATTACKS
• Network Protocol Communications
• Time-Based
• Security & Fraud
Jean Ann Harrison Copyright 2013
34. COMMON TOOLS FOR
ORGANIZATION
• Sticky notes per conditions, per variables, per test case idea
• Keep a log of what you are doing, record conditions
• Utilize the device’s log or find an open source logging tool
• Use tables to record all variables/conditions per test
• Document situation of the test, list out your test conditions
Jean Ann Harrison Copyright 2013
35. SUMMARIZATION OF
TECHNIQUES
• Combining tests can help shorten project schedules
but what also happens is you can witness results.
Automation results do not give you all you need to
witness as it happens.
• Automate positive end to end use for regression only
between builds.
• Using tools which measure memory usage, to get
statistics while conducting regression tests.
• Documenting includes use of video,
writing/development of tests in story formats, user
action formats
Jean Ann Harrison Copyright 2013
36. CONCLUSION
• Remember: In mobile and embedded just poking at the GUI will leave bugs
“hiding”
• Different tests and attacks by a variety of testers (developer and titled
testers) is commonly needed
• Context matters: there is no one set of tests, approaches, testers, or
attacks which will work all the time
• Testing is hard and takes the whole team
Jean Ann Harrison Copyright 2013
37. JEAN ANN HARRISON’S
CONTACT
INFORMATION
• EMAIL: jaharrison@projectrealms.com
• Twitter: https://twitter.com/JA_Harrison
Or @JA_Harrison
• Project Realms Inc website:
www.projectrealms.com
Jean Ann Harrison Copyright 2013
38. Q & A
Want to keep updated on upcoming webinars?
Follow us @xbosoft
Need any help with mobile testing?
Contact us: services@xbosoft.com
40. REFEREN
CES
• Software Test Attacks to Break Mobile & Embedded Devices by Jon D Hagar to be
published and released fall 2013.
• 2013 STPCon Presentation slides from “Testing Beyond the GUI” by Jon D Hagar &
Jean Ann Harrison
• http://www.ministryoftesting.com/2012/06/getting-started-with-mobile-testing-a-
mindmap/ Getting started with Mobile Testing Mindmap
• http://karennicolejohnson.com/ Karen Johnson’s website
• http://www.satisfice.com/articles/what_is_et.shtml by James Bach, Satisfice, Inc
• http://www.softwaretestpro.com/Item/5567/ A Different Take on Mobile Testing –
Test Beyond the GUI by Jean Ann Harrison
Jean Ann Harrison Copyright 2013
41. IMAGES
REFERENCES
• Slide 3 - Embedded Systems examples
• http://www.jijesoft.com/en/?option=com_content&view=article&id=18&Itemid=45
• http://www.mseedsystems.com/products/view/268/aldelo-for-restaurants-pos-wireless-
edition
• http://blog.laptopmag.com/wpress/wp-content/uploads/2013/02/Pandora-Limit-Free-
Listening.jpg
• slide 4 - types of Apps
• http://myeventapps.com/whats-the-difference-native-vs-web-apps/2012/794/
• slide 5 - defining skills
• http://cdn.cutestpaw.com/wp-content/uploads/2012/11/l-The-Thinker.jpg
Jean Ann Harrison Copyright 2013
42. IMAGES
REFERENCES• slide 7 to automate or not to automate
• http://www.eoi.es/blogs/veronicarecanati/2013/02/10/project-management-why-do-
projects-fail/
• Slide 9 - mindmap used with permission from originators Karen Johnson & James Bach
• www.ministryoftesting.com
• slide 18 - What you need to be a good Exploratory
• Testerhttp://www.wallcoo.net/paint/kagaya_celestial_exploring/Kagaya_art_Celestial_Explori
ng_INSPIRATION1.html
• Slide 19 - puzzling together pieces
• http://www.aspire2develop.co.uk/images/People%20development%20page%20&%20HOME
%20page%204%20piece%20jigsaw%20iStock_000002077956Small.jpg
• slide 20 - Testing beyond the GUI
• http://courses.cs.vt.edu/csonline/OS/Lessons/Introduction/onion-skin-diagram.gif
Jean Ann Harrison Copyright 2013