With the exploding popularity of mobile devices, mobile application performance has become increasingly critical to the modern Enterprise.
This session will discuss some of the performance pitfalls common to tablets, iPhones and Android devices, and outline the tools available to allow you to effectively test your company’s mobile-based applications.
This presentation was originally given at SoftEd Fusion 2012 (Sydney) on September 13th.
2. Presentation Roadmap
Background Mobile device Network Server
performance performance performance
2 JDS
3. Background
Background Mobile device Network Server
performance performance performance
3 JDS
4. Mobile is growing fast
• There were more smartphones and tablets sold in 2011
than desktop PCs, laptops and netbooks combined.
• 20% of web traffic in the US comes from smartphones
and tablets
4 JDS
5. Smartphone Sales by Operating System
• Android is the
dominant mobile
phone operating
system
• BUT!!
– The most popular
handset is iPhone
(especially in .au)
– Google Play has
less than ¼ of the
revenue of Apple’s
App Store
– iPad has more market share than all Android tablets combined
– App sales are negligible for other stores (Nokia Ovi Store,
Blackberry App World, Windows Phone Marketplace)
5 JDS
6. Mobile in the Enterprise
• Growth of “bring your own device”
(BYOD) for internal apps – mostly
web apps and email
• Devices for field service/delivery workers
(probably not running iOS or Android)
• Mobile versions of public-facing websites
(http://m.yoursite.com.au)
• Mobile App projects
– Free apps to reach more customers. E.g. banking app, supermarket app
(not monetised through app sales)
– Projects initiated by the Business, instead of the IT department
– Very small budgets, almost always outsourced (to small, agile companies)
– Less maturity than typical Enterprise projects (minimal non-functional
requirements, security testing, performance testing)
• Large IT projects sometimes have a small mobile component
6 JDS
7. Mobile performance is important
• Bounce-rates increase dramatically as response times
increase
• 1 star reviews in App Store “this app sucks”
7 JDS
8. Factors in Mobile Performance
• Mobile device • Network • Data Centre
– Varying hardware ‒ The “last mile” ‒ Load
‒ The Internet balancers, se
– Mobile app code rvers
(from your telco
to your data centre) ‒ Server
code, capacit
y, configurati
on
8 JDS
9. Mobile Device Performance
Background Mobile device Network Server
performance performance performance
9 JDS
10. Apps behaving badly
• Bandwidth hog
– “Your app gave me a huge phone bill!”
• Storage hog
– “Your app is using up all my storage!”
• Drains battery
– Location-based apps, data use patterns
(polling keeps activating the transmitter)
• Leaks memory
• Doesn’t handle network connectivity
problems gracefully
• Slow startup or unresponsive user interface
10 JDS
11. Be careful with WebView
• Tempting shortcuts:
– Write parts of your app in JS + HTML, and display in a
WebView, instead of using native code.
– HTML5 vs Cross-platform vs Native
– Make sure you pick the technology with
the right trade-offs for your application
• Mobile WebKit has a 300ms delay on
firing click events after a tap.
– Work-arounds exist, but impose trade-offs
• Cross-platform mobile development frameworks
(PhoneGap, Trigger.io, Appcelerator, etc.)
– Can be hard to make non-native apps feel “snappy”
– Overhead imposed by JavaScript to Native bridge
11 JDS
12. Profiling tools for developers
• DDMS (Android) or Instruments (iOS)
– View memory usage and allocation (find leaks), threads, process
resource usage, slow methods, network usage, radio state (test
outage conditions).
• JavaScript
– Profile it on a desktop
browser (FireBug for
Chrome/Firefox, Web
Inspector for Safari)
– Use JSLint to ensure
strict code
“correctness”
12 JDS
13. Case Study: SnappyCam+
• Takes lots of photos very fast
(great for action sequences)
• Camera performance was the
central feature of the app
• Bottleneck: JPEG compression
– Existing compression libraries were too
slow
– Researched academic papers on JPEG
compression algorithms
– Implemented compression algorithm
highly-optimised for CPU architecture
– Hand-crafted assembly code for the
ARM NEON SIMD co-processor
13 JDS
14. Network Performance
Background Mobile device Network Server
performance performance performance
14 JDS
15. Sloooow network
• Mobile data speeds are
highly variable and can
be very slow (like a
modem)
• Websites have become
bloated with the
widespread adoption of
broadband Internet
• You can’t improve the
network, but you can
optimise your mobile
app/website for slow
networks
15 JDS
16. Shunra NetworkCatcher Express Mobile
• Free iPhone/Android app to check network performance
from your current location.
• Benefits:
– Measure real-world
mobile network
conditions
(bandwidth,
latency, packet
loss)
– Generates an NTX
file to use with
Shunra
PerformanceSuite
or Shunra vCat
16 JDS
17. WAN Emulation
• Most testing done while connected to
– Your office Wi-Fi
– Fast, metropolitan 3G internet
• Use a WAN emulator to measure response times under
different network conditions
17 JDS
18. Optimising for slow networks
• Minimise data transfer
– Create a mobile version of your website, and serve images scaled for
small mobile screens
– Use the HTML5 Application Cache or local storage
• Minimise “chattiness”.
– Multiple sequential requests multiply
the effect of network latency.
• Pre-fetch data to make it seem fast
• Use YSlow! for your mobile website
– Ensure content is cachable (correct
headers, etc.)
– Make files smaller (compress with Gzip,
minify JS/CSS files)
– Minimise HTTP requests
– Be aware of slow, third-party sites (e.g. ad networks, analytics)
18 JDS
19. Case Study: Smart Electricity Meters
• DPI mandated that all manually read electricity meters
be replaced with smart meters by the end of 2013.
• All meters must report usage (in 30 minute intervals) to
AEMO by 6am the following day.
• Problems at one electricity distributor:
– WiMAX (4G) teething problems
• Patchy data coverage. Meters not
“phoning home” reliably
• Unresponsive meters (requiring site visits)
– “Thundering herds”. Trying to avoid all the
meters sending their data at the same time
• Limited bandwidth on base station
• Servers cannot handle 700,000 concurrent connections
19 JDS
20. Server Performance
Background Mobile device Network Server
performance performance performance
20 JDS
21. Performance Testing 101
• Define Workload Model
– Identify key business processes
– Define Peak Hour transaction volumes for each business process
– Define number of concurrent users, network properties
• Organise Production-like test environment (with monitoring)
• Develop Scripts
– Note: scripts send same traffic as device, so no phones are needed to
run the load test. But response times will not include client-side time.
– Record business processes
• Execute tests
– Peak load, Stress/Break, Soak
– Failover, interface outages, etc.
• Tune, Test, Tune, Test, etc.
– Problems due to code, capacity,
or configuration
21 JDS
22. Difficulties recording mobile apps
• Record device or simulator/emulator on office network
• Probably can’t install recording agent on mobile device
• Most apps are just making web service calls
Application Type Recording Method
Mobile website • Record from desktop PC with standard
browser. Use a User Agent Switcher add-on.
Mobile app (HTTP) • Use a recording proxy
Mobile app (HTTPS) • Security features prevent use of a recording
proxy (i.e MITM attack). Some workarounds.
• Use packet capture, decode with SSL cert.
Mobile app (WebSockets • Beware! Poor tool support.
or other new protocols) • Be careful with new technologies
• Scoping question: what protocols does the
app use?
22 JDS
23. Example: Recording a mobile app (using
HTTPS) with LoadRunner
1. Server-side packet capture while performing business
process (e.g. using WireShark)
2. Export private certificate (from test environment)
3. New script > Mobile Application - HTTP/HTML
– Recording type: Analyze traffic
4. Select PCAP file,
set up filters, import
certificate
5. Generate script
6. Add transaction names,
correlate values, etc.
23 JDS
24. WAN Emulation under load
• Users on slow network connections have different load
characteristics on front-end servers
– Hold sockets open longer
– Cause more active TCP sessions on load balancer (licensing
limits, memory limits for session table)
– Cause more open connections to web servers (thread/process
limits, O/S TCP limits, requests queue waiting for a connection)
• During load testing
– Emulate network conditions for each virtual user individually
(bandwidth is not shared by virtual users)
– Emulate worst-case to find connection limits on servers
24 JDS
25. Performance Testing Overview
Test Device Network Server Load/
Performance Behaviour Performance
Envt Development Development Production-like
Users One One Many, emulated/virtual
Who Developer Developer or Tester
Tester
Notes • Manual, using • Manual, using • Automated, using load
profiling tools network profiling testing tools
• Testers can help • Using WAN • With/without WAN
with requirements, emulation emulation
oversight, etc.
25 JDS