In this presentation a brief justification to performance testing will be given following with some terminology and a short demo
links is provided to some recommended solutions trails /freemium
in the comments, the two demo sessions,
TruClient Lite scripting demo
StormRunner Load simple performance test
3. Today’s Agenda
Why we need performance?
Performance basic terminology(101)
Scripting + Performance demo
How is it relevant for you!
Intro
101
Demo
You!
4. Application performance
Ease of use
Application uptime
Ease of navigation
End-user productivity
Application security
Application crashes
UI design appeal
User behavior and flows
Insights into application
Missing features
Dropped transactions
Social media feedback
Device battery consumption
Cellular data usage
Other
0 20 40 60 80
Percent
Top metrics companies believe defines their user experience?
From “Dimensional Research April 2016”
5. The cost of poor app performance
Your business can perform no better than its applications
Lost revenue
Lost brand
reputation
Lost
customers
Lost
competitive
advantage
61%
unlikely to return to a site if
they’d had trouble viewing it on
a mobile device
70%
of mobile
transaction
response time
stems from the
network
37%
shop elsewhere
if a mobile site or
app fails to load in
3 secs.
1 sec.
Time devices have
to respond to user
input, in order to
keep the user
engaged.
Poor performance drives
customers away:
6. Black Friday is coming…
Today, Every week
we have a
different lunch!
Black Friday is coming…
https://www.thebitbag.com/apple-ios-app-store-crashes-due-heavy-super-mario-run-traffic/212913
http://www.nbcnews.com/business/consumer/black-friday-online-sales-hit-new-high-after-shoppers-snag-n688656
8. Performance Testing Overview
Performance testing is a technical investigation done
to determine or validate a system
• Speed
• Scalability
• Stability
The main goal of performance test is to identify
how well the application performs
• In relation to the performance requirements and objectives
Estimate the hardware configuration
• required to support the application
• Less of an issue today with elasticity and cloud service
http://www.canstockphoto.com/images-photos/investigation.html
https://www.forbes.com/sites/cdw/2015/08/07/what-to-move-to-the-cloud-and-what-not-to-move-to-the-cloud/#6b296a05f0e4
http://www.sgcofc.com/hit-the-target/
9. Key Types of Performance Testing
Term Purpose Note
Load test To verify an application behavior under
normal and peak load conditions.
Load volumes expected in production
Stress test To determine or validate an application
behavior when it is pushed beyond normal
or peak load conditions.
Discover issues that surface only
under high load conditions, such as
limited memory, insufficient disk
space, or server failure.
Capacity test To determine how many users and/or
transactions a given system will support
and still meet performance goals.
Helps you to identify a scaling strategy
in order to determine whether you
should scale up(more cpu) or scale
out(more nodes).
10. Examples of Performance Test Objectives
Objective The Question we should ask
Application response time How long does it take to complete a task?
Acceptance * Is the system stable enough to go into production?
Regression Does the new version of the app affect response
time?
Reliability How stable is the system under a heavy work load?
Bottlenecks What is the root cause of degradation in
performance?
* More later:
16. World of testing: Shift left!
*Shift left: testing is performed earlier
moved left on the project timeline
http://www.clker.com/clipart-shift-left.html
18. IoT performance testing
• Amount of IoT devices is in exponent growth..
• The world is gearing for MQTT performance testing
https://en.wikipedia.org/wiki/MQTT
23. Performance test terminology
• Vuser:
Simulation(automation) of a real user during load test
• Transaction:
Measures the time it takes for the server to respond to specified (Vuser) requests
• Transaction response time(TRT):
Round trip, the time of between initiating the request and receiving the last part of the response
• Transaction SLA:
Customer expectations as was defined by the product owner
• Hit per second:
Requests per second
• Throughput:
Indicates the number of data receiving from the server per second
• 90% percentile:
Take the first 90 % transactions out of this set. The response time that has the maximum value in
this set is the 90 percentile value of the studied transaction.
24. Docker
• A “new” way to build and deploy (micro)
services
• Docker compose