Join us Oct 29th 1pm when Gomez presents: Application Performance Testing: from concept to gravestone with Scott Barber and Imad Mouline
Join Scott Barber, performance testing guru, and Imad Mouline, Web performance expert, for this new look at performance testing and load testing as integral parts of the application life cycle.
Performance testing and load testing are primarily regarded as episodic activities – done once in the life cycle for a given application. Attend this webinar and learn the benefits that accrue by looking at testing as part of a longer cycle.
You will learn:
• How performance testing throughout the application lifecycle, adds value and mitigates risk
• How to integrate performance testing into all phases of the application lifecycle
• Why Web load testing early and often provides additional benefits
Our Speakers:
Scott Barber is the Chief Technologist of PerfTestPlus, Executive Director for the Association for Software Testing, Co-Founder of the Workshop on Performance and Reliability, and co-author of both Performance Testing Guidance for Web Applications and Beautiful Testing. Scott thinks of himself as a tester and trainer of testers who has a passion for software system performance.
Imad Mouline is the CTO of Gomez/Compuware
Imad has served as Chief Technology Officer for Gomez since 2006. Imad is a veteran of software architecture, research and development and an expert in web application development, testing and performance management. His breadth of expertise spans cloud computing, next-generation web browsers, load testing and software as-a-service (SaaS), with a focus on best practices to ensure the delivery of great web experiences.
3. Conception to Gravestone Page 3
Beautiful Testing
oreilly.com/catalog/9780596159825
Performance Testing Guidance
for Web Applications
www.codeplex.com/PerfTestingGuide
www.amazon.com/gp/product/0735625700
CTO, PerfTestPlus, Inc.
sbarber@perftestplus.com
www.perftestplus.com
Co-Founder:
Workshop On Performance and Reliability
www.performance-workshop.org
Co-Author:
Scott Barber
5. Conception to Gravestone Page 5
Conception
Some elements of performance are embedded in an
application’s DNA:
• For best performance, plan in advance
DNA testing is possible:
• Ask performance questions as the concept
evolves
• Test every step of design and architecture with questions, peer
reviews and common sense
Performance/Testing
6. Conception to Gravestone Page 6
In the Cradle
Nurturing is critical in the early stages:
• Performance doesn’t just happen, it needs attention
Regular medical check-ups aren’t considered optional:
• Just because you designed for it, doesn’t make it
reality
• Much like inoculations against diseases, we can take
preventative measures against poor performance
• Test assumptions, models, components, and implementation as
early as possible
Performance/Testing
7. Conception to Gravestone Page 7
Pre-School to High School
Significant personality and physical development occurs and
stabilizes prior to the teen years:
• What we know as “in development” this is best opportunity you
will ever have to achieve maximal performance
Prime time for testing:
• People are continually tested during this period; in school, at
home, on sports teams, etc.
• For applications, this equates to profiling, performance unit
testing, environment testing, and early load/stress testing
Performance/Testing
8. Conception to Gravestone Page 8
High School & College
Rebellion, experimentation, learning, experience gathering,
& personality tuning reign:
• Testing & evaluation is how we track progress as we approach
independence (i.e. production)
Monitoring and testing change is key:
• We can’t tune what we don’t know
• Best performers focus on “final” preparations for a positive and
successful production experience
• Redesign at this point is *painful*
Performance/Testing
9. Conception to Gravestone Page 9
Adulthood
What we’ve been preparing for:
• This is no time to stop paying attention... Just ‘cause we made it,
doesn’t mean we’re done
Observe and adapt is the cornerstone of success:
• Volume, traffic patterns, and environments all change – and
impact performance
• Patches, updates, and upgrades need testing too
• Without testing, adaptations can make things worse
Performance/Testing
10. Conception to Gravestone Page 10
Retirement
Not Employed ≠ Not Working:
• Like people, most applications remain quite active well into
retirement
Slowing down may be ok, but stopping => panic and
expensive trips to the ER:
• Unless every instance of the app is deleted, someone is still
using it
• Continued monitoring, testing, and adapting, can prevent ugly
failures that require significant effort to recover from
Performance/Testing
11. Conception to Gravestone Page 11
In the Afterlife
Legacies often begin, not end, at the funeral:
• Legacies can last a *very* long time
Absent an application ghost whisperer, the time of
monitoring and testing is over, but:
• The time of judging has barely started
• If you didn’t pay attention to performance before now, those
who used the application will probably speak more harshly
about poor performance now than when they were using it
Performance/Testing
12. Conception to Gravestone Page 12
Conception to Gravestone
Conclusions
The lifecycle of application performance mirrors the
lifecycle of a person
Successful people and applications are tested, monitored
and tuned throughout their lives
If we’re unwilling to ignore testing for loved ones, we
should be unwilling to ignore testing applications from...
14. Conception to Gravestone Page 14
Contact Info
Scott Barber
Chief Technologist
PerfTestPlus, Inc
E-mail:
sbarber@perftestplus.com
Web Site:
www.PerfTestPlus.com
16. • 78%have gone to a competitor’s
site due to poor performance at
peak times
After a poor experience..
•88% are less likely to return
•47% left with a less positive
perception of the company
•42%have discussed it with
family, friends, peers or online
Poor Web Experiences During Peak Traffic Times Directly Impact
Business Results
Brand
Customer
Loyalty
Poor web experiences
impacts revenue, brand &
loyalty
Cross-Vertical Findings
17. Applications are no longer what you build inside your firewall
Number of hosts accessed directly by the browser,
per user transaction, averaged across 3,000 companies 9.82
18. The Challenge of Delivering Web Applications
The Web Application Delivery Chain
Systems
management
tools: “OK”
…user is
NOT happy
Major
ISP
Local ISP
Mobile
Carrier
Internet
Content Delivery
Networks
3rd Party/
Cloud Services
Browsers
and devices UsersUsers
Storage
Web
Servers
App
Servers
DB
Servers
Mainframe
Load
Balancers
Mobile
Components
Network
Traditional zone
of control
19. Major
ISP
Local ISP
Mobile
Carrier
Internet
Content Delivery
Networks
3rd Party/
Cloud Services
Browsers
and devices UsersUsers
Storage
Web
Servers
App
Servers
DB
Servers
Mainframe
Load
Balancers
Mobile
Components
Network
The Web Application Delivery Chain
• Network peering
problems
• Outages
• Inconsistent geo performance
• Bad performance under load
• Blocking content delivery
• Incorrect geo-targeted content
• Configuration issues
• Oversubscribed POP
• Poor routing optimization
• Low cache hit rate
• Network peering
problems
• Bandwidth
throttling
• Inconsistent
connectivity• Configuration
errors
• Application
design issues
• Code defects
• Insufficient
infrastructure
• Poorly
performing
JavaScript
• Browser/device
incompatibility
• Page size
too big
• Too many
objects
• Low cache
hit rate
• Network resource
shortage
• Faulty content
transcoding
• SMS routing /
latency issues
The Challenge of Ensuring Quality Web Experiences
Zone of customer expectationZone of customer expectation
Systems
management
tools: “OK”
…user is
NOT happy
Zone of customer expectationZone of customer expectationTraditional zone
of control
Traditional zone
of control
20. Reasons to Load Test
Launching New Apps/Features Deploying New Infrastructure
Penetrating New MarketsMajor Marketing Campaigns
22. Traditional Behind-the-firewall Testing (Load Testing 1.0)
Pros:
Optimal for stress testing
internal systems
Good for testing individual
components inside the firewall
Cons
Offered as a product you
need to install SW, provision
hardware, apply patches, etc
Requires significant hardware
expenditure, for high volume
load testing
Typically harder to script
No visibility into end-user
experience
No visibility into third-party
external components
No visibility into geographical
response time discrepancies
that may surface under load
Major
ISP
Local ISP
Mobile
Carrier
Internet
Content Delivery
Networks
3rd Party/
Cloud Services
Browsers
& devices UsersUsers
Storage
Web
Servers
App
Servers
DB
Servers
Mainframe
Load
Balancers
Mobile
Components
Network
Load generation
behind the
firewall
23. Cloud Testing (Load Testing 1.5)
Pros:
Generate large volume of
load
Visibility into some third party
components
Offered as a SaaS (no
software to maintain, or
hardware to provision)
Cons
No visibility into end-user
experience
No visibility into real
performance of all third-
party external components
No visibility into geographical
response time discrepancies
that can surface under load
Major
ISP
Local ISP
Mobile
Carrier
Internet
Content Delivery
Networks
3rd Party/
Cloud Services
Browsers
& devices UsersUsers
Storage
Web
Servers
App
Servers
DB
Servers
Mainframe
Load
Balancers
Mobile
Components
Network
Load
generation
from the
cloud
24. Gomez’s Reality Load Testing (Load Testing 2.0)
Pros:
You can find and resolve
problems across the entire
WADC
Only accurate method to
predict end user response
times.
Able to find user experience
breaking points
Generate high volume load
Offer full visibility across all
external components
SaaS
Cons
Can not target individual
components inside the data
center.
Major
ISP
Local ISP
Mobile
Carrier
Internet
Content Delivery
Networks
3rd Party/
Cloud Services
Browsers
& devices UsersUsers
Storage
Web
Servers
App
Servers
DB
Servers
Mainframe
Load
Balancers
Mobile
Components
Network
High volume cloud load & real-
world load from +100k Last Mile
computers
25. Gomez Load Testing: Most Accurate Load Test for User Experience
HTTP : Behind the
Firewall
HTTP : Data Centers Browser : Data
Centers
Real World
Desktops
Last Mile
Traditional
Client/
Server Test
Datacenter Testing
Accuracy of End-User
Response Time
Incomplete Incomplete Indicative Most Accurate
Accuracy of
Application Availability
Invalid Indicative Indicative Most Accurate
Ability to drive large
load volume
Yes-requires
substantial
hardware
Best Better Good
Understand CDN
Impact
No Misleading Misleading Most
Accurate
Understand 3rd Party
(ads, feeds, etc…)
No Minimal Some Most
Accurate
Realistic object
download
No No
Static Only
Yes Yes
Visibility behind the
firewall
Best Good Good Good
Load Test 1.0 Load Test 2.0
Only Gomez Spans
Load Test 1.5