This document discusses automated browser performance testing. It describes measuring the perceived performance of specific interactions for a single user through tools like Chrome DevTools. The author discusses challenges of browser performance testing, such as network times, JavaScript execution, reflows, and on mobile browsers. The author also describes designing a JIRA browser performance test suite called Soke that defines timed sections and interactions to test JIRA performance and catch issues. The document emphasizes the importance of quantitative performance testing, being holistic in approach, and maintaining vigilance to address performance issues.
18. The Mobile Web
EVERYTHING IS THE SAME BUT WORSE
• network / radio power mode steps, sleep, etc.
• cpu / battery impact
• memory
• standards compliance
• tooling
38. Psycho Latency
• Ready for Action
• When does the User believe the app is waiting
for her?
• Manual “psycho” event placement
• and maintenance!
• Yardstick tuning against a real system
• Optimization Example: Execution Reordering
41. Selenium / Webdriver
• Chromedriver implemented by Chromium team
• mouse click on geometric centre of target
• W3C Webdriver wire protocol draft
• OK Browsers, follow the standard!
42. Selenium / Webdriver
• Windows and IE process control
• Browser auto-upgrades
• Changing what you measure
• Prefetch cache benefit in JIRA 6.0
43. Takeaway tweets:
• Performance has many enemies
• Be holistic, be quantitative, beware the microoptimization trap
• Perception is Reality
• The price of performance is eternal vigilance
#atlassiansummit
45. Rate this Talk
Show Me the Numbers: Automated Browser
Text code below to 22333
or visit http://bit.ly/197mpCa
MEH = 45
NO T BA D = 46
P R ET T Y GO O D = 47
A WES O ME = 48
To join this session, send text 136888 to