SlideShare une entreprise Scribd logo
1  sur  17
Implementing a
Performance Test
“Centre of excellence”

Richard Bishop
Senior Performance Test Analyst
Introduction


• HBOS Formed by Merger – Sep 2001
       • Halifax Building Society
       • Bank of Scotland
• 70,000 employees
       • UK, Ireland, Spain, Australia
• UK’s largest mortgage provider
• UK’s largest savings provider
• £440bn assets
Why am I here?


• Have worked alongside Mercury PS
      • Experienced performance tester – 6 years
      • Performance Center™ 8.0 (beta)
      • LR 8.1 WebGUI (beta)
      • J2EE diagnostics (beta)
      • .NET diagnostics (beta)
      • Scripting standards
      • Team structure / mentoring
      • Results publication and analysis
Team evolution and growth
Team website


• Team knowledge base
• Central repository for results
• Visible throughout HBOS
Performance Test Services


• EPT – Early testing
   • Informal
   • Iterative testing
   • Developer involvement
   • Aimed at improving performance

• PAT – Acceptance testing
   • Formal validation of application
   • Final test before deployment
Challenges

• Delivering testing to meet growing business demands
• Keeping pace with developments
  (eg .NET 2 / J2EE / Web Services / Citrix)
• Resource constraints
• Demonstrating ROI
• Fixed deadlines
Test Experience


 • Web                      80%
 • COM / DCOM               10%
 • Web Services             5%
 • RTE                      2%
 • Citrix                   2%
 • Other                    1%

 • Citrix use likely to increase post LR 8.1
Test Stages


  • Planning
         • Alongside Developers
            • Early access to code
            • Discussion of key features
            • Standard page IDs
            • Recommendations
         • Key Business Processes
         • Knowledge pooling
Test Stages (Continued)


  • Preparation
      • Technical documentation
      • Volumes calculations
         • Test plan
            • Normal load
            • Peak load
            • Duration test
      • Application familiarisation
      • Scenario design
Test Stages (Continued)


• Scripting

   •   Script recording
   •   Script standards
   •   Test data
   •   Error checking
Test Stages (Continued)


• Test Execution
    • Prove scripts in test environment
    • Prove data
    • Re-state objectives




   • Don’t test for the sake of it………….
Don’t test for the sake of it




                  “A test a day keeps the boss away”
Test Stages (Continued)


 • Results Analysis
        • Web-based reporting
        • LoadRunner Analysis templates
        • PERFMON charts
    • Involve “panel of experts”
    • Publish results daily
    • Appropriate for audience

      (See example site)
Feedback


• No longer a “hurdle”
• Seen as desirable
• Actively requested by the business

 “Overall the experience was one of helpful experts who gave us useful
 guidance in our testing. We fully intend to make further use of SI as we
 enter more early performance testing phases as part of the PTP project” –
 Darren Blackett, Senior Systems Developer - RBIT.
HBOS futures

• .NET2 diagnostics
• J2EE diagnostics
• BAC / Tivoli integration
• MAM
• RUM
• Hyperformix / MCP
• End-to-end performance projects
Q&A
E-Mail
richardbishop@hbosplc.com
DDI: +44 (0) 1422 338084
Mobile: +44 (0) 7909 610098

Contenu connexe

Tendances

JIRA Performance Testing in Pictures - Edward Bukoski Michael March
JIRA Performance Testing in Pictures - Edward Bukoski Michael MarchJIRA Performance Testing in Pictures - Edward Bukoski Michael March
JIRA Performance Testing in Pictures - Edward Bukoski Michael March
Atlassian
 

Tendances (20)

Andreas Grabner - Performance as Code, Let's Make It a Standard
Andreas Grabner - Performance as Code, Let's Make It a StandardAndreas Grabner - Performance as Code, Let's Make It a Standard
Andreas Grabner - Performance as Code, Let's Make It a Standard
 
Continuous Performance Testing
Continuous Performance TestingContinuous Performance Testing
Continuous Performance Testing
 
Dev ops != Dev+Ops
Dev ops != Dev+OpsDev ops != Dev+Ops
Dev ops != Dev+Ops
 
Applying multi-processing techniques in Magento for upgrade optimization
Applying multi-processing techniques in Magento for upgrade optimizationApplying multi-processing techniques in Magento for upgrade optimization
Applying multi-processing techniques in Magento for upgrade optimization
 
Want Continuous Delivery? Give testing a priority! 16-6-2016, Friss, Utrecht
Want Continuous Delivery? Give testing a priority! 16-6-2016, Friss, UtrechtWant Continuous Delivery? Give testing a priority! 16-6-2016, Friss, Utrecht
Want Continuous Delivery? Give testing a priority! 16-6-2016, Friss, Utrecht
 
Performance Testing your Kuali Student Product
Performance Testing your Kuali Student ProductPerformance Testing your Kuali Student Product
Performance Testing your Kuali Student Product
 
Year in Review: Perforce 2014 Product Updates
Year in Review: Perforce 2014 Product UpdatesYear in Review: Perforce 2014 Product Updates
Year in Review: Perforce 2014 Product Updates
 
JIRA Performance Testing in Pictures - Edward Bukoski Michael March
JIRA Performance Testing in Pictures - Edward Bukoski Michael MarchJIRA Performance Testing in Pictures - Edward Bukoski Michael March
JIRA Performance Testing in Pictures - Edward Bukoski Michael March
 
2015 DevOps Breakfast - DevOps in Action
2015 DevOps Breakfast - DevOps in Action2015 DevOps Breakfast - DevOps in Action
2015 DevOps Breakfast - DevOps in Action
 
Lessons Learned Monitoring Production
Lessons Learned Monitoring ProductionLessons Learned Monitoring Production
Lessons Learned Monitoring Production
 
Seminaire od devops 10traps 1.0
Seminaire od devops 10traps 1.0Seminaire od devops 10traps 1.0
Seminaire od devops 10traps 1.0
 
Srivalli Aparna - The Blueprints to Success
Srivalli Aparna - The Blueprints to SuccessSrivalli Aparna - The Blueprints to Success
Srivalli Aparna - The Blueprints to Success
 
Why agile
Why agileWhy agile
Why agile
 
How Mature is Your Infrastructure?
How Mature is Your Infrastructure?How Mature is Your Infrastructure?
How Mature is Your Infrastructure?
 
Streamlining Testing with Visual Studio 2012
Streamlining Testing with Visual Studio 2012Streamlining Testing with Visual Studio 2012
Streamlining Testing with Visual Studio 2012
 
Infrastructure as Code Maturity Model v1
Infrastructure as Code Maturity Model v1Infrastructure as Code Maturity Model v1
Infrastructure as Code Maturity Model v1
 
Fostering Learning and Technology Development in Technical Services
Fostering Learning and Technology Development in Technical ServicesFostering Learning and Technology Development in Technical Services
Fostering Learning and Technology Development in Technical Services
 
Dev ops in the cloud use case and best practices meetup
Dev ops in the cloud use case and best practices   meetupDev ops in the cloud use case and best practices   meetup
Dev ops in the cloud use case and best practices meetup
 
Bob Harnisch & Tim Koomen - Mixing Waterfall, Agile & Outsourcing at Dutch Ra...
Bob Harnisch & Tim Koomen - Mixing Waterfall, Agile & Outsourcing at Dutch Ra...Bob Harnisch & Tim Koomen - Mixing Waterfall, Agile & Outsourcing at Dutch Ra...
Bob Harnisch & Tim Koomen - Mixing Waterfall, Agile & Outsourcing at Dutch Ra...
 
#ESPC18 How to do #devops with the #SharePoint Framework and why it matters?
#ESPC18 How to do #devops with the #SharePoint Framework and why it matters?#ESPC18 How to do #devops with the #SharePoint Framework and why it matters?
#ESPC18 How to do #devops with the #SharePoint Framework and why it matters?
 

En vedette

Developng Academic Centre Of Excellence Part 1 Pic 8 11
Developng Academic Centre Of Excellence Part 1 Pic 8 11Developng Academic Centre Of Excellence Part 1 Pic 8 11
Developng Academic Centre Of Excellence Part 1 Pic 8 11
Oluyomi Olutoye
 
Building Big Data Analytics Center Of Excellence
Building Big Data Analytics Center Of Excellence Building Big Data Analytics Center Of Excellence
Building Big Data Analytics Center Of Excellence
Dr. Mohan K. Bavirisetty
 

En vedette (6)

Developng Academic Centre Of Excellence Part 1 Pic 8 11
Developng Academic Centre Of Excellence Part 1 Pic 8 11Developng Academic Centre Of Excellence Part 1 Pic 8 11
Developng Academic Centre Of Excellence Part 1 Pic 8 11
 
SharePoint Centre of Excellence
SharePoint Centre of ExcellenceSharePoint Centre of Excellence
SharePoint Centre of Excellence
 
Centre of Excellence Implementation
Centre of Excellence ImplementationCentre of Excellence Implementation
Centre of Excellence Implementation
 
Building your Center of Excellence
Building your Center of ExcellenceBuilding your Center of Excellence
Building your Center of Excellence
 
Building Big Data Analytics Center Of Excellence
Building Big Data Analytics Center Of Excellence Building Big Data Analytics Center Of Excellence
Building Big Data Analytics Center Of Excellence
 
Business Process Maturity and Centers of Excellence
Business Process Maturity and Centers of ExcellenceBusiness Process Maturity and Centers of Excellence
Business Process Maturity and Centers of Excellence
 

Similaire à Implementing a Performance Centre of Excellence

Cmmi adptando cmmi a proyectos pequeños weinberg[1]
Cmmi adptando cmmi a proyectos pequeños weinberg[1]Cmmi adptando cmmi a proyectos pequeños weinberg[1]
Cmmi adptando cmmi a proyectos pequeños weinberg[1]
JULIO GONZALEZ SANZ
 
Tony Hsu軟體專業課程簡介
Tony Hsu軟體專業課程簡介Tony Hsu軟體專業課程簡介
Tony Hsu軟體專業課程簡介
Tony Hsu
 
rizwan cse exp resume
rizwan cse exp resumerizwan cse exp resume
rizwan cse exp resume
shaik rizwan
 
reddythippa ETL 8Years
reddythippa ETL 8Yearsreddythippa ETL 8Years
reddythippa ETL 8Years
Thippa Reddy
 
Sucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QASucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QA
Sucheta Kale
 

Similaire à Implementing a Performance Centre of Excellence (20)

Cmmi adptando cmmi a proyectos pequeños weinberg[1]
Cmmi adptando cmmi a proyectos pequeños weinberg[1]Cmmi adptando cmmi a proyectos pequeños weinberg[1]
Cmmi adptando cmmi a proyectos pequeños weinberg[1]
 
Tony Hsu軟體專業課程簡介
Tony Hsu軟體專業課程簡介Tony Hsu軟體專業課程簡介
Tony Hsu軟體專業課程簡介
 
DevOps for Big Data - Data 360 2014 Conference
DevOps for Big Data - Data 360 2014 ConferenceDevOps for Big Data - Data 360 2014 Conference
DevOps for Big Data - Data 360 2014 Conference
 
Telerik test studio webinar deck
Telerik  test studio webinar deckTelerik  test studio webinar deck
Telerik test studio webinar deck
 
Iterative software development
Iterative software developmentIterative software development
Iterative software development
 
Newsar
NewsarNewsar
Newsar
 
G Bisanz Resume Jan2012
G Bisanz Resume Jan2012G Bisanz Resume Jan2012
G Bisanz Resume Jan2012
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance Testing
 
Fishbowl Solutions Webinar: A Path, Package, and Promise for WebCenter Conten...
Fishbowl Solutions Webinar: A Path, Package, and Promise for WebCenter Conten...Fishbowl Solutions Webinar: A Path, Package, and Promise for WebCenter Conten...
Fishbowl Solutions Webinar: A Path, Package, and Promise for WebCenter Conten...
 
Sar
SarSar
Sar
 
Neotys PAC - Ian Molyneaux
Neotys PAC - Ian MolyneauxNeotys PAC - Ian Molyneaux
Neotys PAC - Ian Molyneaux
 
rizwan cse exp resume
rizwan cse exp resumerizwan cse exp resume
rizwan cse exp resume
 
Change management in hybrid landscapes
Change management in hybrid landscapesChange management in hybrid landscapes
Change management in hybrid landscapes
 
Spm lecture-3
Spm lecture-3Spm lecture-3
Spm lecture-3
 
Lect3
Lect3Lect3
Lect3
 
Presentation application server diagnostics
Presentation   application server diagnosticsPresentation   application server diagnostics
Presentation application server diagnostics
 
reddythippa ETL 8Years
reddythippa ETL 8Yearsreddythippa ETL 8Years
reddythippa ETL 8Years
 
Technical Without Code
Technical Without CodeTechnical Without Code
Technical Without Code
 
Lean-Agile Development with SharePoint - Bill Ayers
Lean-Agile Development with SharePoint - Bill AyersLean-Agile Development with SharePoint - Bill Ayers
Lean-Agile Development with SharePoint - Bill Ayers
 
Sucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QASucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QA
 

Implementing a Performance Centre of Excellence

  • 1. Implementing a Performance Test “Centre of excellence” Richard Bishop Senior Performance Test Analyst
  • 2. Introduction • HBOS Formed by Merger – Sep 2001 • Halifax Building Society • Bank of Scotland • 70,000 employees • UK, Ireland, Spain, Australia • UK’s largest mortgage provider • UK’s largest savings provider • £440bn assets
  • 3. Why am I here? • Have worked alongside Mercury PS • Experienced performance tester – 6 years • Performance Center™ 8.0 (beta) • LR 8.1 WebGUI (beta) • J2EE diagnostics (beta) • .NET diagnostics (beta) • Scripting standards • Team structure / mentoring • Results publication and analysis
  • 5. Team website • Team knowledge base • Central repository for results • Visible throughout HBOS
  • 6. Performance Test Services • EPT – Early testing • Informal • Iterative testing • Developer involvement • Aimed at improving performance • PAT – Acceptance testing • Formal validation of application • Final test before deployment
  • 7. Challenges • Delivering testing to meet growing business demands • Keeping pace with developments (eg .NET 2 / J2EE / Web Services / Citrix) • Resource constraints • Demonstrating ROI • Fixed deadlines
  • 8. Test Experience • Web 80% • COM / DCOM 10% • Web Services 5% • RTE 2% • Citrix 2% • Other 1% • Citrix use likely to increase post LR 8.1
  • 9. Test Stages • Planning • Alongside Developers • Early access to code • Discussion of key features • Standard page IDs • Recommendations • Key Business Processes • Knowledge pooling
  • 10. Test Stages (Continued) • Preparation • Technical documentation • Volumes calculations • Test plan • Normal load • Peak load • Duration test • Application familiarisation • Scenario design
  • 11. Test Stages (Continued) • Scripting • Script recording • Script standards • Test data • Error checking
  • 12. Test Stages (Continued) • Test Execution • Prove scripts in test environment • Prove data • Re-state objectives • Don’t test for the sake of it………….
  • 13. Don’t test for the sake of it “A test a day keeps the boss away”
  • 14. Test Stages (Continued) • Results Analysis • Web-based reporting • LoadRunner Analysis templates • PERFMON charts • Involve “panel of experts” • Publish results daily • Appropriate for audience (See example site)
  • 15. Feedback • No longer a “hurdle” • Seen as desirable • Actively requested by the business “Overall the experience was one of helpful experts who gave us useful guidance in our testing. We fully intend to make further use of SI as we enter more early performance testing phases as part of the PTP project” – Darren Blackett, Senior Systems Developer - RBIT.
  • 16. HBOS futures • .NET2 diagnostics • J2EE diagnostics • BAC / Tivoli integration • MAM • RUM • Hyperformix / MCP • End-to-end performance projects
  • 17. Q&A E-Mail richardbishop@hbosplc.com DDI: +44 (0) 1422 338084 Mobile: +44 (0) 7909 610098

Notes de l'éditeur

  1. HBOS is the fifth largest bank in the UK HBOS was formed by a merger of the Halifax Building Society and the Bank of Scotland HBOS is growing rapidly, both by acquisition and organic growth.
  2. Worked with LoadRunner for a little over 5 years. Consultancy background before taking a permanent position with a former customer! Have worked with Mercury PS on a number of product evaluations. Early adopter of Mercury Performance Centre and J2EE / .NET diagnostics Mercury PS staff have commented on the quality of our documentation and working practices and encouraged us to share them.
  3. The performance testing team has more than tripled in size over the last 6 years Over time we have developed a number of procedures and “best practice” documents. All new starters (including contractors) are encouraged to follow HBOS procedures to allow simple handover of work from one tester to another.
  4. To help new starters and contractors all our procedures and documentation are held on an Intranet. Links to other company Intranets as well as documentation, useful C code etc. Batch files for authentication Resource booking schedules for injectors etc. Knowledge base as well as links to external support sites e.g. Mercury & Microsoft.
  5. The test team performs two types of testing EPT – Developer / Project team led. Iterative, aimed at performance tuning, early bug fixes etc. PAT – Final formal tests before application is deployed into production environment. De-risked by more extensive EPT. Better engagement with customers.
  6. Project teams often don’t like testers! Seen as an obstacle to code deployment! Developers can delay the production of final code but the “go-live” date is fixed. All pressure at end of project falls on testers. Application resource requirements, MIPS etc are under constraint. Weekend working, out of hours testing etc. IF we test and don’t find problems our purpose is questioned. We aren’t always thanked for finding faults either!
  7. In common with most Mercury users, majority of work is web related. Some COM/DCOM work Use same principles wherever possible. (Error checking, common C functions etc.)
  8. Meeting project team understanding application Application demonstrations Page Ids (error checking) Discussions with developers, what are their concerns? What are the most important business processes (which 20% of transactions are used 80% of the time). Can the performance tests be run without “burning data”?
  9. Documentation – screenshots (we use SnagIT) – Demo if required. Volume calculations – Demo of Excel spreadsheet which we use. Test plan – We do normal load to give performance metrics to production monitoring teams. - Peak load – simulate busiest hour of busiest day. - Duration test – check for memory leaks - Other tests as required. - If second release of an application perform “before and after” test. Early scripting – be prepared to script again and again. - Your always learning about the application - Better understanding of application leads to better tests. Application familiarisation - Sit with users if possible, or sit with application architects and developers. Scenario planning - What does an average user do. - Can we use log file analysis (e.g. Analog) to find out what users do? - Sit with users if possible. - Just because you have 70,000 employees you don’t need to size the application for 70,000 people. - Educate your customer, they may not be familiar with testing cycle.
  10. Record scripts twice (for ease of correlation) Use comparison tools (e.g Beyond Compare) Use standards within the team, easier handover Test data is as important as scripts, test it all if possible before testing. Re-record as a matter of course, developer will swear code hasn’t changed. Don’t believe them! Write detailed error checking functions, simple text checks are OK but it’s better to tell a developer that they had an “HTTP 404 error on page X when it’s accessed by a user with a zero account balance” than it is to say, 50% of the scripts failed, it’s something to do with the application ‘cos the script works!
  11. On day of test Check all scripts, use Controller as well as VuGen. Run them as multiple users. (Helps check for paramaterisation problems). Check all lines of data if possible. Resist pressure to “test for the sake of it” Projects often feel that the pressure is off the developers etc when testing is taking place. Don’t let them wash their hands of the project once it’s under your control, remain in contact. (See next slide)
  12. A test a days keeps the boss away. Source – Professional Tester magazine – March 2005 Reproduced by permission of Professional Tester magazine. www.professionaltester.com Tests should have a purpose and should be performed to prove an applications capabilities or to investigate a problem. If a script isn’t working and the project team says, “just leave it out of the scenario”. We should push back and say, “Why was it there in the first place?” Running a test which does not in some way simulate real life or prove anything of consequence shouldn’t be done simply to keep management happy. Your time as a tester could be spent in more productive work, such as re-checking test data, documenting scripts or processes or examining previous test results in more detail.
  13. Templates increase speed of report writing. Consistent look and feel. Publish combination of perfmon and LR graphs. Give developers access to results sets and Analysis tool, or you’ll spend your days constantly redrawing graphs! Find out who the application owners are and create a “panel of experts”, capacity planners, production support teams etc. If you experience problems people are always to hand for support. Publish results as soon as possible. Consider management summaries as well as more detailed reports for developers. (Especially if using detailed web page breakdown or diagnostics graphs).
  14. By implementing the practices outlined in the previous slides the performance test team has gone from being a team that people tried to avoid to one that people actively look out for. Constant increase in team size is still barely keeping up with demand for our services. <Suitable quote> to be agreed.
  15. What next? Examples of the technologies which we are investigating.