SCRIMPS-STD: Test Automation Design Principles - and asking the right questions!
1. Download, install, record… we’re automating!??
Asking good questions and building good design
to reach reasonable Automation outcomes
Richard Robinson, May 2014
2. Automation frameworks…. arggghhh!
• So many platforms!
• So many programming languages!
• So many tool vendors!
• So many interfaces!
• So much data!
• So many stakeholders!
• So little time!
3. Automation framework design principles
• Parameterise your input data
• Modularise your automation blocks
• Setup and tear down maintenance
• Use self-descriptive data (SDD)
• Use a database for scalability
• Manage states
• Expand input stakeholders
• Simple trigger points
• Simple reporting methods
4. Design-principles-in-a-diagram
Execution UI Output
reports
Automation engine
System under test
Automation databaseInput values lists
Stakeholders
Execution
trigger from
Web UI
Reporting to
Web UI
Input values
from flexible
entry
sources
Database input
values and results
collection
Adding DB
values
Execution of tests
with system
Modularisation
Setup, teardown
Parameterisation
Self-descriptive data
State management
5. Parameterise - explanation
• Parameterised data is changed each time the test is run
• Values can be randomised
• Known stored valid values can be used (or invalid,
depending on your test)
• Functions can be used to create data depending on
other variables
• Input fields reference the variable, not actual concrete
data
6. Parameterise - example 1
• For a registration form with Firstname/lastname fields,
you can use a name lookup list using a long static list of
first name of all lengths and punctuation.
• Chandrarajan Sivasaravanamuttu (Sri Lankan Cricket
President)
• Mata'afa Mulinu'u II (Samoan President)
• Haven’T (Actual boys names from 2012)
7. Parameterise - example 2
• Suburbs in Australia
http://burbs.com.au/australian-postcodes/
8. Parameterise - example 3
• Credit card numbers used around the world
http://en.wikipedia.org/wiki/Bank_card_number
• Remember to also refer to your institutional whitelist for further
reduce the valid credit card numbers
9. Parameterise - examples
• Mobile numbers in use in Australia:
http://en.wikipedia.org/wiki/Telephone_numbers_in_Austral
ia
10. Parameterise - benefits
• A different test is run each time
• Framework is more extensible to other users
• Can scale the input values
• Reduces maintainability
• Get around constraints of the data or function
• Data is realistic, and supports and bugs found
• Can build in safety so that live records, account, or data is not
used
11. Modularise - explanation
• Modules are building blocks of script or code
• They are like a family tree and any layer can be
executed
• Built once and reused often across scenario tests
12. Modularise - examples
• Field definitions and reuse - eg. password field is used
on registration and also password reset
• Library use of common functions
• Parts of a scenarios
eg. Log in, register, find random item, checkout,
payment, deregister, change password, update profile
• Full scenarios that can be executed in serial to show
real end-to-end usage of the system
14. Modularise - benefits
• Fewer code blocks to maintain
• Build once, use often, saves time when building
• Non-expert coders can build scenarios using blocks
(premise of BDD)
15. Setup & teardown - explanation
• Test your framework
• Build quality checks into the framework
• Ensure the system is ready for execution before finding
out from a functional tests
• Check and maintain data
16. Setup & teardown - examples
• Execution setup script - checking and maintaining
environment variables, 3rd party connectivity, server
availability. eg Ping servers, send test request
• Data setup script - gather record to use, and apply any
initial checks on it
• Reset your data once its been used
17. Setup & teardown - benefits
• Informed of environmental issues prior to any test is run
• Can be used as an environment monitoring tool
• Allows for better re-use of data and records
18. Self-descriptive data - explanation
• Self-descriptive data is labelled to identify itself and the
test
• It is constrained by what is acceptable for the field
• Used for quick checks of test results by scanning
sizeable output
19. Self-descriptive data - examples
• PDF report on screen showing transactions and
balances
• SMS texts being received for all notifications
- matching the action to the message
• Web table data showing groups of results
20. Self-descriptive data - benefits
• Good for manual output checking from time to time
• Good for simplified results checking, as all tests should fit in same
results class
• Does not require external oracles
• Can generate and verify large amounts of complex
• Reduces time to do any manual verification
• Reduces diagnosis when things go wrong
• Good record keeping practice
21. Database-driven - explanation
• If your framework is starting to grow and expand, then
consider using a database for:
- input data storage
- parameter storage (eg. environment variables)
- output result storage
• Excel is great for small operations, but at some point,
having excel output files created for all tests becomes
unmanageable
22. Database-driven - examples
• Environmental variables - server names, IPs,
usernames, passwords, interface values
• Data-set up - time-based variables
• Input data - customer registration, orders, credit cards
23. Database-driven - benefits
• Scalable
• Efficient processing of data, results, and reports
• Centralised - one collection point for all data and results
• Allows for multiple users
24. State management - explanation
• Know your record states
• Maintain record states as in-use
• Return records for re-use
25. State management - examples
• Capture main records in a DB, or file
• Capture defining field values eg. in-use state, customer
cohort, type of payment method, address type, comms
type
26. State management - benefits
• Efficient for role-based, and state-based systems
• Minimised occurrence of record conflicts
• Allows for multiple instances of execution
• Allows for multiple users to execute
27. Stakeholder usage - explanation
• Framework used by any stakeholder
• Additional framework components build to support
others
• Access to data inputting, execution, results generation
• Input to methodologies such as BDD, ATDD, or any
keyword-driven framework implementation
28. Stakeholder usage - examples
• Adding values to data input lists
• UI to add values to input data directly to database
• UI to trigger tests on demand
• Running a demo at a stakeholder meeting
• Obtaining report
29. Stakeholder usage - benefits
• Buy in from stakeholders, increase in investment
• Credibility and quality for all to see
• More feedback on quality of tests
30. Simple triggers - explanation
• Automation can be used for many purposes and for may
stakeholders
• If the trigger points are easy to use, then it will get used
more often
31. Simple triggers - examples
• Web front end page with scenarios and suits available
as a menu to visitors
• Mobile app to trigger tests
32. Simple triggers - benefits
• Framework more widely used, improving value (“ROI”)
• More usage means more testing (and checking), means
greater chance of finding bugs
33. Simple reporting - explanation
• Reporting is what stakeholders see and use to
communicate to others
• Publish automation runs to a web GUI
• Allows layers of access to different reports (role-based
login)
34. Simple reporting - examples
• Report to levels of management, or stakeholder groups
eg. Product owner, development team, managers
• Report to levels of test execution
eg Acceptance test runs, function test runs,
environmental monitoring, data-driven tests, scenarios
35. Simple reporting - benefits
• Exposes the value of automation to the stakeholders
• Allows for feedback into the quality of the framework
• Allows for higher usage of framework by other
stakeholder groups
36. Focussing questions… (we are not done yet!)
• Do you agree?
• Learned something new?
• Applicable? How much do I implement the design
principles?
• Are there any more?
• What’s missing?
37. 10th design principle… CONTEXT IS KING!
• What are the outcomes we are trying to realise and
why?
• What problem are we trying to solve?
• What matters and what questions do we ask?
outcomeautomationproblem
context & environment
38. What outcome are we trying to achieve?
• To automate 100% of the functions
• To gain 100% coverage of a specification
• To reduce the number of manual testers to zero
• To shorten regression cycles to 1 minute
• To deepen exploratory testing to be exhaustive
• To reduce administration overhead to zero
• Monitor all user pathways
• To delivery working code overnight ready for utilisation
39. What problems are we trying to solve?
• Regression testing takes too long and is too costly
• We cannot afford all these testers, lets get rid of 40%
• We are losing quality from our product, lets automate
• Lets make use of dead time by testing overnight, automatically
• Lets save time by checking the quality of the release prior to
manually testing it
• There are too many bugs found in production, lets increase our
testing coverage by automating
41. Strategy - Cost considerations
• Prototype to get further funding?
• Automation is the saviour of our project, spend, spend,
spend
• Whose vision is it to automate? Do they have a budget?
• Are the automation team in budget?
42. Strategy- People considerations
• Full time dedicated resources or stealing developer time
• Proven skills or developing them
• Full domain knowledge or getting from business
resources
• Expert automation consultant (selling of implementation)
• Expert automation specialist (coding, implementing)
43. Strategy - Time considerations
• Big bang approach, or easy onset
• Quick wins, or consistent development
• Milestone based, or open ended outcomes
44. Strategy - System considerations
• Platform tools, language
• Interfaces - 3rd party vendors, access
• Data sensitivity, record availability
• Environment management and coverage
• Deployment infrastructure and processes
45. Strategy - Quality considerations
• How to measure?
• Are we actually speeding things up?
• Are we actually reducing any cost?
• What is our maintenance costs?
• When do we know its not working?
• What does ‘good’ look like?
46.
47. Want a trick to remembering the principles?
The single most important
reason
to reduce cost
or save
and another word for ‘save’ is…
management buy-in to Automation is…
49. Lets rearrange the principles…
• Setup and tear-down
• Context is king
• Reporting interface
• Input stakeholders
• Modularise
• Parameterise
• Self-descriptive
• State management
• Trigger interface
• Database-driven
And take the first letters together…
and you get…
50. SCRIMPS STD
(Scrimps Standard)
“A management-supported
automation framework design
standard,
with the sole objective to gain
economical outcomes of the
quality assurance function”
51. What is your context? Experience reports…
• Setup and tear-down
• Context is king
• Reporting interface
• Input stakeholders
• Modularise
• Parameterise
• Self-descriptive
• State management
• Trigger interface
• Database-driven
52. Resources
Excellent overview of testing and automation, best next steps to read.
Kaner, 2010, http://kaner.com/pdfs/VISTACONexploratoryTestAutmation.pdf
Models of autoation
McCowatt, http://www.ministryoftesting.com/2014/05/automation-time-change-models-iain-mccowatt/
http://exploringuncertainty.com/blog/archives/1010
Problem, solutions, maintainability,
http://www.ministryoftesting.com/2010/07/an-introduction-to-test-automation-design/
Self-verifying data
Doug Hoffman, 2012, http://www.ssqa-sv.org/presentations/Doug%20Hoffman%20SSQA%2011-13-2012.pdf
Noel Nyman, Microsoft, 2001, http://www.stickyminds.com/sites/default/files/presentation/file/2013/01TAU_T6.pdf
Maintainability
http://dhemery.com/pdf/writing_maintainable_automated_acceptance_tests.pdf
Automation test pyramid
Martin Fowler (2012), http://martinfowler.com/bliki/TestPyramid.html
Framework design
Carl Nagle, ”The test framework should be application-independent"
http://safsdev.sourceforge.net/FRAMESDataDrivenTestAutomationFrameworks.htm
General automation methodology
Borland (2012), https://www.borland.com/_images/Silk-Test_WP_How-to-successfully-automate-the-functional-testing-process_tcm32-205735.pdf
IBM, http://public.dhe.ibm.com/common/ssi/rep_wh/n/GBW03035USEN/GBW03035USEN.PDF
Hybrid approach most popular
Yogi Gupta, general implementation methodology
http://www.slideshare.net/yogindernath/understanding-of-automation-framework-presentation
53. Resources continued…
IMPLEMENTATION
Good design ideas, technical implementation (2011)
http://technologyandleadership.com/30-feet-view-of-test-automation-framework-with-selenium/
Generic implementation principles
http://www.theeffectiveengineer.com/blog/what-makes-a-good-engineering-culture
Doug Hoffman on Automation
http://www.slideshare.net/Softwarecentral/software-test-automation-beyond-regression-testing
Failure and problems with automation
http://five-ways-to-make-test-automation-fail.pen.io/
Smartbear Guide to converting to automation testing -
desktop http://www2.smartbear.com/rs/smartbear/images/Your%20Guide%20to%20Converting%20to%20Automated%20Testing.pdf
Alan Page - The A Word - desktop https://leanpub.com/TheAWord
Automation Checklist - strategy, business, tests
https://hackpad.com/Tips-to-Approaching-Test-Automation-fDRllKNrRJ2
54. About the presenter
An independent contract tester who works primarily in
automation, performance and test management roles.
The current President of the Sydney Testers Meetup, and
facilitator of WeekendTesting.com/WTANZ.
The head the facilitation of the Let’s Test Oz and CAST
conferences.
Blogs at www.hellotestworld.com and publish his ideas
to slideshare, and testing magazines such as Testing
Trapeze.
A Black Belt Tester of the Miagi-do school of software
testing, and a founding member of the ISST.
Richard Robinson
@richrichnz
richrob79
Notes de l'éditeur
Pep talk: Look, its a presentation, its entertainment. Please smile, and enjoy it. Remember, no body is testing anyway.
Lots of material to (quickly, no boredom)
Slides available (so you dont need to write it all down.)
Please hold your questions until after the presentation, and after the experience reports from Arial, Priyo and Paul.
I will go through at a good pace, so you dont get bored.
Please, if you need a drink, food or facilities break, or you want to check out the board, please get up and wander about. Vote with your feet. If you want to meet others and network outside the room, that is fine too.
Confusing, undirected, out of control
Sell job - I have the answers (From experience, building, managing, and researching)
Get control back! (You want the principles! You can have the principles!)
20s
Here is the technical overview of my talk
If you wanted was a list to compare to yours, this is it
This is my list, from my experience, in my my situational context.
Lets see what they look like in a diagram… <<next slide>>
Lets go through each one, in more technical detail
Emotional statement: Parameratising is so much fun!
Impact statement: Say NO to hard coding!
Examples: parameterising with value lists! see next slides
Parameterising is about NOT hard coding test values in to tests. Instead, variables are used that can be controlled and manipulated to suit the test, idea, function, cycle, or tester.
Variables can come from many sources. The simplest way is to use a lookup list or table. The values in the table represent various valid values that hit right up to the boundaries of the field definition.
Mobile number SMS notification testing
so you can get away from the automation is just checking argument somewhat
other project team members can add input values to your source parameter
scale to larger set as more is learned and known about the function; fill in the gaps later on
No hard-coding; dont have to go update everywhere where you used the name John Smith
can script in specific environmental constraints such as not being able to actually use assigned mobiles numbers
if there is a challenge about your data, you can say “Actually, that name is possible and is the former Samoan president”
Emotional sale: I love building stuff!
Impact message: Modularisation is chunking building blocks. It includes abstraction.
Example to run with: Many automation tools adopt this principle natively, such as web driver and soapUI.
Modularisation is about abstracting the layers of automation code in a similar fashion to object-oriented programming. It is a good practice to adopt when building a complex system. Smaller chunks of the system, or function, are scripted/coded and called upon by blocks of code with a wider scope.
Here we see a diagram that breaks out the levels of modules into High level, intermedia, and low level. This container approach is common to most automation tools, or can be built in.
We break things into modules to save time, as common modules can be repeated for different tests, and that means less maintenance.
Emotional sale: I like a clean run everytime!
Impact statement: Take control of your execution! Manage your bugs!
Example to run with: checking for broken lines of code in soapui. banking interfaces are up.
Taking control of automation means taking control of your environment and data. Make sure you do some setup and checking that all is in place before you launch a test.
Emotional sale: I love being efficient! Its so impressive to others!
Impact statement: Self-descriptive data is story-telling!
Example: SMS messages, OR bank statements
If you want to reduce your data management overhead, and align manual and automated testing, then using self-descriptive data is a great practice in the right context.
Emotional sale: Size matters!
Impact statement: Execution cycle speed is important! Say NO to excel spreadsheet!
Example: WebDriver output of csv files!
An automation framework should be build so that multiple automation apps can be used, and swapped in and out over time.
The framework should be able to handle scaling of tests, functions and stories.
To do this, a central automation database is your ultimate solution.
To get the framework components of automation engine/application, working together with using data from a central database, and reporting back to that database allows for scalability and efficient processing power of results.
Emotional sale: I love having a living breathing automation solution!
Impact statement: Share your data records across multiple users`
Example: opal card id data is limited. (can be created using a binary and customers
Test data can easily get out of hand. You need to find the constraints on your test data before deciding how to design your management of it.
Often, we are limited by the data records available - due to using multiple systems with only a subset of data from each. So we need to be clever with how we use and reuse data records.
Emotional sale: I get excited when others want to use my stuff!
Impact statement: Work with your people!
Example: Agile team working together on Acceptance Tests, test building, test values expansion, demo
Stakeholders should be able to get their hands on the automation framework where necessary. If you stand-up to good design principles, then there should be nothing to hide. Automation execution, and most testing, should be something that everyone thinks about, and contributes to.
Automation testing is only checking of tests. There are not usually many surprises in terms of functional bugs, or design issues. Automation usually only check the surface of the function or story. These types of tests are good for monitoring, so there is little risk of opening this up to other stakeholders to use. Its a great way to get buy-in for automation in general and ramp up the investment.
Now I want you to think about your environment, job, automation framework…
And think about these questions…
(read them in an inquisitive tone mate ..)
Outcomes! Problems! What matters!
If you are seeking outcomes that have been proposed or suggested by others, then ask yourself why?
The outcomes are likely attempting to solve a problem or two… what could they be.
Once you know the problems, what constraints and opportunities are there that can help realise the outcomes.
Is anyone trying to achieve any of these?
I sincerely hope not…
What is your team and company trying to achieve? What is your MISSION.
But wait, do you even know why you are trying to achieve ANYTHING?
Often, we focus on the benefits without knowing why we are seeking them in the first place…
SO How many of these are possible?
So now we know some testing, project and business problems that we trying to solve with automation.
But why do we have these problems? These are typically SYMPTOMS to the underlying problems.
Where do they come from?
If we know what is the basis of the problems, then our proposed solutions will be more realistic, achievable, and measurable.
So how do we uncover where the problems have come from?
Analyse key factors in the environment to identify how automation will SUCCEED in your environment.
This is the business case and test strategy.
Find out what it is, or build it yourself
Cost is probably the single most important factor of any automation implementation, or framework design.
You must know your cost issues and history, and how to navigate them.
eg. MISSED Milestone payment will reduce investment in initiatives
Where does the money come from?
People will make or break any automation attempt.
Who is available, who are they, how can you use them…
What does your company need to succeed with automation?
upskilling existing people
new people
getting access to the skills in your company
“Timing is everything”
When thinking tactically about your framework, your success will often bring more investment.
You need to show regular success to gain CREDIBILITY, buy-in, and further investment.
All systems come in different shapes and sizes. They have different platforms, technologies, 3rd party services, domain-specific constraints, contractual constraints.
Framework design principles need to be checked against whether the system is capable of achieving the outcome.
In some companies access to certain parts of the platform are forbidden, or unlikely. If you push access to the web server, or database server, you may find you are met with stiff resistance.
Quality is last because its often the hardest to measure.
Quality is your selling points to increase investment into the automation framework.
Quality measurement helps you to realise your outcomes.
If you don’t think about these things, then your manager will push them on to you, as it is some managers job to show realised outcomes, whether it be reduced cost, or reduced bugs found in production?