The document outlines 7 deadly sins of automated testing: envy, gluttony, lust, pride, sloth, rage, and greed. It discusses flaws in comparing manual and automated testing, overreliance on commercial tools, focus on user interfaces over architecture, lack of collaboration, poor maintenance, frustration with brittle tests, and attempts to cut costs rather than improve quality. The document advocates for collaboration, technical best practices, and treating tests as critical code.
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
7 DEADLY SINS OF AUTOMATED TESTING
1. 7 DEADLY SINS
OF
AUTOMATED
TESTING
Dr Adrian Smith
September 2012
Engineering Innovation.
Thursday, 20 September 12
2. Adrian Smith
• Background in Engineering
• Software development using Agile and Lean
Diverse Ex
Diverse Ex
• Technical and Organisational Coach
Aerospace Engineering
• Founded a startup product development and
Commercial and military engineering
Aerospace Engineering
analysis and manufacturing experienc
consulting business Diverse Experienc
Commercial and military engineering
programs including A380 and F35.
analysis and manufacturing experienc
programs including A380 and F35.
Agile Software Developm
Aerospace Engineering
Software development, architecture a
Agile Software Developm
Commercial and military engineering design,
management for engineering CAE, au
Software development, architecture a
analysis and manufacturing experience on major
scientific and digital media.
programs including A380 and F35. engineering CAE, au
management for
scientific and digital media.
Systems Integration
Thursday, 20 September 12
Agile Software Development
Integration of logistics, financial, engi
9. Manual vs Automation
• A flawed comparison
• Assumes that automation can replace manual
testing effort
• Automation generally doesn’t find new defects
• Testing is not merely a sequence of
repeatable actions
• Testing requires thought and learning
Thursday, 20 September 12
10. Ideal automation targets
• Regression testing - assessing current state
• Automation of test support activities
• Data generation or sub-setting
• Load generation
• Non-functional testing
• Deterministic problems
• Big data problems
Thursday, 20 September 12
11. Common symptoms
• Relying on automation as the basis
for all testing activities
• All tests are built by developers
• Absence of code reviews
• Absence of exploratory testing
• Absence of user testing
Thursday, 20 September 12
12. Suggested approach
• Avoid comparison between manual and
automated testing - both are needed
• Distinguish between the automation and the
process that is being automated
• Use automation to provide a baseline
• Use automation in conjunction with manual
techniques
Thursday, 20 September 12
13. GLUTTONY
Over indulging on commercial test tools
Thursday, 20 September 12
14. Promise of automation
• Software vendors
have sold automation
as the capture-replay
of manual testing
processes
• Miracle tools that solve
all testing problems
Thursday, 20 September 12
15. License barrier
• Commercial licenses restrict usage
• Not everyone can run the tests
• Typically, organisations create special
groups or privileged individuals
Thursday, 20 September 12
16. Incompatible technology
• Underlying technology of commercial tools is
often not compatible with the development
toolchain
• Special file formats or databases
• Lack of version control for tests
• Tests cannot be versioned within the software
• Continuous integration problems
• Can’t be adapted or extended by the developers
Thursday, 20 September 12
17. Justifying the expense
• Financial commitments
distort judgement
• Difficult to make objective
decisions
• Tendency to use the tool
for every testing problem
• People define their role by
the tools they use
Thursday, 20 September 12
18. Common symptoms
• A commercial tools form the basis
of a testing strategy
• Only certain teams or individuals can
access a tool or run tests
• Developers have not be consulted in the
selection of a testing tools
• “We always use <insert tool-name> for testing!”
Thursday, 20 September 12
19. Suggested approach
• Use Open Source software tools where ever
possible
• Use tools that can easily be supported by the
development team and play nicely with existing
development tool chain
• Ensure any commercial tools can be executed
in a command-line mode
Thursday, 20 September 12
20. Lust
User interface forms the basis for all testing
Thursday, 20 September 12
21. Testing through the GUI
• Non-technical testers often approach testing
through the user interface
• Ignores the underlying system and application
architecture
• Resulting tests are slow and brittle
• Difficult to setup test context - resulting in
sequence dependent scripts
Thursday, 20 September 12
22. Investment profile
Manual
Exploratory
Interface
Confidence
Speed / Feedback
Collaboratively
built around
system behaviour Acceptance
Exercises
components and Integration
systems
Unit/Component
Investment / Importance
Developer built
optimised for fast
feedback
Thursday, 20 September 12
23. Architecture
• Understanding
application and
system architecture
improves test design
• Creates opportunities
to verify functionality
at the right level
Thursday, 20 September 12
24. Test design
Test Intent
(Clearly identifies what the Test Data
test is trying to verify)
Test
Implementation System Under
(Implementation of the test Test
including usage of test data)
Thursday, 20 September 12
25. F.I.R.S.T. class tests
F Fast
I Independent
R Reliable
S Small
T Transparent
Thursday, 20 September 12
26. Common symptoms
• Testers cannot draw the
application or system architecture
• Large proportion of tests are
being run through the user interface
• Testers have limited technical skills
• No collaboration with developers
• Intent of tests is unclear
Thursday, 20 September 12
27. Suggested approach
• Limit the investment in automated tests that are
executed through the user interface
• Collaborate with developers
• Focus investment in automation at lowest
possible level with clear test intent
• Ensure automation give fast feedback
Thursday, 20 September 12
28. Pride
Too proud to
collaborate when
creating tests
Thursday, 20 September 12
29. Poor collaboration
• Organisations often create
specialisations of roles
and skills
• Layers of management and
control then develop
• Collaboration becomes difficult
• Poor collaboration = poor tests
Thursday, 20 September 12
30. Automating too much
• Delegating test automation to a special group
inhibits collaboration
• Poor collaboration can results in duplicate test
cases / coverage
• Duplication wastes effort and creates
maintenance issues
• Setting performance goals based around test-
cases automated leads to problems
Thursday, 20 September 12
31. No definition of quality
• Automated testing effort should match the
desired system quality
• Risk that too-much, too-little or not the right
things will be tested
• Defining quality creates a shared
understanding and can only be achieved
through collaboration
Thursday, 20 September 12
32. Good collaboration
• Cross-functional
Acceptance
teams built Criteria
better software Analyst
• Collaboration Specification and
improves Elaboration
Collaboration Tester
definition and
verification Developer
Automation
Thursday, 20 September 12
33. Specification by Example
• Recognises the value of
collaboration in testing
• More general than
ATDD and/or BDD
• Based around building a
suite of Living Documentation
that can be executed
Thursday, 20 September 12
34. Common symptoms
• Automated tests are being built
in isolation from team
• Intent of tests is unclear
or not matched to quality
• Poor automation design (abstraction,
encapsulation, ...)
• Maintainability or compatibility issues
Thursday, 20 September 12
35. Suggested approach
• Collaborate to create good tests and avoid
duplication
• Limit the investment in UI based automated
tests
• Collaborate with developers to ensure good
technical practices (encapsulation, abstraction,
reuse, ... )
• Test code = Production code
Thursday, 20 September 12
36. SLOTH
Too lazy to properly
maintain automated tests
Engineering Innovation.
Thursday, 20 September 12
37. Automated Test Failures
• Many potential causes of failure
• Unless maintained - value is slowly eroded
System Reference
Interface Data
Change Changes
Time
New OS
Feature Patch
Thursday, 20 September 12
38. Importance of maintenance
Manual test execution Value of
Unmaintained
Maintained automation Automated Test
Suite
Unmaintained automation
Cost / Effort
Potential
Value of Maintained
Automated Test
Suite
Time
Thursday, 20 September 12
40. Common symptoms
• Test suite has not been recently
run - state is unknown
• Continuous Integration history
shows consistent failures following
development changes / release
• Test suite requires manual intervention
• Duplication within automation code
• Small changes triggers a cascade of failures
Thursday, 20 September 12
41. Suggested Approach
• Ensure automated tests are executed
using a Continuous Integration
environment
• Ensure test are always running - even if
system in not being actively developed
• Make test results visible - create
transparency of system health
• Ensure collaboration between developers
and testers
Thursday, 20 September 12
42. Rage
Frustration with slow, brittle
or unreliable automated tests
Thursday, 20 September 12
43. Slow automation
• Large datasets
• Unnecessary integrations
• Inadequate hardware/environments
• Too many tests
• Reliance on GUI based tests
• Manual intervention
• ... many others
Thursday, 20 September 12
45. Brittle Tests
• Contain time-bound data
• Have external dependencies
• Rely on UI layout/style
• Rely on sequence of
execution
• Based on production data or
environments
Thursday, 20 September 12
47. Unreliable Tests
• False positives
• Wastes time investigating
• Failures start being ignored
• Creates uncertainty of
system health
• Workarounds and alternate
tests are created
Thursday, 20 September 12
48. Suggested approach
• Treat automated tests with the same
importance as production code
• Review, refactor, improve ...
• Apply a “Stop the line” approach to test failure
• Eliminate (quarantine) unreliable tests
• Ensure collaboration with developers
• Up-skill / pair testers
Thursday, 20 September 12
49. Avarice Trying to cut
costs through
(Greed) automation
Thursday, 20 September 12
50. Lure of cheap testing
• Testing tool vendors
often try to calculate
ROI based on saving
labour
• Analysis is unreliable and under values the
importance of testing
Thursday, 20 September 12
51. Automation is not cheap
• Adopting test automation tools and techniques
requires significant investment
• Investment in new ways of working
• Investment in skills
• Investment in collaboration
• Ongoing investment in maintenance
Thursday, 20 September 12
52. Common symptoms
• Investment in commercial tools
using a business-case based on
reducing headcount
• Using a predicted ROI as a way of
reducing budget for Testing
• Consolidating automated testing
within a special group
Thursday, 20 September 12
53. Suggested approach
• Ensure the reasons for automation are
clear and are NOT based purely on saving
money/headcount
• Ensure business case for automation
includes costs for ongoing maintenance
Thursday, 20 September 12
54. 7 Deadly Sins
Envy Flawed comparison of manual testing and automation
Gluttony Over indulging on commercial test tools
Lust User interface forms the basis for all testing
Pride Too proud to collaborate when creating tests
Sloth Too lazy to maintain automated tests
Rage Frustration with slow, brittle or unreliable tests
Greed Trying to cut costs through automation
Thursday, 20 September 12