2. About me
•Consultant for Centric Consulting, LLC
•Over 20 Years in IT industry
•10 + Years QA management
•Provided QA Leadership for many large scale projects at
Nationwide, Grange and currently CAS
•Mom/Grandmother
•Cat lover?
•Enthusiastic about:
•Music/concerts
•Movies
2
3. Today’s Discussion
•Take away today – importance and value of using a test plan
to support agile
•Agile requirements challenges
•Agile Manifesto
•Examination of Manifesto as it relates to planning
•Test strategy purpose
•Test plan value in agile
•Testing phases within agile
•Brief wrap up/conclusion
•Questions
3
4. Agile Requirement Challenges
•Sprint focus is story specific and time boxed which may limit thinking
•Even focusing just on the sprint, the requirements can be
misunderstood – user stories are too BIG
•QA are often seen as the SME and should be seen as key
individuals in this process, but that knowledge should not remain in
our heads only
•Do you have your own set of challenges you’d like to share
4
5. Agile Manifesto
•Individuals and interactions over processes and tools
•Working software over comprehensive documentation
•Customer collaboration over contract negotiation
•Responding to change over following a plan
•That is, while there is value in the items on the right, we value
the items on the left more
5
6. Further Examination
•By all means Individuals and interactions are important, let’s talk to
each other
•Nothing in this statement says don’t document
•What does working software mean
•How do we know
•Working and perfect are not equal
•Customer collaboration is very different in Agile
•Combine this with “responding to change” and this aspect is
critical to creating working software that meet expectations
•Responding to change is key as well
•When we follow the collaboration model, we’ll discover a lot of
changes because the customer will change
•Those changes will be facilitated by working software
6
Following the manifesto in the literal sense would create problems
8. Test Strategies serve a purpose,
but don’t focus on ‘the’ sprint
•Big picture
•Drives behavior and sets boundaries
•Outlines Automation Approach
•Test Data Management
•Defect/bug management
•Identifies large risks
8
Strategies are
9. Test Plan Value
• A agile test plan will solidify what the customer and the team
agreed upon
• Helps to easily identify gaps in thoughts and expectations
• Provides details that will help to clarify the acceptance of
working software
Note: Customer collaboration is throughout and continues at
demo’s and reviews. If something changes, make your
updates and move on, you can refer back to the updated
plan.
Remember: The goal is for working software, the idea of
that has now changed, so your plan will probably change.
9
Testing Scope
10. Quality Over Velocity
•Help the team think quality first
•Many think Agile is all about velocity
•Building quality in will improve velocity and efficiency and drive
down cost
•It doesn’t work the other way around
Creating a test plan encourages focus on quality
10
11. Agile Testing “Phases”
•Requirements and Design Phase
•Stories/Features Verification Phase
•System Verification Phase
•Acceptance Phase
A test plan should have components that help facilitate all these phases
11
12. Phase 1: Requirements and Design
•Scenarios to be tested
•Separation of testing activities (unit vs. functional?)
•Automation selection & plan
•Manual selection & plan
•Out of scope items
•Dependencies & Assumptions
•Test data needs
•Stubbing/Technical debt
•Acceptance section
Plan now and update throughout – Plan elements include:
12
13. Facilitated Discussion
•What tests does the Product Owner want to see to confirm
working software?
•What tests do the team feel are most important to run (based
on knowledge of the system)?
•What negative cases are necessary?
•What tests will be automated and with which tools (i.e. what
cases as the unit level, what at the UI level etc.)?
•What tests can I only manually test due to complexity?
•Where do we expect to find the most issues?
•What parts need exploratory testing?
•Does anyone have an example of questions they might ask?
13
Some questions to ask during the discussion
14. Document the Scenario Details
• List your high level test scenarios
• Should include individual unit level components (easy to
get buy in?)
• Should include functional tests/E2E scenarios
• Identify whether you will manually test or automate the test
(unit implies automation)
• Are we manually testing due to complexity/challenges
• Summarize why you are manually testing a specific
scenario(s)
• Are we automating for ease of use and repeatability
Start closing gaps and clarifying what is to be tested
14
15. Automation/Manual Testing
•Add a summary within your plan noting why you chose to
manually test
•Hopefully automation first and foremost whenever possible
•Consider this automation decision criteria:
•Reuse
•Complexity
•Overlap
•Information Value
•Stability and Coupling
•Dependencies
Summary Format
15
16. Out of Scope Items
• What won’t be tested
• Why you won’t test it (a specific function for example that isn’t
changing)
• If not listed, can end up with scope creep and scrambling to
test items at the end that didn’t get identified
Table Format
16
17. Dependencies and/or Assumptions
•Is there something the team needs to be successful in
accomplishing the testing ‘or’ obtaining acceptance
•For example:
•Are there multiple pods/teams developing and you need
another part to your solution in order to validate
•Needing an updated software version is a good example
•Needing test data from another area
•Maybe you are dependent on another team outside of your
space to provide service or support
•Would anyone like to share an example of a dependency?
•What about an example of an assumption?
List/Table Format :
17
Dependency – I need a test environment
Assumption – I assume the environment will be stable
18. Test Data
•Call out specific types of data you need in order to complete
the testing in the sprint for the scenarios
•Mocked data would be called out here if being used
Summary Format
18
Technical Debt
Summary Format
•Indicate anything in this section that will need to be updated or
changed later such as:
•Stubbing an interface
•Mocking test data
Note: Choose technical debt wisely. Postponing items until later can become costly.
If you manually test to get done now, will you ever have time to automate it later?
19. Acceptance Criteria
Table Format
19
•Input under Document Control History
•Sign off of this sprint plan indicates acceptance of the plan.
Each sprint will have acceptance defined by the team
(Including BA or Product Owner, Tech Lead and QA.)
•Things the team should consider when signing off: have we
identified the right scenarios, have we created the appropriate
test coverage to satisfy the acceptance. This document should
be part of the conversation and included in acceptance.
•Securing approval helps to ensure alignment and increase
effectiveness of acceptance.
20. Acceptance Criteria
20
• Microsoft Press defines Acceptance Criteria as “Conditions that a software product must
satisfy to be accepted by a user, customer or other stakeholder.” Google defines them as
“Pre-established standards or requirements a product or project must meet.”
• Acceptance Criteria are a set of statements, each with a clear pass/fail result, that specify
both functional (e.g., minimal marketable functionality) and non-functional (e.g., minimal
quality) requirements applicable at the current stage of project integration. These
requirements represent “conditions of satisfaction.” There is no partial acceptance: either a
criterion is met or it is not.
• These criteria define the boundaries and parameters of a User Story/feature and determine
when a story is completed and working as expected. They add certainty to what the team is
building.
• Acceptance Criteria must be expressed clearly, in simple language the customer would use,
just like the User Story, without ambiguity as to what the expected outcome is: what is
acceptable and what is not acceptable. They must be testable: easily translated into one or
more manual/automated test cases.
21. •Start your test execution based on the scenarios and
readiness of code
•Refer back to your plan
•Team will be creating unit tests
•Testing will be staggered based on development
•Write Automation tests
•Update test data sets
•Conduct manual validation
•Create stubs
•Ensure the features are working
•Validate component interactions
•Discover defects and correct
Start test execution
21
Phase 2: Story/Feature Verification
22. Phase 3: System Verification
•Wrap up any end to end testing of features
•Run regression to validate prior features are not broken
•Ensure regression sets are updated if needed
•Remove stubbing if possible
•Continue to correct and fix defects (critical)
•Update the plan with any additions or changes and socialize to
team
Shift focus to prepare for acceptance
22
23. Phase 4: Acceptance
Shift focus finalizing tests and securing acceptance
23
•Agree upon defects not making the release
•Conduct exploratory testing once the code/system is stable
•Run a final set of regression or smoke tests
•Conduct the demo or show & tells using the test plan as a
guide
24. Retrospective, Feedback and
change
•Team should provide feedback on the plan’s effectiveness
•Discuss challenges with features or components so it can be
considered if there are changes around that feature in the
future?
Reflection
24
25. Close the Loop - Technical Debt
•Ensure you capture it in a ticketing system
•Make sure you plan for technical debt in future sprints and
work it
•Groom this debt along with any other debt on the project
•Don’t let it sit too long, the longer it sits, the harder it is to work
it in
25
26. Conclusion
• Quality First
• Requirements and design work
• Create a plan
• Secure acceptance of the plan
• Use the plan as a guideline for demos to the product owner
• Seek feedback on the effectiveness of the plan and make
improvements
• Ensure you track your technical debt and understand costs
around postponing something in the sprint
26
28. QUESTIONS?
Contact Centric
To learn more about Centric Consulting Solutions:
CentricConsulting.com
Presenters Name: Michelle Williams
Email: michelle.williams@centricconsulting.com
Phone: 614-886-9930
28
Notes de l'éditeur
Factors to Consider
Reuse
Tests that will be run repeatedly should be considered for automation. Typically, the threshold is 7 executions. If a test is going to be executed less than 7 times, it generally is not a candidate for automation.
Complexity
Complex tests are generally multiplicatively more expensive than simpler tests. Both in terms of initial establishment as well as in terms of ongoing maintenance.
Overlap
Tests that effectively overlap other tests, e.g. multiple negative condition boundary tests, should not be automated unless there is enough incremental value in doing so.
Information Value
The test should provide value in the feedback of both passing and failing. For example, a passing test around processing an eCommerce monetary transaction provides business actionable information. We can collect money. Inversely, the same test failing shows a business can’t collect money. That test is a candidate for consideration. As a second example, consider a test that includes actions on adding an item to a cart through payment processing. A failure to add to cart represents actionable information. However, the act of adding an item to cart is only implicity tested and therefore, a pass does not necessarily mean the function works. In this case, the test should be re-designed to provide information on both passing and failing. A test that only provides feedback in one or the other is likely incomplete and/or unnecessary.
Stability and Coupling
The test needs to be flexible enough to withstand some modifications to the system under test without having to refactor a test or suite of tests. Tests should not be dependent on the configuration or environment of the target system. At times this may mean a deferral of automation efforts.
Dependencies
Tests and test suites should not be constructed with dependencies. Consider re-designing tests to avoid dependencies before considering for automation.
Ever have those items that start to bleed outside of what you thought was the scope of the story
Maybe you didn’t remember to discuss with the team
Maybe you didn’t realize them until later
Best to document and review
One is true, one is thought to be true. One is a constraint, one is helper in order to make a decision.
I find that the technical debt in projects especially Agile, gets lost very easily or buried in a Jira ticket you never come back to
Make sure you create cards to track it (even though it’s here), this is point in time socialization
Documenting it allows it to be visible when planning future work