SlideShare une entreprise Scribd logo
1  sur  82
Télécharger pour lire hors ligne
MC
Full-Day Tutorial
9/30/2013 8:30:00 AM

"Getting Started with RiskBased Testing"
Presented by:
Dale Perry
Software Quality Engineering

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
Dale Perry
Software Quality Engineering
Dale Perry has more than thirty-six years of experience in information technology as a
programmer/analyst, database administrator, project manager, development manager, tester,
and test manager. Dale’s project experience includes large-system development and
conversions, distributed systems, and both web-based and client/server applications. A
professional instructor for more than twenty years, he has presented at numerous industry
conferences on development and testing.
Dale L. Perry

SQE Training
Dperry@SQE.com

FUNDAMENTALS OF RISKBASED TESTING

Blank Page for Spacing

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

2

1
Hidden Slide for SQE Copyright

©2013 SQE Training - STAR West 2013

3

Table of Contents - Agenda
1. Understanding risk
2. Risk and the testing process
3. Test planning (overview)
4. Test creation (analysis and design)
5. Test execution, reassessing risk, and
reporting
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

4

2
1
UNDERSTANDING RISK

Risk – A General Definition

Risk
Riskbased
testing
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

• A factor that could result in future
negative consequences; usually
expressed as impact and likelihood

• An approach to testing to reduce the level
of product risks and inform stakeholders
on their status, starting in the initial
stages of a project. It involves the
identification of product risks and their
use in guiding the test process

6

3
Categories of Risk

Process
risk

• Focus is on procedural issues,
both managerial and technical

Project
risk

• Typically includes operational,
organizational and contractual
issues

Product
risk

• Typically relates to technical
issues and depends on the type
of product being developed

©2013 SQE Training - STAR West 2013

7

Categories of Risk (cont.)
There are many approaches to
categorizing risk
Capers Jones documented sixty
different risk factors
• This list provides an extensive look at
many different areas that represent risk
within an organization, project and
software (process, project and product)
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

8

4
This Slide is Hidden

©2013 SQE Training - STAR West 2013

9

This Slide is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

10

5
This Slide is Hidden

11

©2013 SQE Training - STAR West 2013

Understanding Process Risks
When we look at process risk we focus on both
management and technical procedures
Management processes that
relate to risk include:

Technical processes that relate
to risk include:

Planning and staffing

Requirements analysis

Tracking

Design and code

Quality assurance

Testing

Configuration
management
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

12

6
Understanding Project Risks
Project risks can originate from
several areas
• External sources, such as supplier issues
• Issues related to the organization
• Of the project
• Of the team
• Of the “company” itself – politics
• Issues related to technical problems
• Documentation
©2013 SQE Training - STAR West 2013

13

Project Risks – Organizational Factors
Skill, training and staff shortages
Personnel issues
Political issues, such as:
• Problems with testers communicating their needs and test results
• Failure by the team to follow up on information found in testing
and reviews (e.g., not improving development and testing
practices)

Improper attitude toward or expectations of
testing
• (e.g., not appreciating the value of finding defects during testing)

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

14

7
Project Risk – Technical/Supplier Issues
Technical issues
• Problems in defining the right requirements
• The extent to which requirements cannot be
met given existing constraints
• Test environment not ready on time

Supplier issues
• Failure of a third party
• Contractual issues
©2013 SQE Training - STAR West 2013

15

Project Risk ― External Sources
Delivery date
Acquisition plan

Budget
Scope of test
Use of automation
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

16

8
Product Risks
Potential failure areas (adverse future events
or hazards) in the software or system are
known as product risks

They comprise a risk to the quality of the
product and may include:
• Delivering failure-prone software
• The potential that the software/hardware could
cause harm to an individual or company
• Poor software characteristics (e.g. functionality,
reliability, usability and performance)
• Software that does not meet specifications (perform
its intended functions)

©2013 SQE Training - STAR West 2013

17

Examples of Product Risk Areas
Some areas of product risk
Scalability

Functional complexity

Scope of use

Performance

Environment

Safety

Security

Data selection

Reliability

Data integrity

Usability

Recoverability

Interface complexity

New technology

Technical complexity
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

18

9
Attitudes Toward Risk
Three basic attitudes toward risk
Risk Averse
• Practical, dependable, focus on facts

Risk Seeker
• Speculative, adventurous, adaptable, looking
for new things

Risk Neutral
• Focus on the future, neither overly secure nor
speculative
©2013 SQE Training - STAR West 2013

19

Stakeholder Viewpoints on Software
Stakeholder viewpoint—along with
attitude toward risk—affects our
assessments
There are at least four differing views of
software and software related risks
Management
Marketing
Developer/technician
Client/customer

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

20

10
Understanding Risk Management
Risk
management

• Systematic application of procedures
and practices to the tasks of
identifying, analyzing, prioritizing, and
controlling risk

Risk
Identification

Risk Mitigation
(Risk Control)

Risk Analysis &
prioritization

©2013 SQE Training - STAR West 2013

21

Risk Identification
There are many approaches and
methods for identifying and qualifying
risks
The method selected will depend, to a
large degree, on the type of system
or application being tested
• Some types of systems contain greater risks
(safety critical, mission critical, etc.)
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

22

11
Risk Identification Techniques
Expert interviews
Independent assessments
Use of risk templates
Lessons learned
• Metrics and data from past projects
• Develop a risk database or repository
©2013 SQE Training - STAR West 2013

23

Risk Identification Techniques (cont.)
Risk workshops
Brainstorming

Checklists
Calling on past experience
Formal analysis methods
• Failure modes and effects analysis (FMEA)
• Hazard analysis  Cost of failure  Others
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

24

12
Applying Risk Management
The attitudes, viewpoints and focus
of the individuals involved will affect
risk management, each process is
affected by the other processes
Risk
Identification

Risk Control,
Mitigation

Risk Analysis

©2013 SQE Training - STAR West 2013

25

2
RISK AND THE TESTING
PROCESS

©2013 SQE Training - STAR East 2013

13
Key Elements of Risk Driven Testing
Using testing to influence and direct
software design and development
Testing starts at the beginning of the
lifecycle
Tests are used as requirements and
usage models
• Testware design leads software design
©2013 SQE Training - STAR West 2013

27

Key Elements of Risk Driven Testing (cont.)
Defects are detected earlier or
prevented altogether
• Defects are systematically analyzed

Testers and developers work together
Implicit concurrent engineering of
testware and software

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

28

14
Testing and Development
To test effectively, testing and development
must be managed together
• Testing must become an integral part of the
development lifecycle
• There are many models used to develop software

For each model, testing will have to adjust
to the process in order to be successful
• The only difference is where and how to test

©2013 SQE Training - STAR West 2013

29

Lifecycle Models (Examples and Types)
Sequential

Incremental

Iterative
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

•Waterfall models
•“V” model
•Cleanroom® Engineering
•“W” model

•Spiral models
•RAD (Rapid Application Development)
•RUP (Rational Unified Process)

•Rapid prototyping
•Extreme programming (XP)
•SCRUM (not an acronym)
•DSDM (Dynamic Systems Development Method)

30

15
The Testing Process
Project

Master
Test Plan

Detail/Level
Plans

Analyze
Design/
Implement
Execute
Report

Test
Planning
Test
Creation
Test
Execution

©2013 SQE Training - STAR West 2013

31

Test Planning
Within a project there may be several
levels of testing and test plans
Master
test plan

Acceptance
test plan

System
test plan

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

Build test
plan

Component
test plan

Others

32

16
Test Creation (Testware)
Test Analysis
Inventories
Test Design
Test Specifications
Test Implementation
Test Data
Procedures
©2013 SQE Training - STAR West 2013

33

Test Execution and Reporting
Tests and incident
logs
• Failures and anomalies

Checking adequacy
• Decision to stop testing

Evaluating processes
• Generate summary report

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

34

17
An Effective Testing Process
Focuses on preventive measures
Is lifecycle-based
• Preventing a defect/fault is cheaper and more effective
than finding and fixing it later in the process

It can be incrementally implemented in an
organization
• You don’t have to change everything at once

©2013 SQE Training - STAR West 2013

35

3
TEST PLANNING OVERVIEW

©2013 SQE Training - STAR East 2013

18
Test Planning - Overview
Planning is a critical part of the
testing process
• It is introduced here as part of the
overall testing process, however, we
will not go into great detail
• Test planning is covered in greater
detail in the tutorial on Test
Management and Planning

©2013 SQE Training - STAR West 2013

37

Test Planning - Strategic Issues
•

Scope of the testing
– Breakdown of the testing
– Levels or other approaches

•

Software risks
– Both technical and
requirements based

•

Planning risks
– Budget
– Resources
•
•
•
•

Skills
Time
Budget (money)
People

– Dependencies

•

Test environment
– Complexity
– Accuracy

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

•
•

Testing tools
Test requirements

•
•

Configuration management
Relationships between other
organizations
Measurements/metrics
Software life-cycle in use for the
project
Documentation
Need for regression
Schedule
Staffing

– Objects and goals

•
•

•
•
•
•

– Internal
– Third party
– Consultants
38

19
4
TEST CREATION
(ANALYSIS AND DESIGN)

Focus of Testing
• Does the software do what it’s supposed to do?
• Are there unexpected results or events?

Negative
Positive

Other…

Testing
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

40

20
Approaches to Testing
• Static testing - prevent problems when applied
early in the process
• Dynamic testing - detect problems once the
software is developed
• Learning - explore and learn about the product

Dynamic
Static

Learning
Testing

©2013 SQE Training - STAR West 2013

41

Complete/Exhaustive Testing Is Impossible
No amount of testing will
Prevent all defects
Detect all defects
Prove a product is defect-free

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

42

21
Complete/Exhaustive Testing ― Example

There are too many possibilities to
completely test a system
• For example, a single screen (GUI) or data
stream, with thirteen variables, each with
three values
• To test every possible combination is 3(13) possible
combinations or 1,594,323 tests
• Plus testing of interfaces, etc.

©2013 SQE Training - STAR West 2013

43

Focusing the Testing
A primary goal in testing is to have the
critical features developed first
• Focus the testing effort where the potential risks
exist within the software
• Knowledge of risks guides selection of what and
where to test

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

44

22
The Test Inventory Process
A model designed for test analysis
within a project
• Applicable to virtually all development
lifecycle models

Designed to interact with the
development processes used within a
project focusing on key issues
• Understanding testing issues and risks
• Determining the scope of testing
• Improving estimates and budgets
©2013 SQE Training - STAR West 2013

45

Inventory Activities
Gather the reference
materials
Form a team
Select the test objects and
conditions
Create an inventory
Prioritize the objects
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

46

23
Inventory Activities (cont.)
These activities can be incorporated
into existing development activities
within the project
• At the beginning of every project there are
activities during which the software to be
developed is outlined or defined
• Requirements activities
• Architectural activities
• Detail design and coding activities

• All of these can incorporate the inventory concept
©2013 SQE Training - STAR West 2013

47

Gather Reference Materials
User
requests
Requirements,
stories, use
cases

Release
information

Inventory

Detail
designs

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

Architecture,
system
designs

48

24
A Well-Balanced Team
Analysts,
architects

Testers/QA

Developers

Users/clients

Marketing

Inventory
process

Management

©2013 SQE Training - STAR West 2013

49

Determining Objects and Conditions
The goal is to gather as much
information as early as
possible about the testing
goals, objects, and conditions
• This enables earlier test design
decisions and locates potential
problems sooner and possibly
prevents them from occurring
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

50

25
What Is a Test Object/Condition?
Any aspect of the software, its execution
environment, its usage patterns, etc.,
that should be covered by the testing
Basic functions or methods
Configurations
Usability
Interfaces
etc.
©2013 SQE Training - STAR West 2013

51

Identifying Test Objects/Conditions
The inventory process is an incremental,
iterative process starting at the requirements

Requirements (specifications,
models, etc.)
Architecture (high level
design)
Detail design
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

52

26
What Is an Inventory?
Example

1. Transactions
•

Add

•

Change

•

An inventory is a list
or index of test
objects

Delete

2. Screens
•

Daily

•

Weekly

•

Monthly

It is a detailed answer
to the question of
what should (must) be
tested

3. Reports
4. Constraints
5. Security
6. etc...
©2013 SQE Training - STAR West 2013

53

Inventory Process ― Flexibility
Inventories can be created easily during the
product risk identification and assessment
process
• They can be expanded or changed as needed

Inventories help identify both functional and
non-functional and structural areas to test
• This improves the quality and effectiveness of the
testing and helps avoid problems later in the process

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

54

27
Applying the Inventory Process
There are some common aspects of
applications/systems that can be used as a
starting point (model) in determining the
test objects/conditions
• Even if there are no formal requirements specifications

The more detailed the initial information, the
more accurate the inventory of test
objects/conditions, and the better the
understanding of risk
©2013 SQE Training - STAR West 2013

55

Applying the Inventory Process (cont.)
As noted earlier building an inventory is
an incremental process
It can be applied at the project level or at
an individual level (system, unit, etc.)
Starting with a global view
• A broader view can help put elements into proper
perspective
• “You need to see the forest before the trees”

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

56

28
Building the Inventory – Requirements
While creating an
inventory of test
objects you generate
two key elements
• The test objects
• A list of issues and
questions
• Potential errors and
defects

Requirements,
use cases,
stories, etc.

Issues and
questions

Inventory of
test objects

©2013 SQE Training - STAR West 2013

57

General Requirements-based Objects/Conditions
Features, functions, and methods
Constraints (limits on application, environment,
etc.)

Inputs and outputs
Behavior and business rules
Interfaces
Configurations (hardware, software, etc.)
Scenarios
Other elements…
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

58

29
Design Analysis ― Inputs
Requirements-based inventories

Questions and issues from requirements
can be used to verify design

Items remaining unresolved may result
in major rework later if left open

©2013 SQE Training - STAR West 2013

59

Building the Inventory – Architecture
Requirements,
use cases,
stories, etc.

Issues and
questions

High level
design
(Architecture)

Detail level
design

Inventory of
test objects

• The process can be continued as long as required to
ensure accurate information
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

60

30
Design Analysis ― Steps
Study the software design
Request clarification
Investigate with sample test cases
Request design enhancements
Inventory and prioritize design-based
objects
Review and revise the inventory
©2013 SQE Training - STAR West 2013

61

General Design-based Objects/Conditions
Detailed inputs and outputs
System and design states
Process paths
Data and communications paths
Design limits and exceptions
etc.
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

62

31
Investigating Using Tests
Consider a credit card management
application with the following capabilities
• Add, change, delete and inquire on customers
• Add, change, delete and inquire on credit cards

Consider the following scenarios
• Update customer address, inquire on same customer
to validate change and inquire on card balance
• Inquire on customer, delete customer (customer has
multiple cards on same account), delete credit card

©2013 SQE Training - STAR West 2013

63

Investigating Using Tests (cont.)
Using the two scenarios on the previous
page and the design diagram on the next
page

Do you see any problems in the design?

• If so what are they?
• What might happen if they are not corrected?

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

64

32
Investigating Using Tests (cont.)
DISPLAY
Main Menu
DISPLAY
Customer Menu

ADD
Customer

CHANGE
Customer

DELETE
Customer

Inquire
/Print

DISPLAY
Card Menu
Inquire/
Print

ADD
Card

CHANGE
Card

DELETE
Card

©2013 SQE Training - STAR West 2013

65

Prioritize the Objects/Conditions
Using our list of test objects and
conditions We must determine
which elements are critical and
which are not

To do this we need to understand
risk analysis

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

66

33
Risk Analysis ― Purpose
Focus the software design and
test/evaluation activities on those
elements/functions that are
Critical to the mission of the
organization
Essential to the physical or
economic well-being of users and
their environment
©2013 SQE Training - STAR West 2013

67

Using Risk Analysis
Risk analysis is the study of the
identified risks
• Categorizing each risk
• Determining the likelihood and impact

Each identified risk should be
placed into an appropriate class
• Classes can be internally defined or may be drawn
from a standard such as the ISO 9126
• The use of checklists can combine the identification
and classification processes

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

68

34
Risk Analysis ― Characteristics of Risk

Likelihood

Impact

• High
• Medium
• Low

• High
• Medium
• Low

©2013 SQE Training - STAR West 2013

69

Factors Influencing Likelihood
Many technical factors influence risk
potential and vary from project to project
and team to team within a project
Areas to investigate may include:
Complexity of technology and teams
Personnel and training issues
Conflict within the team or between teams
Contractual problems (vendors, suppliers,
contractors, etc.)
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

70

35
Factors Influencing Likelihood (cont.)
Areas to investigate (continued)
Geographical separation of the team
Legacy versus new ideas and approaches
Use of tools and new technologies
Lack of managerial or technical leadership
Pressures on time, resources and management
Lack of earlier quality assurance
• No focus on quality at early levels of testing

©2013 SQE Training - STAR West 2013

71

Factors Influencing Likelihood (cont.)
Areas to investigate (continued)
A high frequency of change requests
Poor early quality (high defect rates)
Issues with interfacing and integrating software

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

72

36
Factors Influencing Impact
Risk impact comes down to the impact on
the customer or user of the software under
test
Influencing factors may include:
Frequently-used features
• If it’s not available or corrupt, it will have adverse impacts

Impact on the corporate or system image
• Visibility of failure may lead to negative publicity

©2013 SQE Training - STAR West 2013

73

Factors Influencing Impact (cont.)
Influencing factors (continued)
Potential loss of business
Potential financial, ecological or even social
losses
Legal liabilities (civil or criminal )
Potential loss of an operational license
Work stoppages due to a lack of reasonable
workarounds
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

74

37
Approach to Risk Analysis

Quantitative

Qualitative

• Assign a value to each aspect (likelihood,
impact) multiply the two values together to
calculate the cost of exposure. This is the
expected loss associated with a particular
risk

• Use categories such as very high, high,
medium, low, or very low

©2013 SQE Training - STAR West 2013

75

Approach to Risk Analysis (cont.)
Most organization use a combination
of both methods
Qualitative helps express risk in terms a person
can understand

Quantitative is then used in conjunction with
qualitative categories to create a matrix with
each risk given a weighted priority

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

76

38
Initial Risk Priority ― A Risk Model
Likelihood
High

(3)

Medium

(2)

Low

Impact
High

(1)

Medium

(2)

Low

X

(3)
(1)

Impact and likelihood = Potential loss ($)
(helps set the test priority)

©2013 SQE Training - STAR West 2013

77

Adjusting the Assessment
What if we feel one factor is of
greater importance?

Likelihood
High

(3)

Medium

(2)

Low

(1)

Impact
High

X

(5)

Medium

(3)

Low

(2)

Impact and likelihood = Potential loss ($)
(helps set the test priority)
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

78

39
Example ― Risk Analysis
Test Object

Likelihood

Impact

Priority

Client update

L

M

LM=2

Client delete

M

L

ML=2

Client add

H

H

HH=9

Client inquiry

L

L

LL=1

Status report

L

L

LL=1

Performance

M

H

MH=6

Monthly report

L

L

LL=1

Sales report

M

L

ML=2

Calc. sales com.

M

M

MM=4

Mkt. report

M

M

MM=4

©2013 SQE Training - STAR West 2013

79

Risk Inventory Prioritized
Test Object

Likelihood

Impact

Priority

Client add

H

H

HH=9

Performance

M

H

MH=6

Calc. sales com.

M

M

MM=4

Mkt. report

M

M

MM=4

Client update

L

M

LM=2

Client delete

M

L

ML=2

Sale report

M

L

ML=2

Client inquiry

L

L

LL=1

Status report

L

L

LL=1

Monthly report

L

L

LL=1

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

80

40
Planning for Contingencies
Test Object

Test Case Number

Priority

Client add

23, 45, 47, 48, 10, 12…

9

Performance

15, 22, 31, 12…

6

Calc. sales com.

52, 53, 101, 17, 19…

4

Mkt. report

4, 7, 33, 37, 42…

4

Client update

57, 58, 41, 9, 8…

2

Client delete

62, 110, 82, 63…

2

Sale report

13, 16, 77, 68…

2

Client inquiry

19, 3, 5, 102…

1

Status report

73, 64, 56, 7…

1

Monthly report

74, 98, 92…

1

©2013 SQE Training - STAR West 2013

81

Risk Control – Scope of Testing
Once we have prioritized the list of test
objects/conditions, the next issue we have
to address is the scope of testing for each
element, how will we control the risk
• How much testing is required for each object?
• What type of tests can be created?
• This will be affected by the development model
employed and the completeness of the information
available

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

82

41
Approaches to Test Design
Depending on the goals of the testing
there are several ways to approach the
design process
• Formal and informal approaches

The approach chosen depends on many
factors
• Type of development process
• Types of defects anticipated
• Risk issues to be addressed
• Other factors
©2013 SQE Training - STAR West 2013

83

Formal Test Design
Test design using defined document
formats and analytical methods
Static methods
• (Inspections, walkthroughs, etc.)

Formal test techniques and
methods
• Equivalence partitioning and boundary
testing
• Pair-based methods (orthogonal and
combinatorial)
• Decision tables and state diagrams
• etc.
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

84

42
Formal Test Techniques and Risk
Different test techniques are good at identifying
and resolving certain types of problems (risks)
Static methods (reviews)
Help eliminate errors and mistakes at their
source
Find things that are unclear, mismatched,
vague, ambiguous, and possibly un-testable

Identify overlooked and missing elements
early in the process
©2013 SQE Training - STAR West 2013

85

Selecting a Technique (cont.)
Equivalence partitions
Help define and clarify requirements
Provide a solid basis for test condition selection

Boundary value testing
Improves the focus of test conditions selected
using equivalence partitions
Helps identify and eliminate some of the most
common mistakes made in coding

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

86

43
Test Techniques and Risk (cont.)
Decision tables
Help simplify complex business rules
Identify incomplete and contradictory logic in
rules
Help identify elements in each rule that must be
tested

State diagrams
Help identify critical sequences of events and
those events that can have a significant impact on
the system or application

©2013 SQE Training - STAR West 2013

87

Test Techniques and Risk (cont.)
Pair-based methods (orthogonal arrays
and combinatorial analysis)
Handle large amounts of interacting values where there
are limited or no dependencies between elements
• Decision tables are better at complex relationships

Informal methods
Help identify errors and mistakes caused by human
activity
Scientific techniques deal with logic very well but the
real world tends towards chaos
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

88

44
Informal Test Design Approach
Test analysis using the skills and
knowledge of the test team
Exploratory testing
Error guessing
Fault attacks
Defect based (defect and failure history)

©2013 SQE Training - STAR West 2013

89

Selecting the Test Approach
If the information available is relatively
complete, formal methods are a good option
Sequential models tend to focus on complete
requirements as early as possible
• At least that is the intent

Informal methods can be used to supplement the more
formal techniques
Initial test design can be more rigorous and the
documentation can be more detailed
• Care must be taken not to overdo the documentation aspect of testing

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

90

45
Selecting the Test Approach (cont.)
If the information is incomplete or
intended to evolve, the use of informal
methods may be the best initial choice
supplemented by formal techniques
In iterative and some incremental models, test
design, like the software itself, must be
evolutionary
• Initial tests will be high level and subject to constant change
• The formality of test design and documentation must be
flexible

©2013 SQE Training - STAR West 2013

91

Selecting the Test Approach (cont.)
The nature of the test
object/condition and its level of
risk will affect the depth and
breadth of the test design
• The more flexible the tests must be due to
partial information, the more time it will take
to complete the testing
• Increases risk that something will go untested

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

92

46
Test Approaches and Risk
Tests that have to evolve with the
software, require testers with a
greater depth of skill and knowledge
• Methods like exploratory testing rely on the skills and
knowledge of the testers
• An in-depth set of mental models (both formal and
informal) is essential to success

• Documentation and the repeatability of the tests
become more problematic
• Enough information has to be retained to allow someone
other than the original tester to repeat the test

©2013 SQE Training - STAR West 2013

93

Completing the Test Design Process
Organize the test objects
and conditions

Document the test design
decisions
• How much documentation is required
and to what level of detail represent a
type of risk
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

94

47
Organize the Test Objects
Select which objects and conditions
will be tested as a logical set at
which stage/level of testing
Define the test sets for the grouped
test objects and conditions
• Select a design strategy (formal, informal)
• Document the test architecture and design
decisions

©2013 SQE Training - STAR West 2013

95

Example Test Set Definition
Input

Test set
screen

Output

Test
cases

Text

Fields

Test
cases

Test
cases

Internal

Test
cases

External

Test
cases

Navigation

Security

Test
cases

Other
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

96

48
Test Creation - Summary
There are many test objects/conditions
to be considered
• Different types of projects have
different sources but all have
requirements, designs, etc.
The inventoried objects provide a
means for maintaining, evaluating and
updating test requirements

©2013 SQE Training - STAR West 2013

97

Implementing The Test Design

• All tests require the
proper data and
environment be in place
prior to test execution

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

98

49
Implementation Risk Areas
Environment

Data

Software

Implementation

©2013 SQE Training - STAR West 2013

99

Key Data Source Characteristics
Production Generated

Captured

Created

Volume

Too Much

Controllable

Controllable

Too Little

Variety

Mediocre

Varies

Varies

Good

Acquisition

Easy

Varies

Fairly Easy

Difficult

Validation
(Calibration)

Hard

Hard

Fairly Hard

Easy

Varies

Usually Easy

Varies

Easy

Change

• Acquiring the necessary data may be the most
time consuming part of test development
– Requires careful planning and the right tools
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

100

50
The Test Environment and Software
Application/

Hardware

System

Documentation

Testware

Firmware
(system
software)

Application
software

©2013 SQE Training - STAR West 2013

101

Preparing The Test Environment
Establish, manage, and maintain
the environment identified in the
test plan
• Publications
• Supplies
• Personnel
• Facilities
• Hardware
• Software

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

102

51
5
TEST EXECUTION,
REASSESSING RISK, AND
REPORTING

Test Execution and Risk
The earlier risk identification, analysis,
and prioritization activities helped
identify potential product risk areas

With test execution we begin to
discover, analyze, assess, and reassess
the potential product risks

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

104

52
Test Execution – Key Questions
Are we running the critical tests early?
Are our tests achieving the level of
coverage we desire?
Are the tests experiencing a high or low
failure rate?
What does the defect information
suggest?
©2013 SQE Training - STAR West 2013

105

Assessing and Reporting
Three key elements related to risk
Testing
status
Defect
status

Coverage
assessment

Test
Effectiveness
and Risk

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

106

53
Coverage Assessment
Coverage is relative to a specified
inventory and risk assessment
Effectiveness

Thoroughness

Comprehensiveness

©2013 SQE Training - STAR West 2013

107

Which Coverage Measures Matter?
Requirements
Risk

Design
Code

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

• Which requirements have been tested?

• Which risk areas have been tested?
• What risks are still unaddressed?

• What parts of the architecture have been
tested
• Have we addressed key design issues?
• What percentage of the code has been
tested?
• Paths, conditions, decisions, and statements

108

54
Testing Status
Which tests are we running?
Which are passing and which are failing?
When executing the tests, what do the
numbers tell us?
• Initially planned 10,000 tests, currently executed
1,500—what does this mean
• Are we 15% finished?
• If so, finished with what?
• On the next two slides, let’s look at variations of this
©2013 SQE Training - STAR West 2013

109

Test Status ― Example 1
Planned

Executed

High risk

5,000

Total

1,500

1465

35

3,000

Low risk

Failed

2,000

Medium risk

Passed

10,000

• What does this example tell us about test
coverage?, about risk?
• Why might this be happening?
• What is the potential impact on the project and the
product?
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

110

55
Test Status ― Example 2
Planned

Executed

Passed

Failed

High risk

2,000

500

491

9

Medium risk

3,000

250

243

7

Low risk

5,000

750

731

19

Total

10,000

• Now, what does this example potentially
tell us?

©2013 SQE Training - STAR West 2013

111

Testing Status – Context
Implemented
Specified

Planned

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

Executed

Tests

Passed/
Failed

112

56
Test Effectiveness Issues
Tests planned versus executed
On or off schedule
Time spent on key activities and their value
• Documenting the tests versus executing the tests

Issues preventing execution
• Blocking incidents
• Lack of resources (equipment, personnel, data, etc.)

©2013 SQE Training - STAR West 2013

113

Assessing Defect Status
Failing tests represent risks to
the quality of the software
• What is the severity/impact of occurring
defects?
• What is the failure rate?
• What is the distribution, density, and other
characteristics of the failures?
• Are they grouped or generally distributed?

• How much effort (rework) is being expended
to correct and re-test the failed element?
• Too much rework can imperil the schedule
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

114

57
Classification of Defects
• Is there a standard documented scale?
– Recognized and used by all projects
Fatal

Critical

Cosmetic

Incident

Minor

Major

©2013 SQE Training - STAR West 2013

115

Defect Analysis – What are the Risks?
Where are the defects occurring?
What is being missed?
Why are the defects occurring?
What is the cost and impact?
Defect estimation—how many defects
are left?
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

116

58
Analyzing Defect Characteristics
Patterns

Trends

Recurrence

Defects

Density

Distribution

©2013 SQE Training - STAR West 2013

117

Defect Assessment – Example
Component

Risk

Planned

Executed

Passed

Failed

Module1

High

300

100

97

3

Module2

High

350

125

123

2

Module3

High

250

75

75

0

Module4

High

300

150

145

5

Module5

High

175

50

48

2

Module6

High

225

0

Module7

High

400

0

• What does this example tell us about our
defects and risk?

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

118

59
Assessing Defect Status
Open

Cancelled

Status

Closed

Pending

©2013 SQE Training - STAR West 2013

119

Stopping the Testing
Exit criteria met
Coverage Criteria
Testing status
Defect status
Remaining defects estimation criteria
Diminishing returns criteria
Combination of criteria

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

120

60
Stopping the Testing
“There is no single, valid, rational
criterion for stopping.
Furthermore, given any set of
applicable criteria, how each is
weighed depends very much on
the product, the environment, the
culture, and the attitude to risk.”

— Boris Beizer
©2013 SQE Training - STAR West 2013

121

The Key to Success
Risk-driven testing provides the critical,
objective information to allow
stakeholders to make informed
decisions
When is risk-driven testing unnecessary?
• The software will never be used
• You have unlimited resources 
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

122

61
Tutorial Evaluations

Tutorial Evaluation
How did we do?
* Great

* Wonderful

* Fantastic

* Terrific

* Superior

* Excellent

©2013 SQE Training - STAR West 2013

123

Thank You

• On behalf of SQE
Training, we appreciate
your attendance at this
course
• If you have additional
training or consulting
needs, please think of
us

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

124

62
Bibliography ― Books
Boehm, Barry W. Software Risk
Management, IEEE Press, 1989.
Boehm, Barry W. Software Engineering
Economics, PTR Prentice Hall, 1981.

Brooks Jr., Frederick P. The Mythical Man
Month, Addison Wesley, 1995.
DeMarco, Tom and Timothy Lister.
Waltzing with Bears: Managing Risk on
Software Projects, Dorset House, 2003.

©2013 SQE Training - STAR West 2013

125

Bibliography ― Books (cont.)
Down, Alex, Michael Coleman, and Peter
Absolon. Risk Management for Software
Projects, McGraw Hill, 1994.
Gerrard, Paul and Neil Thompson. Risk Based
E-Business Testing, Artech House, 2002.
Grey, Stephen. Practical Risk Assessment for
Project Management, Wiley, 1995.
Hall, Elaine M. Managing Risk: Methods for
Software Systems Development, Addison
Wesley, 1997.
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

126

63
Bibliography – Books (cont.)
Jones, Capers. Assessment and Control of
Software Risks, Yourdon Press, 1994.
McManus, John. Risk Management in Software
Development Projects, Elsevier, 2004.
Myerson, Marian. Risk Management Processes
for Software Engineering Models, Artech
House, 1996.
Neumann, Peter G. Computer Related Risks,
Addison Wesley, 1995.

©2013 SQE Training - STAR West 2013

127

Bibliography ― Articles and Papers
Bach, James. Troubleshooting Risk Based
Testing. Better Software magazine, May/June
2003.
Gardiner, Julie. Risk – The Testers Favorite Four
Letter Word. Track Session, STAREAST 2005.

Kaner, Cem. The Context-driven Approach to
Software Testing. Track Session, STAREAST
2002.
Rice, Randy. The Risks of Risk-based Testing,
Track Session. STAREAST 2007.
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

128

64
Bibliography ― Articles and Papers
Sliger, Michele. ScrumBut: Failure to Deliver.
StickyMinds.com column, June 2009.
Sliger, Michele. Risk Management on an Agile
Project. Track Session, Better Software
Conference 2006.
Sliger, Michele. The Agile-Traditional
Development Cooperative. Track Session,
Better Software Conference 2007.

©2013 SQE Training - STAR West 2013

129

Appendix A
RISK BASED DESIGN
USING FORMAL
TECHNIQUES

©2013 SQE Training - STAR East 2013

65
A Solution to the Exhaustive Example
• 13 variable, 3 value
problem
– 1,594,323 possible
combinations

• Here we have 13
fields and values
ready for PICT

©2013 SQE Training - STAR West 2013

131

Results from PICT

• These results from PICT will be pasted
into Excel and we will add a results
column to create a test case matrix
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

132

66
Results in Excel with Results Column

©2013 SQE Training - STAR West 2013

133

Limits to All Pairs
• All Pairs methods deal very well with
valid values and with tools like PICT we
can create restrictions
– But what about testing invalid values?

• Next, using the initial data from the
PICT tool, duplicating it in Excel we will
modify the results to create invalid test
cases
– We will use Equivalence and boundaries to
create the invalid tests
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

134

67
Creating Invalid Test Cases
• On the following two slides the
following method was applied:
– Invalid values are in italics and are red
– The first field “F1” is a set so all I need
is a single non-set member
– Field “F2” is a range
• Using boundaries I went above and below
and also did a numeric and alpha as well

– These steps were repeated for all fields

©2013 SQE Training - STAR West 2013

135

Invalid Test Cases for All Pairs Data
Test
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36

INVALID TEST CASES
(Using equivalence partitions/boundary analysis)
F1
F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 F12 F13
3
3
D
6 G 9 L 12 N 13 P 16 S
B
0
F
5 H 9 K 10 M 14 R 18 T
B
4
F
5 H 9 K 10 M 14 R 18 T
B
D
F
5 H 9 K 10 M 14 R 18 T
B
@
F
5 H 9 K 10 M 14 R 18 T
C
1
Z
4
I
8
J 11 N 15 R 17 U
A
2
D
4
I
7
J 12 O 14 Q 16 T
A
2
D
8
I
7
J 12 O 14 Q 16 T
A
2
D A
I
7
J 12 O 14 Q 16 T
A
2
D % I
7
J 12 O 14 Q 16 T
C
3
E
5 7
7 K 12 O 15 P 18 U
B
1
E
6 H 6
L 10 M 13 Q 17 S
B
1
E
6 H 10 L 10 M 13 Q 17 S
B
1
E
6 H X L 10 M 13 Q 17 S
B
1
E
6 H $
L 10 M 13 Q 17 S
B
2
F
4 G 8 5 11 M 15 P 16 T

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

RESULTS
FIELD MSG1
FIELD MSG2
FIELD MSG2
FIELD MSG2
FIELD MSG2
FIELD MSG3
FIELD MSG4
FIELD MSG4
FIELD MSG4
FIELD MSG4
FIELD MSG5
FIELD MSG6
FIELD MSG6
FIELD MSG6
FIELD MSG6
FIELD MSG7

136

68
Invalid Test Cases for All Pairs Data (cont.)
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51

C
C
C
C
A
C
C
C
C
A
B
B
B
B
A

1
1
1
1
3
1
1
1
1
3
2
2
2
2
1

D
D
D
D
F
D
D
D
D
F
F
F
F
F
E

6
6
6
6
6
5
5
5
5
5
4
4
4
4
5

H
H
H
H
I
H
H
H
H
I
H
H
H
H
G

7
7
7
7
9
9
9
9
9
8
7
7
7
7
8

K
K
K
K
J
L
L
L
L
L
K
K
K
K
J

9
13
A
&
10
10
10
10
10
11
12
12
12
12
10

O
O
O
O
9
O
O
O
O
N
N
N
N
N
O

13
13
13
13
14
12
16
Q
#
14
13
13
13
13
14

R
R
R
R
P
Q
Q
Q
Q
3
R
R
R
R
R

18
18
18
18
17
16
16
16
16
18
15
19
R
&
17

S
S
S
S
U
U
U
U
U
S
U
U
U
U
6

FIELD MSG8
FIELD MSG8
FIELD MSG8
FIELD MSG8
FIELD MSG9
FIELD MSG10
FIELD MSG10
FIELD MSG10
FIELD MSG10
FIELD MSG11
FIELD MSG12
FIELD MSG12
FIELD MSG12
FIELD MSG12
FIELD MSG13

• With 51 test cases, using all pairs, equivalence classes
and boundaries and experience based techniques we
now have a reasonable set of test cases for a large
problem
©2013 SQE Training - STAR West 2013

137

A Simple Procedure

Test Procedure
Step
1 Navigate to input screen/page
2 Starting at row 1 in first table
3 Enter data from each column in matching field element on screen/page
4 Verify indicated result
5 Repeat steps 2-4 for all rows in the tables

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

138

69
APPENDIX B
EXAMPLE TEST PLANS
AND INVENTORY

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

140

70
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

141

This Slide is Hidden

142
©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

142

71
This Slide is Hidden

143
©2013 SQE Training - STAR West 2013

143

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

144

72
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

145

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

146

73
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

147

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

148

74
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

149

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

150

75
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

151

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

152

76
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

153

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

154

77
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

155

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

156

78
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

157

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

158

79
This Slide Is Hidden

©2013 SQE Training - STAR West 2013

159

This Slide Is Hidden

©2013 SQE Training - STAR West 2013

©2013 SQE Training - STAR East 2013

160

80

Contenu connexe

Tendances

Risk-Based Testing for Agile Projects
Risk-Based Testing for Agile ProjectsRisk-Based Testing for Agile Projects
Risk-Based Testing for Agile ProjectsTechWell
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and PlanningTechWell
 
Free-ebook-rex-black advanced-software-testing
Free-ebook-rex-black advanced-software-testingFree-ebook-rex-black advanced-software-testing
Free-ebook-rex-black advanced-software-testingQualister
 
Risk Management In Software Product Development
Risk Management In Software Product DevelopmentRisk Management In Software Product Development
Risk Management In Software Product DevelopmentAmandeep Midha
 
Risk-management
 Risk-management Risk-management
Risk-managementUmesh Gupta
 
Unit 8-risk manaegement (1) -
Unit 8-risk manaegement (1) - Unit 8-risk manaegement (1) -
Unit 8-risk manaegement (1) - Shashi Kumar
 
Develop a Defect Prevention Strategy—or Else!
Develop a Defect Prevention Strategy—or Else!Develop a Defect Prevention Strategy—or Else!
Develop a Defect Prevention Strategy—or Else!TechWell
 
Risk based testing - Final
Risk based testing - FinalRisk based testing - Final
Risk based testing - FinalKuldeep Kumar
 
risk management
risk managementrisk management
risk managementArti Maggo
 
Software Engineering Code Of Ethics And Professional Practice
Software Engineering Code Of Ethics And Professional PracticeSoftware Engineering Code Of Ethics And Professional Practice
Software Engineering Code Of Ethics And Professional Practice Saqib Raza
 
Asad_F_Swati_Resume[1]
Asad_F_Swati_Resume[1]Asad_F_Swati_Resume[1]
Asad_F_Swati_Resume[1]Asad Swati
 
Cyndie beacham linked in cv resume nov15
Cyndie beacham linked in cv resume nov15Cyndie beacham linked in cv resume nov15
Cyndie beacham linked in cv resume nov15Cyndie Beacham
 
Software Engineering (Risk Management)
Software Engineering (Risk Management)Software Engineering (Risk Management)
Software Engineering (Risk Management)ShudipPal
 
Kasper Hanselman - Imagination is More Important Than Knowledge
Kasper Hanselman - Imagination is More Important Than KnowledgeKasper Hanselman - Imagination is More Important Than Knowledge
Kasper Hanselman - Imagination is More Important Than KnowledgeTEST Huddle
 

Tendances (20)

Risk-Based Testing for Agile Projects
Risk-Based Testing for Agile ProjectsRisk-Based Testing for Agile Projects
Risk-Based Testing for Agile Projects
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
Free-ebook-rex-black advanced-software-testing
Free-ebook-rex-black advanced-software-testingFree-ebook-rex-black advanced-software-testing
Free-ebook-rex-black advanced-software-testing
 
Risk Management In Software Product Development
Risk Management In Software Product DevelopmentRisk Management In Software Product Development
Risk Management In Software Product Development
 
Sda beispiele kopie
Sda beispiele kopieSda beispiele kopie
Sda beispiele kopie
 
Risk-management
 Risk-management Risk-management
Risk-management
 
Unit 8-risk manaegement (1) -
Unit 8-risk manaegement (1) - Unit 8-risk manaegement (1) -
Unit 8-risk manaegement (1) -
 
Risk Management
Risk ManagementRisk Management
Risk Management
 
Risk Management by Roger Pressman
Risk Management by Roger PressmanRisk Management by Roger Pressman
Risk Management by Roger Pressman
 
Develop a Defect Prevention Strategy—or Else!
Develop a Defect Prevention Strategy—or Else!Develop a Defect Prevention Strategy—or Else!
Develop a Defect Prevention Strategy—or Else!
 
Istqb ctal tm
Istqb ctal tmIstqb ctal tm
Istqb ctal tm
 
Risk based testing - Final
Risk based testing - FinalRisk based testing - Final
Risk based testing - Final
 
Douglas Davies Resume
Douglas Davies ResumeDouglas Davies Resume
Douglas Davies Resume
 
risk management
risk managementrisk management
risk management
 
Software Engineering Code Of Ethics And Professional Practice
Software Engineering Code Of Ethics And Professional PracticeSoftware Engineering Code Of Ethics And Professional Practice
Software Engineering Code Of Ethics And Professional Practice
 
Asad_F_Swati_Resume[1]
Asad_F_Swati_Resume[1]Asad_F_Swati_Resume[1]
Asad_F_Swati_Resume[1]
 
Manual Testing
Manual TestingManual Testing
Manual Testing
 
Cyndie beacham linked in cv resume nov15
Cyndie beacham linked in cv resume nov15Cyndie beacham linked in cv resume nov15
Cyndie beacham linked in cv resume nov15
 
Software Engineering (Risk Management)
Software Engineering (Risk Management)Software Engineering (Risk Management)
Software Engineering (Risk Management)
 
Kasper Hanselman - Imagination is More Important Than Knowledge
Kasper Hanselman - Imagination is More Important Than KnowledgeKasper Hanselman - Imagination is More Important Than Knowledge
Kasper Hanselman - Imagination is More Important Than Knowledge
 

En vedette

A Year of Testing in the Cloud: Lessons Learned
A Year of Testing in the Cloud: Lessons LearnedA Year of Testing in the Cloud: Lessons Learned
A Year of Testing in the Cloud: Lessons LearnedTechWell
 
Massive Continuous Integration and Light-speed Iterations
Massive Continuous Integration and Light-speed IterationsMassive Continuous Integration and Light-speed Iterations
Massive Continuous Integration and Light-speed IterationsTechWell
 
How to Break Software: Web 101+ Edition
How to Break Software: Web 101+ EditionHow to Break Software: Web 101+ Edition
How to Break Software: Web 101+ EditionTechWell
 
Automation Culture: Essential to Agile Success
Automation Culture: Essential to Agile SuccessAutomation Culture: Essential to Agile Success
Automation Culture: Essential to Agile SuccessTechWell
 
Think Different: Visualization Tools for Testers
Think Different: Visualization Tools for TestersThink Different: Visualization Tools for Testers
Think Different: Visualization Tools for TestersTechWell
 
Things Could Get Worse: Ideas About Regression Testing
Things Could Get Worse: Ideas About Regression TestingThings Could Get Worse: Ideas About Regression Testing
Things Could Get Worse: Ideas About Regression TestingTechWell
 
Keynote: The Art of Change: Influence Skills for Leaders
Keynote: The Art of Change: Influence Skills for LeadersKeynote: The Art of Change: Influence Skills for Leaders
Keynote: The Art of Change: Influence Skills for LeadersTechWell
 
IT Governance and Compliance in an Agile World
IT Governance and Compliance in an Agile WorldIT Governance and Compliance in an Agile World
IT Governance and Compliance in an Agile WorldTechWell
 
Rapid Software Testing: Reporting
Rapid Software Testing: ReportingRapid Software Testing: Reporting
Rapid Software Testing: ReportingTechWell
 
Flintstones or Jetsons? Jump Start Your Virtual Test Lab
Flintstones or Jetsons? Jump Start Your Virtual Test LabFlintstones or Jetsons? Jump Start Your Virtual Test Lab
Flintstones or Jetsons? Jump Start Your Virtual Test LabTechWell
 
12 cadpe aw5-2
12 cadpe aw5-212 cadpe aw5-2
12 cadpe aw5-2TechWell
 
Quantifying the Value of Testing
Quantifying the Value of TestingQuantifying the Value of Testing
Quantifying the Value of TestingTechWell
 
Android Mobile Testing: Right before Your Eyes
Android Mobile Testing: Right before Your EyesAndroid Mobile Testing: Right before Your Eyes
Android Mobile Testing: Right before Your EyesTechWell
 
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile Development
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile DevelopmentKeynote: Know the Way, Show the Way, Go the Way: Scaling Agile Development
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile DevelopmentTechWell
 
Testing the Data Warehouse—Big Data, Big Problems
Testing the Data Warehouse—Big Data, Big ProblemsTesting the Data Warehouse—Big Data, Big Problems
Testing the Data Warehouse—Big Data, Big ProblemsTechWell
 

En vedette (15)

A Year of Testing in the Cloud: Lessons Learned
A Year of Testing in the Cloud: Lessons LearnedA Year of Testing in the Cloud: Lessons Learned
A Year of Testing in the Cloud: Lessons Learned
 
Massive Continuous Integration and Light-speed Iterations
Massive Continuous Integration and Light-speed IterationsMassive Continuous Integration and Light-speed Iterations
Massive Continuous Integration and Light-speed Iterations
 
How to Break Software: Web 101+ Edition
How to Break Software: Web 101+ EditionHow to Break Software: Web 101+ Edition
How to Break Software: Web 101+ Edition
 
Automation Culture: Essential to Agile Success
Automation Culture: Essential to Agile SuccessAutomation Culture: Essential to Agile Success
Automation Culture: Essential to Agile Success
 
Think Different: Visualization Tools for Testers
Think Different: Visualization Tools for TestersThink Different: Visualization Tools for Testers
Think Different: Visualization Tools for Testers
 
Things Could Get Worse: Ideas About Regression Testing
Things Could Get Worse: Ideas About Regression TestingThings Could Get Worse: Ideas About Regression Testing
Things Could Get Worse: Ideas About Regression Testing
 
Keynote: The Art of Change: Influence Skills for Leaders
Keynote: The Art of Change: Influence Skills for LeadersKeynote: The Art of Change: Influence Skills for Leaders
Keynote: The Art of Change: Influence Skills for Leaders
 
IT Governance and Compliance in an Agile World
IT Governance and Compliance in an Agile WorldIT Governance and Compliance in an Agile World
IT Governance and Compliance in an Agile World
 
Rapid Software Testing: Reporting
Rapid Software Testing: ReportingRapid Software Testing: Reporting
Rapid Software Testing: Reporting
 
Flintstones or Jetsons? Jump Start Your Virtual Test Lab
Flintstones or Jetsons? Jump Start Your Virtual Test LabFlintstones or Jetsons? Jump Start Your Virtual Test Lab
Flintstones or Jetsons? Jump Start Your Virtual Test Lab
 
12 cadpe aw5-2
12 cadpe aw5-212 cadpe aw5-2
12 cadpe aw5-2
 
Quantifying the Value of Testing
Quantifying the Value of TestingQuantifying the Value of Testing
Quantifying the Value of Testing
 
Android Mobile Testing: Right before Your Eyes
Android Mobile Testing: Right before Your EyesAndroid Mobile Testing: Right before Your Eyes
Android Mobile Testing: Right before Your Eyes
 
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile Development
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile DevelopmentKeynote: Know the Way, Show the Way, Go the Way: Scaling Agile Development
Keynote: Know the Way, Show the Way, Go the Way: Scaling Agile Development
 
Testing the Data Warehouse—Big Data, Big Problems
Testing the Data Warehouse—Big Data, Big ProblemsTesting the Data Warehouse—Big Data, Big Problems
Testing the Data Warehouse—Big Data, Big Problems
 

Similaire à Getting Started with Risk-Based Testing

Essential Test Management
Essential Test ManagementEssential Test Management
Essential Test ManagementTechWell
 
Measurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersMeasurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersTechWell
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and PlanningTechWell
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and PlanningTechWell
 
Applicaiton Security - Building The Audit Program
Applicaiton Security - Building The Audit ProgramApplicaiton Security - Building The Audit Program
Applicaiton Security - Building The Audit ProgramMichael Davis
 
Effective Test Estimation
Effective Test EstimationEffective Test Estimation
Effective Test EstimationTechWell
 
Strategies for Implementing Aras Innovator
Strategies for Implementing Aras InnovatorStrategies for Implementing Aras Innovator
Strategies for Implementing Aras InnovatorAras
 
Raj Kunwar Singh_PM_Testing
Raj Kunwar Singh_PM_TestingRaj Kunwar Singh_PM_Testing
Raj Kunwar Singh_PM_TestingRajKunwar Singh
 
Ez- template-IT- Quality Assurance
Ez- template-IT- Quality AssuranceEz- template-IT- Quality Assurance
Ez- template-IT- Quality AssuranceEzni Serafina
 
Building an Enterprise Performance and Load Testing Infrastructure
Building an Enterprise Performance and Load Testing InfrastructureBuilding an Enterprise Performance and Load Testing Infrastructure
Building an Enterprise Performance and Load Testing InfrastructureTechWell
 
Ims poject management methodology
Ims poject management methodologyIms poject management methodology
Ims poject management methodologyinfinitemonkey
 
Gaurav Savla - Consultant
Gaurav Savla - ConsultantGaurav Savla - Consultant
Gaurav Savla - Consultantgns1403
 

Similaire à Getting Started with Risk-Based Testing (20)

Essential Test Management
Essential Test ManagementEssential Test Management
Essential Test Management
 
Measurement and Metrics for Test Managers
Measurement and Metrics for Test ManagersMeasurement and Metrics for Test Managers
Measurement and Metrics for Test Managers
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
Test manager resume
Test manager resumeTest manager resume
Test manager resume
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
Applicaiton Security - Building The Audit Program
Applicaiton Security - Building The Audit ProgramApplicaiton Security - Building The Audit Program
Applicaiton Security - Building The Audit Program
 
Effective Test Estimation
Effective Test EstimationEffective Test Estimation
Effective Test Estimation
 
Strategies for Implementing Aras Innovator
Strategies for Implementing Aras InnovatorStrategies for Implementing Aras Innovator
Strategies for Implementing Aras Innovator
 
Raj Kunwar Singh_PM_Testing
Raj Kunwar Singh_PM_TestingRaj Kunwar Singh_PM_Testing
Raj Kunwar Singh_PM_Testing
 
Jayaseelan Agile _April 2016
Jayaseelan Agile _April 2016Jayaseelan Agile _April 2016
Jayaseelan Agile _April 2016
 
sutapa_resume
sutapa_resumesutapa_resume
sutapa_resume
 
Ramesh_Resume_1215
Ramesh_Resume_1215Ramesh_Resume_1215
Ramesh_Resume_1215
 
Ez- template-IT- Quality Assurance
Ez- template-IT- Quality AssuranceEz- template-IT- Quality Assurance
Ez- template-IT- Quality Assurance
 
Building an Enterprise Performance and Load Testing Infrastructure
Building an Enterprise Performance and Load Testing InfrastructureBuilding an Enterprise Performance and Load Testing Infrastructure
Building an Enterprise Performance and Load Testing Infrastructure
 
Najeeb Shekasan - Detailed Profile
Najeeb Shekasan - Detailed ProfileNajeeb Shekasan - Detailed Profile
Najeeb Shekasan - Detailed Profile
 
Anil Kumar
Anil KumarAnil Kumar
Anil Kumar
 
Quality Assurance and Testing services
Quality Assurance and Testing servicesQuality Assurance and Testing services
Quality Assurance and Testing services
 
Ims poject management methodology
Ims poject management methodologyIms poject management methodology
Ims poject management methodology
 
Gaurav Savla - Consultant
Gaurav Savla - ConsultantGaurav Savla - Consultant
Gaurav Savla - Consultant
 
Resume
ResumeResume
Resume
 

Plus de TechWell

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and RecoveringTechWell
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization TechWell
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTechWell
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartTechWell
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyTechWell
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTechWell
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowTechWell
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityTechWell
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyTechWell
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTechWell
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipTechWell
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsTechWell
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GameTechWell
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsTechWell
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationTechWell
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessTechWell
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateTechWell
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessTechWell
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTechWell
 

Plus de TechWell (20)

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
 
Ma 15
Ma 15Ma 15
Ma 15
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
 

Dernier

Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...apidays
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesBoston Institute of Analytics
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Principled Technologies
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsRoshan Dwivedi
 

Dernier (20)

Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
 

Getting Started with Risk-Based Testing

  • 1. MC Full-Day Tutorial 9/30/2013 8:30:00 AM "Getting Started with RiskBased Testing" Presented by: Dale Perry Software Quality Engineering Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. Dale Perry Software Quality Engineering Dale Perry has more than thirty-six years of experience in information technology as a programmer/analyst, database administrator, project manager, development manager, tester, and test manager. Dale’s project experience includes large-system development and conversions, distributed systems, and both web-based and client/server applications. A professional instructor for more than twenty years, he has presented at numerous industry conferences on development and testing.
  • 3. Dale L. Perry SQE Training Dperry@SQE.com FUNDAMENTALS OF RISKBASED TESTING Blank Page for Spacing ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 2 1
  • 4. Hidden Slide for SQE Copyright ©2013 SQE Training - STAR West 2013 3 Table of Contents - Agenda 1. Understanding risk 2. Risk and the testing process 3. Test planning (overview) 4. Test creation (analysis and design) 5. Test execution, reassessing risk, and reporting ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 4 2
  • 5. 1 UNDERSTANDING RISK Risk – A General Definition Risk Riskbased testing ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 • A factor that could result in future negative consequences; usually expressed as impact and likelihood • An approach to testing to reduce the level of product risks and inform stakeholders on their status, starting in the initial stages of a project. It involves the identification of product risks and their use in guiding the test process 6 3
  • 6. Categories of Risk Process risk • Focus is on procedural issues, both managerial and technical Project risk • Typically includes operational, organizational and contractual issues Product risk • Typically relates to technical issues and depends on the type of product being developed ©2013 SQE Training - STAR West 2013 7 Categories of Risk (cont.) There are many approaches to categorizing risk Capers Jones documented sixty different risk factors • This list provides an extensive look at many different areas that represent risk within an organization, project and software (process, project and product) ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 8 4
  • 7. This Slide is Hidden ©2013 SQE Training - STAR West 2013 9 This Slide is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 10 5
  • 8. This Slide is Hidden 11 ©2013 SQE Training - STAR West 2013 Understanding Process Risks When we look at process risk we focus on both management and technical procedures Management processes that relate to risk include: Technical processes that relate to risk include: Planning and staffing Requirements analysis Tracking Design and code Quality assurance Testing Configuration management ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 12 6
  • 9. Understanding Project Risks Project risks can originate from several areas • External sources, such as supplier issues • Issues related to the organization • Of the project • Of the team • Of the “company” itself – politics • Issues related to technical problems • Documentation ©2013 SQE Training - STAR West 2013 13 Project Risks – Organizational Factors Skill, training and staff shortages Personnel issues Political issues, such as: • Problems with testers communicating their needs and test results • Failure by the team to follow up on information found in testing and reviews (e.g., not improving development and testing practices) Improper attitude toward or expectations of testing • (e.g., not appreciating the value of finding defects during testing) ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 14 7
  • 10. Project Risk – Technical/Supplier Issues Technical issues • Problems in defining the right requirements • The extent to which requirements cannot be met given existing constraints • Test environment not ready on time Supplier issues • Failure of a third party • Contractual issues ©2013 SQE Training - STAR West 2013 15 Project Risk ― External Sources Delivery date Acquisition plan Budget Scope of test Use of automation ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 16 8
  • 11. Product Risks Potential failure areas (adverse future events or hazards) in the software or system are known as product risks They comprise a risk to the quality of the product and may include: • Delivering failure-prone software • The potential that the software/hardware could cause harm to an individual or company • Poor software characteristics (e.g. functionality, reliability, usability and performance) • Software that does not meet specifications (perform its intended functions) ©2013 SQE Training - STAR West 2013 17 Examples of Product Risk Areas Some areas of product risk Scalability Functional complexity Scope of use Performance Environment Safety Security Data selection Reliability Data integrity Usability Recoverability Interface complexity New technology Technical complexity ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 18 9
  • 12. Attitudes Toward Risk Three basic attitudes toward risk Risk Averse • Practical, dependable, focus on facts Risk Seeker • Speculative, adventurous, adaptable, looking for new things Risk Neutral • Focus on the future, neither overly secure nor speculative ©2013 SQE Training - STAR West 2013 19 Stakeholder Viewpoints on Software Stakeholder viewpoint—along with attitude toward risk—affects our assessments There are at least four differing views of software and software related risks Management Marketing Developer/technician Client/customer ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 20 10
  • 13. Understanding Risk Management Risk management • Systematic application of procedures and practices to the tasks of identifying, analyzing, prioritizing, and controlling risk Risk Identification Risk Mitigation (Risk Control) Risk Analysis & prioritization ©2013 SQE Training - STAR West 2013 21 Risk Identification There are many approaches and methods for identifying and qualifying risks The method selected will depend, to a large degree, on the type of system or application being tested • Some types of systems contain greater risks (safety critical, mission critical, etc.) ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 22 11
  • 14. Risk Identification Techniques Expert interviews Independent assessments Use of risk templates Lessons learned • Metrics and data from past projects • Develop a risk database or repository ©2013 SQE Training - STAR West 2013 23 Risk Identification Techniques (cont.) Risk workshops Brainstorming Checklists Calling on past experience Formal analysis methods • Failure modes and effects analysis (FMEA) • Hazard analysis  Cost of failure  Others ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 24 12
  • 15. Applying Risk Management The attitudes, viewpoints and focus of the individuals involved will affect risk management, each process is affected by the other processes Risk Identification Risk Control, Mitigation Risk Analysis ©2013 SQE Training - STAR West 2013 25 2 RISK AND THE TESTING PROCESS ©2013 SQE Training - STAR East 2013 13
  • 16. Key Elements of Risk Driven Testing Using testing to influence and direct software design and development Testing starts at the beginning of the lifecycle Tests are used as requirements and usage models • Testware design leads software design ©2013 SQE Training - STAR West 2013 27 Key Elements of Risk Driven Testing (cont.) Defects are detected earlier or prevented altogether • Defects are systematically analyzed Testers and developers work together Implicit concurrent engineering of testware and software ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 28 14
  • 17. Testing and Development To test effectively, testing and development must be managed together • Testing must become an integral part of the development lifecycle • There are many models used to develop software For each model, testing will have to adjust to the process in order to be successful • The only difference is where and how to test ©2013 SQE Training - STAR West 2013 29 Lifecycle Models (Examples and Types) Sequential Incremental Iterative ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 •Waterfall models •“V” model •Cleanroom® Engineering •“W” model •Spiral models •RAD (Rapid Application Development) •RUP (Rational Unified Process) •Rapid prototyping •Extreme programming (XP) •SCRUM (not an acronym) •DSDM (Dynamic Systems Development Method) 30 15
  • 18. The Testing Process Project Master Test Plan Detail/Level Plans Analyze Design/ Implement Execute Report Test Planning Test Creation Test Execution ©2013 SQE Training - STAR West 2013 31 Test Planning Within a project there may be several levels of testing and test plans Master test plan Acceptance test plan System test plan ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 Build test plan Component test plan Others 32 16
  • 19. Test Creation (Testware) Test Analysis Inventories Test Design Test Specifications Test Implementation Test Data Procedures ©2013 SQE Training - STAR West 2013 33 Test Execution and Reporting Tests and incident logs • Failures and anomalies Checking adequacy • Decision to stop testing Evaluating processes • Generate summary report ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 34 17
  • 20. An Effective Testing Process Focuses on preventive measures Is lifecycle-based • Preventing a defect/fault is cheaper and more effective than finding and fixing it later in the process It can be incrementally implemented in an organization • You don’t have to change everything at once ©2013 SQE Training - STAR West 2013 35 3 TEST PLANNING OVERVIEW ©2013 SQE Training - STAR East 2013 18
  • 21. Test Planning - Overview Planning is a critical part of the testing process • It is introduced here as part of the overall testing process, however, we will not go into great detail • Test planning is covered in greater detail in the tutorial on Test Management and Planning ©2013 SQE Training - STAR West 2013 37 Test Planning - Strategic Issues • Scope of the testing – Breakdown of the testing – Levels or other approaches • Software risks – Both technical and requirements based • Planning risks – Budget – Resources • • • • Skills Time Budget (money) People – Dependencies • Test environment – Complexity – Accuracy ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 • • Testing tools Test requirements • • Configuration management Relationships between other organizations Measurements/metrics Software life-cycle in use for the project Documentation Need for regression Schedule Staffing – Objects and goals • • • • • • – Internal – Third party – Consultants 38 19
  • 22. 4 TEST CREATION (ANALYSIS AND DESIGN) Focus of Testing • Does the software do what it’s supposed to do? • Are there unexpected results or events? Negative Positive Other… Testing ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 40 20
  • 23. Approaches to Testing • Static testing - prevent problems when applied early in the process • Dynamic testing - detect problems once the software is developed • Learning - explore and learn about the product Dynamic Static Learning Testing ©2013 SQE Training - STAR West 2013 41 Complete/Exhaustive Testing Is Impossible No amount of testing will Prevent all defects Detect all defects Prove a product is defect-free ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 42 21
  • 24. Complete/Exhaustive Testing ― Example There are too many possibilities to completely test a system • For example, a single screen (GUI) or data stream, with thirteen variables, each with three values • To test every possible combination is 3(13) possible combinations or 1,594,323 tests • Plus testing of interfaces, etc. ©2013 SQE Training - STAR West 2013 43 Focusing the Testing A primary goal in testing is to have the critical features developed first • Focus the testing effort where the potential risks exist within the software • Knowledge of risks guides selection of what and where to test ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 44 22
  • 25. The Test Inventory Process A model designed for test analysis within a project • Applicable to virtually all development lifecycle models Designed to interact with the development processes used within a project focusing on key issues • Understanding testing issues and risks • Determining the scope of testing • Improving estimates and budgets ©2013 SQE Training - STAR West 2013 45 Inventory Activities Gather the reference materials Form a team Select the test objects and conditions Create an inventory Prioritize the objects ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 46 23
  • 26. Inventory Activities (cont.) These activities can be incorporated into existing development activities within the project • At the beginning of every project there are activities during which the software to be developed is outlined or defined • Requirements activities • Architectural activities • Detail design and coding activities • All of these can incorporate the inventory concept ©2013 SQE Training - STAR West 2013 47 Gather Reference Materials User requests Requirements, stories, use cases Release information Inventory Detail designs ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 Architecture, system designs 48 24
  • 27. A Well-Balanced Team Analysts, architects Testers/QA Developers Users/clients Marketing Inventory process Management ©2013 SQE Training - STAR West 2013 49 Determining Objects and Conditions The goal is to gather as much information as early as possible about the testing goals, objects, and conditions • This enables earlier test design decisions and locates potential problems sooner and possibly prevents them from occurring ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 50 25
  • 28. What Is a Test Object/Condition? Any aspect of the software, its execution environment, its usage patterns, etc., that should be covered by the testing Basic functions or methods Configurations Usability Interfaces etc. ©2013 SQE Training - STAR West 2013 51 Identifying Test Objects/Conditions The inventory process is an incremental, iterative process starting at the requirements Requirements (specifications, models, etc.) Architecture (high level design) Detail design ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 52 26
  • 29. What Is an Inventory? Example 1. Transactions • Add • Change • An inventory is a list or index of test objects Delete 2. Screens • Daily • Weekly • Monthly It is a detailed answer to the question of what should (must) be tested 3. Reports 4. Constraints 5. Security 6. etc... ©2013 SQE Training - STAR West 2013 53 Inventory Process ― Flexibility Inventories can be created easily during the product risk identification and assessment process • They can be expanded or changed as needed Inventories help identify both functional and non-functional and structural areas to test • This improves the quality and effectiveness of the testing and helps avoid problems later in the process ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 54 27
  • 30. Applying the Inventory Process There are some common aspects of applications/systems that can be used as a starting point (model) in determining the test objects/conditions • Even if there are no formal requirements specifications The more detailed the initial information, the more accurate the inventory of test objects/conditions, and the better the understanding of risk ©2013 SQE Training - STAR West 2013 55 Applying the Inventory Process (cont.) As noted earlier building an inventory is an incremental process It can be applied at the project level or at an individual level (system, unit, etc.) Starting with a global view • A broader view can help put elements into proper perspective • “You need to see the forest before the trees” ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 56 28
  • 31. Building the Inventory – Requirements While creating an inventory of test objects you generate two key elements • The test objects • A list of issues and questions • Potential errors and defects Requirements, use cases, stories, etc. Issues and questions Inventory of test objects ©2013 SQE Training - STAR West 2013 57 General Requirements-based Objects/Conditions Features, functions, and methods Constraints (limits on application, environment, etc.) Inputs and outputs Behavior and business rules Interfaces Configurations (hardware, software, etc.) Scenarios Other elements… ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 58 29
  • 32. Design Analysis ― Inputs Requirements-based inventories Questions and issues from requirements can be used to verify design Items remaining unresolved may result in major rework later if left open ©2013 SQE Training - STAR West 2013 59 Building the Inventory – Architecture Requirements, use cases, stories, etc. Issues and questions High level design (Architecture) Detail level design Inventory of test objects • The process can be continued as long as required to ensure accurate information ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 60 30
  • 33. Design Analysis ― Steps Study the software design Request clarification Investigate with sample test cases Request design enhancements Inventory and prioritize design-based objects Review and revise the inventory ©2013 SQE Training - STAR West 2013 61 General Design-based Objects/Conditions Detailed inputs and outputs System and design states Process paths Data and communications paths Design limits and exceptions etc. ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 62 31
  • 34. Investigating Using Tests Consider a credit card management application with the following capabilities • Add, change, delete and inquire on customers • Add, change, delete and inquire on credit cards Consider the following scenarios • Update customer address, inquire on same customer to validate change and inquire on card balance • Inquire on customer, delete customer (customer has multiple cards on same account), delete credit card ©2013 SQE Training - STAR West 2013 63 Investigating Using Tests (cont.) Using the two scenarios on the previous page and the design diagram on the next page Do you see any problems in the design? • If so what are they? • What might happen if they are not corrected? ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 64 32
  • 35. Investigating Using Tests (cont.) DISPLAY Main Menu DISPLAY Customer Menu ADD Customer CHANGE Customer DELETE Customer Inquire /Print DISPLAY Card Menu Inquire/ Print ADD Card CHANGE Card DELETE Card ©2013 SQE Training - STAR West 2013 65 Prioritize the Objects/Conditions Using our list of test objects and conditions We must determine which elements are critical and which are not To do this we need to understand risk analysis ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 66 33
  • 36. Risk Analysis ― Purpose Focus the software design and test/evaluation activities on those elements/functions that are Critical to the mission of the organization Essential to the physical or economic well-being of users and their environment ©2013 SQE Training - STAR West 2013 67 Using Risk Analysis Risk analysis is the study of the identified risks • Categorizing each risk • Determining the likelihood and impact Each identified risk should be placed into an appropriate class • Classes can be internally defined or may be drawn from a standard such as the ISO 9126 • The use of checklists can combine the identification and classification processes ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 68 34
  • 37. Risk Analysis ― Characteristics of Risk Likelihood Impact • High • Medium • Low • High • Medium • Low ©2013 SQE Training - STAR West 2013 69 Factors Influencing Likelihood Many technical factors influence risk potential and vary from project to project and team to team within a project Areas to investigate may include: Complexity of technology and teams Personnel and training issues Conflict within the team or between teams Contractual problems (vendors, suppliers, contractors, etc.) ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 70 35
  • 38. Factors Influencing Likelihood (cont.) Areas to investigate (continued) Geographical separation of the team Legacy versus new ideas and approaches Use of tools and new technologies Lack of managerial or technical leadership Pressures on time, resources and management Lack of earlier quality assurance • No focus on quality at early levels of testing ©2013 SQE Training - STAR West 2013 71 Factors Influencing Likelihood (cont.) Areas to investigate (continued) A high frequency of change requests Poor early quality (high defect rates) Issues with interfacing and integrating software ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 72 36
  • 39. Factors Influencing Impact Risk impact comes down to the impact on the customer or user of the software under test Influencing factors may include: Frequently-used features • If it’s not available or corrupt, it will have adverse impacts Impact on the corporate or system image • Visibility of failure may lead to negative publicity ©2013 SQE Training - STAR West 2013 73 Factors Influencing Impact (cont.) Influencing factors (continued) Potential loss of business Potential financial, ecological or even social losses Legal liabilities (civil or criminal ) Potential loss of an operational license Work stoppages due to a lack of reasonable workarounds ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 74 37
  • 40. Approach to Risk Analysis Quantitative Qualitative • Assign a value to each aspect (likelihood, impact) multiply the two values together to calculate the cost of exposure. This is the expected loss associated with a particular risk • Use categories such as very high, high, medium, low, or very low ©2013 SQE Training - STAR West 2013 75 Approach to Risk Analysis (cont.) Most organization use a combination of both methods Qualitative helps express risk in terms a person can understand Quantitative is then used in conjunction with qualitative categories to create a matrix with each risk given a weighted priority ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 76 38
  • 41. Initial Risk Priority ― A Risk Model Likelihood High (3) Medium (2) Low Impact High (1) Medium (2) Low X (3) (1) Impact and likelihood = Potential loss ($) (helps set the test priority) ©2013 SQE Training - STAR West 2013 77 Adjusting the Assessment What if we feel one factor is of greater importance? Likelihood High (3) Medium (2) Low (1) Impact High X (5) Medium (3) Low (2) Impact and likelihood = Potential loss ($) (helps set the test priority) ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 78 39
  • 42. Example ― Risk Analysis Test Object Likelihood Impact Priority Client update L M LM=2 Client delete M L ML=2 Client add H H HH=9 Client inquiry L L LL=1 Status report L L LL=1 Performance M H MH=6 Monthly report L L LL=1 Sales report M L ML=2 Calc. sales com. M M MM=4 Mkt. report M M MM=4 ©2013 SQE Training - STAR West 2013 79 Risk Inventory Prioritized Test Object Likelihood Impact Priority Client add H H HH=9 Performance M H MH=6 Calc. sales com. M M MM=4 Mkt. report M M MM=4 Client update L M LM=2 Client delete M L ML=2 Sale report M L ML=2 Client inquiry L L LL=1 Status report L L LL=1 Monthly report L L LL=1 ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 80 40
  • 43. Planning for Contingencies Test Object Test Case Number Priority Client add 23, 45, 47, 48, 10, 12… 9 Performance 15, 22, 31, 12… 6 Calc. sales com. 52, 53, 101, 17, 19… 4 Mkt. report 4, 7, 33, 37, 42… 4 Client update 57, 58, 41, 9, 8… 2 Client delete 62, 110, 82, 63… 2 Sale report 13, 16, 77, 68… 2 Client inquiry 19, 3, 5, 102… 1 Status report 73, 64, 56, 7… 1 Monthly report 74, 98, 92… 1 ©2013 SQE Training - STAR West 2013 81 Risk Control – Scope of Testing Once we have prioritized the list of test objects/conditions, the next issue we have to address is the scope of testing for each element, how will we control the risk • How much testing is required for each object? • What type of tests can be created? • This will be affected by the development model employed and the completeness of the information available ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 82 41
  • 44. Approaches to Test Design Depending on the goals of the testing there are several ways to approach the design process • Formal and informal approaches The approach chosen depends on many factors • Type of development process • Types of defects anticipated • Risk issues to be addressed • Other factors ©2013 SQE Training - STAR West 2013 83 Formal Test Design Test design using defined document formats and analytical methods Static methods • (Inspections, walkthroughs, etc.) Formal test techniques and methods • Equivalence partitioning and boundary testing • Pair-based methods (orthogonal and combinatorial) • Decision tables and state diagrams • etc. ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 84 42
  • 45. Formal Test Techniques and Risk Different test techniques are good at identifying and resolving certain types of problems (risks) Static methods (reviews) Help eliminate errors and mistakes at their source Find things that are unclear, mismatched, vague, ambiguous, and possibly un-testable Identify overlooked and missing elements early in the process ©2013 SQE Training - STAR West 2013 85 Selecting a Technique (cont.) Equivalence partitions Help define and clarify requirements Provide a solid basis for test condition selection Boundary value testing Improves the focus of test conditions selected using equivalence partitions Helps identify and eliminate some of the most common mistakes made in coding ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 86 43
  • 46. Test Techniques and Risk (cont.) Decision tables Help simplify complex business rules Identify incomplete and contradictory logic in rules Help identify elements in each rule that must be tested State diagrams Help identify critical sequences of events and those events that can have a significant impact on the system or application ©2013 SQE Training - STAR West 2013 87 Test Techniques and Risk (cont.) Pair-based methods (orthogonal arrays and combinatorial analysis) Handle large amounts of interacting values where there are limited or no dependencies between elements • Decision tables are better at complex relationships Informal methods Help identify errors and mistakes caused by human activity Scientific techniques deal with logic very well but the real world tends towards chaos ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 88 44
  • 47. Informal Test Design Approach Test analysis using the skills and knowledge of the test team Exploratory testing Error guessing Fault attacks Defect based (defect and failure history) ©2013 SQE Training - STAR West 2013 89 Selecting the Test Approach If the information available is relatively complete, formal methods are a good option Sequential models tend to focus on complete requirements as early as possible • At least that is the intent Informal methods can be used to supplement the more formal techniques Initial test design can be more rigorous and the documentation can be more detailed • Care must be taken not to overdo the documentation aspect of testing ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 90 45
  • 48. Selecting the Test Approach (cont.) If the information is incomplete or intended to evolve, the use of informal methods may be the best initial choice supplemented by formal techniques In iterative and some incremental models, test design, like the software itself, must be evolutionary • Initial tests will be high level and subject to constant change • The formality of test design and documentation must be flexible ©2013 SQE Training - STAR West 2013 91 Selecting the Test Approach (cont.) The nature of the test object/condition and its level of risk will affect the depth and breadth of the test design • The more flexible the tests must be due to partial information, the more time it will take to complete the testing • Increases risk that something will go untested ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 92 46
  • 49. Test Approaches and Risk Tests that have to evolve with the software, require testers with a greater depth of skill and knowledge • Methods like exploratory testing rely on the skills and knowledge of the testers • An in-depth set of mental models (both formal and informal) is essential to success • Documentation and the repeatability of the tests become more problematic • Enough information has to be retained to allow someone other than the original tester to repeat the test ©2013 SQE Training - STAR West 2013 93 Completing the Test Design Process Organize the test objects and conditions Document the test design decisions • How much documentation is required and to what level of detail represent a type of risk ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 94 47
  • 50. Organize the Test Objects Select which objects and conditions will be tested as a logical set at which stage/level of testing Define the test sets for the grouped test objects and conditions • Select a design strategy (formal, informal) • Document the test architecture and design decisions ©2013 SQE Training - STAR West 2013 95 Example Test Set Definition Input Test set screen Output Test cases Text Fields Test cases Test cases Internal Test cases External Test cases Navigation Security Test cases Other ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 96 48
  • 51. Test Creation - Summary There are many test objects/conditions to be considered • Different types of projects have different sources but all have requirements, designs, etc. The inventoried objects provide a means for maintaining, evaluating and updating test requirements ©2013 SQE Training - STAR West 2013 97 Implementing The Test Design • All tests require the proper data and environment be in place prior to test execution ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 98 49
  • 52. Implementation Risk Areas Environment Data Software Implementation ©2013 SQE Training - STAR West 2013 99 Key Data Source Characteristics Production Generated Captured Created Volume Too Much Controllable Controllable Too Little Variety Mediocre Varies Varies Good Acquisition Easy Varies Fairly Easy Difficult Validation (Calibration) Hard Hard Fairly Hard Easy Varies Usually Easy Varies Easy Change • Acquiring the necessary data may be the most time consuming part of test development – Requires careful planning and the right tools ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 100 50
  • 53. The Test Environment and Software Application/ Hardware System Documentation Testware Firmware (system software) Application software ©2013 SQE Training - STAR West 2013 101 Preparing The Test Environment Establish, manage, and maintain the environment identified in the test plan • Publications • Supplies • Personnel • Facilities • Hardware • Software ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 102 51
  • 54. 5 TEST EXECUTION, REASSESSING RISK, AND REPORTING Test Execution and Risk The earlier risk identification, analysis, and prioritization activities helped identify potential product risk areas With test execution we begin to discover, analyze, assess, and reassess the potential product risks ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 104 52
  • 55. Test Execution – Key Questions Are we running the critical tests early? Are our tests achieving the level of coverage we desire? Are the tests experiencing a high or low failure rate? What does the defect information suggest? ©2013 SQE Training - STAR West 2013 105 Assessing and Reporting Three key elements related to risk Testing status Defect status Coverage assessment Test Effectiveness and Risk ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 106 53
  • 56. Coverage Assessment Coverage is relative to a specified inventory and risk assessment Effectiveness Thoroughness Comprehensiveness ©2013 SQE Training - STAR West 2013 107 Which Coverage Measures Matter? Requirements Risk Design Code ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 • Which requirements have been tested? • Which risk areas have been tested? • What risks are still unaddressed? • What parts of the architecture have been tested • Have we addressed key design issues? • What percentage of the code has been tested? • Paths, conditions, decisions, and statements 108 54
  • 57. Testing Status Which tests are we running? Which are passing and which are failing? When executing the tests, what do the numbers tell us? • Initially planned 10,000 tests, currently executed 1,500—what does this mean • Are we 15% finished? • If so, finished with what? • On the next two slides, let’s look at variations of this ©2013 SQE Training - STAR West 2013 109 Test Status ― Example 1 Planned Executed High risk 5,000 Total 1,500 1465 35 3,000 Low risk Failed 2,000 Medium risk Passed 10,000 • What does this example tell us about test coverage?, about risk? • Why might this be happening? • What is the potential impact on the project and the product? ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 110 55
  • 58. Test Status ― Example 2 Planned Executed Passed Failed High risk 2,000 500 491 9 Medium risk 3,000 250 243 7 Low risk 5,000 750 731 19 Total 10,000 • Now, what does this example potentially tell us? ©2013 SQE Training - STAR West 2013 111 Testing Status – Context Implemented Specified Planned ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 Executed Tests Passed/ Failed 112 56
  • 59. Test Effectiveness Issues Tests planned versus executed On or off schedule Time spent on key activities and their value • Documenting the tests versus executing the tests Issues preventing execution • Blocking incidents • Lack of resources (equipment, personnel, data, etc.) ©2013 SQE Training - STAR West 2013 113 Assessing Defect Status Failing tests represent risks to the quality of the software • What is the severity/impact of occurring defects? • What is the failure rate? • What is the distribution, density, and other characteristics of the failures? • Are they grouped or generally distributed? • How much effort (rework) is being expended to correct and re-test the failed element? • Too much rework can imperil the schedule ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 114 57
  • 60. Classification of Defects • Is there a standard documented scale? – Recognized and used by all projects Fatal Critical Cosmetic Incident Minor Major ©2013 SQE Training - STAR West 2013 115 Defect Analysis – What are the Risks? Where are the defects occurring? What is being missed? Why are the defects occurring? What is the cost and impact? Defect estimation—how many defects are left? ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 116 58
  • 61. Analyzing Defect Characteristics Patterns Trends Recurrence Defects Density Distribution ©2013 SQE Training - STAR West 2013 117 Defect Assessment – Example Component Risk Planned Executed Passed Failed Module1 High 300 100 97 3 Module2 High 350 125 123 2 Module3 High 250 75 75 0 Module4 High 300 150 145 5 Module5 High 175 50 48 2 Module6 High 225 0 Module7 High 400 0 • What does this example tell us about our defects and risk? ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 118 59
  • 62. Assessing Defect Status Open Cancelled Status Closed Pending ©2013 SQE Training - STAR West 2013 119 Stopping the Testing Exit criteria met Coverage Criteria Testing status Defect status Remaining defects estimation criteria Diminishing returns criteria Combination of criteria ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 120 60
  • 63. Stopping the Testing “There is no single, valid, rational criterion for stopping. Furthermore, given any set of applicable criteria, how each is weighed depends very much on the product, the environment, the culture, and the attitude to risk.” — Boris Beizer ©2013 SQE Training - STAR West 2013 121 The Key to Success Risk-driven testing provides the critical, objective information to allow stakeholders to make informed decisions When is risk-driven testing unnecessary? • The software will never be used • You have unlimited resources  ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 122 61
  • 64. Tutorial Evaluations Tutorial Evaluation How did we do? * Great * Wonderful * Fantastic * Terrific * Superior * Excellent ©2013 SQE Training - STAR West 2013 123 Thank You • On behalf of SQE Training, we appreciate your attendance at this course • If you have additional training or consulting needs, please think of us ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 124 62
  • 65. Bibliography ― Books Boehm, Barry W. Software Risk Management, IEEE Press, 1989. Boehm, Barry W. Software Engineering Economics, PTR Prentice Hall, 1981. Brooks Jr., Frederick P. The Mythical Man Month, Addison Wesley, 1995. DeMarco, Tom and Timothy Lister. Waltzing with Bears: Managing Risk on Software Projects, Dorset House, 2003. ©2013 SQE Training - STAR West 2013 125 Bibliography ― Books (cont.) Down, Alex, Michael Coleman, and Peter Absolon. Risk Management for Software Projects, McGraw Hill, 1994. Gerrard, Paul and Neil Thompson. Risk Based E-Business Testing, Artech House, 2002. Grey, Stephen. Practical Risk Assessment for Project Management, Wiley, 1995. Hall, Elaine M. Managing Risk: Methods for Software Systems Development, Addison Wesley, 1997. ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 126 63
  • 66. Bibliography – Books (cont.) Jones, Capers. Assessment and Control of Software Risks, Yourdon Press, 1994. McManus, John. Risk Management in Software Development Projects, Elsevier, 2004. Myerson, Marian. Risk Management Processes for Software Engineering Models, Artech House, 1996. Neumann, Peter G. Computer Related Risks, Addison Wesley, 1995. ©2013 SQE Training - STAR West 2013 127 Bibliography ― Articles and Papers Bach, James. Troubleshooting Risk Based Testing. Better Software magazine, May/June 2003. Gardiner, Julie. Risk – The Testers Favorite Four Letter Word. Track Session, STAREAST 2005. Kaner, Cem. The Context-driven Approach to Software Testing. Track Session, STAREAST 2002. Rice, Randy. The Risks of Risk-based Testing, Track Session. STAREAST 2007. ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 128 64
  • 67. Bibliography ― Articles and Papers Sliger, Michele. ScrumBut: Failure to Deliver. StickyMinds.com column, June 2009. Sliger, Michele. Risk Management on an Agile Project. Track Session, Better Software Conference 2006. Sliger, Michele. The Agile-Traditional Development Cooperative. Track Session, Better Software Conference 2007. ©2013 SQE Training - STAR West 2013 129 Appendix A RISK BASED DESIGN USING FORMAL TECHNIQUES ©2013 SQE Training - STAR East 2013 65
  • 68. A Solution to the Exhaustive Example • 13 variable, 3 value problem – 1,594,323 possible combinations • Here we have 13 fields and values ready for PICT ©2013 SQE Training - STAR West 2013 131 Results from PICT • These results from PICT will be pasted into Excel and we will add a results column to create a test case matrix ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 132 66
  • 69. Results in Excel with Results Column ©2013 SQE Training - STAR West 2013 133 Limits to All Pairs • All Pairs methods deal very well with valid values and with tools like PICT we can create restrictions – But what about testing invalid values? • Next, using the initial data from the PICT tool, duplicating it in Excel we will modify the results to create invalid test cases – We will use Equivalence and boundaries to create the invalid tests ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 134 67
  • 70. Creating Invalid Test Cases • On the following two slides the following method was applied: – Invalid values are in italics and are red – The first field “F1” is a set so all I need is a single non-set member – Field “F2” is a range • Using boundaries I went above and below and also did a numeric and alpha as well – These steps were repeated for all fields ©2013 SQE Training - STAR West 2013 135 Invalid Test Cases for All Pairs Data Test 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 INVALID TEST CASES (Using equivalence partitions/boundary analysis) F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 F12 F13 3 3 D 6 G 9 L 12 N 13 P 16 S B 0 F 5 H 9 K 10 M 14 R 18 T B 4 F 5 H 9 K 10 M 14 R 18 T B D F 5 H 9 K 10 M 14 R 18 T B @ F 5 H 9 K 10 M 14 R 18 T C 1 Z 4 I 8 J 11 N 15 R 17 U A 2 D 4 I 7 J 12 O 14 Q 16 T A 2 D 8 I 7 J 12 O 14 Q 16 T A 2 D A I 7 J 12 O 14 Q 16 T A 2 D % I 7 J 12 O 14 Q 16 T C 3 E 5 7 7 K 12 O 15 P 18 U B 1 E 6 H 6 L 10 M 13 Q 17 S B 1 E 6 H 10 L 10 M 13 Q 17 S B 1 E 6 H X L 10 M 13 Q 17 S B 1 E 6 H $ L 10 M 13 Q 17 S B 2 F 4 G 8 5 11 M 15 P 16 T ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 RESULTS FIELD MSG1 FIELD MSG2 FIELD MSG2 FIELD MSG2 FIELD MSG2 FIELD MSG3 FIELD MSG4 FIELD MSG4 FIELD MSG4 FIELD MSG4 FIELD MSG5 FIELD MSG6 FIELD MSG6 FIELD MSG6 FIELD MSG6 FIELD MSG7 136 68
  • 71. Invalid Test Cases for All Pairs Data (cont.) 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 C C C C A C C C C A B B B B A 1 1 1 1 3 1 1 1 1 3 2 2 2 2 1 D D D D F D D D D F F F F F E 6 6 6 6 6 5 5 5 5 5 4 4 4 4 5 H H H H I H H H H I H H H H G 7 7 7 7 9 9 9 9 9 8 7 7 7 7 8 K K K K J L L L L L K K K K J 9 13 A & 10 10 10 10 10 11 12 12 12 12 10 O O O O 9 O O O O N N N N N O 13 13 13 13 14 12 16 Q # 14 13 13 13 13 14 R R R R P Q Q Q Q 3 R R R R R 18 18 18 18 17 16 16 16 16 18 15 19 R & 17 S S S S U U U U U S U U U U 6 FIELD MSG8 FIELD MSG8 FIELD MSG8 FIELD MSG8 FIELD MSG9 FIELD MSG10 FIELD MSG10 FIELD MSG10 FIELD MSG10 FIELD MSG11 FIELD MSG12 FIELD MSG12 FIELD MSG12 FIELD MSG12 FIELD MSG13 • With 51 test cases, using all pairs, equivalence classes and boundaries and experience based techniques we now have a reasonable set of test cases for a large problem ©2013 SQE Training - STAR West 2013 137 A Simple Procedure Test Procedure Step 1 Navigate to input screen/page 2 Starting at row 1 in first table 3 Enter data from each column in matching field element on screen/page 4 Verify indicated result 5 Repeat steps 2-4 for all rows in the tables ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 138 69
  • 72. APPENDIX B EXAMPLE TEST PLANS AND INVENTORY This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 140 70
  • 73. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 141 This Slide is Hidden 142 ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 142 71
  • 74. This Slide is Hidden 143 ©2013 SQE Training - STAR West 2013 143 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 144 72
  • 75. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 145 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 146 73
  • 76. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 147 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 148 74
  • 77. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 149 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 150 75
  • 78. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 151 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 152 76
  • 79. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 153 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 154 77
  • 80. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 155 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 156 78
  • 81. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 157 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 158 79
  • 82. This Slide Is Hidden ©2013 SQE Training - STAR West 2013 159 This Slide Is Hidden ©2013 SQE Training - STAR West 2013 ©2013 SQE Training - STAR East 2013 160 80