SlideShare une entreprise Scribd logo
1  sur  114
Quality Assurance
Presented By
Deepak Rathod
2
About Software.
● What is software?
– A Software is a collection of computer programs that helps us to perform a task
● Types of Software
– System software
● Ex: Device drivers, OS, Servers, Utilities, etc.
– Programming software
● Ex: compilers, debuggers, interpreters, etc.
– Application software
● Ex: industrial automation, business software, games, telecoms, etc.
3
Type of Application
● Product Vs Project
– If software application is developed for specific customer requirement then it is
called Project.
● Example: ABC company’s webside for selling theire product such as toys,
beatuy product, health product, etc.
– If software application is developed for multiple customers requirement then it called
Product.
● Example: ABC company offer their service to end users such as whats app,
gmail, facebook, etc.
4
Software Testing
● What is Software Testing?
– Software Testing is a part of software development process.
– Software Testing is an activity to detect and identify the defects in the software.
– The objective of testing is to release quality product to the client.
● Why do we need testing?
– Ensure that software is bug free.
– Ensure that system meets customer requirements and software specifications.
– Ensure that system meets end user expectations.
– Fixing the bugs identified after release is expensive.
5
Software Quality
● Quality:
– Quality is defined as justification of all the requirements of a customer in a product.
● Note:
– Quality is not defined in the product. It is defined in the customer`s mind.
● Quality software is reasonably
– Bug-free.
– Delivered on time.
– Within budget.
– Meets requirements and/or expectations.
– Maintainable.
6
Software Quality
● Error, bug & failure
– Error
● Any incorrect human action that produces a problem in the system is called an
error.
– Defect/Bug
● Deviation from the expected behavior to the actual behavior of the system is
called defect.
– Failure
● The deviation identified by end-user while using the system is called a failure.
7
Why Bugs?
● Why there are bugs in software?
– Miscommunication or no communication
– Software complexity
– Programming errors
– Changing requirements
– Lack of skilled testers Etc.
8
SDLC
● Software Development Life Cycle (SDLC)
– SDLC, Software Development Life Cycle is a process used by software industry to
design, develop and test high quality software's.
– The SDLC aims to produce a high quality software that meets customer
expectations.
● SDLC Models
– Waterfall Model
– Incremental Model
– Spiral Model
– V- Model
– Agile Model
9
Waterfall Model
10
Incremental/Iterative Model
11
Spiral Model
12
V- Model
13
Agile Model
14
QA Vs QC/QE
● QA is Process related.
● QC is the actual testing of the software.
● QA focuses on building in quality.
● QC focuses on testing for quality.
● QA is preventing defects.
● QC is detecting defects.
● QA is process oriented.
● QC is Product oriented.
● QA for entire life cycle.
● QC for testing part in SDLC.
15
Verification V/S Validation
● Verification checks whether we are building the right system.
● Verification typically involves
– Reviews.
– Walkthroughs.
– Inspections.
● Validation checks whether we are building the system right.
– Takes place after verifications are completed.
– Validation typically involves actual testing.
● System Testing.
16
Static V/S Dynamic Testing
● Static testing
– is an approach to test project documents in the form of Reviews, Walkthroughs and
Inspections.
● Dynamic testing
– is an approach to test the actual software by giving inputs and observing results.
17
Review, Walkthrough & Inspection
● Reviews:
– Conducts on documents to ensure correctness and completeness.
– Example:
– Requirement Reviews
– Design Reviews
– Code Reviews
– Test plan reviews
– Test cases reviews etc.
● Walkthroughs:
– It is a formal review and we can discuss/raise the issues at peer level.
– Also walkthrough does not have minutes of the meet. It can happen at any time and
conclude just like that no schedule as such.
18
Review, Walkthrough & Inspection
● Inspections:
– Its a formal approach to the requirements schedule.
– At least 3- 8 people will sit in the meeting 1- reader 2-writer 3- moderator plus
concerned.
– Inspection will have a proper schedule which will be intimated via email to the
concerned developer/tester.
19
Levels of Software Testing
● Unit Testing
● Integration Testing
● System Testing
● User Acceptance Testing(UAT)
20
Level of Software Testing
21
Unit Testing
● A unit is the smallest testable part of software. It usually has one or a few inputs and
usually a single output.
● Unit testing conducts on a single program or single module.
● Unit Testing is white box testing technique.
● Unit testing is conducted by the developers.
● Unit testing techniques:
– Basis path testing
– Control structure testing
● Conditional coverage
● Loops Coverage
– Mutation Testing
22
Integration Testing
● In Integration Testing, individual software modules are integrated logically and tested
as a group.
● Integration testing focuses on checking data communication amongst these modules.
● Integrated Testing is white box testing technique.
● Integrated testing is conducted by the developers.
● Approaches:
– Top Down Approach
– Bottom Up Approach
– Sandwich Approach(Hybrid)
● Stub: Is called by the Module under Test.
● Driver: Calls the Module to be tested.
23
Bottom-Up Integration
● In the bottom up strategy, each module at lower levels is tested with higher modules
until all modules are tested.
● It takes help of Drivers for testing.
24
Top down Integration
● In Top to down approach, testing takes place from top to down following the control
flow of the software system.
● Takes help of stubs for testing.
25
System Testing
● Testing over all functionality of the application with respective client requirements.
● It is a black box testing technique.
● This testing is conducted by testing team.
● Before conducting system testing we should know the requirements.
● System Testing focusses on below aspects.
– User Interface Testing (GUI)
– Functional Testing
– Non-Functional Testing
– Usability Testing
26
User Acceptance Testing
● After completion of system testing UAT team conducts acceptance testing in two
levels.
– Alpha testing
● Alpha testing performed by Testers who are usually internal employees of the
organization
● Alpha Testing performed at developer's site
● Alpha testing is to ensure the quality of the product before moving to Beta
testing.
– Beta testing
● Beta testing is performed by Clients or End Users who are not employees of the
organization
● Beta testing is performed at client location or end user of the product
● Beta version of the software is released to a limited number of end-users of the
product to obtain feedback on the product quality.
27
Testing Methodologies
● White box Testing
● Black box Testing
● Grey box Testing
28
White Box Testing
● White Box Testing conducts on internal logic of the programs.
● Programming Skills are required.
● Example: Unit Testing & Integration Testing
29
Block Box Testing
● Testing conducts on functionality of the application whether it is working according to
customer requirements or not.
● Example
– System Testing & UAT Testing
● This method attempts to find errors in the following categories:
– Incorrect or missing functions
– Interface errors
– Behavior or performance errors
– Initialization and termination errors
30
Grey Box Testing
● Both combination of white box and black box testing.
● Example:
– Database Testing
31
Black Box Testing Types
● GUI Testing
● Usability Testing
● Functional Testing
● Non-Functional Testing
32
What is GUI?
● There are two types of interfaces in a computer application.
– Command Line Interface is where you type text and computer responds to that
command.
– GUI stands for Graphical User Interface where you interact with the computer using
images rather than text.
33
GUI Testing
● Graphical User Interface (GUI) testing is the process of testing the system’s GUI.
● GUI testing involves checking the screens with the controls like menus, buttons, icons,
and all types of bars – tool bar, menu bar, dialog boxes and windows etc.
● During GUI Testing Test Engineers validates user interface of the application as
following aspects:
– Look & Feel
– Easy to use
– Navigations & Shortcut keys
● GUI Objects:
– Window, Dialog Box, Push Button, Radio Button, Radio Group, Tool bar, Edit Box,
Text Box, Check Box, List Box, Drop down Box, Combo Box, Tab, Tree view,
progress bar, Table, Scroll bar Etc.
34
Check list for GUI Testing
● It checks if all the basic elements are available in the page or not.
● It checks the spelling of the objects.
● It checks alignments of the objects.
● It checks content display in web pages.
● It checks if the mandatory fields are highlights or not.
● It checks consistency in background color and color font type and fond size etc.
35
Usability Testing
● During this testing validates application provided context sensitive help or not to the
user.
● Checks how easily the end users are able to understand and operate the application is
called usability testing.
36
Functional Testing
● Object Properties Coverage
● Input Domain Coverage (BVA, ECP)
● Database Testing/Backend Coverage
● Error Handling Coverage
● Links Existence & Links Execution
● Cookies & Sessions
37
Object Properties Testing
● Every object has certain properties.
– Example
● Enable, Disable, Focus, Text etc..
● During Functional testing Test Engineers validate properties of objects in run time.
38
Input Domain Testing
● During Input domain testing Test Engineers validate data provided to the application
value and length.
● There are two techniques in Input domain Techniques.
– Equalance Class Partition (ECP)
– Boundary Value Analysis(BVA)
39
ECP & BVA
● Requirement:-
– • User name field allows only lower case with min 6 max 8 letters.
– ECP for User Name BVA for User Name
Valid Invalid
a....z A.....Z
0......9
Spacial Characters
(@,#,$,%,!,&,*,etc)
Parameters Value Result
Min 6 Valid
Min+1 7 Valid
Min-1 5 Invalid
Max 8 Valid
Max+1 9 Invalid
Max-1 7 Valid
40
Database Testing
● During Database testing Test Engineers validate the data w.r.t database.
● ValidatesDML operations( Insert , Update, Delete & Select)
● SQL Language: DDL, DML, DCL etc..
– DDL - Data Definition Langue - Create , alter, drop
– DML - Data Manipulation language - Insert, update, select, delete
– DCL - Commit, roll back etc.
41
Error Handling Testing
● Validate error messages thrown by the application when we provide invalid data.
● The error messages should be clear and easy to understand to the user.
42
Links Coverage
● Links existence - Links placed in the appropriate location or not.
● Links execution - link is navigating to appropriate page or not.
● Types of links:-
– Internal links
– External links
– Broken links
43
Cookies & Sessions
● Cookie- Temporary internet files which are created at client side when we open the web
sites. These files contains User data.
● Session- Sessions are time slots which are allocated to the user at the serve side.
44
Non-Functional Testing
● Performance Testing
– Load Testing
– Stress Testing
– Volume Testing
● Security Testing
● Recovery Testing
● Compatibility Testing
● Configuration Testing
● Installation Testing
● Sanitation Testing
45
Performance Testing
● Load: Testing speed of the system while increasing the load gradually till the customer
expected number.
● Stress: Testing speed of the system while increasing/reducing the load on the system to
check any where its breaking.
● Volume: Check how much volumes of data is able to handle by the system.
46
Security Testing
● Testing security provided by the system
● Types:
– Authentication
– Access Control/Authorization
– Encryption/Decryption
47
Recovery Testing
● Testing Recovery provided by the system. Whether the system recovering abnormal to
normal condition or not.
48
Compatibility Testing
● Testing Compatibility of the system w.r.t OS, H/W & Browsers.
● Operating System Compatibility
● Hardware Compatibility (Configuration Testing)
● Browser Compatibility
● Forward & Backward Compatibility
49
Installation Testing
● Testing Installation pf the application on Customer expected platforms and check
installation steps, navigation, how much space is occupied in memory.
● Check Un-Installation.
50
Sanitation/Garbage Testing
● Check whether application is providing any extra/additional features beyond the
customer requirements.
51
Testing Terminology
● Adhoc Testing:
– Software testing performed without proper planning and documentation.
– Testing is carried out with the knowledge of the tester about the application and
the tester tests randomly without following the specifications/requirements.
● Monkey Testing:
– Test the functionality randomly without knowledge of application and test cases is
called Monkey Testing.
52
Testing Terminology cont...
● Re- Testing:
– Testing functionality repetitively is called re-testing.
– Re-testing gets introduced in the following two scenarios.
● Testing is functionality with multiple inputs to confirm the business validation are
implemented (or) not.
● Testing functionality in the modified build is to confirm the bug fixers are made
correctly (or) not.
● Regression Testing:
– It is a process of identifying various features in the modified build where there is a
chance of getting side-effects and retesting these features.
– The new functionalities added to the existing system (or) modifications made to
the existing system .It must be noted that a bug fixer might introduce side-effects
and a regression testing is helpful to identify these side effects.
53
Testing Terminology cont...
● Sanity Testing:
– After receiving a build with minor changes in the code or functionality, a subset of
regression test cases are executed that to check whether it rectified the software
bugs or issues and no other software bug is introduced by the changes.
– sanity testing of the software can be done at later cycles after through regression
test cycles.
– If we are moving a build from staging / testing server to production server, sanity
testing of the software application can be done to check that whether the build is
sane enough to move to further at production server or not
● Smoke Testing:
– Software Testing done to ensure that whether the build can be accepted for through
software testing or not. Basically, it is done to check the stability of the build
received for software testing.
– This is wider scope of testing where we deploy new build and clean database then
performed CRUD operation in entire application.
54
Testing Terminology cont...
● End to End Testing:
– Testing the overall functionalities of the system including the data integration among
all the modules is called end-to-end testing.
● Exploratory Testing:
– Exploring the application and understanding the functionalities adding or modifying
the existing test cases for better testing is called exploratory testing.
55
Testing Terminology cont...
● Globalization Testing (or) Internationalization Testing(I18N):
– Checks if the application has a provision of setting and changing languages date
and time format and currency etc. If it is designed for global users, it is called
globalization testing.
● Localization Testing:
– Checks default languages currency date and time format etc. If it is designed for a
particular locality of users is called Localization testing.
56
Testing Terminology cont...
● Positive Testing (+ve ):
– Testing conducted on the application in a positive approach to determine what
system is supposed to do is called a positive testing.
– Positive testing helps in checking if the customer requirements are justifying the
application or not.
● Negative Testing ( -ve):
– Testing a software application with a negative perception to check what system is
not supposed to do is called negative testing.
57
Software Testing Life Cycle (STLC)
● Requirement Analysis
● Test Planning
● Test Designing
● Test Execution
● Test Closure
58
STLC (Software Testing Life Cycle)
59
Test Plan Definition
● A document describing the scope, approach, resources and schedule of testing
activities.
● It identifies test items, the feature to be tested, the testing tasks, who will do each
task, and any risks requiring contingency planning.
● A detail of how the test will proceed, who will do the testing, what will be tested, in
how much time the test will take place, and to what quality level the test will be
performed.
60
Test Plan Contents
● Objective
● Features to be tested
● Features not to be tested
● Test Approach
● Entry Criteria
● Exit Criteria
● Suspension & Resumption criteria
● Test Environment
● Test Deliverables & Milestones
● Schedules
● Risks & Mitigations
● Approvals
61
Use case, Test Scenario & Test Case
● Use Case:
– Use case describes the requirement.
– Use case contains THREE Items .
● Actor
● Action/Flow
● Outcome
● Test Scenario:
– Possible area to be tested (What to test)
● Test Case:
– How to test
– Describes test steps, expected result & actual result.
62
Use Case V/s Test Case
● Use Case
– Describes functional requirement, prepared by
Business Analyst(BA).
● Test Case
– Describes Test Steps/ Procedure, prepared by Test Engineer.
63
Test Case V/s Test Scenario
Test Case Test Scenario
Test case consist of set of input values,
execution precondition, excepted Results and
executed post condition, developed to cover
certain test Condition
scenario is nothing but test procedure.
Test cases are derived (or written) from test
scenario
The scenarios are derived from
use cases.
Test Scenario is ‘What to be tested’ Test Case is ‘How to be tested’.
64
Sample Login Test Scenario
● Example:-
– Test Scenario: Checking the functionality of Login button
– TC1: Click the button without entering user name and password.
● Expected Result: show validation message “error message”.
– TC2: Click the button only entering User name.
● Expected Result: show validation message “Password Required”.
– TC3: Click the button while entering wrong user name and wrong password.
● Expected Result: show validation message “Invalid username or password”.
– TC4: Click the button while entering valid user name and password.
● Expected Result: redirect on next landing page with successful login.
65
Positive V/s Negative Test Cases
● Requirement:
– For Example if a text box is listed as a feature and in SRS it is mentioned as Text box
accepts 6 - 20 characters and only alphabets.
● Positive Test Cases:
– Textbox accepts 6 characters.
– Textbox accepts upto 20 chars length.
– Textbox accepts any value in between 6-20 chars length.
– Textbox accepts all alphabets.
● Negative Test Cases:
– Textbox should not accept less than 6 chars.
– Textbox should not accept chars more than 20 chars.
– Textbox should not accept special characters. (~,@,#,$,%,^,&,* ,<,>,/,;,!,|,)
– Textbox should not accept numerical. (123456)
66
Test case Design Techniques
● Boundary Value Analysis (BVA)
● Equivalence Class Partitioning (ECP)
● Decision Table Testing
● State Transition Diagram
● Use Case Testing
67
Decision Table
68
State Transition Diagram
69
Use Case
70
Test Suite
● Test Suite is group of test cases which belongs to same category.
71
Test Case Contents
● Test Case ID
● Test Case Title
● Description
● Pre-condition
● Priority ( P0, P1,P2,P3) – order
● Requirement ID
● Steps/Actions
● Expected Result
● Actual Result
● Test data
72
Test Case Template
73
Requirement Traceability Matrix(RTM)
● RTM – Requirement Traceability Matrix
● Used for mapping of Requirements w.r.t Test cases
74
RTM
75
Types of RTM
76
Advantage Of RTM
77
Characteristics of good Test Case
● A good test case has certain characteristics which are:
– Should be accurate and tests what it is intended to test.
– No unnecessary steps should be included in it.
– It should be reusable.
– It should be traceable to requirements.
– It should be compliant to regulations.
– It should be independent i.e. You should be able to execute it in any order without
any dependency on other test cases.
– It should be simple and clear, any tester should be able to understand it by reading
once.
78
Test Data
● Test Data is required for Test cases.
● Test Data will be provided as input to the cases.
● Prepared by Test Engineers.
● Maintains in Excel sheets.
79
Test Environment Setup
● Test environment decides the software and hardware conditions under which a work
product is tested.
● Activities
– understand the required architecture, environment set-up and prepare hardware
and software requirement list for the Test Environment.
– Setup test Environment and test data
– Perform smoke test on the build
● Deliverables
– Environment ready with test data set up
– Smoke Test Results.
80
Test Execution
● During this phase test team will carry out the testing based on the test plans and the
test cases prepared.
● Bugs will be reported back to the development team for correction and retesting will be
performed.
81
Defect Reporting
● Any mismatched functionality found in a application is called as Defect/Bug/Issue.
● During Test Execution Test engineers are reporting mismatches as defects to
developers through templates or using tools.
● Defect Reporting Tools:
– Clear Quest
– DevTrack
– Jira
– Quality Center
– Bug Jilla etc.
82
Defect Report Contents
● Defect_ID - Unique identification number for the defect.
● Defect Description - Detailed description of the defect including information about the module in
which defect was found.
● Version - Version of the application in which defect was found.
● Steps - Detailed steps along with screenshots with which the developer can reproduce the defects.
● Date Raised - Date when the defect is raised
● Reference - where in you Provide reference to the documents like . requirements, design, architecture
or may be even screenshots of the error to help understand the defect
● Detected By - Name/ID of the tester who raised the defect
● Status - Status of the defect , more on this later
● Fixed by - Name/ID of the developer who fixed it
● Date Closed - Date when the defect is closed
● Severity which describes the impact of the defect on the application
● Priority which is related to defect fixing urgency. Severity Priority could be High/Medium/Low based
on the impact urgency at which the defect should be fixed respectively
83
Defect Management Process
84
Defect Classification
85
Defect Severity
● Severity describes the seriousness of defect.
● In software testing, defect severity can be defined as the degree of impact a defect has
on the development or operation of a component application being tested.
● Defect severity can be categorized into four class
– Critical: This defect indicates complete shut-down of the process, nothing can
proceed further
– Major/High: It is a highly severe defect and collapse the system. However, certain
parts of the system remain functional
– Minor/Medium: It cause some undesirable behavior, but the system is still
functional
– Low: It won't cause any major break-down of the system
86
Defect Priority
● Priority describes the importance of defect.
● Defect Priority states the order in which a defect should be fixed. Higher the priority
the sooner the defect should be resolved.
● Defect priority can be categorized into three class
– P1 (Immediate/ High) : The defect must be resolved as soon as possible as it
affects the system severely and cannot be used until it is fixed
– P2 (Medium): During the normal course of the development activities defect
should be resolved. It can wait until a new version is created
– P3 (Low): The defect is an irritant but repair can be done once the more serious
defect have been fixed
87
Tips for determining Severity of a Defect
88
A software system can have a
89
Severity & Priority
● A very low severity with a high priority:
– A logo error for any shipment website, can be of low severity as it not going to affect
the functionality of the website but can be of high priority as you don't want any
further shipment to proceed with wrong logo.
● A very high severity with a low priority:
– For flight operating website, defect in reservation functionality may be of high
severity but can be a low priority as it can be scheduled to release in a next cycle.
90
Some more examples...
● Example for High Priority & High Severity defect:
– If 'Login' is required for an Application and the users are unable to login to the
application with valid user credentials. Such defects need to be fixed with high
importance. Since it is stopping the customer to progress further.
● Example for Low Priority and High Severity defect:
– If an application crashes after multiple use of any functionality i.e. if ‘Save’ Button
(functionality) is used for 200 times and then the application crashes, such defects
have High Severity because application gets crashed, but Low Priority because no
need to debug right now you can debug it after some days.
● Example for High Priority & Low Severity defect:
– If in a web application, company name is miss spelled or Text “User Nam:” is
displayed instead of “User Name:” on the application login page. In this case, Defect
Severity is low as it is a spell mistake but Priority is high because of its high visibility.
91
Conti...
● Low priority-Low severity
– A spelling mistake in a page not frequently navigated by users.
● Low priority-High severity
– Application crashing in some very corner case.
● High priority-Low severity
– Slight change in logo color or spelling mistake in company name.
● High priority-High severity
– Issue with login functionality.
92
Defect Resolution
● After receiving the defect report from the testing team, development team conduct a
review meeting to fix defects. Then they send a Resolution Type to the testing team for
further communication.
● Resolution Types:-
– Accept
– Reject
– Duplicate
– Enhancement
– Need more information
– Not Reproducible
– Fixed
– As Designed
93
Defect Triage
● Defect triage is a process that tries to do the re-balancing of the process where test
team faces the problem of limited availability of resources.
● When there are large number of the defects and limited testers to verify them, defect
triage helps trying to get as many defects resolved based on defect parameters like
severity and priority.
● Defect Triage Process:
–
94
The triage process includes following steps
● Reviewing all the defects including rejected defects by the team
● Initial assessment of the defects is based on its content and respective priority and
severity settings
● Prioritizing the defect based on the inputs
● Assign the defect to correct release by product manager
● Re-directs the defect to the correct owner/team for further action
95
Guidelines that every tester should consider
before selecting severity
● Understand the concept of priority and severity well
● Always assign the severity level based on the issue type as this will affect its priority
● Understand how a particular scenario or test case would affect the end-user
● Need to consider how much time it would take to fix the defect based on its complexity
and time to verify defect
96
Exercise
● Assign the Severity for the following issues.
●
1 The website performance is too slow
2 The login function of the website does not work properly
3 The GUI of the website does not display correctly on mobile devices
4 The website could not remember the user login
session
5 Some links doesn’t work
97
Recommended Answers
No. Description Severity Explanation
1 The website performance is too
slow
High The performance bug can cause
huge inconvenience to user.
2 The login function of the website
does not work properly
Critical/High Login is one of the main function of
the banking website if this feature
does not work, it is serious bugs
3 The GUI of the website does not
display correctly on mobile devices
Medium The defect affects the user who use
Smartphone to view the website.
4 The website could not remember
the user login session
High This is a serious issue since the user
will be able to login but not be able to
perform any further transactions
5 Some links doesn’t work Low This is an easy fix for development
guys and the user can still access the
site without these links
98
Tips for good Bug report
● Structure: test carefully (Use deliberate, careful approach to testing)
● Reproduce: test it again (rule of thumb 3 times)
● Generalize: test it elsewhere Does the same failure occur in other modules or locations?
● Compare: review results of similar tests (Same test run against earlier versions)
● Condense: trim unnecessary information (Eliminate extraneous words or steps)
● Disambiguate: use clear words (Goal: Clear, indisputable statements of fact)
● Neutralize: express problem impartially (Deliver bad news gently)
● Review: be sure (Peer review)
99
Bug Life Cycle
100
Jira Defect Workflow
101
Sates of Defects
● New: A bug is reported and is yet to be assigned to developer
● Open: The test lead approves the bug
● Assigned: Assigned to a developer and a fix is in progress
● Need more info: When the developer needs information to reproduce the bug or to fix
the bug
● Fixed: Bug is fixed and waiting for validation
● Closed: The bug is fixed, validated and closed
● Rejected: Bug is not genuine
● Deferred: Fix will be placed in future builds
● Duplicate: Two bugs mention the same concept
● Invalid: Not a valid bug
● Reopened: Bug still exists even after the bug is fixed by the developer
102
Test Cycle Closure
● Activities
– Evaluate cycle completion criteria based on Time, Test coverage, Cost, Software,
Critical Business Objectives , Quality
– Prepare test metrics based on the above parameters.
– Document the learning out of the project
– Prepare Test closure report
– Qualitative and quantitative reporting of quality of the work product to the customer.
– Test result analysis to find out the defect distribution by type and severity.
● Deliverables
– Test Closure report
– Test metrics
103
QA/Testing Activities
● Understanding the requirements and functional specifications of the application.
● Identifying required Test Scenario’s.
● Designing Test Cases to validate application.
● Execute Test Cases to valid application
● Log Test results ( How many test cases pass/fail ).
● Defect reporting and tracking.
● Retest fixed defects of previous build
● Perform various type of testing assigned by Test Lead (Functionality, Usability, User Interface
● and compatibility) etc.,
● Reports to Test Lead about the status of assigned tasks
● Participated in regular team meetings.
● Creating automation scripts for Regression Testing.
● Provides recommendation on whether or not the application / system is ready for production.
104
Test Metrics
Sr. No. Required Data
1 No. Of Requirements
2 Avg. No. of Test Cases written Per Requirement
3 Total No.Of Test Cases written for all Requirement
4 Total No. Of test cases Executed
5 No.of Test Cases Passed
6 No.of Test Cases Failed
7 No.of Test cases Blocked
8 No. Of Test Cases Un Executed
9 Total No. Of Defects Identified
10 Critical Defects Count
11 Higher Defects Count
12 Medium Defects Count
13 Low Defects Count
14 Customer Defects
15 No.of defects found in UAT
105
Continue
● % of Test cases Executed:
– No.of Test cases executed / Total No. of Test cases written ) * 100
– Example: (50/100)*100 = 50%
● % of test cases NOT executed:
– (No.of Test cases NOT executed/Total No. of Test cases written) * 100
– (10/100)*100 = 10%
● % Test cases passed
– (No.of Test cases Passed /Total Test cases executed) * 100
– (40/50)*100 = 80%
● % Test cases failed
– (No.of Test cases failed / Total Test cases executed) * 100
– (10/50)*100 = 20%
● %Test cases blocked
– (No.of test cases blocked / Total Test cases executed ) * 100
– (5/50) * 100 = 10%
106
Continue
● Defect Density: Number of defects identified per requirement/s
– No.of defects found / Size(No. of requirements)
– 25/5 = 5%
● Defect Removal Efficiency (DRE):
– (A / A+B ) * 100
– (Fixed Defects / (Fixed Defects + Missed defects) ) * 100
● A- Defects identified during testing/ Fixed Defects (Example - 25)
● B- Defects identified by the customer/Missed defects (Example - 5)
– (25/(25+5))
● (25/30) * 100 = 83.33%
● Defect Leakage:
– (No.of defects found in UAT / No. of defects found in Testing) * 100
– (5/25) * 100 = 20%
107
Continue
● Defect Rejection Ratio:
– (No. of defect rejected /Total No. of defects raised) * 100
– (4/29) * 100 = 13.80%
● Defect Age:
– Fixed date-Reported date
● Customer satisfaction
– No.of complaints per Period of time
108
Test Efficiency Vs Test Effectiveness
● Effectiveness
– How well the user achieves the goals they set out to achieve using the system
(process).
● Efficiency
– The resources consumed in order to achieve their goals.
109
What is Agile?
● AGILE is a methodology that promotes continuous iteration of development and
testing throughout the software development life cycle of the project.
● The agile software development emphasizes on four core values.
– Individual and team interactions over processes and tools
– Working software over comprehensive documentation
– Customer collaboration over contract negotiation
– Responding to change over following a plan
110
What are Agile Promises?
111
Agile
112
Agile
113
Agile
114
Thank You...
Any Question?

Contenu connexe

Tendances

Mini curso de testes ágeis
Mini curso de testes ágeisMini curso de testes ágeis
Mini curso de testes ágeisQualister
 
ISTQB Foundation Level Basic
ISTQB Foundation Level BasicISTQB Foundation Level Basic
ISTQB Foundation Level BasicErol Selitektay
 
Agile QA presentation
Agile QA presentationAgile QA presentation
Agile QA presentationCarl Bruiners
 
BDD WITH CUCUMBER AND JAVA
BDD WITH CUCUMBER AND JAVABDD WITH CUCUMBER AND JAVA
BDD WITH CUCUMBER AND JAVASrinivas Katakam
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by DurgasoftDurga Prasad
 
Writing test cases from user stories and acceptance criteria
Writing test cases from user stories and acceptance criteria Writing test cases from user stories and acceptance criteria
Writing test cases from user stories and acceptance criteria An Nguyen
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation TestingArchana Krushnan
 
Types of test tools
Types of test toolsTypes of test tools
Types of test toolsVaibhav Dash
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testingHadi Fadlallah
 
Fundamentals of Testing
Fundamentals of TestingFundamentals of Testing
Fundamentals of TestingCode95
 
Types of software testing
Types of software testingTypes of software testing
Types of software testingPrachi Sasankar
 
Test Automation
Test AutomationTest Automation
Test Automationrockoder
 
Agile Testing Process
Agile Testing ProcessAgile Testing Process
Agile Testing ProcessIntetics
 
What is (tcoe) testing center of excellence
What is (tcoe) testing center of excellenceWhat is (tcoe) testing center of excellence
What is (tcoe) testing center of excellenceMaveric Systems
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing FundamentalsChankey Pathak
 
Test automation
Test automationTest automation
Test automationXavier Yin
 

Tendances (20)

Mini curso de testes ágeis
Mini curso de testes ágeisMini curso de testes ágeis
Mini curso de testes ágeis
 
Test Automation
Test AutomationTest Automation
Test Automation
 
ISTQB Foundation Level Basic
ISTQB Foundation Level BasicISTQB Foundation Level Basic
ISTQB Foundation Level Basic
 
Agile QA presentation
Agile QA presentationAgile QA presentation
Agile QA presentation
 
BDD WITH CUCUMBER AND JAVA
BDD WITH CUCUMBER AND JAVABDD WITH CUCUMBER AND JAVA
BDD WITH CUCUMBER AND JAVA
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by Durgasoft
 
Writing test cases from user stories and acceptance criteria
Writing test cases from user stories and acceptance criteria Writing test cases from user stories and acceptance criteria
Writing test cases from user stories and acceptance criteria
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
 
Types of test tools
Types of test toolsTypes of test tools
Types of test tools
 
Testing Metrics
Testing MetricsTesting Metrics
Testing Metrics
 
Carreira de QA
Carreira de QA Carreira de QA
Carreira de QA
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testing
 
Fundamentals of Testing
Fundamentals of TestingFundamentals of Testing
Fundamentals of Testing
 
Types of software testing
Types of software testingTypes of software testing
Types of software testing
 
ISTQB Test Process
ISTQB Test ProcessISTQB Test Process
ISTQB Test Process
 
Test Automation
Test AutomationTest Automation
Test Automation
 
Agile Testing Process
Agile Testing ProcessAgile Testing Process
Agile Testing Process
 
What is (tcoe) testing center of excellence
What is (tcoe) testing center of excellenceWhat is (tcoe) testing center of excellence
What is (tcoe) testing center of excellence
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing Fundamentals
 
Test automation
Test automationTest automation
Test automation
 

Similaire à QA Process Overview

Quality Assurance: An Overview
Quality Assurance: An OverviewQuality Assurance: An Overview
Quality Assurance: An OverviewAnant Corporation
 
manualtesting-170218090020 (1).pdf
manualtesting-170218090020 (1).pdfmanualtesting-170218090020 (1).pdf
manualtesting-170218090020 (1).pdfperamdevi06
 
Manual Testing software testing all slide
Manual Testing software testing all slideManual Testing software testing all slide
Manual Testing software testing all slideSmileySmiley39
 
Fundamentals of software testing
Fundamentals of software testingFundamentals of software testing
Fundamentals of software testingNoha Gamal
 
Software testing and software development process
Software testing and software development processSoftware testing and software development process
Software testing and software development processGen Aloys Ochola Badde
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesPunjab University
 
Software testing methods, levels and types
Software testing methods, levels and typesSoftware testing methods, levels and types
Software testing methods, levels and typesConfiz
 
Objectorientedtesting 160320132146
Objectorientedtesting 160320132146Objectorientedtesting 160320132146
Objectorientedtesting 160320132146vidhyyav
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testingHaris Jamil
 
Software Testing (1).pptx
Software Testing (1).pptxSoftware Testing (1).pptx
Software Testing (1).pptxSarowarSuman
 
All Software Testing in Software Engineering
All Software Testing in Software EngineeringAll Software Testing in Software Engineering
All Software Testing in Software Engineeringsankalpkumarsahoo174
 
Introduction to qa
Introduction to qaIntroduction to qa
Introduction to qaOmid Vahdaty
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual TestingMurageppa-QA
 
Solution Evaluation (BA Role)
Solution Evaluation (BA Role)   Solution Evaluation (BA Role)
Solution Evaluation (BA Role) Shwetha-BA
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in TestingShwetha-BA
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in TestingLakshmi-BA
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in TestingSwatiS-BA
 

Similaire à QA Process Overview (20)

Quality Assurance: An Overview
Quality Assurance: An OverviewQuality Assurance: An Overview
Quality Assurance: An Overview
 
manualtesting-170218090020 (1).pdf
manualtesting-170218090020 (1).pdfmanualtesting-170218090020 (1).pdf
manualtesting-170218090020 (1).pdf
 
Manual Testing software testing all slide
Manual Testing software testing all slideManual Testing software testing all slide
Manual Testing software testing all slide
 
Fundamentals of software testing
Fundamentals of software testingFundamentals of software testing
Fundamentals of software testing
 
Software quality-libreplan
Software quality-libreplanSoftware quality-libreplan
Software quality-libreplan
 
Software testing and software development process
Software testing and software development processSoftware testing and software development process
Software testing and software development process
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slides
 
Software testing methods, levels and types
Software testing methods, levels and typesSoftware testing methods, levels and types
Software testing methods, levels and types
 
Objectorientedtesting 160320132146
Objectorientedtesting 160320132146Objectorientedtesting 160320132146
Objectorientedtesting 160320132146
 
Software testing
Software testingSoftware testing
Software testing
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
 
Lec25
Lec25Lec25
Lec25
 
Software Testing (1).pptx
Software Testing (1).pptxSoftware Testing (1).pptx
Software Testing (1).pptx
 
All Software Testing in Software Engineering
All Software Testing in Software EngineeringAll Software Testing in Software Engineering
All Software Testing in Software Engineering
 
Introduction to qa
Introduction to qaIntroduction to qa
Introduction to qa
 
Testing Concepts and Manual Testing
Testing Concepts and Manual TestingTesting Concepts and Manual Testing
Testing Concepts and Manual Testing
 
Solution Evaluation (BA Role)
Solution Evaluation (BA Role)   Solution Evaluation (BA Role)
Solution Evaluation (BA Role)
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in Testing
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in Testing
 
Role of BA in Testing
Role of BA in TestingRole of BA in Testing
Role of BA in Testing
 

Dernier

Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...
Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...
Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...OnePlan Solutions
 
VK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web DevelopmentVK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web Developmentvyaparkranti
 
Ronisha Informatics Private Limited Catalogue
Ronisha Informatics Private Limited CatalogueRonisha Informatics Private Limited Catalogue
Ronisha Informatics Private Limited Catalogueitservices996
 
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full Recording
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full RecordingOpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full Recording
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full RecordingShane Coughlan
 
Sending Calendar Invites on SES and Calendarsnack.pdf
Sending Calendar Invites on SES and Calendarsnack.pdfSending Calendar Invites on SES and Calendarsnack.pdf
Sending Calendar Invites on SES and Calendarsnack.pdf31events.com
 
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...OnePlan Solutions
 
SensoDat: Simulation-based Sensor Dataset of Self-driving Cars
SensoDat: Simulation-based Sensor Dataset of Self-driving CarsSensoDat: Simulation-based Sensor Dataset of Self-driving Cars
SensoDat: Simulation-based Sensor Dataset of Self-driving CarsChristian Birchler
 
Simplifying Microservices & Apps - The art of effortless development - Meetup...
Simplifying Microservices & Apps - The art of effortless development - Meetup...Simplifying Microservices & Apps - The art of effortless development - Meetup...
Simplifying Microservices & Apps - The art of effortless development - Meetup...Rob Geurden
 
Osi security architecture in network.pptx
Osi security architecture in network.pptxOsi security architecture in network.pptx
Osi security architecture in network.pptxVinzoCenzo
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfDrew Moseley
 
Machine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringMachine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringHironori Washizaki
 
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptx
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptxThe Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptx
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptxRTS corp
 
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxUI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxAndreas Kunz
 
Salesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZSalesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZABSYZ Inc
 
Large Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLarge Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLionel Briand
 
Post Quantum Cryptography – The Impact on Identity
Post Quantum Cryptography – The Impact on IdentityPost Quantum Cryptography – The Impact on Identity
Post Quantum Cryptography – The Impact on Identityteam-WIBU
 
Odoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 EnterpriseOdoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 Enterprisepreethippts
 
Strategies for using alternative queries to mitigate zero results
Strategies for using alternative queries to mitigate zero resultsStrategies for using alternative queries to mitigate zero results
Strategies for using alternative queries to mitigate zero resultsJean Silva
 
VictoriaMetrics Q1 Meet Up '24 - Community & News Update
VictoriaMetrics Q1 Meet Up '24 - Community & News UpdateVictoriaMetrics Q1 Meet Up '24 - Community & News Update
VictoriaMetrics Q1 Meet Up '24 - Community & News UpdateVictoriaMetrics
 
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...confluent
 

Dernier (20)

Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...
Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...
Tech Tuesday - Mastering Time Management Unlock the Power of OnePlan's Timesh...
 
VK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web DevelopmentVK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web Development
 
Ronisha Informatics Private Limited Catalogue
Ronisha Informatics Private Limited CatalogueRonisha Informatics Private Limited Catalogue
Ronisha Informatics Private Limited Catalogue
 
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full Recording
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full RecordingOpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full Recording
OpenChain AI Study Group - Europe and Asia Recap - 2024-04-11 - Full Recording
 
Sending Calendar Invites on SES and Calendarsnack.pdf
Sending Calendar Invites on SES and Calendarsnack.pdfSending Calendar Invites on SES and Calendarsnack.pdf
Sending Calendar Invites on SES and Calendarsnack.pdf
 
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...
Revolutionizing the Digital Transformation Office - Leveraging OnePlan’s AI a...
 
SensoDat: Simulation-based Sensor Dataset of Self-driving Cars
SensoDat: Simulation-based Sensor Dataset of Self-driving CarsSensoDat: Simulation-based Sensor Dataset of Self-driving Cars
SensoDat: Simulation-based Sensor Dataset of Self-driving Cars
 
Simplifying Microservices & Apps - The art of effortless development - Meetup...
Simplifying Microservices & Apps - The art of effortless development - Meetup...Simplifying Microservices & Apps - The art of effortless development - Meetup...
Simplifying Microservices & Apps - The art of effortless development - Meetup...
 
Osi security architecture in network.pptx
Osi security architecture in network.pptxOsi security architecture in network.pptx
Osi security architecture in network.pptx
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdf
 
Machine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringMachine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their Engineering
 
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptx
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptxThe Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptx
The Role of IoT and Sensor Technology in Cargo Cloud Solutions.pptx
 
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxUI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
 
Salesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZSalesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZ
 
Large Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLarge Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and Repair
 
Post Quantum Cryptography – The Impact on Identity
Post Quantum Cryptography – The Impact on IdentityPost Quantum Cryptography – The Impact on Identity
Post Quantum Cryptography – The Impact on Identity
 
Odoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 EnterpriseOdoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 Enterprise
 
Strategies for using alternative queries to mitigate zero results
Strategies for using alternative queries to mitigate zero resultsStrategies for using alternative queries to mitigate zero results
Strategies for using alternative queries to mitigate zero results
 
VictoriaMetrics Q1 Meet Up '24 - Community & News Update
VictoriaMetrics Q1 Meet Up '24 - Community & News UpdateVictoriaMetrics Q1 Meet Up '24 - Community & News Update
VictoriaMetrics Q1 Meet Up '24 - Community & News Update
 
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
 

QA Process Overview

  • 2. 2 About Software. ● What is software? – A Software is a collection of computer programs that helps us to perform a task ● Types of Software – System software ● Ex: Device drivers, OS, Servers, Utilities, etc. – Programming software ● Ex: compilers, debuggers, interpreters, etc. – Application software ● Ex: industrial automation, business software, games, telecoms, etc.
  • 3. 3 Type of Application ● Product Vs Project – If software application is developed for specific customer requirement then it is called Project. ● Example: ABC company’s webside for selling theire product such as toys, beatuy product, health product, etc. – If software application is developed for multiple customers requirement then it called Product. ● Example: ABC company offer their service to end users such as whats app, gmail, facebook, etc.
  • 4. 4 Software Testing ● What is Software Testing? – Software Testing is a part of software development process. – Software Testing is an activity to detect and identify the defects in the software. – The objective of testing is to release quality product to the client. ● Why do we need testing? – Ensure that software is bug free. – Ensure that system meets customer requirements and software specifications. – Ensure that system meets end user expectations. – Fixing the bugs identified after release is expensive.
  • 5. 5 Software Quality ● Quality: – Quality is defined as justification of all the requirements of a customer in a product. ● Note: – Quality is not defined in the product. It is defined in the customer`s mind. ● Quality software is reasonably – Bug-free. – Delivered on time. – Within budget. – Meets requirements and/or expectations. – Maintainable.
  • 6. 6 Software Quality ● Error, bug & failure – Error ● Any incorrect human action that produces a problem in the system is called an error. – Defect/Bug ● Deviation from the expected behavior to the actual behavior of the system is called defect. – Failure ● The deviation identified by end-user while using the system is called a failure.
  • 7. 7 Why Bugs? ● Why there are bugs in software? – Miscommunication or no communication – Software complexity – Programming errors – Changing requirements – Lack of skilled testers Etc.
  • 8. 8 SDLC ● Software Development Life Cycle (SDLC) – SDLC, Software Development Life Cycle is a process used by software industry to design, develop and test high quality software's. – The SDLC aims to produce a high quality software that meets customer expectations. ● SDLC Models – Waterfall Model – Incremental Model – Spiral Model – V- Model – Agile Model
  • 14. 14 QA Vs QC/QE ● QA is Process related. ● QC is the actual testing of the software. ● QA focuses on building in quality. ● QC focuses on testing for quality. ● QA is preventing defects. ● QC is detecting defects. ● QA is process oriented. ● QC is Product oriented. ● QA for entire life cycle. ● QC for testing part in SDLC.
  • 15. 15 Verification V/S Validation ● Verification checks whether we are building the right system. ● Verification typically involves – Reviews. – Walkthroughs. – Inspections. ● Validation checks whether we are building the system right. – Takes place after verifications are completed. – Validation typically involves actual testing. ● System Testing.
  • 16. 16 Static V/S Dynamic Testing ● Static testing – is an approach to test project documents in the form of Reviews, Walkthroughs and Inspections. ● Dynamic testing – is an approach to test the actual software by giving inputs and observing results.
  • 17. 17 Review, Walkthrough & Inspection ● Reviews: – Conducts on documents to ensure correctness and completeness. – Example: – Requirement Reviews – Design Reviews – Code Reviews – Test plan reviews – Test cases reviews etc. ● Walkthroughs: – It is a formal review and we can discuss/raise the issues at peer level. – Also walkthrough does not have minutes of the meet. It can happen at any time and conclude just like that no schedule as such.
  • 18. 18 Review, Walkthrough & Inspection ● Inspections: – Its a formal approach to the requirements schedule. – At least 3- 8 people will sit in the meeting 1- reader 2-writer 3- moderator plus concerned. – Inspection will have a proper schedule which will be intimated via email to the concerned developer/tester.
  • 19. 19 Levels of Software Testing ● Unit Testing ● Integration Testing ● System Testing ● User Acceptance Testing(UAT)
  • 21. 21 Unit Testing ● A unit is the smallest testable part of software. It usually has one or a few inputs and usually a single output. ● Unit testing conducts on a single program or single module. ● Unit Testing is white box testing technique. ● Unit testing is conducted by the developers. ● Unit testing techniques: – Basis path testing – Control structure testing ● Conditional coverage ● Loops Coverage – Mutation Testing
  • 22. 22 Integration Testing ● In Integration Testing, individual software modules are integrated logically and tested as a group. ● Integration testing focuses on checking data communication amongst these modules. ● Integrated Testing is white box testing technique. ● Integrated testing is conducted by the developers. ● Approaches: – Top Down Approach – Bottom Up Approach – Sandwich Approach(Hybrid) ● Stub: Is called by the Module under Test. ● Driver: Calls the Module to be tested.
  • 23. 23 Bottom-Up Integration ● In the bottom up strategy, each module at lower levels is tested with higher modules until all modules are tested. ● It takes help of Drivers for testing.
  • 24. 24 Top down Integration ● In Top to down approach, testing takes place from top to down following the control flow of the software system. ● Takes help of stubs for testing.
  • 25. 25 System Testing ● Testing over all functionality of the application with respective client requirements. ● It is a black box testing technique. ● This testing is conducted by testing team. ● Before conducting system testing we should know the requirements. ● System Testing focusses on below aspects. – User Interface Testing (GUI) – Functional Testing – Non-Functional Testing – Usability Testing
  • 26. 26 User Acceptance Testing ● After completion of system testing UAT team conducts acceptance testing in two levels. – Alpha testing ● Alpha testing performed by Testers who are usually internal employees of the organization ● Alpha Testing performed at developer's site ● Alpha testing is to ensure the quality of the product before moving to Beta testing. – Beta testing ● Beta testing is performed by Clients or End Users who are not employees of the organization ● Beta testing is performed at client location or end user of the product ● Beta version of the software is released to a limited number of end-users of the product to obtain feedback on the product quality.
  • 27. 27 Testing Methodologies ● White box Testing ● Black box Testing ● Grey box Testing
  • 28. 28 White Box Testing ● White Box Testing conducts on internal logic of the programs. ● Programming Skills are required. ● Example: Unit Testing & Integration Testing
  • 29. 29 Block Box Testing ● Testing conducts on functionality of the application whether it is working according to customer requirements or not. ● Example – System Testing & UAT Testing ● This method attempts to find errors in the following categories: – Incorrect or missing functions – Interface errors – Behavior or performance errors – Initialization and termination errors
  • 30. 30 Grey Box Testing ● Both combination of white box and black box testing. ● Example: – Database Testing
  • 31. 31 Black Box Testing Types ● GUI Testing ● Usability Testing ● Functional Testing ● Non-Functional Testing
  • 32. 32 What is GUI? ● There are two types of interfaces in a computer application. – Command Line Interface is where you type text and computer responds to that command. – GUI stands for Graphical User Interface where you interact with the computer using images rather than text.
  • 33. 33 GUI Testing ● Graphical User Interface (GUI) testing is the process of testing the system’s GUI. ● GUI testing involves checking the screens with the controls like menus, buttons, icons, and all types of bars – tool bar, menu bar, dialog boxes and windows etc. ● During GUI Testing Test Engineers validates user interface of the application as following aspects: – Look & Feel – Easy to use – Navigations & Shortcut keys ● GUI Objects: – Window, Dialog Box, Push Button, Radio Button, Radio Group, Tool bar, Edit Box, Text Box, Check Box, List Box, Drop down Box, Combo Box, Tab, Tree view, progress bar, Table, Scroll bar Etc.
  • 34. 34 Check list for GUI Testing ● It checks if all the basic elements are available in the page or not. ● It checks the spelling of the objects. ● It checks alignments of the objects. ● It checks content display in web pages. ● It checks if the mandatory fields are highlights or not. ● It checks consistency in background color and color font type and fond size etc.
  • 35. 35 Usability Testing ● During this testing validates application provided context sensitive help or not to the user. ● Checks how easily the end users are able to understand and operate the application is called usability testing.
  • 36. 36 Functional Testing ● Object Properties Coverage ● Input Domain Coverage (BVA, ECP) ● Database Testing/Backend Coverage ● Error Handling Coverage ● Links Existence & Links Execution ● Cookies & Sessions
  • 37. 37 Object Properties Testing ● Every object has certain properties. – Example ● Enable, Disable, Focus, Text etc.. ● During Functional testing Test Engineers validate properties of objects in run time.
  • 38. 38 Input Domain Testing ● During Input domain testing Test Engineers validate data provided to the application value and length. ● There are two techniques in Input domain Techniques. – Equalance Class Partition (ECP) – Boundary Value Analysis(BVA)
  • 39. 39 ECP & BVA ● Requirement:- – • User name field allows only lower case with min 6 max 8 letters. – ECP for User Name BVA for User Name Valid Invalid a....z A.....Z 0......9 Spacial Characters (@,#,$,%,!,&,*,etc) Parameters Value Result Min 6 Valid Min+1 7 Valid Min-1 5 Invalid Max 8 Valid Max+1 9 Invalid Max-1 7 Valid
  • 40. 40 Database Testing ● During Database testing Test Engineers validate the data w.r.t database. ● ValidatesDML operations( Insert , Update, Delete & Select) ● SQL Language: DDL, DML, DCL etc.. – DDL - Data Definition Langue - Create , alter, drop – DML - Data Manipulation language - Insert, update, select, delete – DCL - Commit, roll back etc.
  • 41. 41 Error Handling Testing ● Validate error messages thrown by the application when we provide invalid data. ● The error messages should be clear and easy to understand to the user.
  • 42. 42 Links Coverage ● Links existence - Links placed in the appropriate location or not. ● Links execution - link is navigating to appropriate page or not. ● Types of links:- – Internal links – External links – Broken links
  • 43. 43 Cookies & Sessions ● Cookie- Temporary internet files which are created at client side when we open the web sites. These files contains User data. ● Session- Sessions are time slots which are allocated to the user at the serve side.
  • 44. 44 Non-Functional Testing ● Performance Testing – Load Testing – Stress Testing – Volume Testing ● Security Testing ● Recovery Testing ● Compatibility Testing ● Configuration Testing ● Installation Testing ● Sanitation Testing
  • 45. 45 Performance Testing ● Load: Testing speed of the system while increasing the load gradually till the customer expected number. ● Stress: Testing speed of the system while increasing/reducing the load on the system to check any where its breaking. ● Volume: Check how much volumes of data is able to handle by the system.
  • 46. 46 Security Testing ● Testing security provided by the system ● Types: – Authentication – Access Control/Authorization – Encryption/Decryption
  • 47. 47 Recovery Testing ● Testing Recovery provided by the system. Whether the system recovering abnormal to normal condition or not.
  • 48. 48 Compatibility Testing ● Testing Compatibility of the system w.r.t OS, H/W & Browsers. ● Operating System Compatibility ● Hardware Compatibility (Configuration Testing) ● Browser Compatibility ● Forward & Backward Compatibility
  • 49. 49 Installation Testing ● Testing Installation pf the application on Customer expected platforms and check installation steps, navigation, how much space is occupied in memory. ● Check Un-Installation.
  • 50. 50 Sanitation/Garbage Testing ● Check whether application is providing any extra/additional features beyond the customer requirements.
  • 51. 51 Testing Terminology ● Adhoc Testing: – Software testing performed without proper planning and documentation. – Testing is carried out with the knowledge of the tester about the application and the tester tests randomly without following the specifications/requirements. ● Monkey Testing: – Test the functionality randomly without knowledge of application and test cases is called Monkey Testing.
  • 52. 52 Testing Terminology cont... ● Re- Testing: – Testing functionality repetitively is called re-testing. – Re-testing gets introduced in the following two scenarios. ● Testing is functionality with multiple inputs to confirm the business validation are implemented (or) not. ● Testing functionality in the modified build is to confirm the bug fixers are made correctly (or) not. ● Regression Testing: – It is a process of identifying various features in the modified build where there is a chance of getting side-effects and retesting these features. – The new functionalities added to the existing system (or) modifications made to the existing system .It must be noted that a bug fixer might introduce side-effects and a regression testing is helpful to identify these side effects.
  • 53. 53 Testing Terminology cont... ● Sanity Testing: – After receiving a build with minor changes in the code or functionality, a subset of regression test cases are executed that to check whether it rectified the software bugs or issues and no other software bug is introduced by the changes. – sanity testing of the software can be done at later cycles after through regression test cycles. – If we are moving a build from staging / testing server to production server, sanity testing of the software application can be done to check that whether the build is sane enough to move to further at production server or not ● Smoke Testing: – Software Testing done to ensure that whether the build can be accepted for through software testing or not. Basically, it is done to check the stability of the build received for software testing. – This is wider scope of testing where we deploy new build and clean database then performed CRUD operation in entire application.
  • 54. 54 Testing Terminology cont... ● End to End Testing: – Testing the overall functionalities of the system including the data integration among all the modules is called end-to-end testing. ● Exploratory Testing: – Exploring the application and understanding the functionalities adding or modifying the existing test cases for better testing is called exploratory testing.
  • 55. 55 Testing Terminology cont... ● Globalization Testing (or) Internationalization Testing(I18N): – Checks if the application has a provision of setting and changing languages date and time format and currency etc. If it is designed for global users, it is called globalization testing. ● Localization Testing: – Checks default languages currency date and time format etc. If it is designed for a particular locality of users is called Localization testing.
  • 56. 56 Testing Terminology cont... ● Positive Testing (+ve ): – Testing conducted on the application in a positive approach to determine what system is supposed to do is called a positive testing. – Positive testing helps in checking if the customer requirements are justifying the application or not. ● Negative Testing ( -ve): – Testing a software application with a negative perception to check what system is not supposed to do is called negative testing.
  • 57. 57 Software Testing Life Cycle (STLC) ● Requirement Analysis ● Test Planning ● Test Designing ● Test Execution ● Test Closure
  • 59. 59 Test Plan Definition ● A document describing the scope, approach, resources and schedule of testing activities. ● It identifies test items, the feature to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning. ● A detail of how the test will proceed, who will do the testing, what will be tested, in how much time the test will take place, and to what quality level the test will be performed.
  • 60. 60 Test Plan Contents ● Objective ● Features to be tested ● Features not to be tested ● Test Approach ● Entry Criteria ● Exit Criteria ● Suspension & Resumption criteria ● Test Environment ● Test Deliverables & Milestones ● Schedules ● Risks & Mitigations ● Approvals
  • 61. 61 Use case, Test Scenario & Test Case ● Use Case: – Use case describes the requirement. – Use case contains THREE Items . ● Actor ● Action/Flow ● Outcome ● Test Scenario: – Possible area to be tested (What to test) ● Test Case: – How to test – Describes test steps, expected result & actual result.
  • 62. 62 Use Case V/s Test Case ● Use Case – Describes functional requirement, prepared by Business Analyst(BA). ● Test Case – Describes Test Steps/ Procedure, prepared by Test Engineer.
  • 63. 63 Test Case V/s Test Scenario Test Case Test Scenario Test case consist of set of input values, execution precondition, excepted Results and executed post condition, developed to cover certain test Condition scenario is nothing but test procedure. Test cases are derived (or written) from test scenario The scenarios are derived from use cases. Test Scenario is ‘What to be tested’ Test Case is ‘How to be tested’.
  • 64. 64 Sample Login Test Scenario ● Example:- – Test Scenario: Checking the functionality of Login button – TC1: Click the button without entering user name and password. ● Expected Result: show validation message “error message”. – TC2: Click the button only entering User name. ● Expected Result: show validation message “Password Required”. – TC3: Click the button while entering wrong user name and wrong password. ● Expected Result: show validation message “Invalid username or password”. – TC4: Click the button while entering valid user name and password. ● Expected Result: redirect on next landing page with successful login.
  • 65. 65 Positive V/s Negative Test Cases ● Requirement: – For Example if a text box is listed as a feature and in SRS it is mentioned as Text box accepts 6 - 20 characters and only alphabets. ● Positive Test Cases: – Textbox accepts 6 characters. – Textbox accepts upto 20 chars length. – Textbox accepts any value in between 6-20 chars length. – Textbox accepts all alphabets. ● Negative Test Cases: – Textbox should not accept less than 6 chars. – Textbox should not accept chars more than 20 chars. – Textbox should not accept special characters. (~,@,#,$,%,^,&,* ,<,>,/,;,!,|,) – Textbox should not accept numerical. (123456)
  • 66. 66 Test case Design Techniques ● Boundary Value Analysis (BVA) ● Equivalence Class Partitioning (ECP) ● Decision Table Testing ● State Transition Diagram ● Use Case Testing
  • 70. 70 Test Suite ● Test Suite is group of test cases which belongs to same category.
  • 71. 71 Test Case Contents ● Test Case ID ● Test Case Title ● Description ● Pre-condition ● Priority ( P0, P1,P2,P3) – order ● Requirement ID ● Steps/Actions ● Expected Result ● Actual Result ● Test data
  • 73. 73 Requirement Traceability Matrix(RTM) ● RTM – Requirement Traceability Matrix ● Used for mapping of Requirements w.r.t Test cases
  • 77. 77 Characteristics of good Test Case ● A good test case has certain characteristics which are: – Should be accurate and tests what it is intended to test. – No unnecessary steps should be included in it. – It should be reusable. – It should be traceable to requirements. – It should be compliant to regulations. – It should be independent i.e. You should be able to execute it in any order without any dependency on other test cases. – It should be simple and clear, any tester should be able to understand it by reading once.
  • 78. 78 Test Data ● Test Data is required for Test cases. ● Test Data will be provided as input to the cases. ● Prepared by Test Engineers. ● Maintains in Excel sheets.
  • 79. 79 Test Environment Setup ● Test environment decides the software and hardware conditions under which a work product is tested. ● Activities – understand the required architecture, environment set-up and prepare hardware and software requirement list for the Test Environment. – Setup test Environment and test data – Perform smoke test on the build ● Deliverables – Environment ready with test data set up – Smoke Test Results.
  • 80. 80 Test Execution ● During this phase test team will carry out the testing based on the test plans and the test cases prepared. ● Bugs will be reported back to the development team for correction and retesting will be performed.
  • 81. 81 Defect Reporting ● Any mismatched functionality found in a application is called as Defect/Bug/Issue. ● During Test Execution Test engineers are reporting mismatches as defects to developers through templates or using tools. ● Defect Reporting Tools: – Clear Quest – DevTrack – Jira – Quality Center – Bug Jilla etc.
  • 82. 82 Defect Report Contents ● Defect_ID - Unique identification number for the defect. ● Defect Description - Detailed description of the defect including information about the module in which defect was found. ● Version - Version of the application in which defect was found. ● Steps - Detailed steps along with screenshots with which the developer can reproduce the defects. ● Date Raised - Date when the defect is raised ● Reference - where in you Provide reference to the documents like . requirements, design, architecture or may be even screenshots of the error to help understand the defect ● Detected By - Name/ID of the tester who raised the defect ● Status - Status of the defect , more on this later ● Fixed by - Name/ID of the developer who fixed it ● Date Closed - Date when the defect is closed ● Severity which describes the impact of the defect on the application ● Priority which is related to defect fixing urgency. Severity Priority could be High/Medium/Low based on the impact urgency at which the defect should be fixed respectively
  • 85. 85 Defect Severity ● Severity describes the seriousness of defect. ● In software testing, defect severity can be defined as the degree of impact a defect has on the development or operation of a component application being tested. ● Defect severity can be categorized into four class – Critical: This defect indicates complete shut-down of the process, nothing can proceed further – Major/High: It is a highly severe defect and collapse the system. However, certain parts of the system remain functional – Minor/Medium: It cause some undesirable behavior, but the system is still functional – Low: It won't cause any major break-down of the system
  • 86. 86 Defect Priority ● Priority describes the importance of defect. ● Defect Priority states the order in which a defect should be fixed. Higher the priority the sooner the defect should be resolved. ● Defect priority can be categorized into three class – P1 (Immediate/ High) : The defect must be resolved as soon as possible as it affects the system severely and cannot be used until it is fixed – P2 (Medium): During the normal course of the development activities defect should be resolved. It can wait until a new version is created – P3 (Low): The defect is an irritant but repair can be done once the more serious defect have been fixed
  • 87. 87 Tips for determining Severity of a Defect
  • 88. 88 A software system can have a
  • 89. 89 Severity & Priority ● A very low severity with a high priority: – A logo error for any shipment website, can be of low severity as it not going to affect the functionality of the website but can be of high priority as you don't want any further shipment to proceed with wrong logo. ● A very high severity with a low priority: – For flight operating website, defect in reservation functionality may be of high severity but can be a low priority as it can be scheduled to release in a next cycle.
  • 90. 90 Some more examples... ● Example for High Priority & High Severity defect: – If 'Login' is required for an Application and the users are unable to login to the application with valid user credentials. Such defects need to be fixed with high importance. Since it is stopping the customer to progress further. ● Example for Low Priority and High Severity defect: – If an application crashes after multiple use of any functionality i.e. if ‘Save’ Button (functionality) is used for 200 times and then the application crashes, such defects have High Severity because application gets crashed, but Low Priority because no need to debug right now you can debug it after some days. ● Example for High Priority & Low Severity defect: – If in a web application, company name is miss spelled or Text “User Nam:” is displayed instead of “User Name:” on the application login page. In this case, Defect Severity is low as it is a spell mistake but Priority is high because of its high visibility.
  • 91. 91 Conti... ● Low priority-Low severity – A spelling mistake in a page not frequently navigated by users. ● Low priority-High severity – Application crashing in some very corner case. ● High priority-Low severity – Slight change in logo color or spelling mistake in company name. ● High priority-High severity – Issue with login functionality.
  • 92. 92 Defect Resolution ● After receiving the defect report from the testing team, development team conduct a review meeting to fix defects. Then they send a Resolution Type to the testing team for further communication. ● Resolution Types:- – Accept – Reject – Duplicate – Enhancement – Need more information – Not Reproducible – Fixed – As Designed
  • 93. 93 Defect Triage ● Defect triage is a process that tries to do the re-balancing of the process where test team faces the problem of limited availability of resources. ● When there are large number of the defects and limited testers to verify them, defect triage helps trying to get as many defects resolved based on defect parameters like severity and priority. ● Defect Triage Process: –
  • 94. 94 The triage process includes following steps ● Reviewing all the defects including rejected defects by the team ● Initial assessment of the defects is based on its content and respective priority and severity settings ● Prioritizing the defect based on the inputs ● Assign the defect to correct release by product manager ● Re-directs the defect to the correct owner/team for further action
  • 95. 95 Guidelines that every tester should consider before selecting severity ● Understand the concept of priority and severity well ● Always assign the severity level based on the issue type as this will affect its priority ● Understand how a particular scenario or test case would affect the end-user ● Need to consider how much time it would take to fix the defect based on its complexity and time to verify defect
  • 96. 96 Exercise ● Assign the Severity for the following issues. ● 1 The website performance is too slow 2 The login function of the website does not work properly 3 The GUI of the website does not display correctly on mobile devices 4 The website could not remember the user login session 5 Some links doesn’t work
  • 97. 97 Recommended Answers No. Description Severity Explanation 1 The website performance is too slow High The performance bug can cause huge inconvenience to user. 2 The login function of the website does not work properly Critical/High Login is one of the main function of the banking website if this feature does not work, it is serious bugs 3 The GUI of the website does not display correctly on mobile devices Medium The defect affects the user who use Smartphone to view the website. 4 The website could not remember the user login session High This is a serious issue since the user will be able to login but not be able to perform any further transactions 5 Some links doesn’t work Low This is an easy fix for development guys and the user can still access the site without these links
  • 98. 98 Tips for good Bug report ● Structure: test carefully (Use deliberate, careful approach to testing) ● Reproduce: test it again (rule of thumb 3 times) ● Generalize: test it elsewhere Does the same failure occur in other modules or locations? ● Compare: review results of similar tests (Same test run against earlier versions) ● Condense: trim unnecessary information (Eliminate extraneous words or steps) ● Disambiguate: use clear words (Goal: Clear, indisputable statements of fact) ● Neutralize: express problem impartially (Deliver bad news gently) ● Review: be sure (Peer review)
  • 101. 101 Sates of Defects ● New: A bug is reported and is yet to be assigned to developer ● Open: The test lead approves the bug ● Assigned: Assigned to a developer and a fix is in progress ● Need more info: When the developer needs information to reproduce the bug or to fix the bug ● Fixed: Bug is fixed and waiting for validation ● Closed: The bug is fixed, validated and closed ● Rejected: Bug is not genuine ● Deferred: Fix will be placed in future builds ● Duplicate: Two bugs mention the same concept ● Invalid: Not a valid bug ● Reopened: Bug still exists even after the bug is fixed by the developer
  • 102. 102 Test Cycle Closure ● Activities – Evaluate cycle completion criteria based on Time, Test coverage, Cost, Software, Critical Business Objectives , Quality – Prepare test metrics based on the above parameters. – Document the learning out of the project – Prepare Test closure report – Qualitative and quantitative reporting of quality of the work product to the customer. – Test result analysis to find out the defect distribution by type and severity. ● Deliverables – Test Closure report – Test metrics
  • 103. 103 QA/Testing Activities ● Understanding the requirements and functional specifications of the application. ● Identifying required Test Scenario’s. ● Designing Test Cases to validate application. ● Execute Test Cases to valid application ● Log Test results ( How many test cases pass/fail ). ● Defect reporting and tracking. ● Retest fixed defects of previous build ● Perform various type of testing assigned by Test Lead (Functionality, Usability, User Interface ● and compatibility) etc., ● Reports to Test Lead about the status of assigned tasks ● Participated in regular team meetings. ● Creating automation scripts for Regression Testing. ● Provides recommendation on whether or not the application / system is ready for production.
  • 104. 104 Test Metrics Sr. No. Required Data 1 No. Of Requirements 2 Avg. No. of Test Cases written Per Requirement 3 Total No.Of Test Cases written for all Requirement 4 Total No. Of test cases Executed 5 No.of Test Cases Passed 6 No.of Test Cases Failed 7 No.of Test cases Blocked 8 No. Of Test Cases Un Executed 9 Total No. Of Defects Identified 10 Critical Defects Count 11 Higher Defects Count 12 Medium Defects Count 13 Low Defects Count 14 Customer Defects 15 No.of defects found in UAT
  • 105. 105 Continue ● % of Test cases Executed: – No.of Test cases executed / Total No. of Test cases written ) * 100 – Example: (50/100)*100 = 50% ● % of test cases NOT executed: – (No.of Test cases NOT executed/Total No. of Test cases written) * 100 – (10/100)*100 = 10% ● % Test cases passed – (No.of Test cases Passed /Total Test cases executed) * 100 – (40/50)*100 = 80% ● % Test cases failed – (No.of Test cases failed / Total Test cases executed) * 100 – (10/50)*100 = 20% ● %Test cases blocked – (No.of test cases blocked / Total Test cases executed ) * 100 – (5/50) * 100 = 10%
  • 106. 106 Continue ● Defect Density: Number of defects identified per requirement/s – No.of defects found / Size(No. of requirements) – 25/5 = 5% ● Defect Removal Efficiency (DRE): – (A / A+B ) * 100 – (Fixed Defects / (Fixed Defects + Missed defects) ) * 100 ● A- Defects identified during testing/ Fixed Defects (Example - 25) ● B- Defects identified by the customer/Missed defects (Example - 5) – (25/(25+5)) ● (25/30) * 100 = 83.33% ● Defect Leakage: – (No.of defects found in UAT / No. of defects found in Testing) * 100 – (5/25) * 100 = 20%
  • 107. 107 Continue ● Defect Rejection Ratio: – (No. of defect rejected /Total No. of defects raised) * 100 – (4/29) * 100 = 13.80% ● Defect Age: – Fixed date-Reported date ● Customer satisfaction – No.of complaints per Period of time
  • 108. 108 Test Efficiency Vs Test Effectiveness ● Effectiveness – How well the user achieves the goals they set out to achieve using the system (process). ● Efficiency – The resources consumed in order to achieve their goals.
  • 109. 109 What is Agile? ● AGILE is a methodology that promotes continuous iteration of development and testing throughout the software development life cycle of the project. ● The agile software development emphasizes on four core values. – Individual and team interactions over processes and tools – Working software over comprehensive documentation – Customer collaboration over contract negotiation – Responding to change over following a plan
  • 110. 110 What are Agile Promises?