17. Test Execution Flow
Test Case Execution
Report & Re-test
Defects
Regression Testing
(Manual &
Automated)
Smoke Testing
18. Test Types
Unit
Testing
Who:
Developers
Where:
Local PC, QAAlpha
Why: To make sure
individual components
are working
Feature
Testing
Who:
Testers
Where:
QABeta
Why: To make sure new
features are working as
intended
Regression
Testing
Who:
Testers
Where:
QABeta,
Pre-production
Why: To make sure
existing features are still
working
Smoke
Testing
Who: Testers
Where: Cold Server,
Production
Why: To make sure core
functionalities are
working
26. Test Automation - Framework
Custom Build Framework
Telerik
Testing
Framework
C#.NET
Silverlight
Automation
Selenium Java
HTML
Automation
27. Custom Automation Framework - Reasons
More control
Freedom for
end to end
process
Custom
Logger
Less
Maintenance
Effort
Less learning
curve
28. Thinking About Layers
Tests
Custom Framework
Automation Framework
(Selenium/Telerik Testing Framework)
Browser (Web App)
Each layer only interacts with
the layer immediately below it
29. Thinking About Layers (Cont.)
Tests
Framework
Selenium
Browser (Web App)
Page Tests
Data Layer
Logger
Page
Repositories
Entities
UI Object
Repositories
Testlink Updater
Helper
Factory
The framework itself is divided into layers of
functionality
30. The Page Objects Model
Test
LoginPage::Code
- login()
- ResetPassword()
ToggleRememberMe()
LoginPage::Browser
Selenium
org.openqa.selenium.support.PageFactory
.NET – Custom design
Pages are good way to model the functionality of an application
31. Execution Flow
Loop for
configured
countries
Begin test
CSV
data
IsRun
=
true?
Test
initialization
Test execution
(Page > UI repo)
HTML logger::
Write detail log
with screenshot
Update Testlink
(pass/fail result)
HTML logger::
Write summary
log
End execution
Is last
row?
Yes
Yes
No
No, proceed to next row
Green Icon: TELERIK Testing Framework – Silverlight Automation
Selenium is for automating the HTML site
Jenkins is helping us to automate our build process
Lack of testing framework, processes and templates
Different approaches, templates and terminology
No agreed framework for how to approach, estimate, deliver and monitor testing
Mention tools that have been implemented such as: TesLink, Bugzilla, etc.
Methodology for how to deliver testing, end-to-end, with accompanying tools and templates
Process tailored to fit the organization needs. Logically sequential but may overlap or take place concurrently
1 : Test Planning
What will be tested & Why? When, Who
2 : Test Design
How & When will it be tested? What
3 : Test Execution
Who will be testing? What was discovered? In action
4 : Test Closure
What is the exit criteria? Is it met?
Defines “what” should be tested.
Identify Test activities:
Environments (e.g. hardware)
Is there a need for data migration?
Test data preparation?
Identify test scope (in and out) by identifying the risks (How? Will be discussed in next slide)
Resource & Scheduling
- Assign resources to the activities
Exit Criteria:
Example: All high priority and blocking defects must be fixed
FACTORS to Consider when Identifying the Test Scope:
Frequency of Use
Feature is used by the business on a daily basis, or even more frequently (e.g. Check-in)
Failure Probability
Based on past experience, which features have a higher chance of failure.
Defect Clustering
Business Impact
May not be used frequently but feature failure may cause the business to lose money
1 Track progress of User Stories – across all teams
2 Easily attach prototypes, requirements, or any other relevant documentation
3 Convert tasks into cards
4 monitor them daily
5 monitor issues -
Defines “how” something is to be tested.
Defines “how” something is to be tested.
Requirements Review: Static Testing
Knowledge Sharing – share stories & how the tester would want to test. Get team feedback on the scenarios. Another form of Static Testing.
Test Cases & Scripts
Review: Internal and External
Regression test plan: Use risk based analysis technique on the next slide
Test Data: if needed (for both Feature and Regression Testing)
For any software product, an infinite number of test scenarios may exist.. Sadly, we don’t have forever to test.
So when preparing the Regression Test Plan, we base it on the Regression Test Scope identified earlier.
We consider the risks and be smart and selective on what to test.
When writing test cases, we tag them as High, Medium, Low for easy identification.
This helps us to test the right requirements with limited time.
Must test
Probably should test
Test only if time permits
Not possible to test everything
In-house Migration Tool
Internal tool
Generatedata.com (free)
Manual approach
NOTES: huge data set
The main activities for test execution are:
1. Execute the test cases/test scripts.
2. Report/Re-test defects.
3. Regression testing
4. Smoke testing
- execute the test cases or test scripts that were developed during the design phase.
- While testing, if there any defects found, we will report the defects. Once the developer has fixed it, we would need to re-test the fixed defects.
-, we also do the regression testing which is to test existing features. The purpose is to ensure that the existing features wont be affected after the new changes.
- Smoke testing is to test the basic functions of the product. The purpose is to ensure the core functions of the product are still working. In our company, we perform smoke testing during release day, after the code is deployed to production.
Explain hardware testing, data migration testing, exploratory testing.
Mention that regression testing are both manual and automated.
To ensure quality, we have several levels of testing and different environments for each one of them.
Here are a few examples of test types that we practice here:
Unit Testing is testing of an individual module in the product. Unit testing will be done by the developers.
Why we need to perform Unit Testing? Unit tests are able to find problems early in the development stage so it can ensure developers to put out good quality of code before handover to us.
Feature Testing : Usually it is done by testers which is to ensure the feature is developed as per requirement. Apart from that, if there is any new feature that required changes of database. Then we need to perform the Data migration testing as well to ensure the data is migrate properly.
---How we perform data migration testing?
first we need to understand what data are migration, where it store and the new destination to store data. Before data migration, we will extract and duplicate data before moving it. Then, devs will start to implement data migration script then inform testers when it ready for testing. Tester will test and validate the migrated data to ensure it is accurate.---
Regression testing which I explained earlier that is to test on existing features.
During regression testing, we need to perform various types of testing for our product, including application testing and hardware testing such as gate controller, POS system, and signature tablet.
For Smoke testing a set of test cases in the Smoke Test plan will be executed after each deployment to production. Smoke testing is like a health check for an application, that ensure the core features of the application are working after the deployment.
Testlink execution metrics to monitor the test progress and analysing the results. To allow us to know where we are.
Test Link is test management tool we used in iconnect360 for tracking test cases and execution. We can monitoring our test execution progress by the number of test cases already marked as pass or failed and how many test cases still haven’t execute. By monitor test progress, it able to give feedback on how the testing is going, provide the project team with visibility about the test results and use in estimating future test efforts.
In test closure phase:
Before the release, we have a release readiness meeting to make sure that all stakeholders have been updated with latest and are fully aware of all pending releases and potential risks. The test manager will prepare the release readiness document to give insight to all the stakeholders to make a decision on whether the product is ready for release.
If the decision is to release the product, on release day, we will execute smoke testing based on the smoke test plan prepared earlier on the production, sales and demo environment.
Post Release, we have a release retrospective to discuss what went well, what could be improved and how to incorporate successes and improvements into future releases.
Bugzilla metrics will be used for analysing purposes, and used in estimating future test efforts
Thats all for my part, now I will pass it to my teammate Rizwan to continue on the Automation framework.
Apart from using the bugzilla built in reporting, we have customized the reporting to supports the flexible definition of metrics on defect stored in the bugzilla such as: no of defect found, no of defects fixed, no of open total defects and number of defect by severity and priority. it allow us quickly identify which modules in high risk that lots of defects raised on the particular module, or how many outstanding defects we have in particular release. To give us a view to decide whether the feature is ready to release or not.
Identify the manual test to automate.
Develop automation scripts and store all the automation scripts into SVN
Create test suits using test link, and using jenkins to trigger the test run.
Jenkin will grab the latest scripts from SVN.
Automati execution is start
Update the test status to testlink.
We can review the test results and logger through testlink.
Log defect trough bugzilla if needed.