SlideShare une entreprise Scribd logo
1  sur  32
Télécharger pour lire hors ligne
CALIFORNIA INSTITUTE OF TECHNOLOGY



INFORMATION MANAGEMENT SYSTEMS &
                        SERVICES
              SOFTWARE TESTING GUIDE

Primary Author:                David Gardner
Creation Date:                 3/16/05 1:10:00 PM
Last Updated:                  3/2/06 4:40:00 PM
File Name:                     IMSS Software Testing Guide.doc
Approvals:
 David Gardner:
 Testing Manager,
 Information Management Systems & Services
 TBD:
 Project Manager,
 Information Management Systems & Services

 TBD:
 Director,
 Information Management Systems & Services
Information Management Systems & Services
                                                         Software Development Testing Guide



                                                      Table of Contents
INTRODUCTION..............................................................................................................................3
     PHILOSOPHY OF T ESTING............................................................................ 4
     OVERALL GOALS OF THE T ESTING EFFORT .................................................... 4
     T ESTING ASSUMPTIONS.............................................................................. 6
     T ESTING ROLES ........................................................................................ 7
TESTING STRATEGY (METHODOLOGY) ....................................................................................9
     INTRODUCTION ......................................................................................... 9
     STAGES OF THE T ESTING LIFE CYCLE ........................................................... 9
     T EST PLAN............................................................................................. 11
     SAMPLE T EST STRATEGY (ORACLE UPGRADE) ............................................. 11
     SAMPLE T EST STRATEGY (DATA WAREHOUSE )............................................ 14
TESTING PROCESS AND PROCEDURES .................................................................................18
     INTRODUCTION ....................................................................................... 18
     CREATE T EMPLATES................................................................................ 18
     CREATE SCENARIOS (TEST CASES)............................................................. 18
     CREATE T EST DATA ................................................................................ 18
     CREATE SCRIPTS..................................................................................... 18
     EXECUTE SCRIPTS................................................................................... 19
     DOCUMENT RESULTS/CAPTURE BUGS ........................................................ 19
     ENTERI NG AND T RACKING BUGS IN REMEDY ............................................... 20
     LOGGING TAR S (ORACLE ONLY)............................................................... 22
     CODE MIGRATION AND PATCHING BUGS ..................................................... 22
GENERAL INFORMATION ...........................................................................................................25
  ENVIRONMENT........................................................................................ 25
  COMMUNICATION.................................................................................... 25
     MEETINGS .............................................................................................. 25
     T ESTING ROOM /TEST BED ....................................................................... 25
     STATUS REPORTING................................................................................. 26
     STANDARD HEADER/FOOTER .................................................................... 26
     LOCATIONS OF USEFUL RELATED DOCUMENTATION....................................... 26
APPENDIX .....................................................................................................................................28
     FILE NAMING STANDARDS AND STORAGE LOCATIONS (EXAMPLE )................... 28
     IMSS SYSTEMS AND MODULES (ACRONYMS) .............................................. 29

Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 2 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



                                                           INTRODUCTION
The purpose of this document is to provide guidance to software testers involved in IMSS software
development projects that may be either a new implementation or upgrade. Because this document is
intended to be general in nature, project specific information will be provided elsewhere, although some
project specific examples may be used. The goal of this document is to give those involved an
understanding and a set of tools to make the software testing effort more effective and consistent. This
document does not necessarily cover all aspects of testing but will address the following:

     ? Testing Philosophy – Reasons for testing, goals, and assumptions
     ? Testing Strategy, Risks, Planning
     ? Testing Roles and Responsibilities – roles are defined to help ensure quality and understand who
       is responsible for the different areas of the testing effort
     ? Testing Processes, Procedures, and Documentation Requirements
     ? Testing Environment and Tools – the environment considers the software and hardware required
       to drive the system




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 3 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



PHILOSOPHY OF TESTING
Why is testing necessary? Testing minimizes bugs before production. This applies to both purchased
software and software developed in -house. Since purchased software is not always delivered to meet
customer requirements, customizations, interfaces, and other soft ware modifications must be tested.
Testing is needed to find faults (bugs) in the software, ensure user requirements are being met, and to
deliver a quality product to the customer.
The success of the testing effort and ultimately the success of the software development project depend on
several factors. Strategy, goals, assumptions, and roles are important in this effort and are described
below.


OVERALL GOALS OF THE TESTING EFFORT
1. Minimize bugs and work-arounds: No system is bug free, but we can be aware of the majority of bugs
   and know how to either avoid or work around them. As the deadline for implementation approaches it
   will become more important to have system stability than to eliminate every non-critical bug.
     One of the goals of a software upgrade rather than a new implementation is to eliminate several
     persistent problems or bugs from the current software version. As a result, we need to specifically
     ensure bug fixes are applied and working as intended.
2. Ensure business processes are tested and application is useable. The functional users of the system
   must be able to conduct specific transactions to successfully complete their business processes. It is
   the responsibility of the IMSS Development team to ensure the system meets those needs and that no
   required functionality has broken in the process. In addition, it is the responsibility of the functional
   users to identify their business processes to test.
     While a development effort has far reaching impact, not all administrative processes are necessarily
     affected. Testing should focus on those with known impact. If the users of the system can exercise all
     of their needed processes during testing, there should be a high confidence in the ability of the system
     to operate in a production environment.
3. Ensure data are valid. Validation of data will be a major goal of the testing effort. Accuracy of data
   passing between and entering systems must be ensured.
4. Aim for no user surprises at go-live. Testing and training are both required to meet this goal. Testing
   is a process that can identify potential surprises and result in their elimination. Training aims to tell
   users of any surprises not eliminated during the testing processes. Training will be the responsibility
   of the Training Manager and is not covered in this document.
5. Confirm that system performance is acceptable. One of the goals of the testing effort will be to ensure
   query results are returned within an acceptable time frame, especially during peak periods of use.
6. Ensure data security is satisfied. Data should only be accessible to those with a need to know and
   meets the requirements of the Institute and the IMSS Security group.


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 4 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


7. Validate what worked by testing and documenting. As new problems are found in the system it is
   extremely important to det ermine if they are new problems or not. This determination helps to reduce
   the problem of finding what has changed since the test was previously executed. Documenting both
   the successes and failures of testing helps to ease the process of determining what has changed. More
   on documentation requirements will be discussed in a later section.




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 5 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




TESTING ASSUMPTIONS
There are assumptions that underlie our overall approach to testing:
1. User Requirements are known and design is stable. Testing should be performed against the user
   requirements. Test cases are identified from those user requirements. The test case scenarios shall be
   written and scripted before testing begins. Changes to the software design once testing has begun can
   delay and ultimately place the project at risk.
2. Most testing will be scripted. This helps to avoid difficulties in proving what worked and what did not
   work, and also provide a record of what was tested. While some ‘ad-hoc’ testing is anticipated, it
   should be done only with the specific coordination with the Testing Manager.
3. Testing could occur in a variety of settings. Testing may occur in a controlled test environment
   (preferable) or at a user’s personal workstation. Because coordination and the timely execution of test
   scenarios is more difficult outside a controlled test area, it becomes extremely important to identify
   test cases in advance, write the scenarios, enter onto the test script, execute, and communicate hand-
   off to others affected in a timely manner.
4. Tests will be executed using a variety of client computer configurations. While no specific effort will
   be made to try every scenario or function on every combination of browser and operating system and
   platform, using a variety of machines and reflecting typical configurations should help to identify any
   problems that are related to client computer configuration. The minimum system requirements
   recommended should be tested as well as any other advertised configuration. Contact the Testing
   Manager for test bed access and availability.
5. Development st aff shall be available to assist testers with data validation. Because accurate data is a
   key success factor, validating test scenarios is extremely important. Some validation may require back
   end querying. Should this be required, IMSS developers will run the queries and be available to assist
   with validating scenario results.
6. Functional testers will be available. Experienced functional system users will be available for testing.
   Preferably the same set of users will be available throughout the testing effort.
7. Whenever possible, automated regression testing will be used. Tools exist that allow automation of
   certain portions of the test execution process. If possible, such tools will be used during the testing
   effort.
8. The testing instance shall remain controlled. There will be a high degree of control in changes to the
   test instance. Patches and migrations shall be coordinated with the Testing Manager. Design changes
   shall be communicated before they are applied.
9. There will be a high degree of control in patches and migrations. A good testing process requires an
   understanding of the changes in the testing environment over time. Two key items for our
   environment are the application of vendor patches and custom development migrations. We will
   coordinate all patch applications and migrations in order to understand their potential impact.


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 6 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




TESTING ROLES
It is extremely important for the success of this project to have participation from several groups of
people. A high level of communicat ion between these groups will increase the success of the test effort.
Below is a list of roles and their description:


Testing Manager/Coordinator -
     ? Directs testing effort
     ? Coordinate the development and execution of the testing methodology
     ? Coordinate the development of a testing guide document
     ? Determine testing tasks, coordinates resources, and establishes testing schedule


Internal Testers –
     ? Typically IMSS systems analysts and high level functional testers
     ? Support testing and issue resolution
     ? Identify and document test cases, testing scenarios or business processes, and prepares test scripts
     ? Execute and document test results


Functional Testers –
     ? Super users from organization outside of IMSS (typically orgs within VP Business & Finance and
       Student Affairs)
     ? Works closely with IMSS systems analysts
     ? Support testing and issue resolution
     ? Identify and document test cases, testing scenarios or business processes, and prepares test scripts
     ? Facilitate acceptance testing
     ? Execute and document test results


End User Testers –
     ? Typically lead by the functional testers
     ? Support testing and issue resolution


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 7 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


     ? Could identify and document testing scenarios or business processes, and prepares test scripts
     ? Facilitate acceptance testing
     ? Document test results


IMSS Developers –
     ? Support testing, testers, and issue resolution
     ? Document test results




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 8 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




                                  TESTING S TRATEGY (M ETHODOLOGY)

I NTRODUCTION
A test strategy describes the overall approach, objectives, and direction of the testing effort. The purpose
of a testing strategy or methodology is to limit risks and to ultimately deliver the best possible software to
the customer. The testing strategy chosen for a particular application will vary depending on the software,
amount of use, and its’ particular goals. For instance, the testin g strategy for a transactional system like
Oracle will be very different than the strategy developed for testing an analytical tool like the Data
Warehouse. Moreover, the strategy chosen to test a campus-wide purchasing system verses a limited user
base tool for Housing will entail much different test strategies. Because some of these examples have
higher exposure they also have higher risk.


STAGES OF THE TESTING LIFE CYCLE
IMSS has employed several variations of one or more testing methodologies. Typical stages have
included preparation, conference room pilot (CRP), unit, integration, system, and user acceptance testing.
These stages are also referred to as phases or levels. The project manager should review the phases below
and consider similar terminology and sequence. If it makes sense, some phases and some tasks may be
deleted. In other cases, tasks and phases may need to be added. Some tasks may run in parallel, and some
stages might be combined. In most cases, each phase must be completed before the other can begin.
Duration for or execution of tasks will vary depending on schedule and risk the project manager is willing
to absorb.


Test Preparation Phase (before testing begins)
   ? Task – Develop test strategy
   ? Task – Develop high level test plan
   ? Task – Identify test cases
   ? Task – Develop scenarios, test scripts
   ? Task – Identify and share test data
   ? Task – Determine processes, procedures, standards, documentation requirements
   ? Task – Identify and create test environment
   ? Task – Identify testing team(s)
   ? T ask – Train testers




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 9 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


Unit Test Phase - The purpose of this test phase is to verify and validate independent modules function
properly. This is completed by the developers and must be completed before future phases can begin. The
Testing Manager is not normally involved in this phase.


CRP Phase (Conference Room Pilot - Optional). The purpose of this phase is to verify proof-of-concept.
A CRP is generally needed for new, large, and unproven projects.
   ? Assumption – Test instance is ready
   ? Assumption – Metadata is inserted into test instance
   ? Assumption – Unit testing and mock up is complete
   ? Assumption – Test scenarios have been identified (scripted or ad hoc)
   ? Task – Identify CRP participants
   ? Task – Determine and finalize CRP logistics
   ? Task – Set expectations.
   ? Task – Begin CRP
   ? Task – Gather and document feedback
   ? Task – End CRP
   ? Task – Obtain phase completion approval/sign-off
   ? Task – Gather/share/integrate lessons learned; incorporate necessary changes
   ? Task – Tune/revise/re-approve test plan
   ?
Integration Test Phase - The purpose of this test phase is to verify and validate all modules are
interfaced and work together.
    ? Assumption – Requirements are frozen and design is determined
    ? Assumption – Application is ready for integration testing
    ? Assumption – Metadata has been populated into test instance tables
    ? Assumption – Unit testing is complete
    ? Task – Test system and document using test scripts
    ? Task – Test interfaces
    ? Task – Identify and report bugs
    ? Task – Retest fixed bugs/regression test
    ? Task – Test security
    ? Task – Test browsers/platforms/operating systems
    ? Task – Obtain phase completion approval/sign-off
    ? Task – Gather/share/integrate lessons learned
    ? Task – Tune/revise/re-approve test plan


System Test Phase - The purpose of this test phase is to verify and validate the system works as if it were
production.
    ? Assumption – Metadata has been populated into test instance


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 10 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


     ?     Assumption -- Application is ready and has successfully completed integration testing
     ?     Task – Test system and document using test scripts
     ?     Task – Identify and report bugs
     ?     Task – Retest fixed bugs/regression test
     ?     Task – Test business processes and reports
     ?     Task – Stress test
     ?     Task – Performance test (e.g. screen refreshes)
     ?     Task – Test security logins, responsibilities, hacking
     ?     Task – Obtain phase completion approval/sign-off
     ?     Task – Gather/share/integrate lessons learned
     ?     Task – Tune/revise/re-approve test plan

User Acceptance Phase - The purpose of this test phase is to verify and validate the system works by and
for the end users as if it were production
     ? Assumption – Show-stoppers and most high level bugs have been fixed or work-arounds have
         been identified and approved
     ? Assumption – All other phases have been signed-off
     ? Assumption – Application is ready for user acceptance testing
     ? Assumption – Metadata has been populated into test instance tables
     ? Task – Train end user testers
     ? Task – Populate and approve test scripts
     ? Task – Test system and document using test scripts
     ? Task -- Obtain phase completion approval/sign-off
     ? Task -- Gather/share/integrate lessons learned



TEST P LAN
A test plan should be created for all significant projects. The t est plan documents the tasks that will be
performed, the sequence of testing, the schedule, and who is responsible for completing each task. It is
also a reflection of the strategy chosen for the testing effort. This plan should be linked to the overall
project plan. MS Project is often used to create a test plan.


SAMPLE TEST S TRATEGY (ORACLE UPGRADE)
In the case of an Oracle upgrade we may require several cycles of testing. Each cycle would represent an
actual upgrade. Several phases of testing may occur during each cycle. Typically, each cycle builds on the
previous, and yet is intended to be shorter, reflecting the assumption that the system will be more stable as
we progress to the next cycle, users will be more familiar with the system, and generally less rework will
be required.

Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 11 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




Dry Run
Before formal testing begins a dry run upgrade may be created for the development and infrastructure
teams. During this time several tasks may be performed including validation of system usability, testing
of third party systems, testing database compatibilities (links) for Exeter and FAMIS, code remediation,
identifying new functionality. Button and form testing can be performed during the dry run cycle in order
to minimize impact to the integration testing phase of cycle one. Testing and fixing issues early should
help expedite the formal testing process.


     ?     Goals
                 o     Confirm the basic application works
                 o     Validate data integrity
                 o     Ensure third party systems are compatible
                 o     Ensure database compatibilities (links) for Exeter and FAMIS
                 o     Remediate and migrate code including interfaces
                 o     Identifying new functionality
                 o     Test buttons and forms
                 o     Report bugs encountered


Cycle 1
The first major testing cycle will include two phases after the application has been validated. These
phases include integration and system testing. The goal of integration testing is to test existing
functionality, tweaks and customizations (interfaces) , address open help desk tickets that the upgrade may
resolve, test custom and standard reports, and new functionality.
The second phase will be system testing. The purpose of this phase is to validate and verify the system
works as if production. This phase will also focus on finding and reducing bugs in the system found
during earlier phases and before go-live. It is expected that only IMSS and functional analysts will
participate in cycle one testing. Remember, these task can change or move to other phases at any time.


     ?     Application Validation Goals
                 o     Confirm the basic application works
                 o     Abbreviated Button/Form testing



Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 12 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


     ?     Integration Testing Goals
                 o     Confirm the basic application works
                 o     Test Existing Functionality
                 o     Test Tweaks/Customizations
                 o     Test Open HD tickets pending upgrade
                 o     Test custom/standard reports
                 o     Test new functionality
                 o     Test interfaces
                 o     Report bugs encountered


     ?     System Testing Goals
                 o     Test Business Processes
                 o     Test Web Apps
                 o     Test Security
                 o     Test CIT Developed Apps
                 o     Test Internal Charges (Auto Feeds - I/F's)
                 o     Test Third Party Apps



Cycle 2
Cycle two will repeat integration and system test phases, but at a much quicker pace and with additional
goals and testers. Moreover, an abbreviated application validation should be performed to ensure the
application works before users begin testing. It is expected that most bugs will have been resolved in
cycle one.


     ?     Application Validation Goals
                 o     Confirm the basic application works
                 o     Abbreviated Button/Form testing


     ?     Integration Testing Goals
                 o     Test Existing Functionality (including new functionality)


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 13 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


                 o     Test Tweaks/Customizations
                 o     Regression test fixed bugs
                 o     Test interfaces
                 o     Test Business Processes
                 o     Test Responsibilities
                 o     Test custom/standard reports
                 o     Report bugs encountered


     ?     System Testing Goals
                 o     Test Web Apps
                 o     Test Security
                 o     Test CIT Developed Apps
                 o     Test Internal Charges
                 o     Test Third Party Apps
                 o     Test Appworx




SAMPLE TEST S TRATEGY (DATA WAREHOUSE)

Introduction
Due to the nature of a Data Warehouse, testing the validity of each data mart will require an approach
different from that of testing a transactional system (e.g. Oracle Applications). Because the data
warehouse and the source system are designed to perform different functions, the table structures between
the two systems differ greatly. The main difficulty found when testing is validating query results between
the systems. Not all of the data in the source system is loaded into the warehouse, and the data that is
loaded is often transformed. Therefore, comparisons between the systems are difficult, and
troubleshooting becomes extremely complex when trying to identify points of failure. The testing method
introduced in this plan is designed to help streamline the process, making it easier to pinpoint problems
and cut down on confusion for the testers while at the same time expedite the testing phases.


Testing Phases


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 14 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


Data mart testing should be divided into three distinct phases, Instance Validation, Data
Validation, and Application Usability. This testing is designed to test the completeness,
correctness, and performance of each data mart.


Instance Validation (a combination of unit and system test phases)


     o     Initial and incremental load: The designated development team will be assigned to compare the
           row counts between the staging tables and the DW tables to make sure all push records are pulled
           over to the DW tables. The DW tables should capture all changes that occur in the source tables
           through incremental loads. A script should be used to capture this information.


     o     Push logic: All conditions, translations, formulas and calculations that are part of the push load
           are tested to make sure that it is providing useful and valid data to the DW. These scenarios are
           pulled directly from the code and business rules noted in the design document.
           For example:
                 a) Can a PO agent be inactive in the HR table but still active in the PO agent table?
                 b) Are there any payments that are being pushed over to DW with no invoice associated
                    with them?
                 c) Does the PO distribution amount calculate correctly?
                 d) If the invoice status is CANCELLED, should the sum of all payments amount equal to 0?
                 e) Is there any translation or mapping errors? For example, ATTRIBUTE9 becomes
                    TRAVELER_NAME in the DW.
     o     Pull logic: All logic, conditions, translations, transformations and exceptions that are part of the
           pull load are tested to make sure that the program is pulling the right data from staging tables into
           the DW. These scenarios are pulled directly from the code and business rules noted in the design
           document.
           For example:
                 a) If a record in the Fact tables has no corresponding record in the Dimension tables, would
                    it still be loaded or does it get sent to the error table? For example, should a Vendor
                    exist in the fact table but not in the dimension table or can a PTA exist in the fact table
                    but not in the dimension table, etc.
                 b) If there are supporting Fact tables, do they need to have corresponding records in another
                    Fact table? For example, can a record exist in the Payment table when no corresponding
                    invoice exists in Invoice Overview and Invoice Detail tables? Or can a record appear in
                    the Detail table but not in the Overview table, etc.


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 15 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


                 c) Due to data being displayed differently in the DW than in the source system, at times,
                    records may appear to be duplicated. Sufficient testing should be done to ensure that
                    there are no duplicate records in the DW. Any records that appear to be duplicated (but
                    are not) should be noted in the design and training documents.


     The successful completion of this phase indicates that the load programs (push/pull) are
     working the way they should and that the data is in sync in both systems.


Data Validation (a combination of unit, system and integration test phases)


     o     Forms – Front-end
           By comparing results using the same selection criteria between the source application
           front-end and application tool (e.g. Discoverer, Cognos, Webster) we will know that the
           DW contains reliable information that can support transactional-level inquiries.


           For example:
                 a) What is the status of invoice number 198273?
                 b) Who was the buyer that processed purchase order number PO374021?
                 c) What check number paid invoice number 198272? What was the payment date?


     o     Queries– Back-end
           By using SQL to perform complex test scenarios, results are compared against the tables
           in both databases. Since not all of the data from the source system is loaded into the DW,
           testing must be done to ensure that the information that is loaded is sufficient to support
           the business activities or decision-making processes for the Institute.


           For example:
                 a) Can I see all invoices and payments associated for OFFICE DEPOT for this fiscal year?
                 b) Can I see all purchase orders for this award number for this month, and any invoice or
                    payment associated with each?
                 c) What purchase orders have invoices on hold and for what reasons?




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 16 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


           The successful completion of this phase of testing indicates that the databases are in sync.
           This step eliminates any further need of having to use two systems to validate results.
           Validation for the testing that occurs in the next phase is contained to the reporting tool
           and any custom views used by that tool.


Application Usability (a combination of integration and user acceptance test phases)


           This testing determines the usability of the information in the DW when using the front-
           end tool to perform multi-level (transactional and analytical) inquiries. To minimize bugs
           found by end user testers, internal and functional super-users should first conduct this
           testing.
           It should be assumed that any bug found in this phase is directly related to the business
           area, join or limitation of the tool. If a query produces unexpected results,
           troubleshooting should begin with the tool. Further troubleshooting using backend tables
           should occur only after the tool has been ruled out.


     o     Business Areas– Ad-Hoc Reporting
           This step tests the relationship between the folders. It is strictly scenario driven and
           should answer a valid business question. Original user requirements should be the basis
           for the scenarios that are tested.
           For example:
                 a) What are the top ten vendors in dollar volume over the last year?
                 b) How many purchase orders did a particular buyer process in the last month?
                 c) Which invoices were paid against a particular PTA in the last year?


     o     Reports – Canned Reports
           This step tests the validity of canned (pre-built) reports that have been developed to meet
           further requirements and provide easy access for commonly asked queries. Testers
           should use a variety of parameters when testing the reports. Data returned should
           produce meaningful results.




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 17 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



                                   TESTING PROCESS AND PROCEDURES


I NTRODUCTION
Most testing will be scripted. Using scripts helps to avoid confusion and increase coordination during
testing. While some ‘open-ended’ testing is anticipated, it should be done only with the specific approval
of the Testing Manager. Prior to the execution of tests, test cases should be identified, and scripts should
be created. At a minimum, scripts should address user requirements.


CREATE TEMPLATES
The creation of test script template(s) provides a consistent approach to the documentation of the testing
effort. Example templates can be found at Y:TestingTest Script TemplatesBlank Templates and an
example naming convention is provided in the Appendix. After obtaining a copy of the template users
should perform a “save as” and create a test script file according to the naming and saving conventions.
Use the templates according to which phase of the testing phase or cycle you are in.



CREATE SCENARIOS (TEST CASES )
Prior to testing, scenarios should be determined. Scenarios should test user requirements. Refer to the user
requirements document.



CREATE TEST DATA
For most testing efforts, test data should be identified and shared among the testers. Test data can range
from employee names, employee IDs, PTAs, purchase requisition or order number, etc. This data is
useful when handing off to the next tester. Having this data helps testers anticipate data from the previous
tester. A document should be created and distributed with this information to all testers.


CREATE SCRIPTS
The test scripts themselves should be derived from the test cases. These scenarios should be entered onto
the test script. Columns to be entered in advance include requirement number, action, expected outcome,
navigations, test data used, and tester information.



Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 18 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




EXECUTE SCRIPTS
Once the scenarios have been entered onto the scripts, testing may begin. The scripts created are meant to
be both a roadmap for the testing as well as a way of documenting the results of testing. Several important
areas should be documented as a result of testing. The results should include testing date, pass/fail results,
performance results, responsibility used, and pertinent comments (especially a Remedy ticket number for
failures – more on that later).
While testing, the tester should have the appropriate script open so that you may note necessary data in
the script as you go. Save often to avoid losing work. Keeping your script information up to date will ease
status reporting. The Testing Manager will review scripts periodically to obtain a count of the passes and
fails and report this at status meetings.


DOCUMENT R ESULTS /CAPTURE BUGS
When you run your scenarios you will find they have either passed or failed. This information should be
captured on the test script. For all testing efforts, it is necessary to document your results by tracking all
bugs and anomalies encountered during testing. Remedy will serve as the tool to report and track all
testing items that have failed. For those testers without access to Remedy, a bug report document should
be completed and submitted to the T esting Manager.
The Remedy Help Desk ticket should track the bug from the initial report , to assignment, to resolution, to
re-test, to status reporting for management purposes, and finally to closure.
Before testing please check the bug report. This report will help avoid wasting time by showing you what
bugs are known and their status.


Pass/Fail Entries
On the test script enter the test results for each scenario as follows:

     If the scenario Passes:
           ? Identify data tested on your script
           ? Enter the number “1” in the “Pass” column of the script
           ? Move to the next step.


     If a scenario Fails
           ? Identify data tested on your script


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 19 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


           ? Enter the number “1” in the “Fail” column of the script
           ? Check the bug report to make sure someone hasn’t already reported this problem.
           ? Capture the failure with a screen shot. Use either Word or Snag-it. (See below for more
             information and the Appendix for file name convention and storage location.)
           ? Make sure the Testing Manager or the IMSS Development team knows that you have
             encountered a new problem as soon as possible.
           ? If you do not have access to Remedy, submit a bug report using the form found at
             http://support.caltech.edu/testingsupport/
           ? The Testing Manager or designee will open a Remedy ticket from the bug reports submitted
             and include the information received on the report form.


     Re-testing scenarios that previously Failed
           ? After a bug has been fixed, the original tester will be asked to perform a re-test.
           ? Reply to the email with your results. This information shall be posted to the Remedy ticket
             work log and the ticket can either be resolved if pass or returned to the developer if fails.
           ? If the scenario is now successful (e.g. the previous problem is now fixed), move the “1” from
             the “Fail” column to “Pass” column
           ? DO NOT erase the original results in the “Comments” column. Add the new pass
             documentation to that column or add any new information if it still fails
           ? Notify the Testing Manager of the outcome immediately.


Should you need to capture a screen shot use Snag-it or MS Word. Snag-it is preferred over Word for
saving screen shots due to file size, however Word can be used if you do not already have Snag-it
installed. Contact the Testing Manager should you need instructions for Snag-it.


ENTERING AND TRACKING BUGS IN REMEDY
Not all testers will enter bugs directly into Remedy. Many testers will complete a Bug Report. For those
who have access to Remedy, tickets should be entered in a standard way to allow necessary reports to be
easily generated; this should reduce the need for ad hoc status requests if the cases are kept up to date.
Internal testers typically enter bugs directly into Remedy. Most external testers will complete a bug report
and submit it to the Testing Manager.
Process
     1. Once it has been determined that a bug (error) has been found and/or that a testing step has failed,
        the tester should either open a Remedy ticket or create a Bug Report. The bug report form for


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 20 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


           non-Remedy users can be found at http://support.caltech.edu/. All fields should be completed.
           Once complete submit and it will be automatically forwarded to the Testing Manager. The
           Testing Manager will open a Remedy Help Desk (HD) ticket. The ticket will be assigned to the
           appropriate IMSS development team for further analysis.


     The following Remedy categorizations should be used:

                       Category                - Application Upgrade
                       Type                    - Application Testing
                       Item                    - Bug, Functional Gap, Perform Issue, Training/Doc, Unclassified
                       Module                  - [Enter Project Name from LOV]
                       Summary                 - Include the module acronym in the Summary Field along with a short
                                               description of the problem, e.g. “GL – Unable to Enter Journal.”
                       Vendor tracking #                   - Enter the TAR number, if applicable
                       Application Version                 - [Enter from LOV]

     2. The Testing Manager, team lead or his or her designee assigns the ticket to an individual or
        group. That individual is responsible for updating the ticket. If appropriate, updates can include
        changes to description, summary, or categorization.
     3. The tester shall create screen-shot of the error and file it. The file name and location of the
        screen-shot plus any other backup information should be entered on the test script. The Bug
        Report shall include file names and locations of screen shots, query names, and backup
        information as well. Be sure to follow the naming convention and enter requested information
        previously described.
     4. Remedy prioritization is defined as follows:


                 Urgent:           Show stopper, cannot go-live with the bug
                 High:             Time-critical and impacting more than one person, but not of Institute-level
                                   impact.
                 Med:              This is the default priority. Items that affect one or two people and are not
                                   immediately time critical.
                 Low:              An annoyance. No major impact. A work around has been found.


     5. Only after a fix has been confirmed should the HD ticket be set to ‘Resolved’ status. The ticket
        shall be assigned back to the Testing Manager for coordination with and confirmation of the fix
        by the end user.



Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 21 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


     6. The Testing Manager will review open tickets on a regular basis to confirm progress is being
        made. Close attention will be paid to tickets that have been open for a long time, have urgent or
        high priority, or have not been addressed or received follow-up.
     7. Always log a Remedy Ticket for bugs found and document, document, document! It is highly
        recommended that you communicate via email when discussing the problem and resolution. You
        can then cut and paste your emails into the Remedy work log. This is an excellent way to
        document and a great reference should the problem occur in the future.



LOGGING TARS (O RACLE ONLY)

Only log a TAR after a Remedy ticket has been created. A TAR is used when an Oracle bug is found and
cannot be resolved by IMSS. TARs are logged in the Oracle tracking system called Metalink. Only IMSS
Analysts, Developers, and DBAs will log TARs. All other testers should seek out a member of one of
these groups if a TAR is required.
After requesting a user name and password from IMSS Infrastructure go to www.oracle.com and navigate
as follows:
     ? Select Support in the Resources menu
     ? Select Metalink from the Support Services menu
     ? Select Metalink login (for registered users)
     ? Enter your user name and password.
     ? Begin entering your TAR as instructed.
When logging a TAR with Oracle, be sure to use the correct CSI number. Contact the Infrastructure
manager to obtain this number.
Reminder: Please be sure a Remedy ticket is created before the TAR is logged. This becomes very
important, especially after the TAR is removed from Metalink and you can no longer retrieve it. Be sure
to update the “Vendor Bug Number” field in the Remedy ticket. After logging your TAR please notify
others of the problem and status.



C ODE MIGRATION AND PATCHING

Custom code migrations into the TEST instance will need to be closely coordinated to ensure proper
handoffs are made and that the test instance is not corrupted. For the most current version of the migration
procedure please contact the Development Manager.
Patching is often necessary to address bugs found during testing. Patches are normally provided by the
vendor. Like migration of code, patching will also be closely coordinated to minimize unnecessary code
changes and to prevent corruption to the test instance. The patching process will be tracked in Remedy via


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 22 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide


the change request process. The help desk ticket opened for the bug found should be related to the change
request ticket. A patch document must be created for each patch that is under consideration. A meeting is
normally held to determine if the patch is viable and whether the patch will be applied. The test manager
should be well informed of the application of the patch into the test instance and coordinate instance
availability and regression testing. For the most current version of the patching procedure please contact
the Development Manager.




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 23 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



 Testing Process Summary
All testing should follow a defined process as described below.

     ?     Create Test Templates
     ?     Create Test Scenarios
     ?     Execute Scripts
     ?     Document Results
     ?     Capture Bugs
     ?     Hand-off to next tester
     ?     Fix Bugs
     ?     Retest and Update Test Scripts


                                                            Testing Process Diagram


                  Create Test                                 Create Test                        Execute
                  Templates                                   Scenarios                          Scripts




                                                              Hand-off to              Success   Document
                                                              next tester                        Results


                                                                                                            Failure


                                                                                                 Capture Bugs




                                                              Retest and                         Fix Bugs
                                                              Update Test
                                                              Scripts


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 24 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




                                                 G ENERAL INFORMATION

ENVIRONMENT
Environment means many things. The testing environment includes, but is not limited to system
architecture, testing tools, system access, a test instance, as well as where the testing will be performed.
Access to the test instance, a computer, the ‘Y’ drive, and a printer will be necessary.
     ? IMSS Security or the Testing Manager will grant access to the testing instance (application) with
       appropriate responsibilities
     ? If you have access problems please contact the Testing Manager.
Familiarize yourself with the ‘Y’ drive structure. Note which phase or cycle of testing you are in and
where to store your files. Standards are described below. Understand where to find script templates and
where to store actual scripts with results. Also note where to store screen-shot backup. Contact the
Testing Manager if you questions about files or the ‘Y’ drive structure.
http://atcdba-support.caltech.edu/tnd_apps.htm


COMMUNICATION
Communicating hand-offs and other events affecting other testers/developers are extremely important to
the success and on-time delivery of this project. In order to maintain a continuous flow of effort, a
tester/developer should notify others when an action has been completed so the next action can begin. Use
of email, personal communication, telephone, and voice-mail can all play an integral part of
communicating an event.

An email distribution for most project s will be created. This list is intended to keep testers informed of the
various testing activities, including status, updates, reminders, instructions, procedures, et c. If you have a
need to use the distribution list , it is suggested you first contact the project or Testing Manager.



MEETINGS
For larger projects that involve many testers, a training/kickoff meeting should be held. Periodic status
meetings are especially important for the testers and management to attend. The focus of the discussion
should be around the following issues: pass/fail status, urgent and high level bugs/issues, schedule, and
other topics as required.




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 25 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



TESTING ROOM /TEST B ED
A testing facility will be reserved if available. A test bed has been created to test the minimum system
requirements for most software applications. Platforms include Macintosh, LINUX, and PC with various
operating systems and browsers. The Testing Manager should be contacted to reserve this facility.


STATUS R EPORTING
The Testing Manager will provide the testing team and management periodic status. Because the
information is taken from the test scripts, it is extremely important to keep them current. Several pieces of
data will be tracked:


     ? Total number of test scenarios.
     ? Number of scenarios passed.
     ? Number of scenarios failed.
     ? Number of scenarios incomplete.
     ? Percent of scenarios completed.


Other information may be included:
     ? How many times was each scenario performed?
     ? Are we on schedule?
     ? If not, what is the hold up and what is the recovery plan?


STANDARD HEADER /FOOTER
All files created, such as Word and Excel documents should contain specific information in the footer and
header. This will assist testers and others with finding files (by displaying full path and file name),
determining when they were last modified (by coding in the save date, not print date), and identifying the
author. In summary, include the following information on every document:

     ?     Full path and file name
     ?     Creator’s Name
     ?     Save Date
     ?     Number of pages (Page X of Y)



Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 26 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



LOCATIONS OF USEFUL RELATED DOCUMENTATION
Refer to the application testing support site for additional information and assistance. It is found at
http://www.support.caltech.edu/testingsupport/index.htm .




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 27 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



                                                                 APPENDIX


FILE NAMING STANDARDS AND STORAGE LOCATIONS (EXAMPLE)




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 28 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




IMSS SYSTEMS AND MODULES (ACRONYMS )
For the most current version of the systems and modules document please see P:Documentation
(CIT )Policy-StandardsDevelopment StandardsATC Development Standards.doc.


System                                  Module              Module Description
                                        Acronym
DATA WAREHOUSE
                                        DIS                 Discoverer
                                        LDQ                 LD Query
                                        RDB                 Relational Database Management
                                        WBS                 Webster
EXETER
                                        CMN                 Common
                                        CMS                 Career Management Services
                                        GEN                 General
                                        MLS                 CIT Mail Services
                                        SAS                 Student Aid (Financial Aid) System
                                        SBS                 Student Billing (Bursar) System
                                        SHS                 Student Housing System
                                        SMS                 Student Marketing (Undergraduate Admissions) System
                                        SSS                 Student Services (Registrar) System
                                        SWS                 Student Web System
FAMIS
                                        FWB                 Famis Web
                                        INC                 Inventory Control
                                        KC                  Key Control
                                        MMM                 Maintenance Management
                                        SPM                 Space Management


Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 29 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 30 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



IMSS Systems and Modules (continued)


System                                  Module              Module Description
                                        Acronym
MISCELLANEOUS
                                        APX                 AppWorx
                                        BK                  Bookstore Web Utility
                                        BTX                 BudgeText
                                        CAT                 Catermate
                                        CBD                 CBord
                                        CC                  Counseling Center
                                        CD                  Catalog Database
                                        CEI                 Campus Card
                                        DMS                 Athenaeum
                                        FLS                 Federal Liaison Services
                                        GAT                 GATES (graduate office)
                                        IPS                 Interim Pre-Award System
                                        KRO                 Kronos
                                        MMS                 Win Mail Management System
                                        NPW                 Northern Prowatch
                                        PAS                 Passport (Pcard)
                                        SIG                 Sig Forms
                                        TEL                 Telecomm
                                        UID                 Universal Identifier
                                        WNS                 Windstar



ORACLE
                                        AOL                 Application Objects Library
                                        AP                  Accounts Payable

Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 31 of 32
Information Management Systems & Services
                                                         Software Development Testing Guide



                                        AR                  Accounts Receivable
O RACLE
                                        BB                  Benefit Billing
                                        CM                  Cash Management
                                        CWS                 Workstudy
                                        FA                  Fixed Assets
                                        GL                  General Ledger
                                        GMS                 Grants Management System
                                        HR                  Human Resources
                                        IC                  Internal Charges
                                        IN                  Inventory
                                        LD                  Labor Distribution
                                        PAN                 EPAN
                                        PAY                 Payroll
                                        PO                  Purchasin g
                                        PRK                 Parking Database
See below for breakout                  WA                  Web Apps


ORACLE Web Apps
                                        WIC                 Web Internal Charges
                                        PIC                 Personal Internal Charges
                                        EPAN                Web Personal Action Notification
                                        EVAL                Performance Evaluation
                                        ASI                 Annual Salary Increase
                                        WREQ                Web Requisition
                                        EXPDEF              Expenditure Definitions
                                        PTA                 PTA Query




Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc
                                                          Saved Date: 3/2/06 4:40 PM

Page 32 of 32

Contenu connexe

Tendances

General Principals Of Software Validation
General Principals Of Software ValidationGeneral Principals Of Software Validation
General Principals Of Software Validationstaciemarotta
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testingmuhamad iqbal
 
Production environment data to improve testing
Production environment data to improve testingProduction environment data to improve testing
Production environment data to improve testingJohan Hoberg
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and PlanningTechWell
 
Software reliability & quality
Software reliability & qualitySoftware reliability & quality
Software reliability & qualityNur Islam
 
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...Cognizant
 
Iso12207:2008 standard
Iso12207:2008 standardIso12207:2008 standard
Iso12207:2008 standardMaria Akther
 
Cost effective auditing of web applications and networks in smb
Cost effective auditing of web applications and networks in smbCost effective auditing of web applications and networks in smb
Cost effective auditing of web applications and networks in smbLalit Choudhary
 
IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET Journal
 
02. Fault Tolerance Pattern 위한 mindset
02. Fault Tolerance Pattern 위한 mindset02. Fault Tolerance Pattern 위한 mindset
02. Fault Tolerance Pattern 위한 mindseteva
 
SECTZG629T_FR_2012HZ78512
SECTZG629T_FR_2012HZ78512SECTZG629T_FR_2012HZ78512
SECTZG629T_FR_2012HZ78512Najeem M Illyas
 
Concepts in Software Safety
Concepts in Software SafetyConcepts in Software Safety
Concepts in Software Safetydalesanders
 
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...IRJET- Factors Affecting the Delivery of Quality Software and their Relations...
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...IRJET Journal
 

Tendances (19)

General Principals Of Software Validation
General Principals Of Software ValidationGeneral Principals Of Software Validation
General Principals Of Software Validation
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testing
 
Production environment data to improve testing
Production environment data to improve testingProduction environment data to improve testing
Production environment data to improve testing
 
Dtl 2012 kl-app_ctl1.2
Dtl 2012 kl-app_ctl1.2Dtl 2012 kl-app_ctl1.2
Dtl 2012 kl-app_ctl1.2
 
Thesis Final Report
Thesis Final ReportThesis Final Report
Thesis Final Report
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
Software reliability & quality
Software reliability & qualitySoftware reliability & quality
Software reliability & quality
 
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...
Why On-Demand Provisioning Enables Tighter Alignment of Test and Production E...
 
Ch8.testing
Ch8.testingCh8.testing
Ch8.testing
 
Testing Experience Magazine Vol.14 June 2011
Testing Experience Magazine Vol.14 June 2011Testing Experience Magazine Vol.14 June 2011
Testing Experience Magazine Vol.14 June 2011
 
Iso12207:2008 standard
Iso12207:2008 standardIso12207:2008 standard
Iso12207:2008 standard
 
Cost effective auditing of web applications and networks in smb
Cost effective auditing of web applications and networks in smbCost effective auditing of web applications and networks in smb
Cost effective auditing of web applications and networks in smb
 
functional requirements using LPP
functional requirements using LPPfunctional requirements using LPP
functional requirements using LPP
 
IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?
 
Lesson 6...Guide
Lesson 6...GuideLesson 6...Guide
Lesson 6...Guide
 
02. Fault Tolerance Pattern 위한 mindset
02. Fault Tolerance Pattern 위한 mindset02. Fault Tolerance Pattern 위한 mindset
02. Fault Tolerance Pattern 위한 mindset
 
SECTZG629T_FR_2012HZ78512
SECTZG629T_FR_2012HZ78512SECTZG629T_FR_2012HZ78512
SECTZG629T_FR_2012HZ78512
 
Concepts in Software Safety
Concepts in Software SafetyConcepts in Software Safety
Concepts in Software Safety
 
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...IRJET- Factors Affecting the Delivery of Quality Software and their Relations...
IRJET- Factors Affecting the Delivery of Quality Software and their Relations...
 

En vedette (20)

Familia y diccionario
Familia y diccionarioFamilia y diccionario
Familia y diccionario
 
Evaluation question 1
Evaluation question 1Evaluation question 1
Evaluation question 1
 
LinkedIn Publisher Offerings - InShare (April 2013)
LinkedIn Publisher Offerings - InShare (April 2013)LinkedIn Publisher Offerings - InShare (April 2013)
LinkedIn Publisher Offerings - InShare (April 2013)
 
Connect for Lunch Explained
Connect for Lunch ExplainedConnect for Lunch Explained
Connect for Lunch Explained
 
Gi Ambassadors workshops - Intro to blogging 2011
Gi Ambassadors workshops - Intro to blogging 2011Gi Ambassadors workshops - Intro to blogging 2011
Gi Ambassadors workshops - Intro to blogging 2011
 
Production log
Production logProduction log
Production log
 
Audience feedback
Audience feedbackAudience feedback
Audience feedback
 
Sona
SonaSona
Sona
 
Tm review
Tm  reviewTm  review
Tm review
 
Oqi
OqiOqi
Oqi
 
Digipack analysis
Digipack analysisDigipack analysis
Digipack analysis
 
The ascent
The ascentThe ascent
The ascent
 
Agm gip icx
Agm   gip icxAgm   gip icx
Agm gip icx
 
¿Qué vas a hacer en tu casa?
¿Qué vas a hacer en tu casa?¿Qué vas a hacer en tu casa?
¿Qué vas a hacer en tu casa?
 
Agm discharge report expansion
Agm discharge report   expansionAgm discharge report   expansion
Agm discharge report expansion
 
Multi-Objective Cross-Project Defect Prediction
Multi-Objective Cross-Project Defect PredictionMulti-Objective Cross-Project Defect Prediction
Multi-Objective Cross-Project Defect Prediction
 
The millennials sola
The millennials solaThe millennials sola
The millennials sola
 
A mapa historia
A mapa historiaA mapa historia
A mapa historia
 
Production log
Production logProduction log
Production log
 
Best Finance Award Application
Best Finance Award ApplicationBest Finance Award Application
Best Finance Award Application
 

Similaire à Testing guide

Software testing techniques - www.testersforum.com
Software testing techniques - www.testersforum.comSoftware testing techniques - www.testersforum.com
Software testing techniques - www.testersforum.comwww.testersforum.com
 
Principles and Goals of Software Testing
Principles and Goals of Software Testing Principles and Goals of Software Testing
Principles and Goals of Software Testing INFOGAIN PUBLICATION
 
Understanding Test Environments Management
Understanding Test Environments ManagementUnderstanding Test Environments Management
Understanding Test Environments ManagementEnov8
 
Beginner guide-to-software-testing
Beginner guide-to-software-testingBeginner guide-to-software-testing
Beginner guide-to-software-testingbiswajit52
 
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIBEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIsuhasreddy1
 
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTING
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTINGWelingkar_final project_ppt_IMPORTANCE & NEED FOR TESTING
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTINGSachin Pathania
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance Webtech Learning
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxssuser305f65
 
Term Paper - Quality Assurance in Software Development
Term Paper - Quality Assurance in Software DevelopmentTerm Paper - Quality Assurance in Software Development
Term Paper - Quality Assurance in Software DevelopmentSharad Srivastava
 
Software testing
Software testingSoftware testing
Software testingRavi Dasari
 
Software reliability engineering
Software reliability engineeringSoftware reliability engineering
Software reliability engineeringMark Turner CRP
 
The Vital Role of Test Data Management in Software Development.pdf
The Vital Role of Test Data Management in Software Development.pdfThe Vital Role of Test Data Management in Software Development.pdf
The Vital Role of Test Data Management in Software Development.pdfRohitBhandari66
 
Software testing pdf
Software testing pdfSoftware testing pdf
Software testing pdfGaurav Nigam
 

Similaire à Testing guide (20)

Test Driven Development (TDD)
Test Driven Development (TDD)Test Driven Development (TDD)
Test Driven Development (TDD)
 
Too many files
Too many filesToo many files
Too many files
 
Software testing techniques - www.testersforum.com
Software testing techniques - www.testersforum.comSoftware testing techniques - www.testersforum.com
Software testing techniques - www.testersforum.com
 
Software Testing ppt
Software Testing pptSoftware Testing ppt
Software Testing ppt
 
Principles and Goals of Software Testing
Principles and Goals of Software Testing Principles and Goals of Software Testing
Principles and Goals of Software Testing
 
Understanding Test Environments Management
Understanding Test Environments ManagementUnderstanding Test Environments Management
Understanding Test Environments Management
 
Beginner guide-to-software-testing
Beginner guide-to-software-testingBeginner guide-to-software-testing
Beginner guide-to-software-testing
 
Quality Assurance and Testing services
Quality Assurance and Testing servicesQuality Assurance and Testing services
Quality Assurance and Testing services
 
Testing by padamini c
Testing by padamini cTesting by padamini c
Testing by padamini c
 
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIBEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
 
CTFL chapter 05
CTFL chapter 05CTFL chapter 05
CTFL chapter 05
 
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTING
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTINGWelingkar_final project_ppt_IMPORTANCE & NEED FOR TESTING
Welingkar_final project_ppt_IMPORTANCE & NEED FOR TESTING
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docx
 
Term Paper - Quality Assurance in Software Development
Term Paper - Quality Assurance in Software DevelopmentTerm Paper - Quality Assurance in Software Development
Term Paper - Quality Assurance in Software Development
 
Software testing
Software testingSoftware testing
Software testing
 
Software reliability engineering
Software reliability engineeringSoftware reliability engineering
Software reliability engineering
 
Softwaretesting
SoftwaretestingSoftwaretesting
Softwaretesting
 
The Vital Role of Test Data Management in Software Development.pdf
The Vital Role of Test Data Management in Software Development.pdfThe Vital Role of Test Data Management in Software Development.pdf
The Vital Role of Test Data Management in Software Development.pdf
 
Software testing pdf
Software testing pdfSoftware testing pdf
Software testing pdf
 

Dernier

Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxSayali Powar
 
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...Nguyen Thanh Tu Collection
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...DhatriParmar
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research DiscourseAnita GoswamiGiri
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptxmary850239
 
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxCLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxAnupam32727
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Association for Project Management
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfPrerana Jadhav
 
How to Manage Buy 3 Get 1 Free in Odoo 17
How to Manage Buy 3 Get 1 Free in Odoo 17How to Manage Buy 3 Get 1 Free in Odoo 17
How to Manage Buy 3 Get 1 Free in Odoo 17Celine George
 
ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6Vanessa Camilleri
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSMae Pangan
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesVijayaLaxmi84
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...DhatriParmar
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDhatriParmar
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 

Dernier (20)

Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
 
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
31 ĐỀ THI THỬ VÀO LỚP 10 - TIẾNG ANH - FORM MỚI 2025 - 40 CÂU HỎI - BÙI VĂN V...
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research Discourse
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptxCLASSIFICATION OF ANTI - CANCER DRUGS.pptx
CLASSIFICATION OF ANTI - CANCER DRUGS.pptx
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdf
 
How to Manage Buy 3 Get 1 Free in Odoo 17
How to Manage Buy 3 Get 1 Free in Odoo 17How to Manage Buy 3 Get 1 Free in Odoo 17
How to Manage Buy 3 Get 1 Free in Odoo 17
 
ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6ICS 2208 Lecture Slide Notes for Topic 6
ICS 2208 Lecture Slide Notes for Topic 6
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHS
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their uses
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 

Testing guide

  • 1. CALIFORNIA INSTITUTE OF TECHNOLOGY INFORMATION MANAGEMENT SYSTEMS & SERVICES SOFTWARE TESTING GUIDE Primary Author: David Gardner Creation Date: 3/16/05 1:10:00 PM Last Updated: 3/2/06 4:40:00 PM File Name: IMSS Software Testing Guide.doc Approvals: David Gardner: Testing Manager, Information Management Systems & Services TBD: Project Manager, Information Management Systems & Services TBD: Director, Information Management Systems & Services
  • 2. Information Management Systems & Services Software Development Testing Guide Table of Contents INTRODUCTION..............................................................................................................................3 PHILOSOPHY OF T ESTING............................................................................ 4 OVERALL GOALS OF THE T ESTING EFFORT .................................................... 4 T ESTING ASSUMPTIONS.............................................................................. 6 T ESTING ROLES ........................................................................................ 7 TESTING STRATEGY (METHODOLOGY) ....................................................................................9 INTRODUCTION ......................................................................................... 9 STAGES OF THE T ESTING LIFE CYCLE ........................................................... 9 T EST PLAN............................................................................................. 11 SAMPLE T EST STRATEGY (ORACLE UPGRADE) ............................................. 11 SAMPLE T EST STRATEGY (DATA WAREHOUSE )............................................ 14 TESTING PROCESS AND PROCEDURES .................................................................................18 INTRODUCTION ....................................................................................... 18 CREATE T EMPLATES................................................................................ 18 CREATE SCENARIOS (TEST CASES)............................................................. 18 CREATE T EST DATA ................................................................................ 18 CREATE SCRIPTS..................................................................................... 18 EXECUTE SCRIPTS................................................................................... 19 DOCUMENT RESULTS/CAPTURE BUGS ........................................................ 19 ENTERI NG AND T RACKING BUGS IN REMEDY ............................................... 20 LOGGING TAR S (ORACLE ONLY)............................................................... 22 CODE MIGRATION AND PATCHING BUGS ..................................................... 22 GENERAL INFORMATION ...........................................................................................................25 ENVIRONMENT........................................................................................ 25 COMMUNICATION.................................................................................... 25 MEETINGS .............................................................................................. 25 T ESTING ROOM /TEST BED ....................................................................... 25 STATUS REPORTING................................................................................. 26 STANDARD HEADER/FOOTER .................................................................... 26 LOCATIONS OF USEFUL RELATED DOCUMENTATION....................................... 26 APPENDIX .....................................................................................................................................28 FILE NAMING STANDARDS AND STORAGE LOCATIONS (EXAMPLE )................... 28 IMSS SYSTEMS AND MODULES (ACRONYMS) .............................................. 29 Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 2 of 32
  • 3. Information Management Systems & Services Software Development Testing Guide INTRODUCTION The purpose of this document is to provide guidance to software testers involved in IMSS software development projects that may be either a new implementation or upgrade. Because this document is intended to be general in nature, project specific information will be provided elsewhere, although some project specific examples may be used. The goal of this document is to give those involved an understanding and a set of tools to make the software testing effort more effective and consistent. This document does not necessarily cover all aspects of testing but will address the following: ? Testing Philosophy – Reasons for testing, goals, and assumptions ? Testing Strategy, Risks, Planning ? Testing Roles and Responsibilities – roles are defined to help ensure quality and understand who is responsible for the different areas of the testing effort ? Testing Processes, Procedures, and Documentation Requirements ? Testing Environment and Tools – the environment considers the software and hardware required to drive the system Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 3 of 32
  • 4. Information Management Systems & Services Software Development Testing Guide PHILOSOPHY OF TESTING Why is testing necessary? Testing minimizes bugs before production. This applies to both purchased software and software developed in -house. Since purchased software is not always delivered to meet customer requirements, customizations, interfaces, and other soft ware modifications must be tested. Testing is needed to find faults (bugs) in the software, ensure user requirements are being met, and to deliver a quality product to the customer. The success of the testing effort and ultimately the success of the software development project depend on several factors. Strategy, goals, assumptions, and roles are important in this effort and are described below. OVERALL GOALS OF THE TESTING EFFORT 1. Minimize bugs and work-arounds: No system is bug free, but we can be aware of the majority of bugs and know how to either avoid or work around them. As the deadline for implementation approaches it will become more important to have system stability than to eliminate every non-critical bug. One of the goals of a software upgrade rather than a new implementation is to eliminate several persistent problems or bugs from the current software version. As a result, we need to specifically ensure bug fixes are applied and working as intended. 2. Ensure business processes are tested and application is useable. The functional users of the system must be able to conduct specific transactions to successfully complete their business processes. It is the responsibility of the IMSS Development team to ensure the system meets those needs and that no required functionality has broken in the process. In addition, it is the responsibility of the functional users to identify their business processes to test. While a development effort has far reaching impact, not all administrative processes are necessarily affected. Testing should focus on those with known impact. If the users of the system can exercise all of their needed processes during testing, there should be a high confidence in the ability of the system to operate in a production environment. 3. Ensure data are valid. Validation of data will be a major goal of the testing effort. Accuracy of data passing between and entering systems must be ensured. 4. Aim for no user surprises at go-live. Testing and training are both required to meet this goal. Testing is a process that can identify potential surprises and result in their elimination. Training aims to tell users of any surprises not eliminated during the testing processes. Training will be the responsibility of the Training Manager and is not covered in this document. 5. Confirm that system performance is acceptable. One of the goals of the testing effort will be to ensure query results are returned within an acceptable time frame, especially during peak periods of use. 6. Ensure data security is satisfied. Data should only be accessible to those with a need to know and meets the requirements of the Institute and the IMSS Security group. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 4 of 32
  • 5. Information Management Systems & Services Software Development Testing Guide 7. Validate what worked by testing and documenting. As new problems are found in the system it is extremely important to det ermine if they are new problems or not. This determination helps to reduce the problem of finding what has changed since the test was previously executed. Documenting both the successes and failures of testing helps to ease the process of determining what has changed. More on documentation requirements will be discussed in a later section. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 5 of 32
  • 6. Information Management Systems & Services Software Development Testing Guide TESTING ASSUMPTIONS There are assumptions that underlie our overall approach to testing: 1. User Requirements are known and design is stable. Testing should be performed against the user requirements. Test cases are identified from those user requirements. The test case scenarios shall be written and scripted before testing begins. Changes to the software design once testing has begun can delay and ultimately place the project at risk. 2. Most testing will be scripted. This helps to avoid difficulties in proving what worked and what did not work, and also provide a record of what was tested. While some ‘ad-hoc’ testing is anticipated, it should be done only with the specific coordination with the Testing Manager. 3. Testing could occur in a variety of settings. Testing may occur in a controlled test environment (preferable) or at a user’s personal workstation. Because coordination and the timely execution of test scenarios is more difficult outside a controlled test area, it becomes extremely important to identify test cases in advance, write the scenarios, enter onto the test script, execute, and communicate hand- off to others affected in a timely manner. 4. Tests will be executed using a variety of client computer configurations. While no specific effort will be made to try every scenario or function on every combination of browser and operating system and platform, using a variety of machines and reflecting typical configurations should help to identify any problems that are related to client computer configuration. The minimum system requirements recommended should be tested as well as any other advertised configuration. Contact the Testing Manager for test bed access and availability. 5. Development st aff shall be available to assist testers with data validation. Because accurate data is a key success factor, validating test scenarios is extremely important. Some validation may require back end querying. Should this be required, IMSS developers will run the queries and be available to assist with validating scenario results. 6. Functional testers will be available. Experienced functional system users will be available for testing. Preferably the same set of users will be available throughout the testing effort. 7. Whenever possible, automated regression testing will be used. Tools exist that allow automation of certain portions of the test execution process. If possible, such tools will be used during the testing effort. 8. The testing instance shall remain controlled. There will be a high degree of control in changes to the test instance. Patches and migrations shall be coordinated with the Testing Manager. Design changes shall be communicated before they are applied. 9. There will be a high degree of control in patches and migrations. A good testing process requires an understanding of the changes in the testing environment over time. Two key items for our environment are the application of vendor patches and custom development migrations. We will coordinate all patch applications and migrations in order to understand their potential impact. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 6 of 32
  • 7. Information Management Systems & Services Software Development Testing Guide TESTING ROLES It is extremely important for the success of this project to have participation from several groups of people. A high level of communicat ion between these groups will increase the success of the test effort. Below is a list of roles and their description: Testing Manager/Coordinator - ? Directs testing effort ? Coordinate the development and execution of the testing methodology ? Coordinate the development of a testing guide document ? Determine testing tasks, coordinates resources, and establishes testing schedule Internal Testers – ? Typically IMSS systems analysts and high level functional testers ? Support testing and issue resolution ? Identify and document test cases, testing scenarios or business processes, and prepares test scripts ? Execute and document test results Functional Testers – ? Super users from organization outside of IMSS (typically orgs within VP Business & Finance and Student Affairs) ? Works closely with IMSS systems analysts ? Support testing and issue resolution ? Identify and document test cases, testing scenarios or business processes, and prepares test scripts ? Facilitate acceptance testing ? Execute and document test results End User Testers – ? Typically lead by the functional testers ? Support testing and issue resolution Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 7 of 32
  • 8. Information Management Systems & Services Software Development Testing Guide ? Could identify and document testing scenarios or business processes, and prepares test scripts ? Facilitate acceptance testing ? Document test results IMSS Developers – ? Support testing, testers, and issue resolution ? Document test results Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 8 of 32
  • 9. Information Management Systems & Services Software Development Testing Guide TESTING S TRATEGY (M ETHODOLOGY) I NTRODUCTION A test strategy describes the overall approach, objectives, and direction of the testing effort. The purpose of a testing strategy or methodology is to limit risks and to ultimately deliver the best possible software to the customer. The testing strategy chosen for a particular application will vary depending on the software, amount of use, and its’ particular goals. For instance, the testin g strategy for a transactional system like Oracle will be very different than the strategy developed for testing an analytical tool like the Data Warehouse. Moreover, the strategy chosen to test a campus-wide purchasing system verses a limited user base tool for Housing will entail much different test strategies. Because some of these examples have higher exposure they also have higher risk. STAGES OF THE TESTING LIFE CYCLE IMSS has employed several variations of one or more testing methodologies. Typical stages have included preparation, conference room pilot (CRP), unit, integration, system, and user acceptance testing. These stages are also referred to as phases or levels. The project manager should review the phases below and consider similar terminology and sequence. If it makes sense, some phases and some tasks may be deleted. In other cases, tasks and phases may need to be added. Some tasks may run in parallel, and some stages might be combined. In most cases, each phase must be completed before the other can begin. Duration for or execution of tasks will vary depending on schedule and risk the project manager is willing to absorb. Test Preparation Phase (before testing begins) ? Task – Develop test strategy ? Task – Develop high level test plan ? Task – Identify test cases ? Task – Develop scenarios, test scripts ? Task – Identify and share test data ? Task – Determine processes, procedures, standards, documentation requirements ? Task – Identify and create test environment ? Task – Identify testing team(s) ? T ask – Train testers Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 9 of 32
  • 10. Information Management Systems & Services Software Development Testing Guide Unit Test Phase - The purpose of this test phase is to verify and validate independent modules function properly. This is completed by the developers and must be completed before future phases can begin. The Testing Manager is not normally involved in this phase. CRP Phase (Conference Room Pilot - Optional). The purpose of this phase is to verify proof-of-concept. A CRP is generally needed for new, large, and unproven projects. ? Assumption – Test instance is ready ? Assumption – Metadata is inserted into test instance ? Assumption – Unit testing and mock up is complete ? Assumption – Test scenarios have been identified (scripted or ad hoc) ? Task – Identify CRP participants ? Task – Determine and finalize CRP logistics ? Task – Set expectations. ? Task – Begin CRP ? Task – Gather and document feedback ? Task – End CRP ? Task – Obtain phase completion approval/sign-off ? Task – Gather/share/integrate lessons learned; incorporate necessary changes ? Task – Tune/revise/re-approve test plan ? Integration Test Phase - The purpose of this test phase is to verify and validate all modules are interfaced and work together. ? Assumption – Requirements are frozen and design is determined ? Assumption – Application is ready for integration testing ? Assumption – Metadata has been populated into test instance tables ? Assumption – Unit testing is complete ? Task – Test system and document using test scripts ? Task – Test interfaces ? Task – Identify and report bugs ? Task – Retest fixed bugs/regression test ? Task – Test security ? Task – Test browsers/platforms/operating systems ? Task – Obtain phase completion approval/sign-off ? Task – Gather/share/integrate lessons learned ? Task – Tune/revise/re-approve test plan System Test Phase - The purpose of this test phase is to verify and validate the system works as if it were production. ? Assumption – Metadata has been populated into test instance Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 10 of 32
  • 11. Information Management Systems & Services Software Development Testing Guide ? Assumption -- Application is ready and has successfully completed integration testing ? Task – Test system and document using test scripts ? Task – Identify and report bugs ? Task – Retest fixed bugs/regression test ? Task – Test business processes and reports ? Task – Stress test ? Task – Performance test (e.g. screen refreshes) ? Task – Test security logins, responsibilities, hacking ? Task – Obtain phase completion approval/sign-off ? Task – Gather/share/integrate lessons learned ? Task – Tune/revise/re-approve test plan User Acceptance Phase - The purpose of this test phase is to verify and validate the system works by and for the end users as if it were production ? Assumption – Show-stoppers and most high level bugs have been fixed or work-arounds have been identified and approved ? Assumption – All other phases have been signed-off ? Assumption – Application is ready for user acceptance testing ? Assumption – Metadata has been populated into test instance tables ? Task – Train end user testers ? Task – Populate and approve test scripts ? Task – Test system and document using test scripts ? Task -- Obtain phase completion approval/sign-off ? Task -- Gather/share/integrate lessons learned TEST P LAN A test plan should be created for all significant projects. The t est plan documents the tasks that will be performed, the sequence of testing, the schedule, and who is responsible for completing each task. It is also a reflection of the strategy chosen for the testing effort. This plan should be linked to the overall project plan. MS Project is often used to create a test plan. SAMPLE TEST S TRATEGY (ORACLE UPGRADE) In the case of an Oracle upgrade we may require several cycles of testing. Each cycle would represent an actual upgrade. Several phases of testing may occur during each cycle. Typically, each cycle builds on the previous, and yet is intended to be shorter, reflecting the assumption that the system will be more stable as we progress to the next cycle, users will be more familiar with the system, and generally less rework will be required. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 11 of 32
  • 12. Information Management Systems & Services Software Development Testing Guide Dry Run Before formal testing begins a dry run upgrade may be created for the development and infrastructure teams. During this time several tasks may be performed including validation of system usability, testing of third party systems, testing database compatibilities (links) for Exeter and FAMIS, code remediation, identifying new functionality. Button and form testing can be performed during the dry run cycle in order to minimize impact to the integration testing phase of cycle one. Testing and fixing issues early should help expedite the formal testing process. ? Goals o Confirm the basic application works o Validate data integrity o Ensure third party systems are compatible o Ensure database compatibilities (links) for Exeter and FAMIS o Remediate and migrate code including interfaces o Identifying new functionality o Test buttons and forms o Report bugs encountered Cycle 1 The first major testing cycle will include two phases after the application has been validated. These phases include integration and system testing. The goal of integration testing is to test existing functionality, tweaks and customizations (interfaces) , address open help desk tickets that the upgrade may resolve, test custom and standard reports, and new functionality. The second phase will be system testing. The purpose of this phase is to validate and verify the system works as if production. This phase will also focus on finding and reducing bugs in the system found during earlier phases and before go-live. It is expected that only IMSS and functional analysts will participate in cycle one testing. Remember, these task can change or move to other phases at any time. ? Application Validation Goals o Confirm the basic application works o Abbreviated Button/Form testing Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 12 of 32
  • 13. Information Management Systems & Services Software Development Testing Guide ? Integration Testing Goals o Confirm the basic application works o Test Existing Functionality o Test Tweaks/Customizations o Test Open HD tickets pending upgrade o Test custom/standard reports o Test new functionality o Test interfaces o Report bugs encountered ? System Testing Goals o Test Business Processes o Test Web Apps o Test Security o Test CIT Developed Apps o Test Internal Charges (Auto Feeds - I/F's) o Test Third Party Apps Cycle 2 Cycle two will repeat integration and system test phases, but at a much quicker pace and with additional goals and testers. Moreover, an abbreviated application validation should be performed to ensure the application works before users begin testing. It is expected that most bugs will have been resolved in cycle one. ? Application Validation Goals o Confirm the basic application works o Abbreviated Button/Form testing ? Integration Testing Goals o Test Existing Functionality (including new functionality) Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 13 of 32
  • 14. Information Management Systems & Services Software Development Testing Guide o Test Tweaks/Customizations o Regression test fixed bugs o Test interfaces o Test Business Processes o Test Responsibilities o Test custom/standard reports o Report bugs encountered ? System Testing Goals o Test Web Apps o Test Security o Test CIT Developed Apps o Test Internal Charges o Test Third Party Apps o Test Appworx SAMPLE TEST S TRATEGY (DATA WAREHOUSE) Introduction Due to the nature of a Data Warehouse, testing the validity of each data mart will require an approach different from that of testing a transactional system (e.g. Oracle Applications). Because the data warehouse and the source system are designed to perform different functions, the table structures between the two systems differ greatly. The main difficulty found when testing is validating query results between the systems. Not all of the data in the source system is loaded into the warehouse, and the data that is loaded is often transformed. Therefore, comparisons between the systems are difficult, and troubleshooting becomes extremely complex when trying to identify points of failure. The testing method introduced in this plan is designed to help streamline the process, making it easier to pinpoint problems and cut down on confusion for the testers while at the same time expedite the testing phases. Testing Phases Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 14 of 32
  • 15. Information Management Systems & Services Software Development Testing Guide Data mart testing should be divided into three distinct phases, Instance Validation, Data Validation, and Application Usability. This testing is designed to test the completeness, correctness, and performance of each data mart. Instance Validation (a combination of unit and system test phases) o Initial and incremental load: The designated development team will be assigned to compare the row counts between the staging tables and the DW tables to make sure all push records are pulled over to the DW tables. The DW tables should capture all changes that occur in the source tables through incremental loads. A script should be used to capture this information. o Push logic: All conditions, translations, formulas and calculations that are part of the push load are tested to make sure that it is providing useful and valid data to the DW. These scenarios are pulled directly from the code and business rules noted in the design document. For example: a) Can a PO agent be inactive in the HR table but still active in the PO agent table? b) Are there any payments that are being pushed over to DW with no invoice associated with them? c) Does the PO distribution amount calculate correctly? d) If the invoice status is CANCELLED, should the sum of all payments amount equal to 0? e) Is there any translation or mapping errors? For example, ATTRIBUTE9 becomes TRAVELER_NAME in the DW. o Pull logic: All logic, conditions, translations, transformations and exceptions that are part of the pull load are tested to make sure that the program is pulling the right data from staging tables into the DW. These scenarios are pulled directly from the code and business rules noted in the design document. For example: a) If a record in the Fact tables has no corresponding record in the Dimension tables, would it still be loaded or does it get sent to the error table? For example, should a Vendor exist in the fact table but not in the dimension table or can a PTA exist in the fact table but not in the dimension table, etc. b) If there are supporting Fact tables, do they need to have corresponding records in another Fact table? For example, can a record exist in the Payment table when no corresponding invoice exists in Invoice Overview and Invoice Detail tables? Or can a record appear in the Detail table but not in the Overview table, etc. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 15 of 32
  • 16. Information Management Systems & Services Software Development Testing Guide c) Due to data being displayed differently in the DW than in the source system, at times, records may appear to be duplicated. Sufficient testing should be done to ensure that there are no duplicate records in the DW. Any records that appear to be duplicated (but are not) should be noted in the design and training documents. The successful completion of this phase indicates that the load programs (push/pull) are working the way they should and that the data is in sync in both systems. Data Validation (a combination of unit, system and integration test phases) o Forms – Front-end By comparing results using the same selection criteria between the source application front-end and application tool (e.g. Discoverer, Cognos, Webster) we will know that the DW contains reliable information that can support transactional-level inquiries. For example: a) What is the status of invoice number 198273? b) Who was the buyer that processed purchase order number PO374021? c) What check number paid invoice number 198272? What was the payment date? o Queries– Back-end By using SQL to perform complex test scenarios, results are compared against the tables in both databases. Since not all of the data from the source system is loaded into the DW, testing must be done to ensure that the information that is loaded is sufficient to support the business activities or decision-making processes for the Institute. For example: a) Can I see all invoices and payments associated for OFFICE DEPOT for this fiscal year? b) Can I see all purchase orders for this award number for this month, and any invoice or payment associated with each? c) What purchase orders have invoices on hold and for what reasons? Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 16 of 32
  • 17. Information Management Systems & Services Software Development Testing Guide The successful completion of this phase of testing indicates that the databases are in sync. This step eliminates any further need of having to use two systems to validate results. Validation for the testing that occurs in the next phase is contained to the reporting tool and any custom views used by that tool. Application Usability (a combination of integration and user acceptance test phases) This testing determines the usability of the information in the DW when using the front- end tool to perform multi-level (transactional and analytical) inquiries. To minimize bugs found by end user testers, internal and functional super-users should first conduct this testing. It should be assumed that any bug found in this phase is directly related to the business area, join or limitation of the tool. If a query produces unexpected results, troubleshooting should begin with the tool. Further troubleshooting using backend tables should occur only after the tool has been ruled out. o Business Areas– Ad-Hoc Reporting This step tests the relationship between the folders. It is strictly scenario driven and should answer a valid business question. Original user requirements should be the basis for the scenarios that are tested. For example: a) What are the top ten vendors in dollar volume over the last year? b) How many purchase orders did a particular buyer process in the last month? c) Which invoices were paid against a particular PTA in the last year? o Reports – Canned Reports This step tests the validity of canned (pre-built) reports that have been developed to meet further requirements and provide easy access for commonly asked queries. Testers should use a variety of parameters when testing the reports. Data returned should produce meaningful results. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 17 of 32
  • 18. Information Management Systems & Services Software Development Testing Guide TESTING PROCESS AND PROCEDURES I NTRODUCTION Most testing will be scripted. Using scripts helps to avoid confusion and increase coordination during testing. While some ‘open-ended’ testing is anticipated, it should be done only with the specific approval of the Testing Manager. Prior to the execution of tests, test cases should be identified, and scripts should be created. At a minimum, scripts should address user requirements. CREATE TEMPLATES The creation of test script template(s) provides a consistent approach to the documentation of the testing effort. Example templates can be found at Y:TestingTest Script TemplatesBlank Templates and an example naming convention is provided in the Appendix. After obtaining a copy of the template users should perform a “save as” and create a test script file according to the naming and saving conventions. Use the templates according to which phase of the testing phase or cycle you are in. CREATE SCENARIOS (TEST CASES ) Prior to testing, scenarios should be determined. Scenarios should test user requirements. Refer to the user requirements document. CREATE TEST DATA For most testing efforts, test data should be identified and shared among the testers. Test data can range from employee names, employee IDs, PTAs, purchase requisition or order number, etc. This data is useful when handing off to the next tester. Having this data helps testers anticipate data from the previous tester. A document should be created and distributed with this information to all testers. CREATE SCRIPTS The test scripts themselves should be derived from the test cases. These scenarios should be entered onto the test script. Columns to be entered in advance include requirement number, action, expected outcome, navigations, test data used, and tester information. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 18 of 32
  • 19. Information Management Systems & Services Software Development Testing Guide EXECUTE SCRIPTS Once the scenarios have been entered onto the scripts, testing may begin. The scripts created are meant to be both a roadmap for the testing as well as a way of documenting the results of testing. Several important areas should be documented as a result of testing. The results should include testing date, pass/fail results, performance results, responsibility used, and pertinent comments (especially a Remedy ticket number for failures – more on that later). While testing, the tester should have the appropriate script open so that you may note necessary data in the script as you go. Save often to avoid losing work. Keeping your script information up to date will ease status reporting. The Testing Manager will review scripts periodically to obtain a count of the passes and fails and report this at status meetings. DOCUMENT R ESULTS /CAPTURE BUGS When you run your scenarios you will find they have either passed or failed. This information should be captured on the test script. For all testing efforts, it is necessary to document your results by tracking all bugs and anomalies encountered during testing. Remedy will serve as the tool to report and track all testing items that have failed. For those testers without access to Remedy, a bug report document should be completed and submitted to the T esting Manager. The Remedy Help Desk ticket should track the bug from the initial report , to assignment, to resolution, to re-test, to status reporting for management purposes, and finally to closure. Before testing please check the bug report. This report will help avoid wasting time by showing you what bugs are known and their status. Pass/Fail Entries On the test script enter the test results for each scenario as follows: If the scenario Passes: ? Identify data tested on your script ? Enter the number “1” in the “Pass” column of the script ? Move to the next step. If a scenario Fails ? Identify data tested on your script Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 19 of 32
  • 20. Information Management Systems & Services Software Development Testing Guide ? Enter the number “1” in the “Fail” column of the script ? Check the bug report to make sure someone hasn’t already reported this problem. ? Capture the failure with a screen shot. Use either Word or Snag-it. (See below for more information and the Appendix for file name convention and storage location.) ? Make sure the Testing Manager or the IMSS Development team knows that you have encountered a new problem as soon as possible. ? If you do not have access to Remedy, submit a bug report using the form found at http://support.caltech.edu/testingsupport/ ? The Testing Manager or designee will open a Remedy ticket from the bug reports submitted and include the information received on the report form. Re-testing scenarios that previously Failed ? After a bug has been fixed, the original tester will be asked to perform a re-test. ? Reply to the email with your results. This information shall be posted to the Remedy ticket work log and the ticket can either be resolved if pass or returned to the developer if fails. ? If the scenario is now successful (e.g. the previous problem is now fixed), move the “1” from the “Fail” column to “Pass” column ? DO NOT erase the original results in the “Comments” column. Add the new pass documentation to that column or add any new information if it still fails ? Notify the Testing Manager of the outcome immediately. Should you need to capture a screen shot use Snag-it or MS Word. Snag-it is preferred over Word for saving screen shots due to file size, however Word can be used if you do not already have Snag-it installed. Contact the Testing Manager should you need instructions for Snag-it. ENTERING AND TRACKING BUGS IN REMEDY Not all testers will enter bugs directly into Remedy. Many testers will complete a Bug Report. For those who have access to Remedy, tickets should be entered in a standard way to allow necessary reports to be easily generated; this should reduce the need for ad hoc status requests if the cases are kept up to date. Internal testers typically enter bugs directly into Remedy. Most external testers will complete a bug report and submit it to the Testing Manager. Process 1. Once it has been determined that a bug (error) has been found and/or that a testing step has failed, the tester should either open a Remedy ticket or create a Bug Report. The bug report form for Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 20 of 32
  • 21. Information Management Systems & Services Software Development Testing Guide non-Remedy users can be found at http://support.caltech.edu/. All fields should be completed. Once complete submit and it will be automatically forwarded to the Testing Manager. The Testing Manager will open a Remedy Help Desk (HD) ticket. The ticket will be assigned to the appropriate IMSS development team for further analysis. The following Remedy categorizations should be used: Category - Application Upgrade Type - Application Testing Item - Bug, Functional Gap, Perform Issue, Training/Doc, Unclassified Module - [Enter Project Name from LOV] Summary - Include the module acronym in the Summary Field along with a short description of the problem, e.g. “GL – Unable to Enter Journal.” Vendor tracking # - Enter the TAR number, if applicable Application Version - [Enter from LOV] 2. The Testing Manager, team lead or his or her designee assigns the ticket to an individual or group. That individual is responsible for updating the ticket. If appropriate, updates can include changes to description, summary, or categorization. 3. The tester shall create screen-shot of the error and file it. The file name and location of the screen-shot plus any other backup information should be entered on the test script. The Bug Report shall include file names and locations of screen shots, query names, and backup information as well. Be sure to follow the naming convention and enter requested information previously described. 4. Remedy prioritization is defined as follows: Urgent: Show stopper, cannot go-live with the bug High: Time-critical and impacting more than one person, but not of Institute-level impact. Med: This is the default priority. Items that affect one or two people and are not immediately time critical. Low: An annoyance. No major impact. A work around has been found. 5. Only after a fix has been confirmed should the HD ticket be set to ‘Resolved’ status. The ticket shall be assigned back to the Testing Manager for coordination with and confirmation of the fix by the end user. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 21 of 32
  • 22. Information Management Systems & Services Software Development Testing Guide 6. The Testing Manager will review open tickets on a regular basis to confirm progress is being made. Close attention will be paid to tickets that have been open for a long time, have urgent or high priority, or have not been addressed or received follow-up. 7. Always log a Remedy Ticket for bugs found and document, document, document! It is highly recommended that you communicate via email when discussing the problem and resolution. You can then cut and paste your emails into the Remedy work log. This is an excellent way to document and a great reference should the problem occur in the future. LOGGING TARS (O RACLE ONLY) Only log a TAR after a Remedy ticket has been created. A TAR is used when an Oracle bug is found and cannot be resolved by IMSS. TARs are logged in the Oracle tracking system called Metalink. Only IMSS Analysts, Developers, and DBAs will log TARs. All other testers should seek out a member of one of these groups if a TAR is required. After requesting a user name and password from IMSS Infrastructure go to www.oracle.com and navigate as follows: ? Select Support in the Resources menu ? Select Metalink from the Support Services menu ? Select Metalink login (for registered users) ? Enter your user name and password. ? Begin entering your TAR as instructed. When logging a TAR with Oracle, be sure to use the correct CSI number. Contact the Infrastructure manager to obtain this number. Reminder: Please be sure a Remedy ticket is created before the TAR is logged. This becomes very important, especially after the TAR is removed from Metalink and you can no longer retrieve it. Be sure to update the “Vendor Bug Number” field in the Remedy ticket. After logging your TAR please notify others of the problem and status. C ODE MIGRATION AND PATCHING Custom code migrations into the TEST instance will need to be closely coordinated to ensure proper handoffs are made and that the test instance is not corrupted. For the most current version of the migration procedure please contact the Development Manager. Patching is often necessary to address bugs found during testing. Patches are normally provided by the vendor. Like migration of code, patching will also be closely coordinated to minimize unnecessary code changes and to prevent corruption to the test instance. The patching process will be tracked in Remedy via Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 22 of 32
  • 23. Information Management Systems & Services Software Development Testing Guide the change request process. The help desk ticket opened for the bug found should be related to the change request ticket. A patch document must be created for each patch that is under consideration. A meeting is normally held to determine if the patch is viable and whether the patch will be applied. The test manager should be well informed of the application of the patch into the test instance and coordinate instance availability and regression testing. For the most current version of the patching procedure please contact the Development Manager. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 23 of 32
  • 24. Information Management Systems & Services Software Development Testing Guide Testing Process Summary All testing should follow a defined process as described below. ? Create Test Templates ? Create Test Scenarios ? Execute Scripts ? Document Results ? Capture Bugs ? Hand-off to next tester ? Fix Bugs ? Retest and Update Test Scripts Testing Process Diagram Create Test Create Test Execute Templates Scenarios Scripts Hand-off to Success Document next tester Results Failure Capture Bugs Retest and Fix Bugs Update Test Scripts Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 24 of 32
  • 25. Information Management Systems & Services Software Development Testing Guide G ENERAL INFORMATION ENVIRONMENT Environment means many things. The testing environment includes, but is not limited to system architecture, testing tools, system access, a test instance, as well as where the testing will be performed. Access to the test instance, a computer, the ‘Y’ drive, and a printer will be necessary. ? IMSS Security or the Testing Manager will grant access to the testing instance (application) with appropriate responsibilities ? If you have access problems please contact the Testing Manager. Familiarize yourself with the ‘Y’ drive structure. Note which phase or cycle of testing you are in and where to store your files. Standards are described below. Understand where to find script templates and where to store actual scripts with results. Also note where to store screen-shot backup. Contact the Testing Manager if you questions about files or the ‘Y’ drive structure. http://atcdba-support.caltech.edu/tnd_apps.htm COMMUNICATION Communicating hand-offs and other events affecting other testers/developers are extremely important to the success and on-time delivery of this project. In order to maintain a continuous flow of effort, a tester/developer should notify others when an action has been completed so the next action can begin. Use of email, personal communication, telephone, and voice-mail can all play an integral part of communicating an event. An email distribution for most project s will be created. This list is intended to keep testers informed of the various testing activities, including status, updates, reminders, instructions, procedures, et c. If you have a need to use the distribution list , it is suggested you first contact the project or Testing Manager. MEETINGS For larger projects that involve many testers, a training/kickoff meeting should be held. Periodic status meetings are especially important for the testers and management to attend. The focus of the discussion should be around the following issues: pass/fail status, urgent and high level bugs/issues, schedule, and other topics as required. Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 25 of 32
  • 26. Information Management Systems & Services Software Development Testing Guide TESTING ROOM /TEST B ED A testing facility will be reserved if available. A test bed has been created to test the minimum system requirements for most software applications. Platforms include Macintosh, LINUX, and PC with various operating systems and browsers. The Testing Manager should be contacted to reserve this facility. STATUS R EPORTING The Testing Manager will provide the testing team and management periodic status. Because the information is taken from the test scripts, it is extremely important to keep them current. Several pieces of data will be tracked: ? Total number of test scenarios. ? Number of scenarios passed. ? Number of scenarios failed. ? Number of scenarios incomplete. ? Percent of scenarios completed. Other information may be included: ? How many times was each scenario performed? ? Are we on schedule? ? If not, what is the hold up and what is the recovery plan? STANDARD HEADER /FOOTER All files created, such as Word and Excel documents should contain specific information in the footer and header. This will assist testers and others with finding files (by displaying full path and file name), determining when they were last modified (by coding in the save date, not print date), and identifying the author. In summary, include the following information on every document: ? Full path and file name ? Creator’s Name ? Save Date ? Number of pages (Page X of Y) Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 26 of 32
  • 27. Information Management Systems & Services Software Development Testing Guide LOCATIONS OF USEFUL RELATED DOCUMENTATION Refer to the application testing support site for additional information and assistance. It is found at http://www.support.caltech.edu/testingsupport/index.htm . Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 27 of 32
  • 28. Information Management Systems & Services Software Development Testing Guide APPENDIX FILE NAMING STANDARDS AND STORAGE LOCATIONS (EXAMPLE) Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 28 of 32
  • 29. Information Management Systems & Services Software Development Testing Guide IMSS SYSTEMS AND MODULES (ACRONYMS ) For the most current version of the systems and modules document please see P:Documentation (CIT )Policy-StandardsDevelopment StandardsATC Development Standards.doc. System Module Module Description Acronym DATA WAREHOUSE DIS Discoverer LDQ LD Query RDB Relational Database Management WBS Webster EXETER CMN Common CMS Career Management Services GEN General MLS CIT Mail Services SAS Student Aid (Financial Aid) System SBS Student Billing (Bursar) System SHS Student Housing System SMS Student Marketing (Undergraduate Admissions) System SSS Student Services (Registrar) System SWS Student Web System FAMIS FWB Famis Web INC Inventory Control KC Key Control MMM Maintenance Management SPM Space Management Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 29 of 32
  • 30. Information Management Systems & Services Software Development Testing Guide Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 30 of 32
  • 31. Information Management Systems & Services Software Development Testing Guide IMSS Systems and Modules (continued) System Module Module Description Acronym MISCELLANEOUS APX AppWorx BK Bookstore Web Utility BTX BudgeText CAT Catermate CBD CBord CC Counseling Center CD Catalog Database CEI Campus Card DMS Athenaeum FLS Federal Liaison Services GAT GATES (graduate office) IPS Interim Pre-Award System KRO Kronos MMS Win Mail Management System NPW Northern Prowatch PAS Passport (Pcard) SIG Sig Forms TEL Telecomm UID Universal Identifier WNS Windstar ORACLE AOL Application Objects Library AP Accounts Payable Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 31 of 32
  • 32. Information Management Systems & Services Software Development Testing Guide AR Accounts Receivable O RACLE BB Benefit Billing CM Cash Management CWS Workstudy FA Fixed Assets GL General Ledger GMS Grants Management System HR Human Resources IC Internal Charges IN Inventory LD Labor Distribution PAN EPAN PAY Payroll PO Purchasin g PRK Parking Database See below for breakout WA Web Apps ORACLE Web Apps WIC Web Internal Charges PIC Personal Internal Charges EPAN Web Personal Action Notification EVAL Performance Evaluation ASI Annual Salary Increase WREQ Web Requisition EXPDEF Expenditure Definitions PTA PTA Query Y:TestingMgmt Support Info Testing GuidesIMSS Software Testing Guide_Backup.doc Saved Date: 3/2/06 4:40 PM Page 32 of 32