Contenu connexe
Similaire à Intro to Software Engineering - Software Testing
Similaire à Intro to Software Engineering - Software Testing (20)
Plus de Radu_Negulescu (12)
Intro to Software Engineering - Software Testing
- 2. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 2
About this module
Targeted tests are more efficient than random tests.
Here we discuss
• Testing concepts
• Unit testing
• Integration testing
• System testing
- 3. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 3
Example
A program that reads three sides of a triangle and determines whether
the triangle is scalene, isosceles, or equilateral
• [Myers–The Art of Software Testing]
• How many test cases are necessary?
ERROERROERROERRO WHVW,87 WULDQJOH
- 7. ^
VHW LQSXWV VHW LQSXWV VHW LQSXWV VHW LQSXWV
WULDQJOHVHW/HQJWK
- 19. FKHFN RXWSXW FKHFN RXWSXW FKHFN RXWSXW FKHFN RXWSXW
LI WULDQJOHLI WULDQJOHLI WULDQJOHLI WULDQJOHLV6FDOLV6FDOLV6FDOLV6FDO
- 39. ` HOVH ^` HOVH ^` HOVH ^` HOVH ^
6VWHPRXW6VWHPRXW6VWHPRXW6VWHPRXWSULQWOQSULQWOQSULQWOQSULQWOQ 7HVW IDLOV
- 44. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 4
Testing
Definitions
• The process of running a program on a set of test cases [Liskov]
• Finding differences between the system and its models (specs) [BD]
• Executing an implementation with test data and examining the outputs
and operational behavior vs. requirements [Somm]
Importance
• Manifest faults
Different from manual checks
• Can be automated
• Decide completion of a project
Sign-off
- 45. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 5
Types of testing
Different purposes
• Fault (defect) testing: discover faults by comparing to spec
• Regression testing: ensure that new faults have not been introduced
• Statistical testing: measure average-case parameters on operational
profile
• Usability testing: measure ease of use
• Performance testing: determine performance
• User acceptance testing: system sign-off
• ...
- 46. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 6
Types of testing
Different levels of integration
• Unit testing
Test atomic units (classes, routines) in isolation from other units
• Integration testing
Test the interoperability of several units
• System testing
Test the system as a whole
- 47. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 7
Types of testing
Different levels of knowledge / different coverage criteria
• Blackbox testing
Focus on input/output behavior
Equivalence testing
Boundary testing
• Structural testing
Use information regarding internal structure
Path testing
Define-use testing
State-based testing
A.k.a. white-box, glass-box, clear-box (“see inside the box”)
Different conditions of application
• Alpha testing
• Beta testing
• …
- 48. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 8
Test cases
Testing is organized around test cases
• Test case = input data + expected result
Elements of a test case
• Input
• Oracle
Expected output
Inputs and outputs may alternate in a test pattern
• Identification (name)
Derived from component or set of components under test
E.g. Test_A to test component A in isolation
E.g. Test_AB to test components A and B together
• Various logistics
The actual output record, or records, or place holder for it
Instructions of how to apply the test (e.g. location of test case program)
- 49. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 9
Test cases
Example: System test case for a coffee vending machine
SysTestMakeCapuccino
pressSelectionButton(“Capuccino”)
display(“Capuccino”)
insert(1.00)
display(“1.00”)
insert(0.25)
display(“1.25”)
makeCoffee(“Capuccino”)
display(“making”)
display(“Thanks”)
Input
Sequence of
coin inserts is
fully specified
Output
Output to
hardware
driver
Name should
facilitate test
maintenance
- 50. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 10
Test cases
A test case may aggregate several finer-grained test cases
• E.g. a regression test contains several individual tests
• A.k.a. “test suite”
Test cases may have precedence associations
• E.g. test units before testing subsystems
• The fewer the better
- 51. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 11
Test harnesses
A test harness includes test drivers, stubs, test case files, and other tools
to support test execution
• Program or script
Implementation under test (IUT)
• System, component, or combination of components being tested
• Messages need to be sent between the IUT and its environment
(actors or other parts of the system)
• Need to create substitutes for the environment of the IUT
- 52. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 12
Test stubs and drivers
A test driver calls the IUT services
• Pass inputs to the IUT
Test data generators
• Display and/or check results
A test stub simulates parts of the system that are called by the IUT
• Provides the same API as the real components
• Returns a value compliant with the result type
• E.g. application of the bridge pattern
User
Database interface
DatabaseTest stub
Database
implementationInterface
- 53. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 13
Test harnesses
Example harness structures
TestDriver
testCase001()
testCase002()
testCase003()
...
ClassUnderTest
TestDriver
runTestCases()
TestCase
run(CUT)
*
1
1 1
TestStub
TestStub
CUT
- 54. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 14
Testing activities
Testing activities
• Planning (selecting) test cases
Reviewing the test plan
• Test harness development
• Applying (running, re-running) the tests
Tradeoff between cost and completeness
• “Good” results: accuracy, coverage
Cannot cover all possible inputs (exhaustive testing)
• Time and budget limitations
Coverage metrics
• Modules / collaborations / subsystems
• Routines
• Statements / code
• Define-use combinations
- 55. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 15
Fault testing
The purpose of fault testing is to find faults (defects)
• Make the system fail (manifest failures)
• Locate the faults that caused the failure
Compare to manufacturing tests for circuits:
• Coverage of faults introduced by fabrication process
Levels of fault testing
• System testing: based on requirements
• Integration testing: based on design / module associations
• Unit testing: based on module interfaces and internals
- 56. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 16
Fault testing
Misconception: “My code is fully tested, therefore it has no bugs.”
Reality: “Program testing can be used to show the presence of bugs, but
never to show their absence.” [Edsger Dijkstra]
- 57. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 17
Fault testing
Edsger Wybe Dijkstra 1930-2002
• A pioneer of structured and disciplined programming
• “Goto considered harmful”
• Key contributions to concurrency, system design, formal methods,
software engineering, ...
- 58. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 18
Unit testing
Testing a component in isolation from others
• Focus on small units
System units vs. project units
• Reduce the scope of fault finding
• Reduce the impact of changes entailed by fixing the fault
• Allows parallelism: independent testing of each component
Test case design techniques
• Black-box
Equivalence testing
Boundary testing
• Structural
Path testing
Dataflow testing
State-based testing
Polymorphic bindings
...
- 59. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 19
Equivalence testing
Select at least one test input from each equivalence class
Example (modified from [BD]): getNumDaysInMonth(int month, int year)
• Equivalence classes for each input
3 classes of year: leap, non-leap, = 0 (off range)
5 classes of month: = 0; 30-day; 31-day; February; 12
• Equivalence classes for combinations of inputs
Total 3x5 = 15 classes
Including (leap, 30-day) and (non-leap, 30-day)
- 60. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 20
Black-box testing
Partition the input data into equivalence classes
• Class = set of combinations of inputs
• Cover all possible input
• Disjoint classes
• Invalid inputs may also be regarded as separate classes
Slightly different twist from BD
class 1
class 3
class 2
class 4
class 5 class 6
all inputs
valid inputs
- 61. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 21
Input classes
Valid data
• By input ranges or values
E.g. long string, short string
E.g. integers: [-32768..-1] [0] [1..32767] [off-range pos] [off-range neg]
• By term (conjunct, disjunct) of the precondition
E.g. (a=0) or (b=2) generates three valid classes
• By output values
E.g. input data that will generate positive output
E.g. input data that will generate output out of representable range
E.g. input levels that would cause the alarm to sound
• By values of the object data
E.g. default password
• By user-supplied commands, mouse picks
One class for each possible command
• By physical parameters
- 62. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 22
Input classes
Invalid data
• Data that invalidates the precondition
• Data outside the represented range
• Physically impossible data
E.g. temperatures less than 0 Kelvin
Valid vs. invalid data
• If a precondition requires a range (e.g. year between 1 and 9999),
then one valid and two invalid classes are defined
• If a precondition requires a set (e.g. month name), then one valid
and one invalid classes are defined
- 63. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 23
Boundary testing
Focuses on boundaries between equivalence classes
• Composite boundaries: corners
• “Bugs lurk in corners and congregate at boundaries.” – Boris Beizer
Example: getNumDaysInMonth(int month, int year)
• Test values 0, 1, 12, 13 for the month parameter
• Test values 0, 1, 1999, 2000, 2001 for the year parameter
Boundaries can be seen as classes on their own
• Which have boundaries
And so on
- 64. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 24
Equivalence and boundary partitioning
Limitation
• The number of classes grows exponentially with the number and
complexity of inputs
What to do if there are too many equivalence or boundary classes
• Limit the number of criteria for partitioning at each level of test
Mostly used in unit testing
• Cover individual classes but not all combinations of classes
Example: classes to be considered to check validity of credit card expiry
dates
Credit card Month
expiry dates 00 01 02-11 12 13 large 99
Year =98 00/98 01/98 03/98 12/98 13/98 65/98 99/98
99 00/99 01/99 03/99 12/99 13/99 65/99 99/99
00 00/00 01/00 03/00 12/00 13/00 65/00 99/00
01 00/01 01/01 03/01 12/01 13/01 65/01 99/01
=02 00/08 01/08 03/08 12/08 13/08 65/08 99/08
50 00/50 01/50 03/50 12/50 13/50 65/50 99/50
- 65. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 25
Path testing
Popular coverage criteria
• Statement coverage: all or almost all statements
• Branch coverage: all or almost all condition outcomes
• Path coverage: all or almost all possible execution paths
Determine a set of test cases that produce good coverage
• Main test (“common case”, “best test”, etc.)
• Targeted tests
Testing tools
• Coverage
• Test derivation
- 66. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 26
Path testing
Example: count words in a sentence
public int wordCount(String sentence) {
boolean inWord = false;
int currentCount = 0;
for (int i = 0; i sentence.length(); i ++) {
if (sentence.charAt(i) != ' ') {
if (! inWord) {
currentCount ++;
inWord = true;
}
} else {
inWord = false;
}
}
return currentCount;
}
- 67. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 27
Cyclomatic complexity
A measure of the complexity of control flow
• Flow graph
Nodes represent statements, blocks, or condition terms
Extra nodes: start, stop
Edges represent flow of control
• Cyclomatic complexity = number of edges - number of nodes + 2
• CC gives an upper bound on the minimum number of tests necessary
for complete branch coverage
Calculate CC by counting keywords
• Count 1 for each of if, and, or, for, while, case (or equivalents)
• Add 1
• Subtract 1 for each switch statement without a default clause
• Why/how does this work?
Variation: one node for each decision instead of condition term
- 68. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 28
Cyclomatic complexity
Example [after BD] (contains several faults on purpose):
public static int getNumDaysInMonth(int month, int year)
throws MonthOutOfBounds, YearOutOfBounds {
int numDays;
if (year 1) {
throw new YearOutOfBounds(year);
}
if (month == 1 || month == 3 || month == 5 || month == 7 ||
month == 10 || month == 12) {
numDays = 32;
} else if (month == 4 || month == 6 || month == 9 ||
month == 11) {
numDays = 30;
} else if (month == 2) {
if (isLeapYear(year)) {
numDays = 29;
} else {
numDays = 28;
}
} else {
throw new MonthOutOfBounds(month);
}
return numDays;
}
- 69. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 29
Cyclomatic complexity
Example: flow graph (condition-based)
year 1
n=32
throw2 n=29
return
throw1
n=28
n=30
month == 2 leap(year)
month= 1 3 5 7 10 12
month= 4 6 9 11
- 70. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 30
Cyclomatic complexity
Example: flow graph [BD] (statement-based)
[year 1]
[month in (1,3,5,7,10,12)]
n=32
throw2 n=29
return
throw1
n=28
n=30
[month in (4,6,9,11)]
[month == 2] [leap(year)]
- 71. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 31
Path testing
Limitations
• Shorter methods, polymorphism in object-oriented programs
Integration testing becomes important for object-oriented testing
• Faults associated with violating invariants of data structures
E.g. array out of bounds
• Corner and boundary cases
E.g. zero iterations of a loop
- 72. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 32
Data-flow testing
Data states
• Defined: initialized, but not evaluated yet
Declared: pointer allocated, but no valid value yet
• Used: value evaluated
• Killed: pointer freed
In Java, killed = out of context
A use chain contains
• A statement where a variable is defined (assigned a value)
• A statement where that variable is used with that definition active
- 73. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 33
Data-flow testing
Define-use testing strategy:
• Each use chain must be tested at least once
• Guard against faults where the wrong value is used
• Usually results in more tests than complete condition coverage
Example:
if (cond1) {
x = 5;
} else {
x = 6;
}
if (cond2) {
y = x + 1;
} else {
y = x - 1;
}
All condition outcomes: 2
All use chains: 4
- 74. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 34
Testing objects
Challenge: a lot goes on behind the scenes
• More methods, less code in each method
• Inherited methods need to be retested
Even if the method code hasn’t changed
Use local overridden methods and fields
• Polymorphism may introduce inconsistencies
• Some methods affect the internal state (fields) of an object but do
not return a value
- 75. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 35
Testing objects
Polymorphism
• All possible bindings should be identified and tested
Method overriding
Parameter inheritance
• If that is infeasible, at least test every method call and every
parameter type
A –foo(X) D
A –foo(Y) D
…
C –foo(Z) F
A
B C
D
E F
foo(X)
X
Y Z
... ...
...
- 76. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 36
State-based testing
Observe not just return values but also the state of the object
• Instrumentation of class attributes, to see their values
• Test each transition from each state
Use a sequence of inputs/messages to put the object in the desired state
Use a representative set of stimuli for each transition from that state
• Example:
MeasureTime SetTime
3.pressButtonsLAndR
5.pressButtonsLAndR/beep
4.after 2 min.
DeadBattery
8.after 20 years7.after 20 years
1.
pressButtonL
pressButtonR
2.
pressButtonL
pressButtonR
6.
- 77. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 37
Testing common faults
Target commonly encountered faults
• Use inspection checklists
Example: GUI testing
• Windows not opening properly
• Window cannot be resized, scrolled, or moved
• Window does not properly regenerate after overwriting
Generic examples
• Too little data (or no data):
E.g. zero or one lines for a sorting routine
• Too much data
• The wrong size of data
E.g. integer input that exceeds the representable range
• Uninitialized data
• Compatibility with old data
- 78. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 38
Integration testing
Integration testing
• Focus on groups of components
• Tradeoff
Test coverage (more groups)
Effort to build drivers and stubs
Effort to locate a fault in a group
• Tradeoff impacted by the selection and ordering of groups
Why test several components together after we have tested each of
them individually?
• Impossible to test all input combinations
Test stubs and drivers are only approximations of real component behaviors
• Interface mismatches
E.g. caller’s parameter list differs from callee’s
- 79. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 39
Big-bang integration testing
Perform system testing right after unit testing
Pros:
Simple
Low cost of test set-up
Can detect a few obvious failures quickly
Cons:
Difficult to locate and fix each fault
Difficult to detect more obscure failures
= a more elaborate approach is needed
Bot
Target
Top
System
- 80. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 40
Layers
Top layer
• User/actor interface objects
• Close to user, visible
• Needs thorough testing
Bottom layer
• System drivers
• Communications
• Part of the output to user
Middle (target) layer
• Application objects
• Use services of bottom layer
• Offer services to top layer
Application logic
System drivers
User interface
- 81. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 41
Bottom-up testing
Integrates components from the bottom layer first
Pros:
Fewer test stubs necessary
Interface faults tend to be more easily found
Cons:
UI is tested last, and least
More likely to contain faults immediately visible to the user
Bot
Bot + Target
Target
Top
System
Bot Bot + Target System
- 82. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 42
Top-down testing
Integrates components from the top layer first
Pros:
No special test drivers
It favors testing of user interface components
Cons:
Test stubs are costly to create, and error prone; many are needed
Test stub are imperfect
Top Top + Target System
- 83. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 43
Sandwich testing
Strategy
• Top layer and bottom layer are integrated incrementally with
components of the target layer
• This can be done separately for top and bottom layers
Pros
• No need to write drivers and stubs for top and bottom layers
• Proper emphasis on testing UI components
Cons
• Does not thoroughly test the target layer before integration
- 84. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 44
Modified sandwich testing
Strategy
• Individual layer tests
Top layer with stubs for target layer components
Bottom layer with drivers replacing the target layer
Target layer with bottom layer stubs, and drivers replacing the top layer
• Top layer + target layer + stubs
• Target layer + bottom layer + drivers
• Reuse target layer test drivers and stubs
Pros
• Lots of parallelism
• Allows early testing of user interface components
• Saves time by more coverage with the same stubs
Cons
• Many stubs and drivers needed to be developed
- 85. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 45
(Modified) sandwich testing
Sandwich:
Modified:
Bot Bot + Target
Top Top + Target
System
Bot
Bot + Target
Top
Top + Target
System
Target
- 86. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 46
System testing
Types of system testing
• Functional testing
• Non-functional (performance) testing
• Pilot testing
• Acceptance testing
• Installation testing
- 87. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 47
Functional system testing
Interactive test cases
• Exact input and output should still be specified
Test cases selected from the use case model
• Coverage targeted to use cases
Knows nothing about design – complementary to unit and integration testing
But can guess at future design possibilities – likely to uncover faults
Focuses on mismatches between system and specification
• Does not test the appropriateness of specification
No need to prototype
- 88. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 48
Black-box techniques in system testing
Incomplete coverage of equivalence classes or boundaries
• Common and exceptional use cases:
Common usage scenario: equivalence testing
Exceptional conditions: boundary testing
• For each use case:
Principal flow (common usage)
Secondary flows (exceptions)
Example: e-mail program
• Common usage use cases: send/receive
• Exceptional condition use cases: network down
• Receive use case principal flow: download mail, select message, ...
• Receive use case secondary flow: close program and recovery
• Incomplete coverage: not all combinations of user network actions
- 89. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 49
Structural techniques in system testing
Target elements of a use case flow
• States, transitions
• Similar to object testing
• Focus on the interface only; ignore the internals
Example [BD, pp. 360-361]
- 90. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 50
Non-functional testing
Stress testing: many simultaneous requests
• E.g. number of simultaneous accesses
Volume testing: large sizes
• Large data sizes
• Unusually complex data
• Hard disk fragmentation
Timing testing:
• Latency, throughput, averages, ...
- 91. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 51
Non-functional testing
Security testing:
• “Tiger teams”
• Formally defined privacy properties
Recovery testing: recover from errors, resource or system failures
• System perturbers (memory occupation, etc.)
• Consistency of data after hardware or network failures
- 92. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 52
User tests
Pilot testing
• Use field data, exercised by users
• Alpha tests: in the development environment
• Beta tests: a limited number of end users
• Not very thorough because they are non-systematic
Acceptance testing
• Benchmark tests: typical conditions
• Competitor and shadow tests: back-to-back
Installation testing
• Ease of reconfiguration from the development environment to the
target environment
• Repeats the test cases from function and performance testing
- 93. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 53
Usability testing
Tests the software specification (SRS, especially use case model) against
the users’ expectations
• Does the user’s mental model match the actual system image?
• Let the user explore the system or part of the system
• Collect statistics on the user’s response
Time to complete a task
Rate of user errors
Ease of learning
- 94. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 54
Levels of usability testing
Scenario tests
• Present the users with a visionary scenario
• Watch for reactions: understanding, match to the work flow,
subjective opinions
Prototype tests
• Vertical: per use case (especially the critical use cases)
• Horizontal: user interface, high technical risk, etc
• Instrument the code to profile user actions and collect operational
statistics
• Limitations: can be expensive; tempting to transform into system
Product tests
• “Dog and pony show”
• Focus on validation
• Limitation: needs the product to be developed first
- 95. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 55
Collecting usage statistics
Example design using Observers and event passing:
1 1
listenerState = crtEvent.getState();
history.collectEvent(crtEvent);
create new Event e;
for each registered Listener l
l.handle(e);
Source
register(Listener)
unregister(Listener)
cast(Event)
ConcreteSource
subjectState
cast(Event)
Listener
handle(Event)
ConcreteListener
listenerState
handle(Event)
Event
copyState
getState()
crtEvent
*
1
1
*
1 *
ConcreteStatistics
measure
computeMeasure()
getMeasure()
1
*
history
Statistics
collectEvent(Event)
deleteEvent(Event)
- 96. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 56
Usability testing
Usability testing includes not just test design, but also:
• Development of test objectives
• Selection of a representative sample of users
• Use of the system in work environment
• Debriefing: interrogation and probing of users
• Collection of quantitative data and recommendations of improvement
• Interpretation and aggregation of raw data
Usability metrics
- 97. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 57
Managing testing
Develop test cases (plan tests) as early as possible
• SRS – system tests
• DD – integration tests black-box unit tests
• Code – structural unit tests
• E.g. V-model of software life cycles
Pro: an opportunity to think of likely faults, leading to fault prevention
Con: maintenance of test cases, drivers and stubs
Parallelize tests
• E.g. [BD p. 364]
• Test batches may need to wait for defect fixes
• Start each as early as possible
• Few dependencies between test cases
- 98. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 58
Managing testing
Test documentation
• Test plan [BD, p. 365]
• Test case specifications [BD, p. 366]
Each test case is a fully-detailed scenario
Includes inputs and expected outputs
• Reports of test incidence and failures discovered
Responsibilities
• Separate QA team (up to 1 tester/developer)
• Swap development and testing roles
- 99. McGill University ECSE 321 © 2003 Radu Negulescu
Introduction to Software Engineering Software testing—Slide 59
References
Unit, integration, and system testing:
• BD: 9.1-9.3, 9.4.2-9.4.4, 9.5
• McConnell: ch. 25
• Sommerville: ch. 20
Usability testing:
• BD: 4.5.3
• Sommerville: 15.5