SlideShare une entreprise Scribd logo
1  sur  68
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test Data, Information, Knowledge, Wisdom:
the past, present & future of
standing, running, driving & flying
Neil Thompson @neilttweet
Thompson information Systems Consulting Ltd
©Thompson
information
Systems
Consulting Ltd 1
v1.2
(v1.0 was the handout,
v1.1 was presented on the day.
This v1.2 has erratum & appendix)
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Agenda
• Part A (past & present) The basics of test data
• B (“ “) Data structures & Object Orientation
• C (the present) Agile & Context-Driven
• D (present & future) Cloud, Big Data, Internet
of Things / “Everything” (oh, and Artificial
Intelligence, still)
• Summary, takeaways etc
© Thompson
information
Systems
Consulting Ltd 2
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
(past & present):
The basics of test data
Part A
© Thompson
information
Systems
Consulting Ltd 3
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
The poor relation of the
artefact family?
© Thompson
information
Systems
Consulting Ltd 4
Image credits: (extracted from) slideshare.net/softwarecentral (repro from ieeeexplore.ieee.org)
thenamiracleoccurs.wordpress.com
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
ISO 29119 on test data
• Definition: data created or selected to satisfy the input
requirements for executing one or more test cases,
which may be defined in the Test Plan, test case or test
procedure
• Note: could be stored within the product under test
(e.g. in arrays, flat files, or a database), or could be
available from or supplied by external sources, such as
other systems, other system components, hardware
devices, or human operators
• Status of each test data requirement may be
documented in a Test Data Readiness Report
• Hmm... so, what about data during and after a test?
© Thompson
information
Systems
Consulting Ltd 5
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
ISO 29119 on test data (continued)
• “Actual results”:
– Definition – set of behaviours or conditions of a test item, or set
of conditions of associated data or the test environment,
observed as a result of test execution
– Example: Outputs to screen, outputs to hardware, changes to
data, reports and communication messages sent
• Overall
processes:
(test data
information
builds
through
three of
these in
particular...)
© Thompson
information
Systems
Consulting Ltd 6
Source: ISO/IEC/IEEE
29119-2
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
ISO 29119 on test data (continued)
• Test Planning (process, in Test Management)
identifies strategy, test environment, test tool &
test data needs:
– Design Test Strategy (activity, which contributes to
Test Plan) includes:
• “identifying” test data
• Example: factors to consider include regulations on data
confidentiality (it could require data masking or encryption),
volume of data required and data clean-up upon completion
• test data requirements could identify origin of test data and
state where specific test data is located, whether has to be
disguised for confidentiality reasons, and/or the role
responsible for the test data
• test input data and test output data may be
identified as deliverables
© Thompson
information
Systems
Consulting Ltd 7
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
ISO 29119 on test data (continued)
• Within Test Design & Implementation (process):
– Derive Test Cases (activity):
• preconditions include existing data (e.g. databases)
• inputs are the data information used to drive test execution
– may be specified by value or name, eg constant tables,
transaction files, databases, files, terminal messages,
memory resident areas, and values passed by the operating
system
– Derive Test Procedures (activity) includes:
• identifying any test data not already included in the Test
Plan
• note: although might not be finalized until test procedures
complete, could often start far earlier, even as
early as when test conditions are agreed © Thompson
information
Systems
Consulting Ltd 8
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
ISO 29119 on test data (continued)
• Within Test Design & Implementation process (continued):
– Test Data Requirements describe the properties of the test data
needed to execute the test procedures:
• eg simulated / anonymised production data, such as customer data and
user account data
• may be divided into elements reflecting the data structure of the test item,
eg defined in a class diagram or an entity-relationship diagram
• specific name and required values or ranges of values for each test data
element
• who responsible, resetting needs, period needed, archiving / disposal
• Test Environment Set-Up & Maintenance (process) produces
an established, maintained & communicated test
environment:
– Establish Test Environment (activity) includes:
• Set up test data to support the testing (where appropriate)
– Test Data Readiness Report documents:
• status wrt Test Data Requirements, eg if & how the actual
test data deviates from the requirements, e.g. in terms of
values or volume
© Thompson
information
Systems
Consulting Ltd 9
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
My thoughts: Standing & Running
data; CRUD build-up
© Thompson
information
Systems
Consulting Ltd 10
• Consider a
new system
under test:
System
Create
new data
(software checks validity)
Standing
data
Running
data
System
Create
(checks validity wrt
reference data)
Standing
data
Running
data
System
C,
R,
U,
D• Moving on to
test an “in-
use” system:
• Now, what about test coverage?...
Running test data
Running
test
data
Running
test data
Select from /
use all
Via tools /
interfaces
Outputs
Create,
Read,
Update,
Delete
C,R,U,D
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
A little industrial archaeology: 1993!
© Thompson
information
Systems
Consulting Ltd 11• “Organisation before Automation”
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
... and 1999...
© Thompson
information
Systems
Consulting Ltd 12
• “Zen and the Art of Object-Oriented Risk Management”
• Added the concept of input & output data spreads
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
... oh, and 2007!
© Thompson
information
Systems
Consulting Ltd 13
• “Holistic Test Analysis & Design”
(with Mike Smith)
Non-
Func
Req’ts
S1ES1
ES2S2
ES3
S1ES1
ES2S2
ES3
S1ES1
ES2S2
ES3
S1ES1
ES2S2
ES3
F3 F6F5F2F1
F3 F7F5F1F2
F3F1
F3F2
F10 F12F11F9F8
Data A
Data A
Data B
Data C
Data D
F3 F6F5F2F1
F3 F7F5F1F2
F3F1
F3F2
F10 F12F11F9F8
Data A
Data A
Data B
Data C
Data D
COMPONENTCOMPONENTSYSTEMSYSTEMSACCEPTANCE
INTEGRATIONINTEGRATION
Func
Req’ts …
Func
Spec
Tech
Design
Module
Specs
Programming
Standards

Workshops
Functional Non-
Functional
Online Batch
Val Nav … Perf Sec …
   Behav
 
 
Behav
Behav
Struc
Behav
Behav
 

Behav
Struc

Behav
Struc 

Service to
stakeholders
Streams
Threads
Modules
TEST ITEMS
Service
Levels
Behav
I’face
Spec
F5
F2
F4
F3
F1
C1
C3
C2
C5
C4
Public op
CAB
C6
F5
F2
F4
F3
F1
C1
C3
C2
C5
C4
Public op
CAB
Public op
CAB
C6
F5F3
F1
C3
C2
F2 F5F3
F1
C3
C2
F2
Pairs /
clusters of
modules
TEST FEATURES BEHAVIOURAL /
STRUCTURAL
TEST BASIS
REFERENCES PRODUCT
RISKS
ADDRESSED
TEST
CONDITIONS
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
... (2007 part 2 of 2)
© Thompson
information
Systems
Consulting Ltd 14
• “Holistic Test Analysis & Design”
(with Mike Smith)
COMPONENTCOMPONENTSYSTEMSYSTEMSACCEPTANCE
INTEGRATIONINTEGRATION
Manual:
- screen images viewed,
- test log hand-written
Manual for changes:
- examine interface log prints
- view screens in each sys
Auto regression test:
in-house test harness
Manual for changes:
- database spot-checks
- view screens, audit print
Auto regression test:
threads, approved tool
Manual + Auto:
- varies, under team
control
Manual for changes:
- varies, under
individual control
Auto regression test:
per component,
approved tool
update with care,
documentation
out of date
copy of live data,
timing important,
users all have access
ad-hoc data,
unpredictable content,
check early with
system contacts
tailored to each
component
contains
sanitised live
data extracts
arrange data separation
between teams
Use Cases
State
Transitions
(all transitions,
Chow 0-switch)
Boundary
Value
Analysis
1.0 over
0.1 over
on
0.1 under
1.0 under
CT-2.4.1
CT-2.4.2
CT-2.4.3
CT-2.4.4
CT-2.4.5
Main success scenario
Extension 2a
Extension 4a
Extension 4b
Extension 6a
AT-8.5.1
AT-8.5.2
AT-8.5.3
AT-8.5.4
etc
etc
MPTU
MPTC
MPC
MC
MN
ST-9.7.1
ST-9.7.2
ST-9.7.3
etc
• If you use an informal technique, state so here.
• You may even invent new techniques!
TEST
CONDITIONS
MANUAL/AUTOMATED
VERIFICATION/VALID’N
RESULT CHECK METHOD
TEST DATA
CONSTRAINTS /
INDICATIONS
TESTDATA
INSCRIPTS?
TEST CASE
DESIGN
TECHNIQUES
““
““
TEST SUITE /
TEST / TEST CASE
OBJECTIVES
TEST CASE /
SCRIPT/ PROCEDURE
IDENTIFIERS
““
““
““
““
““
““
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Now, a data-oriented view of test
coverage: but relation to techniques?
© Thompson
information
Systems
Consulting Ltd 15
Standing
data
Running
data
System
C,
R,
U,
DRunning
test
data
Running
test data
Select from /
use all
Via tools /
interfaces
Outputs
Create,
Read,
Update,
Delete
C,R,U,D
BLACK-BOX
techniques?
GLASS-BOX
techniques?
(etc)...
• However: glass-box techniques still need data to drive them!
Input transactions
Input
data
spread
Processing transactions
Stored
data
spread
Output transactions
Output
data
spread
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
More about test data sources
© Thompson
information
Systems
Consulting Ltd 16
System
Running
test
data
Running
test data
Direct input
by tester
Via tools /
interfaces
eg messages,
transactions,
records,
files/tables,
whole databases
Acquisition
POTENTIAL SOURCES OF TEST DATA
CHARACTERISTICS
&handling:............
Adapted from: Craig & Jaskiel 2002 (Table 6-2 and associated text) “OTHER BOOKS ARE AVAILABLE”!
Validation
(calibration)
Change
VOLUME
VARIETY
Manually
created
Captured
(by tool)
Tool/utility
generated
Random Production
Controllable Too muchToo little Controllable Controllable
Good Varies Varies Mediocre Mediocre
Difficult Easy EasyVariesFairly easy
Easy Fairly difficult Difficult DifficultVery difficult
VariesVariesEasy EasyUsually easy
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data and the V/W-model
© Thompson
information
Systems
Consulting Ltd 17
Contrived
Maybe live,
else live-like
Contrived
then/and
live-like
Acceptance
Testing
System
Testing
Integration
Testing
Levels of
specification
Requirements
Functional &
NF specifica-
tions
Technical
spec, Hi-level
design
Detailed
designs
Unit
Testing
Levels of
stakeholders
Business,
Users,
Business Analysts,
Acceptance Testers
Architects,
“independent”
testers
Designers,
integration
testers
Developers,
unit testers
Levels of
integration
+ Business
processes
Levels of:
testing...
Levels of
review
...test
data
Pilot /
progressive
rollout
NF
Func
Live
Remember: not only for waterfall or V-model SDLCs, rather
iterative / incremental go down & up
through layers of stakeholders,
specifications & system integrations
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
The V-model and techniques
© Thompson
information
Systems
Consulting Ltd 18
GLASS-BOX → STRUCTURE
-BASED
Contrived
Maybe live,
else live-like
Contrived
then/and
live-like
Acceptance
Testing
System
Testing
Integration
Testing
Unit
Testing
Levels of
integration
+ Business
processes
Levels of:
testing... ...test
data
Pilot /
progressive
rollout
NF
Func
Live
Techniques for being live-like?
Techniques contrived for coverage
BLACK-BOX →
BEHAVIOUR-BASED
Source: BS 7925-2
...............Source:ISO/IEC/IEEE29119-4...............
EXPERIENCE-BASED
Cause-Effect Graphing
Combinatorial
(All, Pairs, Choices)
Classification Tree
Decision Table
Boundary Value Analysis
Equivalence Partitioning
Random
Scenarios
State Transitions
Error Guessing
....................Source:BBSTTestDesign....................
Domain ↑
• Input & output
• Primary & secondary
• Filters & consequences
• Multiple variables
MAYBEEXPLORATORY,RISK-ORIENTED
Tester-based
eg α, β, Paired
Coverage-based
eg Functions, Tours,
[Para-func] risks
eg Stress, Usability
Activity-based, eg
Use Cases, All-pairs
Evaluation-based
eg Math oracle
Desired-result
eg Build verification
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data for Unit Testing
© Thompson
information
Systems
Consulting Ltd 19
Contrived...
Integration
Testing
Unit
Testing
Drivers Stubs
DATA
To drive all techniques in use
Measure
GLASS-BOX coverage
(manually? /
by instrumentation)
Functional, eg validity checks:
• intra-field & inter-field
Any Non-Func wanted & feasible, eg:
• local performance, usability
Input transactions
Input
data
spread
Processing transactions
Stored
data
spread
Output transactions
Output
data
spread
Standing
data
Running
test
data
Running
test data
Outputs
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data for Integration Testing
© Thompson
information
Systems
Consulting Ltd 20
Contrived...
System
Testing
Integration
Testing
Input interfaces
Input
data
spread
Processing transactions
Stored
data
spread
Output interfaces
Output
data
spread
Standing
data
Running
test
data
Running
test data
Outputs
Unit
Testing
Drivers Stubs
Other units
when ready
Functional, eg boundary conditions:
• null transfer, single-record, duplicates
Any Non-Func wanted & feasible, eg:
• local performance, security
Running
test
data
Running
test data
Some UNIDIRECTIONAL, some BIDIRECTIONAL
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data for System Testing
© Thompson
information
Systems
Consulting Ltd 21
Contrived
Acceptance
Testing
Integration
Testing
Live-like
Input
transactions
Input
data
spread
Processing
transactions
Stored
data
spread
Output
transactions
Output
data
spread
Standing
data
Running
test data
Running
test data
Outputs
Eg:
• performance (eg
response times)
• peak business volumes
• usability
• user access security
Some FUNCTIONAL, some NON / PARA - FUNCTIONAL
Eg:
• stress
• volumes over-peak
• contention
• anti-penetration
security
All (believed)
contrivable
Live-like Surprise?!
System
Testing
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data for Acceptance Testing etc
© Thompson
information
Systems
Consulting Ltd 22
Full live
running
Acceptance
Testing
System
Testing
Maybe live,
else
live-like
Input
transactions
Input
data
spread
Processing
transactions
Stored
data
spread
Output
transactions
Output
data
spread
Standing
data
Running
test data
Running
test data
Outputs
NF acceptance criteria, eg:
• performance
• volume
• security
Some FUNCTIONAL, some NON / PARA - FUNCTIONAL
Live-like Any surprises here
won’t cause failures?
Pilot / progressive
rollout
Follow up any issuesLive
Live
But any surprises here
may cause failures!
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data often needs planning &
acquiring earlier than you may think!
© Thompson
information
Systems
Consulting Ltd 23
• Data to check validity? (esp. inter-field)
• Who writes stubs & drivers, and when?
• How get enough data to test perf early?
• How to select location, scope etc of pilot
• Planning rollout sequence
• Planning containment & fix of any failures
• Deciding whether live and/or live-like
• For a new system, how much live exists yet?
• Rules, permissions & tools for obfuscation
• Top-down / bottom-up affects data coord?
• Still stubs & drivers, but also harnesses,
probes, analysers?
• Any live-like avail yet? How know like live?
• Tools to replicate data to volumes, while
keeping referential integrity
Contrived
Maybe live,
else live-like
Contrived
then/and
live-like
Acceptance
Testing
System
Testing
Integration
Testing
Unit
Testing
Levels of:
testing... ...test
data
Pilot /
progressive
rollout
NF
Func
Live
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
“Experience report”: general advice,
and pitfalls / concerns
• Start tests with empty data stores, then progressively add data
• Standing data can/should be complete, but running (and transactional) data
need only be a representative subset – until volume etc & acceptance testing
• If different teams can own data, pre-agree codings & value ranges
• If able to automate tests (usual caveats applying), use a data-driven
framework (more later about targeted test data tools)
• Keep backups, and/or use database checkpointing facilities, to be able to
refresh back to known data states (but remember this is not live-like!)
• Regression testing should occur at all levels, and needs stable data baselines
© Thompson
information
Systems
Consulting Ltd 24
• Pitfalls:
– not having planned test data carefully / early enough
– insufficiently rich, traceable, and/or embarrassingly silly test data values
– difficulties with referential integrity
– despite huge efforts, not getting permission to use live data (even obfuscated)
– if actual live data is too large, subsetting is not trivial (eg integrity)
– test data leaking into live!!
• Concerns:
– (from personal experience, also anecdotally) real projects we meet
don’t have time/expertise to craft with techniques – at least, not
very explicitly
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Summary of “old school” approach
to test data
© Thompson
information
Systems
Consulting Ltd 25
Contrived
Maybe live,
else live-like
Contrived
then/and
live-like
Acceptance
Testing
System
Testing
Integration
Testing
Unit
Testing
Pilot /
progressive
rollout
NF
Func
Live
Standing
data
Running
test
data
Running
test data
5. SOME TOOL USE, EG FOR
FUNC TEST AUTOMATION,
REPLICATING DATA FOR
PERF/VOLUME TESTS
1. POOR RELATION OF
ARTEFACT FAMILY
2. (IN THEORY)
DRIVEN BY TECHNIQUES:
MAINLY BLACK-BOX
3. DIFFERENT STYLES/EMPHASES
AT DIFFERENT LEVELS
4. MAY BE THOUGHT OF AS
BUILDING CRUD USAGE...
...ACROSS NOT ONLY INPUTS,
BUT ALSO PROCESSING & OUTPUTS
(EG VIA DOMAIN TESTING)
Outputs
eg TestFrame
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
(past & present): Data structures
& Object Orientation
Part B
© Thompson
information
Systems
Consulting Ltd 26
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Test data in object-oriented methods
• The famous 1191-page book:
– no “test data” in index! (nor data, nor persistence –
but maybe because data is/are “encapsulated”)
– emphasis on automated testing, though “manual
testing, of course, still plays a role”
– structure of book is Models, Patterns & Tools:
• applying combinational test models (decision/truth
tables, Karnaugh-Veitch matrices, cause-effect graphs) to
Unified Modelling Language (UML) diagrams (eg state
transitions)
• Patterns – “results-oriented” (*not* glass/black box) test
design at method, class, component, subsystem,
integration & system scopes
• Tools – assertions, oracles & test harnesses © Thompson
information
Systems
Consulting Ltd 27
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
© Thompson
information
Systems
Consulting Ltd 28
Diagram examples:
agilemodeling.com
[etc]
Use case BEHAVIOURAL
STRUCTURAL
Activity
Sequence
State
Collaboration(now
Communication)
Class
Object
Component
Deployment
Method
Class
Component
Subsystem
Integration
System
• (more
industrial
archaeology!)
A UML V-model
• Since then, UML (now v2) has expanded to 14 diagram types,
but anyway how many of these 9 do you typically see?
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
If functional / data / OO diagrams not
provided: you could build your own
• Note: parts of UML extend/develop
concepts from older-style models, eg
SSADM (Structured Systems Analysis
and Design Method):
– Use Cases built on Business Activity
Model, BAM
– Class diagrams built on Logical Data
Model, LDM (entity relationships)
– Activity diagrams on Data Flow Model,
DFM
– Interaction diagrams on Entity Life
History, ELH (entity event modelling)
• And (according to Beizer and many
since) testers may build their own
models – even potentially invent new
ones © Thompson
information
Systems
Consulting Ltd 29Diagram examples: visionmatic.co.uk, umsl.edu, paulherber.co.uk, jacksonworkbench.co.uk
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
And, if you have much time / a desire
for selecting carefully...
© Thompson
information
Systems
Consulting Ltd 30
Version of Zachman framework from icmgworld.com
See also modified expansion in David C. Hay, Data Model Patterns – a Metadata Map
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
(the present): Agile &
Context-Driven
Part C
© Thompson
information
Systems
Consulting Ltd 31
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Analogy with scientific experiment:
hypothesis “all swans are white”
• So far, we have been setting
out conditions (→ cases) we
want to test, then
contriving/procuring test data
to trigger those conditions
© Thompson
information
Systems
Consulting Ltd 32
Test Data
Test Strategy
Test Plan
Test Conditions
Test Cases
Test Procedures/Scripts
• This is like scientific
hypothesis, then experiment
to confirm/falsify:
– test whiteness of swans
(hmm: cygnets grey, adult
birds may be dirty)
– but only by going to Australia
could early observers have
found real, wild black swans
• See also “Grounded Theory”
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
From top-down to bottom-up: what if
data triggers unplanned conditions?
• “Old-school” methods don’t seem to consider
this? Neither do OO & UML in themselves?
• But agile can, and Context-Driven does...
© Thompson
information
Systems
Consulting Ltd 33
Test Data
Test Strategy
Test Plan
Test Conditions
Test Cases
Test Procedures/Scripts
Testing
Test Data
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Agile on test data
• The Agile/agile methods/”ecosystems” ?
– Prototyping, Spiral, Evo, RAD, DSDM
– USDP, RUP, EUP, AUP, EPF-OpenUP, Crystal, Scrum, XP,
Lean, ASD, AM, ISD, Kanban, Scrumban
• The “Drivens”:
– A(T)DD, BDD, CDD, DDD, EDD, FDD, GDD, HDD, IDD,
JSD, KDS, LFD, MDD, NDD, ODD, PDD, QDD, RDD, SDD,
TDD, UDD, VDD, WDD, XDD, YDD, ZDD*
• Scalable agile frameworks (SAFe, DAD, LeSS etc) ?
• Lisa Crispin & Janet Gregory:
– Agile Testing; More Agile Testing © Thompson
information
Systems
Consulting Ltd 34* I fabricated only four of these – can you guess which?
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Driven by test data?
© Thompson
information
Systems
Consulting Ltd 35
Source: Gojko Adzic via neuri.co.uk
eg ACCEPTANCE TESTS:
As a <Role> I want
<Feature> so that <Benefit>
Source: Wikipedia [!]
eg UNIT TESTS:
Setup
Execution
Validation
Cleanup
ACCEPTANCE CRITERIA:
Given <Initial context>
when <Event occurs>
then <ensure some Outcomes>
Source: Aaron Kromer via github.com
Source: Gojko Adzic
via gojko.net:
“a good acceptance test”
SPECIFICATION:
When <Executable example 1>
and <Executable example 2>
then <Expected behaviour>
Source: The RSpec
Book, David
Chelimsky, Dan
North etc
ATDD
SBE
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Crispin & Gregory on test data
© Thompson
information
Systems
Consulting Ltd 36
• These books were written partly because so
many agile methods say (often deliberately)
so little about testing
• Within “Strategies for writing tests” – test
genesis/design patterns include:
– Build-Operate-Check, using multiple input data
values
– Data-Driven testing
• Much on TDD, BDD & ATDD (mentioning
Domain Specific Languages, Specification By
Example etc)
• Guest article by Jeff Morgan on test data
management
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Beyond agile to DevOps
© Thompson
information
Systems
Consulting Ltd 37
• DevOps extends
(“Shifts Right”)
agile concepts into
operations, ie live
production
• Multiple aspects, but
especially needs
more specialised
tools, eg:
• This includes
test data
generation &
management
tools, eg:
continuousautomation
.com
techbeacon
.com
techarcis
.com
(formerly Grid Tools)
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Context-Driven on test data
• The books:
– actually there *are* more than one!
• BBST courses (BBST is a registered trademark of Kaner,
Fiedler & Associates ,LLC):
– Foundations
– Bug Advocacy
– Test Design
• Rapid Software Testing course
(James Bach &
Michael Bolton) © Thompson
information
Systems
Consulting Ltd 38
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Context-Driven on test data: books
• Jerry Weinberg:
– rain gauge story
– composition & decomposition fallacies
• Kaner, Falk & Nguyen:
– [static] testing of data structures & access
• Kaner, Bach (James) & Pettichord:
– 103 & 129 Use automated techniques to
extend reach of test inputs, eg:
• models, combinations, random, volume
– 127 & 130 Data-driven automation
separating generation & execution:
• tabulate inputs & expected outputs
• easier to understand, review & control
© Thompson
information
Systems
Consulting Ltd 39
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Context-Driven on test data: courses
• BBST:
– Foundations:
• a typical context includes creating test data sets with well-understood
attributes, to be used in several tests
– Bug Advocacy:
• may need to analyse & vary test data during the “Replicate, Isolate, Generalise,
Externalise” reporting elements
– Test Design:
• significant emphasis on Domain Testing
• Rapid Software Testing:
– use diversified, risk-based strategy, eg:
• “sample data” is one of the tours techniques
– “easy input” oracles include:
• populations of data which have distinguishable
statistical properties
• data which embeds data about itself
• where output=input but state may have changed
– if “repeating” tests, exploit variation to find
more bugs, eg:
• substitute different data
• vary state of surrounding system(s)
© Thompson
information
Systems
Consulting Ltd 40
NB this illustration
is from
csestudyzone.
blogspot.co.uk
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Random testing & Hi-Volume Automation
• Random testing:
– counts as a named technique in many taxonomies
– may be random / pseudo-random (advantages of no bias)...
– ...wrt data values generated, selection of data from pre-populated tables,
sequence of functions triggered etc
– may be guided by heuristics (partial bias)
– “monkey testing” does not do it justice, but difficult to specify oracles/expected
results
• But... James Bach & Patrick J. Schroeder paper:
– empirical studies found no significant difference in the defect detection
efficiency of pairwise test sets and same-size randomly selected test sets,
however...
– several factors need consideration in such comparisons
• And... Cem Kaner on HiVAT:
– “automated generation, execution and evaluation of arbitrarily many tests. The
individual tests are often weak, but taken together, they can expose problems
that individually-crafted tests will miss” – examples:
• inputs-focussed: parametric variation, combinations, fuzzing, hostile datastream
• exploiting oracle: function equivalence, constraint checks, inverse
operations, state models, diagnostic
• exploiting existing tests/tools: long-sequence regression, hi-volume
protocol, load-enhanced functional
© Thompson
information
Systems
Consulting Ltd 41
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
(present & future): Cloud,
Big Data, IoT/E (oh, and AI, still)
Part D
© Thompson
information
Systems
Consulting Ltd 42
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Beyond driving: now let’s fly!
© Thompson
information
Systems
Consulting Ltd 43
• Test data
for cloud
systems
...
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
(Test) data in cloud systems
• In the olden days, the data was in one known place
• But in cloud computing:
– system specifications & physical implementations are abstracted away
from users, data replicated & stored in unknown locations
– resources virtualised, pooled & shared with unknown others
• Differing deployment models: private, community, public, hybrid
• Non/para-functional tests prioritised & complicated (eg
“elasticity”, service levels & portability) even more than plain
internet systems; huge data sizes, incl. auto-generated for mgmt
• From my own experience with Twitter etc:
– no single source of truth at a time; notifications ≠ web or mobile app
view; updates prioritised & cascaded, sequence unpredictable
– testing extends into live usage (eg “test on New Zealand first”)
• Particular considerations for data when migrating an in-house
system out to cloud (IaaS / PaaS / SaaS)
• Dark web not indexable by search engines (eg Facebook!)
• So, testing & test data more difficult? © Thompson
information
Systems
Consulting Ltd 44
Sources: Barrie Sosinsky, Cloud Computing “Bible”
Blokland, Mengerink & Pol – Testing Cloud Services
See Erratum
slide 68
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Big data
• An extension of data warehouse → business intelligence concepts,
enabled by:
– vastness of data (some cloud, some not), Moore’s Law re processing power
– new data, eg GPS locations & biometrics from mobile devices & wearables
– tools which handle diverse, unstructured data (not just neat files / tables /
fields) – importance of multimedia & metadata
– convergences: Social, Mobile, Analytics & Cloud; Volume, Variety, Velocity
• Not just data for a system to create/add value: but value from data itself
• Exact → approximate; need not be perfect for these new purposes
• Away from rules & hypotheses, eg language translation by brute inference
• This extends the “bottom-up” emphasis I have been developing
• A key aim is to identify hitherto unknown (or at least unseen) patterns,
relationships & trends – again a testing challenge, because not
predictable, what are “expected results”?
• So contrived test data may be no use – need real or nothing?
• (And beware, not all correlations are causations – but users may still be
happy to use for decision-making)
© Thompson
information
Systems
Consulting Ltd 45
Sources: Mayer-Schönberger & Cukier – Big Data
Minelli, Chambers & Dhiraj – Big Data, Big Analytics
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
When Data gets “big”, does it grow up
into something else?
© Thompson
information
Systems
Consulting Ltd 46
Sources: above – Matthew Viel as in US Army CoP, via Wikipedia
below – Karim Vaes blog,
below right – Bellinger, Castro & Mills
at systems-thinking.org
• (two axes
but no
distinction?)
• other perspectives........
David McCandless, Malcolm Pritchard
informationisbeautiful.net cademy.isf.edu.hk fluks.dvrlists.com
(Respectively above)...................................................
applied
organised
discrete
linked
values,
virtues,
vision
experience,
reflection,
understanding
meaning,
memory
symbols,
senses
signals,
know-nothing
useful,
organised,
structured
contextual,
synthesised,
learning
understanding,
integrated,
actionable
+ DECISION!
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Another of the many views available
© Thompson
information
Systems
Consulting Ltd 47
Source(s): Avinash Kaushik (kaushik.net/avinash/great-analyst-skills-skepticism-wisdom)
quoting by David Somerville, based on a two pane version by Hugh McLeod
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
No, it did not have to be a pyramid: plus
here are several extra resonances
© Thompson
information
Systems
Consulting Ltd 48
• like Verification & Validation?
• T & E also
quoted
(reversed)
elsewhere
as Explicit
& Tacit*
Source: Michael Ervick,
via systemswiki.org
* Harry Collins after
Michael Polanyi;
quotation by
“Omegapowers” on
Wikipedia
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Data Science
• A new term arising out of Big Data & Analytics?
How much more than just statistics?
© Thompson
information
Systems
Consulting Ltd 49
Data
Science
Modified after Steven Geringer:
• Data used to be quite scientific already?
– sets, categories & attributes
– types, eg strings, integers, floating-point
– models & schemas, names &
representations, determinants & identifiers,
redundancy & duplication, repeating groups
– flat, hierarchical, network, relational, object
databases
– normalisation, primary & foreign keys,
relational algebra & calculus
– distribution, federation, loose/tight
coupling, commitment protocols
– data quality rules
• But now... (this is only one of several
available alternatives)
• And blobs hide hypothesising, pattern
recognition, judgement, prediction skills
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Internet of Things
© Thompson
information
Systems
Consulting Ltd 50
collaborative.com
pubnub
.com
• Extends “Social, Mobile,
Analytics & Cloud”
• Even more data –
and more diverse
• Identifying & using
signals amid “noise”
• So, new architectures
suggested
• Maybe nature
can help
• Yet more testing difficulty!
Francis daCosta:
Rethinking the IoT
eg see Paul Gerrard’s articles
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Entropy & information theory
• Cloud, Big Data & IoT are all bottom-up
disrupters of the old top-down methods
• Are there any bottom-up theories which might
help here?
© Thompson
information
Systems
Consulting Ltd 51
After hyperphysics.phy-
astr.gsu.edu
“temperature”
of gas
energies of
individual
molecules
Ito & Sagawa, nature.com
BOLTZMANN etc
SHANNON etc
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Information grows where energy flows
© Thompson
information
Systems
Consulting Ltd 52Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF
Sources: Daniel Dennett “Darwin’s Dangerous Idea”
“cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc)
Mathematics
EVOLUTION&“EMERGENCE”
Neil Thompson:
Value Flow
ScoreCards
Daniel Dennett:
platforms & cranes
Physics (Quantum
Theory end) Physics (General
Relativity end)
Chemistry (inorganic)
Chemistry (organic)
Biology
Humans
Tools
Languages
Books
Information Technology
Artificial
Intelligence
Physics (String Theories & rivals)
Geography
Geology
Astronomy
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Evolution & punctuated equilibria
• Here’s another
bottom-up
theory
© Thompson
information
Systems
Consulting Ltd 5353
“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould
Images from www.wikipedia.org
Sophistication
Diversity“Gradual”
Darwinsim
Sophistication
DiversityPunctuated
equilibria
“Explosion” in species,
eg Cambrian
Spread into new niche,
eg Mammals
Mass extinction,
eg Dinosaurs
(equilibrium)
(equilibrium)
(equilibrium)
Sophistication
Diversity
Number of
species
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
©Thompson
information
Systems
Consulting Ltd
Punctuated equilibria in information
technology?
54Computers
1GL
Object
Orientation
Internet,
Mobile
devices
Artificial
Intelligence?!
4GL
3GL
2GL
• Are we
ready
to test
AI??
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Artificial Intelligence
• Remember this?...............
• But there’s
much more!
• I keep treating it as the future, but much is already
here, or imminent sooner than you may think?
• Again there is the “oracle problem”:
– what are the “expected results”
– how can we predict emergent things?
– who will determine whether good, or bad, or...?
© Thompson
information
Systems
Consulting Ltd 55
Data
Science
legaltechnology.com
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
©Thompson
information
Systems
Consulting Ltd
More about Emergence: progress
along order-chaos edge?
56
Physics
Social sciences
Chemistry
Biology
• For best innovation & progress, need
neither too much order
nor too much chaos
• “Adjacent Possible”
Extrapolation from various sources, esp. Stuart Kauffman, “The Origins of Order”, “Investigations”
jurgenappelo.com
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
©Thompson
information
Systems
Consulting Ltd
So, back to test data
57
Computers
Artificial Intelligence
Object Orientation
Internet, Mobile
• Ross Ashby’s Law of Requisite Variety
• Make your test data “not too uniform,
not too random”
• SMAC + IoT +AI will be
an ecosystem?
• Which needs Data
Science to
manage it??
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
© Thompson
information
Systems
Consulting Ltd 58
• No,
maybe we
do indeed
need
rocket
science!
Summary
Artificial
Intelligence
DECISIONS
DATA
INFORMATION
KNOWLEDGE
WISDOM
INSIGHT
IoT
Cloud
Mobile
Social
Analytics
Emergence
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
The key points – test data is (are):
• More fundamental than you probably think, because:
– test conditions & cases don’t “really” exist until triggered by
data
– whether data is standing or running can affect design of tests
• More interesting, because:
– considerations (eg contrived, live-like) vary greatly at
different levels in the V-model
– data was always a science (you may have missed that) in
many respects
• Changing, through:
– agile & context-driven paradigms (both of which are still
evolving)
– cloud, big data, Internet of Things and Artificial Intelligence
– these changes are arguably moving from a top-down
approach (via test conditions & cases) to a more
bottom-up / holistic worldview
© Thompson
information
Systems
Consulting Ltd 59
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Takeaway messages
• Despite its apparent obscurity & tedium, test data actually
unifies a strand through key concepts of test techniques,
top-down v bottom-up approaches, SDLC methods and
even (arguably) “emergence”
• Whatever your context, think ahead – much more needs to
be decided than the literature makes clear, and much of it
is non-trivial
• Don’t think only test data, think test information,
knowledge, wisdom – and insight & decisions!
• This embraces the distinctions between:
– tacit /explicit knowledge
– verification / validation (≈ checking / testing)
• The “future” is already here, in many respects – honest
inquiry and research can cut through much of the hype –
don’t get left behind
© Thompson
information
Systems
Consulting Ltd 60
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Main references
• Standards:
– IEEE 829-1998
– BS 7925-2
– ISO/IEC/IEEE 29119-2:2013 & 4:2015
• Industrial archaeology from my own past:
– EuroSTAR 1993: Organisation before Automation
– EuroSTAR 1999: Zen & the Art of Object-Oriented Risk Management
– Book 2002: Risk-Based E-Business Testing (Paul Gerrard lead author)
– STARWest 2007: Holistic Test Analysis & Design (with Mike Smith)
• Testing textbooks:
– Craig & Jaskiel: Systematic Software Testing (2002)
– Binder: Testing Object-Oriented Systems (2000)
– Weinberg: Perfect Software and Other Illusions about Testing (2008)
– Kaner, Falk & Nguyen: Testing Computer Software (2nd ed, 1999)
– Kaner, Bach & Pettichord: Lessons Learned in Software Testing (2002)
– Crispin & Gregory: Agile Testing (2009) & More Agile Testing (2015)
• Testing training courses:
– BBST Foundations, Bug Advocacy & Test Design (Kaner et al.)
– Rapid Software Testing (Bach & Bolton)
© Thompson
information
Systems
Consulting Ltd 61
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Main references (continued)
• Data / keyword / table driven test automation:
– Buwalda, Janssen & Pinkster: Integrated Test Design & Automation
using the TestFrame method (2002)
• Methods:
– Structured Systems Analysis & Design Method (SSADM)
– Unified Modelling Language (UML)
– The Zachman framework (eg as in Hay: Data Model Patterns – a
Metadata Map)
• Data generally:
– Kent: Data & Reality (1978 & 1998)
– Howe: Data Analysis for Database Design (1983, 1989 & 2001)
• Agile methods textbooks:
– Highsmith: Agile Software Development Ecosystems (2002)
– Boehm & Turner: Balancing Agility & Discipline (2004)
– Adzic: Bridging the Communication Gap (2009)
– Gärtner: ATDD by Example (2013)
– Appelo: Management 3.0 (2010) – maybe post-agile? © Thompson
information
Systems
Consulting Ltd 62
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Main references (continued)
• Cloud, Big Data, Internet of Things:
– Sosinsky: Cloud Computing “Bible” (2011)
– Blokland, Mengerink & Pol: Testing Cloud Services (2013)
– Mayer-Schönberger & Cukier: Big Data (2013)
– Minelli, Chambers & Dhiraj: Big Data, Big Analytics (2013)
– daCosta: Rethinking the Internet of Things (2013)
• Entropy, emergence, Artificial Intelligence etc:
– Kauffman: The Emergence of Order (1993) & Investigations (2000)
– Dennett: Darwin’s Dangerous Idea (1995)
– Taleb: Fooled by Randomness (2001) & The Black Swan (2007)
– Gleick: The Information (2011)
– Morowitz: The Emergence of Everything (2002)
– Birks & Mills: Grounded Theory (2011)
– Kurzweil: The Singularity is Near (2005)
• Websites:
– many (see individual credits annotated on slides) © Thompson
information
Systems
Consulting Ltd 63
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Thanks for listening (and looking)!
Neil Thompson @neilttweet
NeilT@TiSCL.com
linkedin.com/in/tiscl
Thompson information Systems Consulting Ltd
©Thompson
information
Systems
Consulting Ltd 64
Questions?
Contact information:
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Answers to *DD quiz
(ie the four I fabricated)
• XDD: eXtremely Driven Development (ie
micromanaged)
• LFD: Laissez Faire Development
• WDD: Weakly Driven Development
• NDD: Not Driven Development
© Thompson
information
Systems
Consulting Ltd 65
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Appendix: the most convincing *DD
examples which I didn’t fabricate
• A(T)DD: Acceptance (Test) Driven
Development
• B: Behaviour Driven Development
• C: Context Driven Design
• D: Domain Driven Design
• E: Example Driven Development
• F: Feature Driven Development
• G: Goal Driven Process
• H: Hypothesis Driven Development
• I: Idea Driven Development
• J: Jackson System Development
• K: Knowledge Driven Software
• M: Model Driven Development
© Thompson
information
Systems
Consulting Ltd 66
• O: Object Driven Development
• P: Process Driven Development
• Q: Quality Driven Development
• R: Result Driven Development
• S: Security Driven Development
• T: Test Driven Development
• U: Usability Driven Development
• V: Value Driven Development
• Y: YOLO (You Only Live Once)
Development
• Z: Zero Defects Development
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Appendix: best *DD runner-up
Quantum Driven Development:
• works only on hardware that hasn't yet been
invented
• works; oh no it doesn’t; oh now it does...
• works & doesn’t work at the same time
• uncertain whether or not it works
• there’s a probability function that...
• [That’s enough QDD: Ed.]
© Thompson
information
Systems
Consulting Ltd 67
secretgeek.net
SIGiST
Specialist Interest Group in
Software Testing 15 Sep 2016
Erratum
• Slide 44: Sorry, Facebook is no longer “dark
web”. I now see that it was, sort-of, only
before 2007 and I read in a 2011 book that it
still was, but this seems wrong and I didn’t
test it carefully enough!
– Obviously much depends on specific privacy
settings
– Maybe deep content is still not externally
crawlable?
© Thompson
information
Systems
Consulting Ltd 68

Contenu connexe

Tendances

How to Migrate from Oracle to EDB Postgres
How to Migrate from Oracle to EDB PostgresHow to Migrate from Oracle to EDB Postgres
How to Migrate from Oracle to EDB PostgresAshnikbiz
 
ODSC May 2019 - The DataOps Manifesto
ODSC May 2019 - The DataOps ManifestoODSC May 2019 - The DataOps Manifesto
ODSC May 2019 - The DataOps ManifestoDataKitchen
 
Large Scale Lakehouse Implementation Using Structured Streaming
Large Scale Lakehouse Implementation Using Structured StreamingLarge Scale Lakehouse Implementation Using Structured Streaming
Large Scale Lakehouse Implementation Using Structured StreamingDatabricks
 
Seven building blocks for MDM
Seven building blocks for MDMSeven building blocks for MDM
Seven building blocks for MDMKousik Mukherjee
 
Building the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake AnalyticsBuilding the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake AnalyticsKhalid Salama
 
Getting Started with Amazon QuickSight
Getting Started with Amazon QuickSightGetting Started with Amazon QuickSight
Getting Started with Amazon QuickSightAmazon Web Services
 
ETL VS ELT.pdf
ETL VS ELT.pdfETL VS ELT.pdf
ETL VS ELT.pdfBOSupport
 
Got data?… now what? An introduction to modern data platforms
Got data?… now what?  An introduction to modern data platformsGot data?… now what?  An introduction to modern data platforms
Got data?… now what? An introduction to modern data platformsJamesAnderson599331
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Precisely
 
Datastage to ODI
Datastage to ODIDatastage to ODI
Datastage to ODINagendra K
 
Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouseJames Serra
 
Informatica MDM Presentation
Informatica MDM PresentationInformatica MDM Presentation
Informatica MDM PresentationMaxHung
 
You Need a Data Catalog. Do You Know Why?
You Need a Data Catalog. Do You Know Why?You Need a Data Catalog. Do You Know Why?
You Need a Data Catalog. Do You Know Why?Precisely
 
DSpace-CRIS: a CRIS enhanced repository platform
DSpace-CRIS: a CRIS enhanced repository platformDSpace-CRIS: a CRIS enhanced repository platform
DSpace-CRIS: a CRIS enhanced repository platformAndrea Bollini
 
Building Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerBuilding Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerDatabricks
 
Data as a Product by Wayne Eckerson
Data as a Product by Wayne EckersonData as a Product by Wayne Eckerson
Data as a Product by Wayne EckersonZoomdata
 
Data Modeling & Metadata Management
Data Modeling & Metadata ManagementData Modeling & Metadata Management
Data Modeling & Metadata ManagementDATAVERSITY
 
Modernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureModernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureDatabricks
 
Creating your Center of Excellence (CoE) for data driven use cases
Creating your Center of Excellence (CoE) for data driven use casesCreating your Center of Excellence (CoE) for data driven use cases
Creating your Center of Excellence (CoE) for data driven use casesFrank Vullers
 

Tendances (20)

How to Migrate from Oracle to EDB Postgres
How to Migrate from Oracle to EDB PostgresHow to Migrate from Oracle to EDB Postgres
How to Migrate from Oracle to EDB Postgres
 
ODSC May 2019 - The DataOps Manifesto
ODSC May 2019 - The DataOps ManifestoODSC May 2019 - The DataOps Manifesto
ODSC May 2019 - The DataOps Manifesto
 
Big Data in Azure
Big Data in AzureBig Data in Azure
Big Data in Azure
 
Large Scale Lakehouse Implementation Using Structured Streaming
Large Scale Lakehouse Implementation Using Structured StreamingLarge Scale Lakehouse Implementation Using Structured Streaming
Large Scale Lakehouse Implementation Using Structured Streaming
 
Seven building blocks for MDM
Seven building blocks for MDMSeven building blocks for MDM
Seven building blocks for MDM
 
Building the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake AnalyticsBuilding the Data Lake with Azure Data Factory and Data Lake Analytics
Building the Data Lake with Azure Data Factory and Data Lake Analytics
 
Getting Started with Amazon QuickSight
Getting Started with Amazon QuickSightGetting Started with Amazon QuickSight
Getting Started with Amazon QuickSight
 
ETL VS ELT.pdf
ETL VS ELT.pdfETL VS ELT.pdf
ETL VS ELT.pdf
 
Got data?… now what? An introduction to modern data platforms
Got data?… now what?  An introduction to modern data platformsGot data?… now what?  An introduction to modern data platforms
Got data?… now what? An introduction to modern data platforms
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?
 
Datastage to ODI
Datastage to ODIDatastage to ODI
Datastage to ODI
 
Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouse
 
Informatica MDM Presentation
Informatica MDM PresentationInformatica MDM Presentation
Informatica MDM Presentation
 
You Need a Data Catalog. Do You Know Why?
You Need a Data Catalog. Do You Know Why?You Need a Data Catalog. Do You Know Why?
You Need a Data Catalog. Do You Know Why?
 
DSpace-CRIS: a CRIS enhanced repository platform
DSpace-CRIS: a CRIS enhanced repository platformDSpace-CRIS: a CRIS enhanced repository platform
DSpace-CRIS: a CRIS enhanced repository platform
 
Building Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerBuilding Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics Primer
 
Data as a Product by Wayne Eckerson
Data as a Product by Wayne EckersonData as a Product by Wayne Eckerson
Data as a Product by Wayne Eckerson
 
Data Modeling & Metadata Management
Data Modeling & Metadata ManagementData Modeling & Metadata Management
Data Modeling & Metadata Management
 
Modernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureModernizing to a Cloud Data Architecture
Modernizing to a Cloud Data Architecture
 
Creating your Center of Excellence (CoE) for data driven use cases
Creating your Center of Excellence (CoE) for data driven use casesCreating your Center of Excellence (CoE) for data driven use cases
Creating your Center of Excellence (CoE) for data driven use cases
 

En vedette

Risk and Testing (2003)
Risk and Testing (2003)Risk and Testing (2003)
Risk and Testing (2003)Neil Thompson
 
Sensation & Perception PowerPoint
Sensation & Perception PowerPointSensation & Perception PowerPoint
Sensation & Perception PowerPointKRyder
 
Information, knowledge, wisdom
  Information, knowledge, wisdom  Information, knowledge, wisdom
Information, knowledge, wisdomRobert Arvanitis
 
NC FIELD Newsletter - 2nd Quarter
NC FIELD Newsletter - 2nd QuarterNC FIELD Newsletter - 2nd Quarter
NC FIELD Newsletter - 2nd QuarterNC FIELD, Inc.
 
Theorizing data, information and knowledge constructs and their inter-relatio...
Theorizing data, information and knowledge constructs and their inter-relatio...Theorizing data, information and knowledge constructs and their inter-relatio...
Theorizing data, information and knowledge constructs and their inter-relatio...Cranfield University
 
Dallas/Fort Worth Real Estate Prospects 2014
Dallas/Fort Worth Real Estate Prospects 2014Dallas/Fort Worth Real Estate Prospects 2014
Dallas/Fort Worth Real Estate Prospects 2014Shahid Butt
 
Understanding the difference between Data, information and knowledge
Understanding the difference between Data, information and knowledgeUnderstanding the difference between Data, information and knowledge
Understanding the difference between Data, information and knowledgeNeeti Naag
 
Scenario Mapping Introduction
Scenario Mapping IntroductionScenario Mapping Introduction
Scenario Mapping Introductionbobweber
 
Introduction to CLIPS Expert System
Introduction to CLIPS Expert SystemIntroduction to CLIPS Expert System
Introduction to CLIPS Expert SystemMotaz Saad
 
Advantages and Disadvantages of MIS
Advantages and Disadvantages of MISAdvantages and Disadvantages of MIS
Advantages and Disadvantages of MISNeeti Naag
 
Pp1 data, information & knowledge
Pp1 data, information & knowledgePp1 data, information & knowledge
Pp1 data, information & knowledgemenisantixs
 
Vitrue deck sales_case_studynestle
Vitrue deck sales_case_studynestleVitrue deck sales_case_studynestle
Vitrue deck sales_case_studynestlepbrady459
 
Rajesh & radha marriage invitation
Rajesh & radha marriage invitationRajesh & radha marriage invitation
Rajesh & radha marriage invitationrajeswaran6263nan
 
Aemfi and the microfinance sector
Aemfi and the microfinance sectorAemfi and the microfinance sector
Aemfi and the microfinance sectorshree kant kumar
 
Document de Voluntats Anticipades. EAP Sallent
Document de Voluntats Anticipades. EAP SallentDocument de Voluntats Anticipades. EAP Sallent
Document de Voluntats Anticipades. EAP SallentICS Catalunya Central
 
Peluang Biznis HiGOAT
Peluang Biznis HiGOATPeluang Biznis HiGOAT
Peluang Biznis HiGOATImpian Hari
 

En vedette (20)

Risk and Testing (2003)
Risk and Testing (2003)Risk and Testing (2003)
Risk and Testing (2003)
 
Sensation & Perception PowerPoint
Sensation & Perception PowerPointSensation & Perception PowerPoint
Sensation & Perception PowerPoint
 
Information, knowledge, wisdom
  Information, knowledge, wisdom  Information, knowledge, wisdom
Information, knowledge, wisdom
 
The power of BI
The power of BIThe power of BI
The power of BI
 
NC FIELD Newsletter - 2nd Quarter
NC FIELD Newsletter - 2nd QuarterNC FIELD Newsletter - 2nd Quarter
NC FIELD Newsletter - 2nd Quarter
 
Theorizing data, information and knowledge constructs and their inter-relatio...
Theorizing data, information and knowledge constructs and their inter-relatio...Theorizing data, information and knowledge constructs and their inter-relatio...
Theorizing data, information and knowledge constructs and their inter-relatio...
 
Dallas/Fort Worth Real Estate Prospects 2014
Dallas/Fort Worth Real Estate Prospects 2014Dallas/Fort Worth Real Estate Prospects 2014
Dallas/Fort Worth Real Estate Prospects 2014
 
Understanding the difference between Data, information and knowledge
Understanding the difference between Data, information and knowledgeUnderstanding the difference between Data, information and knowledge
Understanding the difference between Data, information and knowledge
 
L1 dikw and knowledge management
L1 dikw and knowledge managementL1 dikw and knowledge management
L1 dikw and knowledge management
 
Scenario Mapping Introduction
Scenario Mapping IntroductionScenario Mapping Introduction
Scenario Mapping Introduction
 
Introduction to CLIPS Expert System
Introduction to CLIPS Expert SystemIntroduction to CLIPS Expert System
Introduction to CLIPS Expert System
 
Advantages and Disadvantages of MIS
Advantages and Disadvantages of MISAdvantages and Disadvantages of MIS
Advantages and Disadvantages of MIS
 
Expert Systems
Expert SystemsExpert Systems
Expert Systems
 
Pp1 data, information & knowledge
Pp1 data, information & knowledgePp1 data, information & knowledge
Pp1 data, information & knowledge
 
Vitrue deck sales_case_studynestle
Vitrue deck sales_case_studynestleVitrue deck sales_case_studynestle
Vitrue deck sales_case_studynestle
 
Implants
ImplantsImplants
Implants
 
Rajesh & radha marriage invitation
Rajesh & radha marriage invitationRajesh & radha marriage invitation
Rajesh & radha marriage invitation
 
Aemfi and the microfinance sector
Aemfi and the microfinance sectorAemfi and the microfinance sector
Aemfi and the microfinance sector
 
Document de Voluntats Anticipades. EAP Sallent
Document de Voluntats Anticipades. EAP SallentDocument de Voluntats Anticipades. EAP Sallent
Document de Voluntats Anticipades. EAP Sallent
 
Peluang Biznis HiGOAT
Peluang Biznis HiGOATPeluang Biznis HiGOAT
Peluang Biznis HiGOAT
 

Similaire à Test Data, Information, Knowledge, Wisdom: past, present & future of standing, running, driving & flying (2016)

SplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
SplunkLive! Frankfurt 2018 - Integrating Metrics & LogsSplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
SplunkLive! Frankfurt 2018 - Integrating Metrics & LogsSplunk
 
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...Informatik Aktuell
 
Taming the Beast: Test/QA on Large-scale Projects
Taming the Beast: Test/QA on Large-scale ProjectsTaming the Beast: Test/QA on Large-scale Projects
Taming the Beast: Test/QA on Large-scale ProjectsTechWell
 
SplunkLive! Munich 2018: Integrating Metrics and Logs
SplunkLive! Munich 2018: Integrating Metrics and LogsSplunkLive! Munich 2018: Integrating Metrics and Logs
SplunkLive! Munich 2018: Integrating Metrics and LogsSplunk
 
rough-work.pptx
rough-work.pptxrough-work.pptx
rough-work.pptxsharpan
 
Sachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant
 
Sachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant
 
SplunkLive! Zurich 2018: Integrating Metrics and Logs
SplunkLive! Zurich 2018: Integrating Metrics and LogsSplunkLive! Zurich 2018: Integrating Metrics and Logs
SplunkLive! Zurich 2018: Integrating Metrics and LogsSplunk
 
Rabobank - There is something about Data
Rabobank - There is something about DataRabobank - There is something about Data
Rabobank - There is something about DataBigDataExpo
 
Priyanka Jain_Resume20161602
Priyanka Jain_Resume20161602Priyanka Jain_Resume20161602
Priyanka Jain_Resume20161602Priyanka Jain
 
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...Splunk
 
Machine Data Is EVERYWHERE: Use It for Testing
Machine Data Is EVERYWHERE: Use It for TestingMachine Data Is EVERYWHERE: Use It for Testing
Machine Data Is EVERYWHERE: Use It for TestingTechWell
 
Agile Testing Process Analytics: From Data to Insightful Information
Agile Testing Process Analytics: From Data to Insightful InformationAgile Testing Process Analytics: From Data to Insightful Information
Agile Testing Process Analytics: From Data to Insightful InformationTechWell
 
Requirement verification & validation
Requirement verification & validationRequirement verification & validation
Requirement verification & validationAbdul Basit
 
DATA @ NFLX (Tableau Conference 2014 Presentation)
DATA @ NFLX (Tableau Conference 2014 Presentation)DATA @ NFLX (Tableau Conference 2014 Presentation)
DATA @ NFLX (Tableau Conference 2014 Presentation)Blake Irvine
 
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...akquinet enterprise solutions GmbH
 
[Case study]Utilize STLC data for Process Improvement
[Case study]Utilize STLC data for Process Improvement[Case study]Utilize STLC data for Process Improvement
[Case study]Utilize STLC data for Process ImprovementRakuten Group, Inc.
 

Similaire à Test Data, Information, Knowledge, Wisdom: past, present & future of standing, running, driving & flying (2016) (20)

SplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
SplunkLive! Frankfurt 2018 - Integrating Metrics & LogsSplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
SplunkLive! Frankfurt 2018 - Integrating Metrics & Logs
 
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...
Wolfgang Epting – IT-Tage 2015 – Testdaten – versteckte Geschäftschance oder ...
 
W7
W7W7
W7
 
Taming the Beast: Test/QA on Large-scale Projects
Taming the Beast: Test/QA on Large-scale ProjectsTaming the Beast: Test/QA on Large-scale Projects
Taming the Beast: Test/QA on Large-scale Projects
 
SplunkLive! Munich 2018: Integrating Metrics and Logs
SplunkLive! Munich 2018: Integrating Metrics and LogsSplunkLive! Munich 2018: Integrating Metrics and Logs
SplunkLive! Munich 2018: Integrating Metrics and Logs
 
rough-work.pptx
rough-work.pptxrough-work.pptx
rough-work.pptx
 
Sachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant_232644_CV
Sachin Sawant_232644_CV
 
Sachin Sawant_232644_CV
Sachin Sawant_232644_CVSachin Sawant_232644_CV
Sachin Sawant_232644_CV
 
SplunkLive! Zurich 2018: Integrating Metrics and Logs
SplunkLive! Zurich 2018: Integrating Metrics and LogsSplunkLive! Zurich 2018: Integrating Metrics and Logs
SplunkLive! Zurich 2018: Integrating Metrics and Logs
 
Rabobank - There is something about Data
Rabobank - There is something about DataRabobank - There is something about Data
Rabobank - There is something about Data
 
Priyanka Jain_Resume20161602
Priyanka Jain_Resume20161602Priyanka Jain_Resume20161602
Priyanka Jain_Resume20161602
 
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...
SplunkLive! Frankfurt 2018 - Legacy SIEM to Splunk, How to Conquer Migration ...
 
Machine Data Is EVERYWHERE: Use It for Testing
Machine Data Is EVERYWHERE: Use It for TestingMachine Data Is EVERYWHERE: Use It for Testing
Machine Data Is EVERYWHERE: Use It for Testing
 
Agile Testing Process Analytics: From Data to Insightful Information
Agile Testing Process Analytics: From Data to Insightful InformationAgile Testing Process Analytics: From Data to Insightful Information
Agile Testing Process Analytics: From Data to Insightful Information
 
Requirement verification & validation
Requirement verification & validationRequirement verification & validation
Requirement verification & validation
 
DATA @ NFLX (Tableau Conference 2014 Presentation)
DATA @ NFLX (Tableau Conference 2014 Presentation)DATA @ NFLX (Tableau Conference 2014 Presentation)
DATA @ NFLX (Tableau Conference 2014 Presentation)
 
Data explorer
Data explorerData explorer
Data explorer
 
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...
SAP Security & Compliance Audits. Find your vulnerabilities before you get hu...
 
Batch Process Analytics
Batch Process Analytics Batch Process Analytics
Batch Process Analytics
 
[Case study]Utilize STLC data for Process Improvement
[Case study]Utilize STLC data for Process Improvement[Case study]Utilize STLC data for Process Improvement
[Case study]Utilize STLC data for Process Improvement
 

Plus de Neil Thompson

Six schools, three cultures of testing: future-proof by shifting left, down, ...
Six schools, three cultures of testing: future-proof by shifting left, down, ...Six schools, three cultures of testing: future-proof by shifting left, down, ...
Six schools, three cultures of testing: future-proof by shifting left, down, ...Neil Thompson
 
From 'Fractal How' to Emergent Empowerment (2013 article)
From 'Fractal How' to Emergent Empowerment (2013 article)From 'Fractal How' to Emergent Empowerment (2013 article)
From 'Fractal How' to Emergent Empowerment (2013 article)Neil Thompson
 
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Neil Thompson
 
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Neil Thompson
 
Risk-Based Testing - Designing & managing the test process (2002)
Risk-Based Testing - Designing & managing the test process (2002)Risk-Based Testing - Designing & managing the test process (2002)
Risk-Based Testing - Designing & managing the test process (2002)Neil Thompson
 
'Best Practices' & 'Context-Driven' - Building a bridge (2003)
'Best Practices' & 'Context-Driven' - Building a bridge (2003)'Best Practices' & 'Context-Driven' - Building a bridge (2003)
'Best Practices' & 'Context-Driven' - Building a bridge (2003)Neil Thompson
 
Risk Mitigation Trees - Review test handovers with stakeholders (2004)
Risk Mitigation Trees - Review test handovers with stakeholders (2004)Risk Mitigation Trees - Review test handovers with stakeholders (2004)
Risk Mitigation Trees - Review test handovers with stakeholders (2004)Neil Thompson
 
ROI at the bug factory - Goldratt & throughput (2004)
ROI at the bug factory - Goldratt & throughput (2004)ROI at the bug factory - Goldratt & throughput (2004)
ROI at the bug factory - Goldratt & throughput (2004)Neil Thompson
 
Feedback-focussed process improvement (2006)
Feedback-focussed process improvement (2006)Feedback-focussed process improvement (2006)
Feedback-focussed process improvement (2006)Neil Thompson
 
Thinking tools - From top motors through s'ware proc improv't to context-driv...
Thinking tools - From top motors through s'ware proc improv't to context-driv...Thinking tools - From top motors through s'ware proc improv't to context-driv...
Thinking tools - From top motors through s'ware proc improv't to context-driv...Neil Thompson
 
Holistic Test Analysis & Design (2007)
Holistic Test Analysis & Design (2007)Holistic Test Analysis & Design (2007)
Holistic Test Analysis & Design (2007)Neil Thompson
 
Value Flow ScoreCards - For better strategies, coverage & processes (2008)
Value Flow ScoreCards - For better strategies, coverage & processes (2008)Value Flow ScoreCards - For better strategies, coverage & processes (2008)
Value Flow ScoreCards - For better strategies, coverage & processes (2008)Neil Thompson
 
Value Flow Science - Fitter lifecycles from lean balanced scorecards (2011)
Value Flow Science - Fitter lifecycles from lean balanced scorecards  (2011)Value Flow Science - Fitter lifecycles from lean balanced scorecards  (2011)
Value Flow Science - Fitter lifecycles from lean balanced scorecards (2011)Neil Thompson
 
What is Risk? - lightning talk for software testers (2011)
What is Risk? - lightning talk for software testers (2011)What is Risk? - lightning talk for software testers (2011)
What is Risk? - lightning talk for software testers (2011)Neil Thompson
 
The Science of Software Testing - Experiments, Evolution & Emergence (2011)
The Science of Software Testing - Experiments, Evolution & Emergence (2011)The Science of Software Testing - Experiments, Evolution & Emergence (2011)
The Science of Software Testing - Experiments, Evolution & Emergence (2011)Neil Thompson
 
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)Neil Thompson
 
Testing as Value Flow Mgmt - organise your toolbox (2012)
Testing as Value Flow Mgmt - organise your toolbox (2012)Testing as Value Flow Mgmt - organise your toolbox (2012)
Testing as Value Flow Mgmt - organise your toolbox (2012)Neil Thompson
 

Plus de Neil Thompson (17)

Six schools, three cultures of testing: future-proof by shifting left, down, ...
Six schools, three cultures of testing: future-proof by shifting left, down, ...Six schools, three cultures of testing: future-proof by shifting left, down, ...
Six schools, three cultures of testing: future-proof by shifting left, down, ...
 
From 'Fractal How' to Emergent Empowerment (2013 article)
From 'Fractal How' to Emergent Empowerment (2013 article)From 'Fractal How' to Emergent Empowerment (2013 article)
From 'Fractal How' to Emergent Empowerment (2013 article)
 
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
 
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Eme...
 
Risk-Based Testing - Designing & managing the test process (2002)
Risk-Based Testing - Designing & managing the test process (2002)Risk-Based Testing - Designing & managing the test process (2002)
Risk-Based Testing - Designing & managing the test process (2002)
 
'Best Practices' & 'Context-Driven' - Building a bridge (2003)
'Best Practices' & 'Context-Driven' - Building a bridge (2003)'Best Practices' & 'Context-Driven' - Building a bridge (2003)
'Best Practices' & 'Context-Driven' - Building a bridge (2003)
 
Risk Mitigation Trees - Review test handovers with stakeholders (2004)
Risk Mitigation Trees - Review test handovers with stakeholders (2004)Risk Mitigation Trees - Review test handovers with stakeholders (2004)
Risk Mitigation Trees - Review test handovers with stakeholders (2004)
 
ROI at the bug factory - Goldratt & throughput (2004)
ROI at the bug factory - Goldratt & throughput (2004)ROI at the bug factory - Goldratt & throughput (2004)
ROI at the bug factory - Goldratt & throughput (2004)
 
Feedback-focussed process improvement (2006)
Feedback-focussed process improvement (2006)Feedback-focussed process improvement (2006)
Feedback-focussed process improvement (2006)
 
Thinking tools - From top motors through s'ware proc improv't to context-driv...
Thinking tools - From top motors through s'ware proc improv't to context-driv...Thinking tools - From top motors through s'ware proc improv't to context-driv...
Thinking tools - From top motors through s'ware proc improv't to context-driv...
 
Holistic Test Analysis & Design (2007)
Holistic Test Analysis & Design (2007)Holistic Test Analysis & Design (2007)
Holistic Test Analysis & Design (2007)
 
Value Flow ScoreCards - For better strategies, coverage & processes (2008)
Value Flow ScoreCards - For better strategies, coverage & processes (2008)Value Flow ScoreCards - For better strategies, coverage & processes (2008)
Value Flow ScoreCards - For better strategies, coverage & processes (2008)
 
Value Flow Science - Fitter lifecycles from lean balanced scorecards (2011)
Value Flow Science - Fitter lifecycles from lean balanced scorecards  (2011)Value Flow Science - Fitter lifecycles from lean balanced scorecards  (2011)
Value Flow Science - Fitter lifecycles from lean balanced scorecards (2011)
 
What is Risk? - lightning talk for software testers (2011)
What is Risk? - lightning talk for software testers (2011)What is Risk? - lightning talk for software testers (2011)
What is Risk? - lightning talk for software testers (2011)
 
The Science of Software Testing - Experiments, Evolution & Emergence (2011)
The Science of Software Testing - Experiments, Evolution & Emergence (2011)The Science of Software Testing - Experiments, Evolution & Emergence (2011)
The Science of Software Testing - Experiments, Evolution & Emergence (2011)
 
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
 
Testing as Value Flow Mgmt - organise your toolbox (2012)
Testing as Value Flow Mgmt - organise your toolbox (2012)Testing as Value Flow Mgmt - organise your toolbox (2012)
Testing as Value Flow Mgmt - organise your toolbox (2012)
 

Dernier

Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionDilum Bandara
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 

Dernier (20)

Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An Introduction
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 

Test Data, Information, Knowledge, Wisdom: past, present & future of standing, running, driving & flying (2016)

  • 1. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test Data, Information, Knowledge, Wisdom: the past, present & future of standing, running, driving & flying Neil Thompson @neilttweet Thompson information Systems Consulting Ltd ©Thompson information Systems Consulting Ltd 1 v1.2 (v1.0 was the handout, v1.1 was presented on the day. This v1.2 has erratum & appendix)
  • 2. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Agenda • Part A (past & present) The basics of test data • B (“ “) Data structures & Object Orientation • C (the present) Agile & Context-Driven • D (present & future) Cloud, Big Data, Internet of Things / “Everything” (oh, and Artificial Intelligence, still) • Summary, takeaways etc © Thompson information Systems Consulting Ltd 2
  • 3. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 (past & present): The basics of test data Part A © Thompson information Systems Consulting Ltd 3
  • 4. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 The poor relation of the artefact family? © Thompson information Systems Consulting Ltd 4 Image credits: (extracted from) slideshare.net/softwarecentral (repro from ieeeexplore.ieee.org) thenamiracleoccurs.wordpress.com
  • 5. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ISO 29119 on test data • Definition: data created or selected to satisfy the input requirements for executing one or more test cases, which may be defined in the Test Plan, test case or test procedure • Note: could be stored within the product under test (e.g. in arrays, flat files, or a database), or could be available from or supplied by external sources, such as other systems, other system components, hardware devices, or human operators • Status of each test data requirement may be documented in a Test Data Readiness Report • Hmm... so, what about data during and after a test? © Thompson information Systems Consulting Ltd 5
  • 6. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ISO 29119 on test data (continued) • “Actual results”: – Definition – set of behaviours or conditions of a test item, or set of conditions of associated data or the test environment, observed as a result of test execution – Example: Outputs to screen, outputs to hardware, changes to data, reports and communication messages sent • Overall processes: (test data information builds through three of these in particular...) © Thompson information Systems Consulting Ltd 6 Source: ISO/IEC/IEEE 29119-2
  • 7. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ISO 29119 on test data (continued) • Test Planning (process, in Test Management) identifies strategy, test environment, test tool & test data needs: – Design Test Strategy (activity, which contributes to Test Plan) includes: • “identifying” test data • Example: factors to consider include regulations on data confidentiality (it could require data masking or encryption), volume of data required and data clean-up upon completion • test data requirements could identify origin of test data and state where specific test data is located, whether has to be disguised for confidentiality reasons, and/or the role responsible for the test data • test input data and test output data may be identified as deliverables © Thompson information Systems Consulting Ltd 7
  • 8. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ISO 29119 on test data (continued) • Within Test Design & Implementation (process): – Derive Test Cases (activity): • preconditions include existing data (e.g. databases) • inputs are the data information used to drive test execution – may be specified by value or name, eg constant tables, transaction files, databases, files, terminal messages, memory resident areas, and values passed by the operating system – Derive Test Procedures (activity) includes: • identifying any test data not already included in the Test Plan • note: although might not be finalized until test procedures complete, could often start far earlier, even as early as when test conditions are agreed © Thompson information Systems Consulting Ltd 8
  • 9. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ISO 29119 on test data (continued) • Within Test Design & Implementation process (continued): – Test Data Requirements describe the properties of the test data needed to execute the test procedures: • eg simulated / anonymised production data, such as customer data and user account data • may be divided into elements reflecting the data structure of the test item, eg defined in a class diagram or an entity-relationship diagram • specific name and required values or ranges of values for each test data element • who responsible, resetting needs, period needed, archiving / disposal • Test Environment Set-Up & Maintenance (process) produces an established, maintained & communicated test environment: – Establish Test Environment (activity) includes: • Set up test data to support the testing (where appropriate) – Test Data Readiness Report documents: • status wrt Test Data Requirements, eg if & how the actual test data deviates from the requirements, e.g. in terms of values or volume © Thompson information Systems Consulting Ltd 9
  • 10. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 My thoughts: Standing & Running data; CRUD build-up © Thompson information Systems Consulting Ltd 10 • Consider a new system under test: System Create new data (software checks validity) Standing data Running data System Create (checks validity wrt reference data) Standing data Running data System C, R, U, D• Moving on to test an “in- use” system: • Now, what about test coverage?... Running test data Running test data Running test data Select from / use all Via tools / interfaces Outputs Create, Read, Update, Delete C,R,U,D
  • 11. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 A little industrial archaeology: 1993! © Thompson information Systems Consulting Ltd 11• “Organisation before Automation”
  • 12. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ... and 1999... © Thompson information Systems Consulting Ltd 12 • “Zen and the Art of Object-Oriented Risk Management” • Added the concept of input & output data spreads
  • 13. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ... oh, and 2007! © Thompson information Systems Consulting Ltd 13 • “Holistic Test Analysis & Design” (with Mike Smith) Non- Func Req’ts S1ES1 ES2S2 ES3 S1ES1 ES2S2 ES3 S1ES1 ES2S2 ES3 S1ES1 ES2S2 ES3 F3 F6F5F2F1 F3 F7F5F1F2 F3F1 F3F2 F10 F12F11F9F8 Data A Data A Data B Data C Data D F3 F6F5F2F1 F3 F7F5F1F2 F3F1 F3F2 F10 F12F11F9F8 Data A Data A Data B Data C Data D COMPONENTCOMPONENTSYSTEMSYSTEMSACCEPTANCE INTEGRATIONINTEGRATION Func Req’ts … Func Spec Tech Design Module Specs Programming Standards  Workshops Functional Non- Functional Online Batch Val Nav … Perf Sec …    Behav     Behav Behav Struc Behav Behav    Behav Struc  Behav Struc   Service to stakeholders Streams Threads Modules TEST ITEMS Service Levels Behav I’face Spec F5 F2 F4 F3 F1 C1 C3 C2 C5 C4 Public op CAB C6 F5 F2 F4 F3 F1 C1 C3 C2 C5 C4 Public op CAB Public op CAB C6 F5F3 F1 C3 C2 F2 F5F3 F1 C3 C2 F2 Pairs / clusters of modules TEST FEATURES BEHAVIOURAL / STRUCTURAL TEST BASIS REFERENCES PRODUCT RISKS ADDRESSED TEST CONDITIONS
  • 14. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ... (2007 part 2 of 2) © Thompson information Systems Consulting Ltd 14 • “Holistic Test Analysis & Design” (with Mike Smith) COMPONENTCOMPONENTSYSTEMSYSTEMSACCEPTANCE INTEGRATIONINTEGRATION Manual: - screen images viewed, - test log hand-written Manual for changes: - examine interface log prints - view screens in each sys Auto regression test: in-house test harness Manual for changes: - database spot-checks - view screens, audit print Auto regression test: threads, approved tool Manual + Auto: - varies, under team control Manual for changes: - varies, under individual control Auto regression test: per component, approved tool update with care, documentation out of date copy of live data, timing important, users all have access ad-hoc data, unpredictable content, check early with system contacts tailored to each component contains sanitised live data extracts arrange data separation between teams Use Cases State Transitions (all transitions, Chow 0-switch) Boundary Value Analysis 1.0 over 0.1 over on 0.1 under 1.0 under CT-2.4.1 CT-2.4.2 CT-2.4.3 CT-2.4.4 CT-2.4.5 Main success scenario Extension 2a Extension 4a Extension 4b Extension 6a AT-8.5.1 AT-8.5.2 AT-8.5.3 AT-8.5.4 etc etc MPTU MPTC MPC MC MN ST-9.7.1 ST-9.7.2 ST-9.7.3 etc • If you use an informal technique, state so here. • You may even invent new techniques! TEST CONDITIONS MANUAL/AUTOMATED VERIFICATION/VALID’N RESULT CHECK METHOD TEST DATA CONSTRAINTS / INDICATIONS TESTDATA INSCRIPTS? TEST CASE DESIGN TECHNIQUES ““ ““ TEST SUITE / TEST / TEST CASE OBJECTIVES TEST CASE / SCRIPT/ PROCEDURE IDENTIFIERS ““ ““ ““ ““ ““ ““
  • 15. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Now, a data-oriented view of test coverage: but relation to techniques? © Thompson information Systems Consulting Ltd 15 Standing data Running data System C, R, U, DRunning test data Running test data Select from / use all Via tools / interfaces Outputs Create, Read, Update, Delete C,R,U,D BLACK-BOX techniques? GLASS-BOX techniques? (etc)... • However: glass-box techniques still need data to drive them! Input transactions Input data spread Processing transactions Stored data spread Output transactions Output data spread
  • 16. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 More about test data sources © Thompson information Systems Consulting Ltd 16 System Running test data Running test data Direct input by tester Via tools / interfaces eg messages, transactions, records, files/tables, whole databases Acquisition POTENTIAL SOURCES OF TEST DATA CHARACTERISTICS &handling:............ Adapted from: Craig & Jaskiel 2002 (Table 6-2 and associated text) “OTHER BOOKS ARE AVAILABLE”! Validation (calibration) Change VOLUME VARIETY Manually created Captured (by tool) Tool/utility generated Random Production Controllable Too muchToo little Controllable Controllable Good Varies Varies Mediocre Mediocre Difficult Easy EasyVariesFairly easy Easy Fairly difficult Difficult DifficultVery difficult VariesVariesEasy EasyUsually easy
  • 17. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data and the V/W-model © Thompson information Systems Consulting Ltd 17 Contrived Maybe live, else live-like Contrived then/and live-like Acceptance Testing System Testing Integration Testing Levels of specification Requirements Functional & NF specifica- tions Technical spec, Hi-level design Detailed designs Unit Testing Levels of stakeholders Business, Users, Business Analysts, Acceptance Testers Architects, “independent” testers Designers, integration testers Developers, unit testers Levels of integration + Business processes Levels of: testing... Levels of review ...test data Pilot / progressive rollout NF Func Live Remember: not only for waterfall or V-model SDLCs, rather iterative / incremental go down & up through layers of stakeholders, specifications & system integrations
  • 18. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 The V-model and techniques © Thompson information Systems Consulting Ltd 18 GLASS-BOX → STRUCTURE -BASED Contrived Maybe live, else live-like Contrived then/and live-like Acceptance Testing System Testing Integration Testing Unit Testing Levels of integration + Business processes Levels of: testing... ...test data Pilot / progressive rollout NF Func Live Techniques for being live-like? Techniques contrived for coverage BLACK-BOX → BEHAVIOUR-BASED Source: BS 7925-2 ...............Source:ISO/IEC/IEEE29119-4............... EXPERIENCE-BASED Cause-Effect Graphing Combinatorial (All, Pairs, Choices) Classification Tree Decision Table Boundary Value Analysis Equivalence Partitioning Random Scenarios State Transitions Error Guessing ....................Source:BBSTTestDesign.................... Domain ↑ • Input & output • Primary & secondary • Filters & consequences • Multiple variables MAYBEEXPLORATORY,RISK-ORIENTED Tester-based eg α, β, Paired Coverage-based eg Functions, Tours, [Para-func] risks eg Stress, Usability Activity-based, eg Use Cases, All-pairs Evaluation-based eg Math oracle Desired-result eg Build verification
  • 19. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data for Unit Testing © Thompson information Systems Consulting Ltd 19 Contrived... Integration Testing Unit Testing Drivers Stubs DATA To drive all techniques in use Measure GLASS-BOX coverage (manually? / by instrumentation) Functional, eg validity checks: • intra-field & inter-field Any Non-Func wanted & feasible, eg: • local performance, usability Input transactions Input data spread Processing transactions Stored data spread Output transactions Output data spread Standing data Running test data Running test data Outputs
  • 20. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data for Integration Testing © Thompson information Systems Consulting Ltd 20 Contrived... System Testing Integration Testing Input interfaces Input data spread Processing transactions Stored data spread Output interfaces Output data spread Standing data Running test data Running test data Outputs Unit Testing Drivers Stubs Other units when ready Functional, eg boundary conditions: • null transfer, single-record, duplicates Any Non-Func wanted & feasible, eg: • local performance, security Running test data Running test data Some UNIDIRECTIONAL, some BIDIRECTIONAL
  • 21. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data for System Testing © Thompson information Systems Consulting Ltd 21 Contrived Acceptance Testing Integration Testing Live-like Input transactions Input data spread Processing transactions Stored data spread Output transactions Output data spread Standing data Running test data Running test data Outputs Eg: • performance (eg response times) • peak business volumes • usability • user access security Some FUNCTIONAL, some NON / PARA - FUNCTIONAL Eg: • stress • volumes over-peak • contention • anti-penetration security All (believed) contrivable Live-like Surprise?! System Testing
  • 22. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data for Acceptance Testing etc © Thompson information Systems Consulting Ltd 22 Full live running Acceptance Testing System Testing Maybe live, else live-like Input transactions Input data spread Processing transactions Stored data spread Output transactions Output data spread Standing data Running test data Running test data Outputs NF acceptance criteria, eg: • performance • volume • security Some FUNCTIONAL, some NON / PARA - FUNCTIONAL Live-like Any surprises here won’t cause failures? Pilot / progressive rollout Follow up any issuesLive Live But any surprises here may cause failures!
  • 23. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data often needs planning & acquiring earlier than you may think! © Thompson information Systems Consulting Ltd 23 • Data to check validity? (esp. inter-field) • Who writes stubs & drivers, and when? • How get enough data to test perf early? • How to select location, scope etc of pilot • Planning rollout sequence • Planning containment & fix of any failures • Deciding whether live and/or live-like • For a new system, how much live exists yet? • Rules, permissions & tools for obfuscation • Top-down / bottom-up affects data coord? • Still stubs & drivers, but also harnesses, probes, analysers? • Any live-like avail yet? How know like live? • Tools to replicate data to volumes, while keeping referential integrity Contrived Maybe live, else live-like Contrived then/and live-like Acceptance Testing System Testing Integration Testing Unit Testing Levels of: testing... ...test data Pilot / progressive rollout NF Func Live
  • 24. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 “Experience report”: general advice, and pitfalls / concerns • Start tests with empty data stores, then progressively add data • Standing data can/should be complete, but running (and transactional) data need only be a representative subset – until volume etc & acceptance testing • If different teams can own data, pre-agree codings & value ranges • If able to automate tests (usual caveats applying), use a data-driven framework (more later about targeted test data tools) • Keep backups, and/or use database checkpointing facilities, to be able to refresh back to known data states (but remember this is not live-like!) • Regression testing should occur at all levels, and needs stable data baselines © Thompson information Systems Consulting Ltd 24 • Pitfalls: – not having planned test data carefully / early enough – insufficiently rich, traceable, and/or embarrassingly silly test data values – difficulties with referential integrity – despite huge efforts, not getting permission to use live data (even obfuscated) – if actual live data is too large, subsetting is not trivial (eg integrity) – test data leaking into live!! • Concerns: – (from personal experience, also anecdotally) real projects we meet don’t have time/expertise to craft with techniques – at least, not very explicitly
  • 25. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Summary of “old school” approach to test data © Thompson information Systems Consulting Ltd 25 Contrived Maybe live, else live-like Contrived then/and live-like Acceptance Testing System Testing Integration Testing Unit Testing Pilot / progressive rollout NF Func Live Standing data Running test data Running test data 5. SOME TOOL USE, EG FOR FUNC TEST AUTOMATION, REPLICATING DATA FOR PERF/VOLUME TESTS 1. POOR RELATION OF ARTEFACT FAMILY 2. (IN THEORY) DRIVEN BY TECHNIQUES: MAINLY BLACK-BOX 3. DIFFERENT STYLES/EMPHASES AT DIFFERENT LEVELS 4. MAY BE THOUGHT OF AS BUILDING CRUD USAGE... ...ACROSS NOT ONLY INPUTS, BUT ALSO PROCESSING & OUTPUTS (EG VIA DOMAIN TESTING) Outputs eg TestFrame
  • 26. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 (past & present): Data structures & Object Orientation Part B © Thompson information Systems Consulting Ltd 26
  • 27. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Test data in object-oriented methods • The famous 1191-page book: – no “test data” in index! (nor data, nor persistence – but maybe because data is/are “encapsulated”) – emphasis on automated testing, though “manual testing, of course, still plays a role” – structure of book is Models, Patterns & Tools: • applying combinational test models (decision/truth tables, Karnaugh-Veitch matrices, cause-effect graphs) to Unified Modelling Language (UML) diagrams (eg state transitions) • Patterns – “results-oriented” (*not* glass/black box) test design at method, class, component, subsystem, integration & system scopes • Tools – assertions, oracles & test harnesses © Thompson information Systems Consulting Ltd 27
  • 28. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 © Thompson information Systems Consulting Ltd 28 Diagram examples: agilemodeling.com [etc] Use case BEHAVIOURAL STRUCTURAL Activity Sequence State Collaboration(now Communication) Class Object Component Deployment Method Class Component Subsystem Integration System • (more industrial archaeology!) A UML V-model • Since then, UML (now v2) has expanded to 14 diagram types, but anyway how many of these 9 do you typically see?
  • 29. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 If functional / data / OO diagrams not provided: you could build your own • Note: parts of UML extend/develop concepts from older-style models, eg SSADM (Structured Systems Analysis and Design Method): – Use Cases built on Business Activity Model, BAM – Class diagrams built on Logical Data Model, LDM (entity relationships) – Activity diagrams on Data Flow Model, DFM – Interaction diagrams on Entity Life History, ELH (entity event modelling) • And (according to Beizer and many since) testers may build their own models – even potentially invent new ones © Thompson information Systems Consulting Ltd 29Diagram examples: visionmatic.co.uk, umsl.edu, paulherber.co.uk, jacksonworkbench.co.uk
  • 30. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 And, if you have much time / a desire for selecting carefully... © Thompson information Systems Consulting Ltd 30 Version of Zachman framework from icmgworld.com See also modified expansion in David C. Hay, Data Model Patterns – a Metadata Map
  • 31. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 (the present): Agile & Context-Driven Part C © Thompson information Systems Consulting Ltd 31
  • 32. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Analogy with scientific experiment: hypothesis “all swans are white” • So far, we have been setting out conditions (→ cases) we want to test, then contriving/procuring test data to trigger those conditions © Thompson information Systems Consulting Ltd 32 Test Data Test Strategy Test Plan Test Conditions Test Cases Test Procedures/Scripts • This is like scientific hypothesis, then experiment to confirm/falsify: – test whiteness of swans (hmm: cygnets grey, adult birds may be dirty) – but only by going to Australia could early observers have found real, wild black swans • See also “Grounded Theory”
  • 33. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 From top-down to bottom-up: what if data triggers unplanned conditions? • “Old-school” methods don’t seem to consider this? Neither do OO & UML in themselves? • But agile can, and Context-Driven does... © Thompson information Systems Consulting Ltd 33 Test Data Test Strategy Test Plan Test Conditions Test Cases Test Procedures/Scripts Testing Test Data
  • 34. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Agile on test data • The Agile/agile methods/”ecosystems” ? – Prototyping, Spiral, Evo, RAD, DSDM – USDP, RUP, EUP, AUP, EPF-OpenUP, Crystal, Scrum, XP, Lean, ASD, AM, ISD, Kanban, Scrumban • The “Drivens”: – A(T)DD, BDD, CDD, DDD, EDD, FDD, GDD, HDD, IDD, JSD, KDS, LFD, MDD, NDD, ODD, PDD, QDD, RDD, SDD, TDD, UDD, VDD, WDD, XDD, YDD, ZDD* • Scalable agile frameworks (SAFe, DAD, LeSS etc) ? • Lisa Crispin & Janet Gregory: – Agile Testing; More Agile Testing © Thompson information Systems Consulting Ltd 34* I fabricated only four of these – can you guess which?
  • 35. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Driven by test data? © Thompson information Systems Consulting Ltd 35 Source: Gojko Adzic via neuri.co.uk eg ACCEPTANCE TESTS: As a <Role> I want <Feature> so that <Benefit> Source: Wikipedia [!] eg UNIT TESTS: Setup Execution Validation Cleanup ACCEPTANCE CRITERIA: Given <Initial context> when <Event occurs> then <ensure some Outcomes> Source: Aaron Kromer via github.com Source: Gojko Adzic via gojko.net: “a good acceptance test” SPECIFICATION: When <Executable example 1> and <Executable example 2> then <Expected behaviour> Source: The RSpec Book, David Chelimsky, Dan North etc ATDD SBE
  • 36. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Crispin & Gregory on test data © Thompson information Systems Consulting Ltd 36 • These books were written partly because so many agile methods say (often deliberately) so little about testing • Within “Strategies for writing tests” – test genesis/design patterns include: – Build-Operate-Check, using multiple input data values – Data-Driven testing • Much on TDD, BDD & ATDD (mentioning Domain Specific Languages, Specification By Example etc) • Guest article by Jeff Morgan on test data management
  • 37. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Beyond agile to DevOps © Thompson information Systems Consulting Ltd 37 • DevOps extends (“Shifts Right”) agile concepts into operations, ie live production • Multiple aspects, but especially needs more specialised tools, eg: • This includes test data generation & management tools, eg: continuousautomation .com techbeacon .com techarcis .com (formerly Grid Tools)
  • 38. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Context-Driven on test data • The books: – actually there *are* more than one! • BBST courses (BBST is a registered trademark of Kaner, Fiedler & Associates ,LLC): – Foundations – Bug Advocacy – Test Design • Rapid Software Testing course (James Bach & Michael Bolton) © Thompson information Systems Consulting Ltd 38
  • 39. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Context-Driven on test data: books • Jerry Weinberg: – rain gauge story – composition & decomposition fallacies • Kaner, Falk & Nguyen: – [static] testing of data structures & access • Kaner, Bach (James) & Pettichord: – 103 & 129 Use automated techniques to extend reach of test inputs, eg: • models, combinations, random, volume – 127 & 130 Data-driven automation separating generation & execution: • tabulate inputs & expected outputs • easier to understand, review & control © Thompson information Systems Consulting Ltd 39
  • 40. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Context-Driven on test data: courses • BBST: – Foundations: • a typical context includes creating test data sets with well-understood attributes, to be used in several tests – Bug Advocacy: • may need to analyse & vary test data during the “Replicate, Isolate, Generalise, Externalise” reporting elements – Test Design: • significant emphasis on Domain Testing • Rapid Software Testing: – use diversified, risk-based strategy, eg: • “sample data” is one of the tours techniques – “easy input” oracles include: • populations of data which have distinguishable statistical properties • data which embeds data about itself • where output=input but state may have changed – if “repeating” tests, exploit variation to find more bugs, eg: • substitute different data • vary state of surrounding system(s) © Thompson information Systems Consulting Ltd 40 NB this illustration is from csestudyzone. blogspot.co.uk
  • 41. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Random testing & Hi-Volume Automation • Random testing: – counts as a named technique in many taxonomies – may be random / pseudo-random (advantages of no bias)... – ...wrt data values generated, selection of data from pre-populated tables, sequence of functions triggered etc – may be guided by heuristics (partial bias) – “monkey testing” does not do it justice, but difficult to specify oracles/expected results • But... James Bach & Patrick J. Schroeder paper: – empirical studies found no significant difference in the defect detection efficiency of pairwise test sets and same-size randomly selected test sets, however... – several factors need consideration in such comparisons • And... Cem Kaner on HiVAT: – “automated generation, execution and evaluation of arbitrarily many tests. The individual tests are often weak, but taken together, they can expose problems that individually-crafted tests will miss” – examples: • inputs-focussed: parametric variation, combinations, fuzzing, hostile datastream • exploiting oracle: function equivalence, constraint checks, inverse operations, state models, diagnostic • exploiting existing tests/tools: long-sequence regression, hi-volume protocol, load-enhanced functional © Thompson information Systems Consulting Ltd 41
  • 42. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 (present & future): Cloud, Big Data, IoT/E (oh, and AI, still) Part D © Thompson information Systems Consulting Ltd 42
  • 43. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Beyond driving: now let’s fly! © Thompson information Systems Consulting Ltd 43 • Test data for cloud systems ...
  • 44. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 (Test) data in cloud systems • In the olden days, the data was in one known place • But in cloud computing: – system specifications & physical implementations are abstracted away from users, data replicated & stored in unknown locations – resources virtualised, pooled & shared with unknown others • Differing deployment models: private, community, public, hybrid • Non/para-functional tests prioritised & complicated (eg “elasticity”, service levels & portability) even more than plain internet systems; huge data sizes, incl. auto-generated for mgmt • From my own experience with Twitter etc: – no single source of truth at a time; notifications ≠ web or mobile app view; updates prioritised & cascaded, sequence unpredictable – testing extends into live usage (eg “test on New Zealand first”) • Particular considerations for data when migrating an in-house system out to cloud (IaaS / PaaS / SaaS) • Dark web not indexable by search engines (eg Facebook!) • So, testing & test data more difficult? © Thompson information Systems Consulting Ltd 44 Sources: Barrie Sosinsky, Cloud Computing “Bible” Blokland, Mengerink & Pol – Testing Cloud Services See Erratum slide 68
  • 45. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Big data • An extension of data warehouse → business intelligence concepts, enabled by: – vastness of data (some cloud, some not), Moore’s Law re processing power – new data, eg GPS locations & biometrics from mobile devices & wearables – tools which handle diverse, unstructured data (not just neat files / tables / fields) – importance of multimedia & metadata – convergences: Social, Mobile, Analytics & Cloud; Volume, Variety, Velocity • Not just data for a system to create/add value: but value from data itself • Exact → approximate; need not be perfect for these new purposes • Away from rules & hypotheses, eg language translation by brute inference • This extends the “bottom-up” emphasis I have been developing • A key aim is to identify hitherto unknown (or at least unseen) patterns, relationships & trends – again a testing challenge, because not predictable, what are “expected results”? • So contrived test data may be no use – need real or nothing? • (And beware, not all correlations are causations – but users may still be happy to use for decision-making) © Thompson information Systems Consulting Ltd 45 Sources: Mayer-Schönberger & Cukier – Big Data Minelli, Chambers & Dhiraj – Big Data, Big Analytics
  • 46. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 When Data gets “big”, does it grow up into something else? © Thompson information Systems Consulting Ltd 46 Sources: above – Matthew Viel as in US Army CoP, via Wikipedia below – Karim Vaes blog, below right – Bellinger, Castro & Mills at systems-thinking.org • (two axes but no distinction?) • other perspectives........ David McCandless, Malcolm Pritchard informationisbeautiful.net cademy.isf.edu.hk fluks.dvrlists.com (Respectively above)................................................... applied organised discrete linked values, virtues, vision experience, reflection, understanding meaning, memory symbols, senses signals, know-nothing useful, organised, structured contextual, synthesised, learning understanding, integrated, actionable + DECISION!
  • 47. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Another of the many views available © Thompson information Systems Consulting Ltd 47 Source(s): Avinash Kaushik (kaushik.net/avinash/great-analyst-skills-skepticism-wisdom) quoting by David Somerville, based on a two pane version by Hugh McLeod
  • 48. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 No, it did not have to be a pyramid: plus here are several extra resonances © Thompson information Systems Consulting Ltd 48 • like Verification & Validation? • T & E also quoted (reversed) elsewhere as Explicit & Tacit* Source: Michael Ervick, via systemswiki.org * Harry Collins after Michael Polanyi; quotation by “Omegapowers” on Wikipedia
  • 49. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Data Science • A new term arising out of Big Data & Analytics? How much more than just statistics? © Thompson information Systems Consulting Ltd 49 Data Science Modified after Steven Geringer: • Data used to be quite scientific already? – sets, categories & attributes – types, eg strings, integers, floating-point – models & schemas, names & representations, determinants & identifiers, redundancy & duplication, repeating groups – flat, hierarchical, network, relational, object databases – normalisation, primary & foreign keys, relational algebra & calculus – distribution, federation, loose/tight coupling, commitment protocols – data quality rules • But now... (this is only one of several available alternatives) • And blobs hide hypothesising, pattern recognition, judgement, prediction skills
  • 50. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Internet of Things © Thompson information Systems Consulting Ltd 50 collaborative.com pubnub .com • Extends “Social, Mobile, Analytics & Cloud” • Even more data – and more diverse • Identifying & using signals amid “noise” • So, new architectures suggested • Maybe nature can help • Yet more testing difficulty! Francis daCosta: Rethinking the IoT eg see Paul Gerrard’s articles
  • 51. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Entropy & information theory • Cloud, Big Data & IoT are all bottom-up disrupters of the old top-down methods • Are there any bottom-up theories which might help here? © Thompson information Systems Consulting Ltd 51 After hyperphysics.phy- astr.gsu.edu “temperature” of gas energies of individual molecules Ito & Sagawa, nature.com BOLTZMANN etc SHANNON etc
  • 52. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Information grows where energy flows © Thompson information Systems Consulting Ltd 52Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF Sources: Daniel Dennett “Darwin’s Dangerous Idea” “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc) Mathematics EVOLUTION&“EMERGENCE” Neil Thompson: Value Flow ScoreCards Daniel Dennett: platforms & cranes Physics (Quantum Theory end) Physics (General Relativity end) Chemistry (inorganic) Chemistry (organic) Biology Humans Tools Languages Books Information Technology Artificial Intelligence Physics (String Theories & rivals) Geography Geology Astronomy
  • 53. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Evolution & punctuated equilibria • Here’s another bottom-up theory © Thompson information Systems Consulting Ltd 5353 “Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould Images from www.wikipedia.org Sophistication Diversity“Gradual” Darwinsim Sophistication DiversityPunctuated equilibria “Explosion” in species, eg Cambrian Spread into new niche, eg Mammals Mass extinction, eg Dinosaurs (equilibrium) (equilibrium) (equilibrium) Sophistication Diversity Number of species
  • 54. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ©Thompson information Systems Consulting Ltd Punctuated equilibria in information technology? 54Computers 1GL Object Orientation Internet, Mobile devices Artificial Intelligence?! 4GL 3GL 2GL • Are we ready to test AI??
  • 55. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Artificial Intelligence • Remember this?............... • But there’s much more! • I keep treating it as the future, but much is already here, or imminent sooner than you may think? • Again there is the “oracle problem”: – what are the “expected results” – how can we predict emergent things? – who will determine whether good, or bad, or...? © Thompson information Systems Consulting Ltd 55 Data Science legaltechnology.com
  • 56. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ©Thompson information Systems Consulting Ltd More about Emergence: progress along order-chaos edge? 56 Physics Social sciences Chemistry Biology • For best innovation & progress, need neither too much order nor too much chaos • “Adjacent Possible” Extrapolation from various sources, esp. Stuart Kauffman, “The Origins of Order”, “Investigations” jurgenappelo.com
  • 57. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 ©Thompson information Systems Consulting Ltd So, back to test data 57 Computers Artificial Intelligence Object Orientation Internet, Mobile • Ross Ashby’s Law of Requisite Variety • Make your test data “not too uniform, not too random” • SMAC + IoT +AI will be an ecosystem? • Which needs Data Science to manage it??
  • 58. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 © Thompson information Systems Consulting Ltd 58 • No, maybe we do indeed need rocket science! Summary Artificial Intelligence DECISIONS DATA INFORMATION KNOWLEDGE WISDOM INSIGHT IoT Cloud Mobile Social Analytics Emergence
  • 59. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 The key points – test data is (are): • More fundamental than you probably think, because: – test conditions & cases don’t “really” exist until triggered by data – whether data is standing or running can affect design of tests • More interesting, because: – considerations (eg contrived, live-like) vary greatly at different levels in the V-model – data was always a science (you may have missed that) in many respects • Changing, through: – agile & context-driven paradigms (both of which are still evolving) – cloud, big data, Internet of Things and Artificial Intelligence – these changes are arguably moving from a top-down approach (via test conditions & cases) to a more bottom-up / holistic worldview © Thompson information Systems Consulting Ltd 59
  • 60. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Takeaway messages • Despite its apparent obscurity & tedium, test data actually unifies a strand through key concepts of test techniques, top-down v bottom-up approaches, SDLC methods and even (arguably) “emergence” • Whatever your context, think ahead – much more needs to be decided than the literature makes clear, and much of it is non-trivial • Don’t think only test data, think test information, knowledge, wisdom – and insight & decisions! • This embraces the distinctions between: – tacit /explicit knowledge – verification / validation (≈ checking / testing) • The “future” is already here, in many respects – honest inquiry and research can cut through much of the hype – don’t get left behind © Thompson information Systems Consulting Ltd 60
  • 61. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Main references • Standards: – IEEE 829-1998 – BS 7925-2 – ISO/IEC/IEEE 29119-2:2013 & 4:2015 • Industrial archaeology from my own past: – EuroSTAR 1993: Organisation before Automation – EuroSTAR 1999: Zen & the Art of Object-Oriented Risk Management – Book 2002: Risk-Based E-Business Testing (Paul Gerrard lead author) – STARWest 2007: Holistic Test Analysis & Design (with Mike Smith) • Testing textbooks: – Craig & Jaskiel: Systematic Software Testing (2002) – Binder: Testing Object-Oriented Systems (2000) – Weinberg: Perfect Software and Other Illusions about Testing (2008) – Kaner, Falk & Nguyen: Testing Computer Software (2nd ed, 1999) – Kaner, Bach & Pettichord: Lessons Learned in Software Testing (2002) – Crispin & Gregory: Agile Testing (2009) & More Agile Testing (2015) • Testing training courses: – BBST Foundations, Bug Advocacy & Test Design (Kaner et al.) – Rapid Software Testing (Bach & Bolton) © Thompson information Systems Consulting Ltd 61
  • 62. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Main references (continued) • Data / keyword / table driven test automation: – Buwalda, Janssen & Pinkster: Integrated Test Design & Automation using the TestFrame method (2002) • Methods: – Structured Systems Analysis & Design Method (SSADM) – Unified Modelling Language (UML) – The Zachman framework (eg as in Hay: Data Model Patterns – a Metadata Map) • Data generally: – Kent: Data & Reality (1978 & 1998) – Howe: Data Analysis for Database Design (1983, 1989 & 2001) • Agile methods textbooks: – Highsmith: Agile Software Development Ecosystems (2002) – Boehm & Turner: Balancing Agility & Discipline (2004) – Adzic: Bridging the Communication Gap (2009) – Gärtner: ATDD by Example (2013) – Appelo: Management 3.0 (2010) – maybe post-agile? © Thompson information Systems Consulting Ltd 62
  • 63. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Main references (continued) • Cloud, Big Data, Internet of Things: – Sosinsky: Cloud Computing “Bible” (2011) – Blokland, Mengerink & Pol: Testing Cloud Services (2013) – Mayer-Schönberger & Cukier: Big Data (2013) – Minelli, Chambers & Dhiraj: Big Data, Big Analytics (2013) – daCosta: Rethinking the Internet of Things (2013) • Entropy, emergence, Artificial Intelligence etc: – Kauffman: The Emergence of Order (1993) & Investigations (2000) – Dennett: Darwin’s Dangerous Idea (1995) – Taleb: Fooled by Randomness (2001) & The Black Swan (2007) – Gleick: The Information (2011) – Morowitz: The Emergence of Everything (2002) – Birks & Mills: Grounded Theory (2011) – Kurzweil: The Singularity is Near (2005) • Websites: – many (see individual credits annotated on slides) © Thompson information Systems Consulting Ltd 63
  • 64. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Thanks for listening (and looking)! Neil Thompson @neilttweet NeilT@TiSCL.com linkedin.com/in/tiscl Thompson information Systems Consulting Ltd ©Thompson information Systems Consulting Ltd 64 Questions? Contact information:
  • 65. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Answers to *DD quiz (ie the four I fabricated) • XDD: eXtremely Driven Development (ie micromanaged) • LFD: Laissez Faire Development • WDD: Weakly Driven Development • NDD: Not Driven Development © Thompson information Systems Consulting Ltd 65
  • 66. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Appendix: the most convincing *DD examples which I didn’t fabricate • A(T)DD: Acceptance (Test) Driven Development • B: Behaviour Driven Development • C: Context Driven Design • D: Domain Driven Design • E: Example Driven Development • F: Feature Driven Development • G: Goal Driven Process • H: Hypothesis Driven Development • I: Idea Driven Development • J: Jackson System Development • K: Knowledge Driven Software • M: Model Driven Development © Thompson information Systems Consulting Ltd 66 • O: Object Driven Development • P: Process Driven Development • Q: Quality Driven Development • R: Result Driven Development • S: Security Driven Development • T: Test Driven Development • U: Usability Driven Development • V: Value Driven Development • Y: YOLO (You Only Live Once) Development • Z: Zero Defects Development
  • 67. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Appendix: best *DD runner-up Quantum Driven Development: • works only on hardware that hasn't yet been invented • works; oh no it doesn’t; oh now it does... • works & doesn’t work at the same time • uncertain whether or not it works • there’s a probability function that... • [That’s enough QDD: Ed.] © Thompson information Systems Consulting Ltd 67 secretgeek.net
  • 68. SIGiST Specialist Interest Group in Software Testing 15 Sep 2016 Erratum • Slide 44: Sorry, Facebook is no longer “dark web”. I now see that it was, sort-of, only before 2007 and I read in a 2011 book that it still was, but this seems wrong and I didn’t test it carefully enough! – Obviously much depends on specific privacy settings – Maybe deep content is still not externally crawlable? © Thompson information Systems Consulting Ltd 68