A Brief History of Intangibles in Ad Valorem Taxation.pdf
Soasta testing inproduction_whitepaper__v1.0
1. CloudTest®
Strategy
and
Approach
Cloud
Testing
Production
Applications
SOASTA
CloudTest
Methodology™
White
Paper
Series
Executive
Summary
Online
application
performance
is
critical
–
no
one
would
challenge
this
statement.
Yet
the
“dirty
little
secret”
in
the
web
world
is
that
applications
are
rarely
performance
tested.
When
performance
testing
is
done
it’s
usually
conducted
in
a
lab
with
significant
resources
spent
on
software,
infrastructure,
and
people.
The
results
from
this
investment
are
not
delivering
the
answers
that
leadership
needs
about
web
application
performance.
Simply
put,
a
lab
is
not
a
production
environment.
Running
a
few
hundred
virtual
users
in
a
small
lab
does
not
mean
that
you
can
support
twenty
times
as
much
load
in
the
production
environment.
Yet
this
kind
of
extrapolation
is
being
done
all
throughout
the
industry.
The
limitations
of
using
legacy
client-‐server
testing
tools
for
web
applications
has
delayed
the
introduction
of
load
testing
methodologies
for
production
environments.
Testing
in
production
is
a
required
component
of
any
performance
methodology
that
aims
to
achieve
the
highest
confidence
possible.
This
principle
is
at
the
core
of
the
SOASTA
CloudTest
Methodology.
Since
2007
SOASTA’s
methodology
has
grown
to
become
the
set
of
patterns
that
address
challenges
that
arise
in
testing
production
environments.
This
white
paper
discusses
the
three
most
frequently
discussed
topics
about
testing
in
production
(security,
test
data,
and
potential
live
customer
impact),
and
describes
how
the
methodology
helps
to
achieve
high
confidence
testing
in
production.
SOASTA,
Inc.
2010
2. Table
of
Contents
Executive
Summary .......................................................................................................1
1.
Intended
Audience ....................................................................................................1
2.
The
Past
and
Present
of
Performance
Engineering .....................................................1
3.
Product
and
Methodology
Working
Together
in
Harmony .........................................3
4.
Select
Topics
From
SOASTA’s
CloudTest
Methodology ...............................................3
4.1
Security .....................................................................................................................................................................5
4.1.1
Sensitive
Data
Applications.......................................................................................................................... 6
4.1.2
Systems
Monitoring
Data.............................................................................................................................. 8
4.1.3
Test
Data
and
Results
Security ................................................................................................................... 9
4.2
Test
Data
in
Production ..................................................................................................................................10
4.2.1
Using
Test
Data
in
Production..................................................................................................................10
4.2.2
Coding
for
Testability ...................................................................................................................................13
4.3
Potential
Live
Customer
Impact..................................................................................................................13
5.
Customer
Case
Studies
–
Testing
in
Production........................................................14
5.1.1
Banking
and
Financial
Data
Security....................................................................................................14
5.1.2
Results
Confidentiality..................................................................................................................................15
5.1.3
Testing
With
Live
Customers.....................................................................................................................15
About
SOASTA,
Inc. .....................................................................................................17
3. 1.
Intended
Audience
Key
stakeholders
in
the
application
performance
lifecycle
will
find
this
white
paper
valuable.
For
many
companies,
the
target
audience
for
this
document
is
the
C-‐Level
executive.
Web
sites
are
often
the
primary,
if
not
the
only
revenue
channel
for
a
company.
Revenue
generation
aside,
web
sites
are
the
most
accessible
consumer-‐facing
outlet
for
most
companies.
Therefore,
the
brand
impact
from
poor
performance
or
an
outage
is
far
reaching
and
usually
garners
visibility
at
the
highest
levels.
This
means
that
the
CEO,
CIO,
CFO
and
CTO
are
the
primary
stakeholders
for
performance.
Anyone
involved
in
the
performance
of
a
web
site
will
find
this
information
valuable,
especially
those
involved
in
the
decision-‐making
and
operations
of
consumer-‐facing
applications.
2.
The
Past
and
Present
of
Performance
Engineering
In
the
late
1980’s
and
early
1990’s
client-‐server
applications
were
being
deployed
everywhere.
Fueled
by
the
proliferation
of
desktop
computers
and
advances
in
networking
technologies,
information
was
flowing
across
the
wire
in
greater
quantities
than
ever
before.
The
need
for
testing
client-‐server
performance
was
quickly
identified,
which
gave
birth
to
products
such
as
LoadRunner
by
Mercury
Interactive
(founded
in
1989
and
now
part
of
HP
through
acquisition
in
2006),
and
SilkPerformer
by
Segue
Software
(founded
in
1988
and
acquired
by
Borland
in
2006,
who
was
then
acquired
by
Micro
Focus
in
2009).
Along
with
new
products,
the
discipline
of
the
“performance
engineer”
emerged
as
a
critical
part
of
the
software
engineering
team.
In
the
late
nineties
the
focus
for
companies
began
to
shift
towards
online
retail
and
software-‐as-‐a-‐service
applications,
moving
further
and
further
away
from
client-‐server
technologies.
Fast
forward
to
today.
Having
a
web
presence
is
the
minimum
bar
of
entry
for
almost
any
company
in
the
world.
The
web
is
a
critical
source
of
revenue
and
the
primary
customer-‐facing
outlet
for
advertising
and
brand
management.
Applications
are
running
in
a
web
browser
instead
of
being
installed.
When
desktop
applications
are
installed,
the
software
is
no
longer
delivered
on
CD,
it’s
downloaded.
TV
shows
and
movies
are
streamed
over
the
web
and
content
is
delivered
to
desktop
computers,
laptops,
cell
phones,
gaming
consoles,
or
one
of
many
other
flavors
of
desktop
and
mobile
devices.
Applications
are
increasingly
event
driven,
needing
to
serve
a
larger
than
normal
customer
base
for
perhaps
only
a
few
weeks
out
of
the
year…
for
example,
retail
applications
during
the
holiday
season
or
tax
preparation
in
tax
season.
Some
applications
may
only
need
to
support
peak
load
for
a
few
days,
or
even
just
a
few
hours,
during
events
such
as
the
Olympics,
the
World
Cup,
or
the
airing
of
a
TV
commercial.
In
addition
to
the
technology
changes
introducing
complexity
in
modern
web
apps,
we
are
also
facing
unpredictability
in
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
1
4. traffic
patterns.
Social
media
outlets
such
as
Facebook,
MySpace
and
Twitter
can
drive
more
traffic
to
a
web
site
than
it’s
prepared
to
handle.
Today’s
software
development
practices
are
agile.
Builds
happen
multiple
times
per
day,
releases
to
customers
happen
frequently,
and
the
rate
of
change
is
high.
Without
agile
test
practices
and
the
right
tools,
a
team
is
only
doing
agile
development…
they’re
not
really
doing
agile.
All
processes
and
tools
need
to
be
able
to
adapt
to
the
fast
moving
software
development
lifecycle
to
fully
benefit
an
organization.
Otherwise,
quality
and
performance
will
suffer
as
agile
passes
them
by.
Currently,
when
performance
testing
is
done,
it
is
almost
always
done
in
a
lab
environment.
This
lab
is
usually
a
small
subset
of
the
production
environment.
As
such,
tests
are
done
on
a
small
scale
and
the
results
are
extrapolated
to
peak
production
numbers.
This
is
problematic
for
a
number
of
reasons,
not
the
least
of
which
is
the
fact
that
the
lab
is
significantly
different
from
production
in
terms
of
hardware,
scale,
configuration,
and
activity
at
any
given
time
(batch
jobs,
shared
services,
etc).
Perhaps
most
critical
is
the
fact
that
real
users
come
from
outside
of
the
firewall
where
latency
is
an
important
factor
in
the
customer
experience.
Testing
in
a
lab,
while
important
for
extracting
some
types
of
results,
cannot
answer
questions
about
production
performance
or
capacity
with
a
high
degree
of
accuracy
and
confidence.
The
most
common
test
tool
in
use
today
for
web
application
testing
was
never
designed
for
this
purpose.
From
technology
to
licensing,
the
culture
created
by
using
the
wrong
tools
for
the
job
are
holding
organizations
back
from
getting
the
right
testing
outcomes
for
their
applications.
HTTP
was
created
as
an
add-‐on
protocol
for
the
client-‐
server
testing
tool
LoadRunner.
The
same
methodology
of
testing
with
a
few
thousand
users
is
being
used
for
web
sites
today.
This
is
not
sufficient
for
web
sites
with
a
significant
presence.
The
licensing
models
for
tools
like
LoadRunner
come
from
a
time
when
expensive
Enterprise
License
Agreements
were
the
standard.
Today
we
live
in
a
world
of
Software-‐as-‐a-‐Service
and
On-‐Demand.
To
meet
the
challenges
of
increasing
complexity
and
the
real-‐world
scale
of
online
applications
a
modern
methodology
with
the
accompanying
product
solution
is
required.
This
solution
must
offer
speed
to
keep
up
with
agile
development
and
accelerating
change
rates,
it
must
be
able
to
scale
up
to
the
real-‐world
traffic
levels
applications
are
seeing,
and
it
must
be
able
to
do
so
in
a
cost
effective
way.
The
highest
confidence
results
will
be
achieved
by
using
this
approach
on
a
production
environment.
Therefore,
the
final
requirement
is
that
the
solution
is
delivered
with
the
expertise
and
best
practices
that
come
from
running
tests
against
production
websites
to
ultimately
deliver
high
confidence
results.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
2
5. 3.
Product
and
Methodology
Working
Together
in
Harmony
The
methodology
for
high
confidence
production
testing
needs
the
right
product
for
successful
execution.
Conversely,
the
best
tool
in
the
marketplace
needs
the
appropriate
methodology
to
go
along
with
it.
This
harmony
enables
companies
to
use
the
solution
to
its
fullest
potential.
The
SOASTA
solution
includes
both
the
product
and
the
methodology.
The
focus
of
this
white
paper
is
on
select
topics
from
the
methodology
that
enable
production
testing.
However,
it
must
be
mentioned
that
in
order
to
be
successful
at
production
testing
having
the
right
software
in
place
is
critical.
The
CloudTest
product
is
discussed
in
more
detail
in
a
separately
published
white
paper
titled
“The
SOASTA
CloudTest
Solution”.
A
key
takeaway
from
that
work
is
that
there
are
certain
pieces
of
critical
functionality
that
are
required
in
a
testing
tool
to
be
successful
at
production
testing.
CloudTest
is
a
purpose-‐built
production
performance-‐testing
tool.
It
has
the
ability
to
scale
up
to
generate
true
concurrent
user
loads
up
into
the
millions
of
users
while
still
delivering
the
analytics
in
real
time.
Throughout
the
next
few
sections
there
will
be
references
to
the
CloudTest
product
as
the
enabling
tool,
and
to
features
within
it
that
enable
production
testing
to
take
place
with
confidence.
For
a
more
in-‐depth
understanding
of
how
CloudTest
excels
over
other
testing
tools
on
the
market,
and
how
it
is
in
many
cases
the
only
way
to
accomplish
production
testing,
please
refer
to
the
CloudTest
Solution
white
paper.
4.
Select
Topics
From
SOASTA’s
CloudTest
Methodology
The
CloudTest
Methodology
defines
the
people,
processes,
and
technologies
needed
for
achieving
a
successful
performance-‐engineering
ecosystem.
A
core
component
of
this
methodology
is
testing
in
production.
For
years
SOASTA
has
been
enabling
companies
to
overcome
the
organizational
and
technological
hurdles
that
are
encountered
when
doing
production
testing.
The
solutions
to
these
problems
have
become
the
blueprints
and
patterns
that
should
be
implemented
for
successful
testing.
Every
application
is
different,
and
every
application
has
different
testing
needs.
The
methodology
SOASTA
has
developed
is
a
full
lifecycle
system.
It
starts
with
strategy
and
planning,
and
then
proceeds
into
implementation
details.
The
third
pillar
is
execution,
where
the
focus
is
to
produce
as
much
value
from
testing
as
possible.
Finally,
a
full
iteration
of
the
process
ends
with
measurement
of
success
and
changes
in
the
processes
to
improve
the
system,
ultimately
looping
back
to
the
beginning
to
repeat
again.
The
following
diagram
shows
the
high-‐level
methodology
process.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
3
6.
The
topics
in
the
following
sections
summarize
the
detailed
content
that
resides
in
the
Strategy
and
Implementation
phases.
Solving
the
types
of
problems
discussed
here
is
required
to
move
on
to
the
execution
phase.
Part
of
the
initial
Definition
phase
includes
creating
a
heat-‐map
of
the
production
application
and
architecture,
and
subsequently
deciding
what
should
be
tested
and
what
shouldn’t.
Not
every
business
process
must
be
tested
in
production.
For
the
things
that
do
require
testing
the
methodology
speaks
to
how
customers
have
accomplished
this
with
most
of
the
common
pieces
of
functionality
seen
in
today
web
applications…
things
such
as
logins,
payments,
checkouts
and
so
on.
When
production
testing
is
first
being
integrated
into
existing
development
lifecycles
and
testing
practices,
the
same
three
questions
usually
surface.
1. Is
testing
in
production
secure?
2. What
about
test
data
in
the
production
environment?
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
4
7. 3. What
about
potential
impact
to
live
customers?
These
excerpts
from
the
methodology
address
these
questions
and
explain
the
solutions
used
for
successful
execution
by
SOASTA
customers.
4.1
Security
In
general,
production
testing
a
web
site
or
application
is
just
as
secure
as
it
is
for
real
user
traffic.
This
general
statement
applies
to
applications
that
are
already
consumer
facing,
or
internal
applications
that
are
not
classified
in
nature.
These
two
clarifying
statements
about
how
traffic
is
handled
usually
provide
a
high
level
of
assurance
about
what
happens
during
testing:
1) CloudTest
is
simulating
traffic
to
a
web
site
over
the
same
communication
channels
as
a
browser;
HTTP/HTTPS
over
ports
80
and
443
respectively.
The
traffic
is
encrypted
just
as
it
is
to
an
end
user’s
browser.
2) During
a
load
test
the
CloudTest
product
only
retains
responses
from
an
application
in
memory
for
a
matter
of
milliseconds…
there
is
no
persistence
of
a
response
unless
programmatically
declared
in
a
test.
This
means
that
none
of
the
data
that
goes
across
the
wire
is
saved.
Diagram:
Lower
left:
Tests
are
created
and
controlled
via
web
browser
from
anywhere.
Center:
Load
is
generated
from
one
or
more
cloud
environments
around
the
world.
Upper
right:
Browser
traffic
is
directed
at
the
target
application
wherever
it
resides
(private
datacenter,
cloud,
etc)
over
standard
HTTP
and
HTTPS
ports
just
like
real
users.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
5
8. The
parameters
of
a
test
are
defined
and
controlled
by
the
engineer
so
every
action
that
virtual
users
are
performing
is
tightly
managed.
The
sensitivity
of
the
data
going
across
the
wire
can
also
be
handled
with
proper
use
of
test
data
techniques
and
documented
processes
(discussed
in
the
next
section).
Depending
on
the
sensitivity
of
an
application,
there
are
measures
required
to
ensure
that
security
is
maintained.
SOASTA
has
years
of
experience
testing
secured
and
confidential
applications.
Testing
these
kinds
of
applications
is
possible
with
the
proper
methodology
and
process
applied.
4.1.1
Sensitive
Data
Applications
This
document
is
geared
towards
testing
applications
that
are
already
accessible
from
the
outside
world,
or
that
could
be
made
accessible
to
the
outside
world
safely
for
a
period
of
time.
There
are
some
applications
that
are
only
accessed
on
secure
networks.
SOASTA
has
done
testing
for
these
types
of
applications
through
installation
of
a
CloudTest
Appliance
on-‐site
and
then
testing
over
the
intranet.
Sensitive
data
applications
fall
into
two
categories
and
there
are
different
approaches
to
ensuring
security
and
confidentiality
for
each
type.
The
following
solutions
will
apply
whether
the
load
generation
source
is
inside
or
outside
the
firewall.
Non-Disclosure
Agreements
Non-‐Disclosure
Agreements
(NDA’s)
are
commonly
put
in
place
between
SOASTA,
the
customer
and
other
involved
parties,
such
as
an
outsourced
development
team.
While
this
is
common
practice
in
the
industry,
and
certainly
common
practice
between
SOASTA
and
its
customers,
it
is
worth
a
brief
mention.
This
practice
is
important
for
all
parties
to
maintain
the
agreed
upon
level
of
confidentiality
when
working
with
secure
or
unreleased
information.
Sensitive
Data
Sites
These
are
applications
that
expose
things
such
as
personally
identifiable
information
(PII),
legally
protected
information,
classified
data,
or
other
types
of
records
that
are
should
only
be
viewable
by
those
with
authority
or
clearance.
Applications
that
fit
this
profile
include
customer
service
portals,
e-‐commerce
sites,
banking
and
finance
applications,
access
controlled
government
sites,
law
enforcement
and
military
applications.
The
use
of
fictitious
test
data
is
a
common
approach
to
allowing
for
testability
on
applications
that
would
typically
expose
sensitive
data
to
its
users.
If
these
sites
are
being
accessed
via
the
outside
world
the
encryption
and
communication
channels
are
just
as
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
6
9. secure
as
they
are
for
real
world
users.
For
some
companies
this
doesn’t
alleviate
the
concern
about
what
data
is
going
across
the
wire
during
a
test.
It
is
possible
to
further
tighten
security
of
the
test
by
having
the
virtual
users
access
fictitious
data
and
records.
This
extra
level
of
security
is
gained
by
populating
the
database
with
test
data.
This
population
can
be
done
either
programmatically
in
a
test
(e.g.
a
CloudTest
scenario
that
adds
thousands
of
user
accounts
in
an
automated
fashion),
or
by
the
application
engineering
team
through
a
back
channel
(such
as
direct
insertion
of
user
accounts
through
queries
into
the
database).
Data
that
is
typically
created
in
this
fashion
includes
dummy
user
accounts,
events,
product
SKU’s,
inventory
counts,
etc.
Regardless
of
the
data
that
was
transferred
during
a
test,
CloudTest
discards
message
responses
in
matter
of
milliseconds
after
they
are
received.
When
a
response
arrives
from
a
request
made
to
an
application,
it
is
kept
in
memory
only
long
enough
to
analyze
and
do
any
extraction
or
validation
on
it
that
might
be
required,
and
then
it
is
discarded.
There
is
no
persistence
of
response
data
unless
it
is
done
programmatically.
This
is
often
assurance
enough
to
avoid
any
additional
steps
in
locking
down
the
test
data.
Pre-Release
and
Early
Access
Sites
These
are
applications
that
will
soon
be
accessible
by
the
general
public
but
have
not
yet
been
released.
They
may
or
may
not
contain
sensitive
data
such
as
social
security
numbers,
credit
cards,
and
things
of
that
nature.
The
focus
is
on
protecting
the
confidentiality
of
the
information
before
it
goes
public
and
restricting
access
so
that
only
the
test
tool
and
clients
with
certain
configurations
can
access
the
site
for
testing.
Applications
that
fit
this
profile
include
new
product
releases,
marketing
campaigns,
upcoming
sporting
events,
concert
tours,
and
press
releases.
IP
Address
Restriction
Access
to
environments
from
the
outside
world
during
tests
can
be
restricted
by
the
IP
address
of
the
servers
being
used
for
load
generation.
A
typical
practice
for
companies
is
to
put
rules
in
place
on
the
perimeter
firewalls
that
allow
access
for
the
exact
IP
addresses
or
ranges
of
IPs
that
will
be
using
during
a
test.
This
can
be
enabled
right
before
and
disabled
right
after
a
test.
This
practice
ensures
access
by
IP
address
and
only
during
the
test
window.
Using
firewall
and
load
balancer
access
rules
that
are
based
on
host
IP
addresses
will
ensure
that
content
cannot
be
accessed
except
from
specified
sources.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
7
10. Cookies
and
Headers
Transaction
tagging
is
when
the
test
engineer
modifies
a
request
to
contain
flags
that
mark
it
as
a
test
transaction.
A
recommended
practice
for
access
restriction
is
to
configure
a
load
balancer
or
web
application
to
check
for
cookie
or
header
values
in
a
request
and
then
redirect
the
user
into
a
hidden
site.
The
application
engineering
team
can
create
a
load
balancer
rule
that
checks
for
a
complex
cookie
value,
for
example
a
string
that’s
long
and
obfuscated.
The
team
then
gives
this
value
to
the
test
engineer
to
add
to
the
relevant
requests.
If
the
load
balancer
detects
this
value,
then
it
would
send
the
request
to
a
different
pool
of
servers
that
are
responsible
for
serving
a
hidden
version
of
a
site
to
be
tested.
Transaction
tagging
in
conjunction
with
IP
address
restriction
is
a
secure
method
of
hiding
content.
This
same
approach
of
‘tagging’
transactions
is
used
for
halting
processing
of
certain
test
requests,
such
as
order
placements
in
live
environments
(more
on
this
later).
4.1.2
Systems
Monitoring
Data
Another
data
stream
that
might
be
present
during
a
running
cloud
test
is
the
monitoring
data
(CPU,
Memory,
etc).
Whether
the
data
is
coming
from
SOASTA’s
monitoring
agents
themselves
or
from
3rd
party
monitoring
systems,
the
data
is
always
relayed
through
SOASTA’s
agent
called
‘Conductor’.
In
a
production
test,
Conductor
pulls
monitoring
data
from
systems
inside
the
firewall,
and
then
it
uses
‘push’
technology
to
send
data
to
the
CloudTest
main
instance.
While
this
data
is
not
generally
considered
sensitive,
as
it
is
only
related
to
the
health
of
systems
under
test,
it
can
be
encrypted
for
further
security
and
transferred
over
HTTPS.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
8
11.
Diagram:
CloudTest
dashboards
combine
data
from
the
test
controller
perspective
with
data
from
the
servers
inside
the
firewall.
Connections
originate
from
inside
the
firewall
and
the
traffic
is
sent
over
port
80/443
in
standard
HTTP/HTTPS
protocol.
SOASTA’s
Conductor
agent
can
also
run
in
a
DMZ,
a
secured
area
between
the
servers
being
monitored
and
the
outside
world.
This
creates
a
secure
relay
point
for
the
monitoring
data
to
travel
through.
Of
course,
in
the
most
secure
environments,
monitoring
can
be
turned
off
so
that
this
stream
never
leaves
the
datacenter.
Unfortunately,
by
choosing
this
option,
customers
lose
the
ability
to
see
monitoring
data
on
the
same
timeline
as
the
test
data.
Once
the
test
is
finished,
monitoring
data
can
be
wiped,
following
properly
documented
procedures
similar
to
those
used
for
confidential
test
results.
4.1.3
Test
Data
and
Results
Security
Test
data,
in
the
context
of
this
paper,
refers
to
data
that
resides
in
an
environment
(production,
in
this
case)
to
enable
testing.
It
can
also
refer
to
data
that
was
created
in
the
production
environment
as
a
result
of
testing,
such
as
test
orders,
test
accounts,
and
things
of
that
nature.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
9
12.
Test
data
is
sometimes
used
loosely
to
refer
to
data
collected
during
a
test
by
the
CloudTest
load
generators,
things
such
as
result
reports,
dashboards,
monitoring
data,
etc.
In
some
cases,
added
security
measures
for
managing
the
results
of
a
test
are
required.
Documented
practices
can
ensure
the
confidentiality
and
proper
handling
of
the
results
from
performance
tests.
This
sometimes
applies
to
companies
whose
performance
and
benchmarking
are
critical
to
their
business,
such
as
platform
companies
and
product
companies
that
would
be
negatively
affected
by
the
release
of
confidential
performance
information
about
their
software.
4.2
Test
Data
in
Production
Test
data,
as
previously
defined
in
this
context,
means
data
that
enables
testing
or
is
created
as
a
result
of
testing.
The
extent
of
the
management
effort
for
test
data
depends
a
great
deal
on
the
coverage
of
the
test
scenarios.
A
simple
browse
scenario
may
have
absolutely
no
data
requirements,
just
page
click
steps.
A
test
case
that
logs
in
might
need
to
have
pre-‐populated
accounts
in
the
database.
On
the
extreme
end
of
spectrum,
some
test
cases
may
require
placing
orders,
creating
accounts,
and
sending
requests
off
to
third
party
systems.
The
art
of
test
case
definition
to
get
the
right
level
of
coverage
is
a
separate
topic,
but
the
goal
is
to
stress
the
right
functionality
of
the
application
and
not
to
create
a
suite
of
tests
that
do
everything.
If
at
peak
a
web
site
does
6
orders
per
second,
the
order
system
itself
might
not
need
to
be
tested
in
production
because
the
volume
is
so
low.
If
it
does
need
to
be
tested
there
are
best
practices
for
enablement.
Different
strategies
apply
for
different
sites
-‐
there
isn’t
a
one-‐size-‐fits-‐all
approach.
The
tools
to
get
the
job
done
all
exist,
but
implementations
can
vary
significantly
depending
on
the
maturity
of
the
application,
the
real
user
traffic
at
any
given
time
of
day,
and
various
other
factors.
4.2.1
Using
Test
Data
in
Production
The
notion
of
using
fictitious
test
data
in
place
of
confidential
data
for
security
reasons
was
discussed
in
section
4.1.1.
This
section
is
about
best
practices
on
how
to
maintain
any
type
of
test
data
in
the
production
environment.
Throughout
the
course
of
testing
in
production
there
comes
a
time
when
test
data
will
inevitably
end
up
in
the
production
database,
either
to
enable
testing
of
certain
scenarios
or
as
a
result
of
test
scenarios
generating
records.
If
companies
want
to
test
anything
other
than
a
browsing
scenario,
and
they
usually
do,
this
topic
needs
to
be
addressed.
User
Accounts
Speaking
in
general
across
different
categories
of
applications,
the
most
common
type
of
test
data
encountered
in
production
environments
is
user
accounts.
Most
applications
have
the
notion
of
a
login.
It
is
very
common
to
pre-‐populate
a
production
environment
with
as
many
user
accounts
as
is
needed
to
accomplish
regular
performance
testing.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
10
13.
This
can
be
done
either
through
direct
insertion
into
the
database
or
through
a
CloudTest
test
scenario
written
to
do
account
creation.
Best
practices
dictate
that
when
test
accounts
are
created,
the
usernames
should
be
randomized
and
not
follow
any
particular
pattern.
The
reason
for
this
is
that
‘username’
is
usually
an
indexed
value
in
the
database.
Creating
a
large
number
of
accounts
that
have
a
distinct
pattern
will
cause
many
databases
to
change
their
query
plan
to
optimize
for
looking
up
usernames
with
that
pattern.
An
example
of
this
would
be
creating
50,000
accounts
that
are
named
PERFTEST0
through
PERFTEST49999.
This
would
create
a
pattern
in
the
database
that
might
get
optimized
for
lookups.
A
better
approach
is
to
create
usernames
that
are
around
12
characters
in
length
with
3
segments
that
are
randomized
and
assembled
at
creation
time.
Segments
can
be
generated
from
letters
and
numbers,
pulled
from
timestamps
and
the
like,
and
then
put
together
to
create
fairly
distinct
names.
At
creation
time
the
usernames
can
be
exported
to
a
.csv
file
to
be
used
in
CloudTest.
Sometimes,
during
the
creation
process
they
are
inserted
into
a
separate
table,
and
then
exported
to
.csv;
whatever
works
best
for
the
DBA
and
the
database
platform.
The
key
is
to
make
sure
they
are
tracked
at
creation
time,
as
significantly
randomized
usernames
may
be
hard
to
pull
out
of
the
database
after
the
fact.
Once
the
test
accounts
are
in
the
database,
they
should
remain
there
to
support
ongoing
testability.
SOASTA
has
customers
with
hundreds
of
thousands
of
test
accounts
in
the
production
database
to
support
ongoing
testing.
A
site
that
has
300,000
concurrent
users
logged
in
at
any
given
time
would
statistically
have
a
user
base
in
the
50
to
100
million
range.
Therefore,
the
storage
requirements
for
300,000
test
accounts
are
a
fraction
of
what
is
being
used
for
the
entire
population.
Additionally,
deleting
that
many
records
from
a
production
data
store
may
cause
problems
such
as
database
fragmentation
(see
the
section
titled
‘Test
Data
Cleanup’
for
information
on
how
to
address
this).
Handling
Other
Types
of
Test
Data
Data
that
needs
to
be
read
usually
has
requirements
that
are
easy
to
meet.
User
accounts
aside,
things
such
as
products,
inventory
counts,
content
articles,
downloads,
and
things
of
that
nature
can
be
used
in
a
test
case
with
acceptable
stress
on
the
application,
whether
it’s
real
or
fictitious
data
(aforementioned
security
concerns
aside).
Data
that’s
being
inserted,
updated
or
deleted
needs
special
attention.
Many
test
data
situations
can
be
handled
using
techniques
already
documented.
However,
to
stress
critical
functionality
in
an
application
sometimes
there
doesn’t
need
to
be
a
database
operation
or
a
transaction
that
goes
past
a
certain
point.
For
example,
in
an
e-‐
commerce
application
it
is
common
practice
to
have
an
order
object
move
into
a
queue
to
be
picked
up
and
processed.
If
getting
the
order
into
the
queue
is
the
performance
concern,
and
the
processing
operation
is
not
in
the
testing
heat
map
because
it’s
not
a
concern,
or
because
it
can’t
be
tested
for
technology
reasons,
then
ideally
the
processor
needs
to
pick
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
11
14. up
the
order
and
discard
it.
Techniques
like
this
are
called
‘coding
for
testability’.
In
this
example
the
queue
processor
needs
to
be
modified
to
pick
up
the
order
off
of
the
queue,
recognize
that
it’s
a
test
order
(possibly
through
previously
discussed
transaction
tagging),
and
discard
it.
This
technique
is
discussed
in
more
detail
in
section
4.2.2.
Test
Data
Cleanup
Some
types
of
data
should
stay
in
the
production
database
to
support
ongoing
testing.
User
accounts,
fictitious
events
or
content,
etc.
However,
database
performance
does
get
worse
as
a
table
gets
larger.
Depending
on
the
size
of
a
test
and
the
impact
that
the
test
data
can
have
on
database
performance,
this
data
may
need
to
be
deleted
from
production.
A
common
performance
problem
that
databases
encounter
over
time
is
called
database
fragmentation.
This
fragmentation
is
much
like
what
occurs
on
a
hard
disk
drive.
Hard
disks
try
to
store
files
in
a
contiguous
group
of
blocks
and
sectors.
When
you
delete
a
file,
it
leaves
a
gap
on
the
drive.
If
the
next
thing
written
to
the
drive
is
larger
than
the
gap,
part
of
it
goes
in
the
hole
left
by
the
previously
deleted
file
and
the
rest
goes
in
the
next
available
spot.
Reading
this
file
will
be
slower
because
the
drive
head
has
to
move
across
the
drive
to
get
all
of
it.
The
same
thing
happens
to
databases,
except
that
this
is
a
non-‐trivial
problem
because
the
fragmentation
is
in
memory.
Most
modern
databases
store
their
data
in
RAM
and
only
access
the
disk
when
they
need
to.
If
you
delete
large
chunks
of
data
from
a
database,
there
will
be
gaps
in
the
data
and
the
individual
pages
that
store
data
will
become
non-‐contiguous.
A
side
effect
of
this
is
that
SQL
statements
will
take
much
longer
to
process
because
they
are
spanning
large
gaps
of
memory
and
disk
pages
when
they
execute.
Databases
usually
have
ongoing
nightly
maintenance
plans
and
special
care
should
be
given
to
them,
especially
during
times
surrounding
bulk
deletes.
Steps
should
be
taken
to
ensure
that
indexes
are
up
to
date
(so
that
the
database
always
knows
where
to
find
data
without
having
to
follow
multiple
data
pointers),
and
also
that
the
records
exist
in
the
proper
order
(defined
by
the
Primary
Key)
with
no
extra
space
on
data
pages.
Severe
fragmentation
can
result
in
a
single
record
being
stored
on
an
8-‐kilobyte
memory
page
instead
of
multiple
records
grouped
together.
Something
to
keep
in
mind
around
the
first
few
production
tests
is
that
a
regularly
scheduled
maintenance
plan
might
run
for
much
longer
the
night
after
a
production
test
and
a
data
delete.
This
should
be
observed
to
ensure
that
the
plan
does
not
execute
into
the
following
day
and
peak
traffic
hours.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
12
15. 4.2.2
Coding
for
Testability
One
benefit
of
being
able
to
control
the
entire
payload
of
a
request
in
CloudTest
is
that
an
engineer
can
put
anything
in
the
message
needed
to
support
testability.
More
specifically,
they
have
the
flexibility
to
create
messages
that
wouldn’t
normally
show
up
through
typical
use
of
the
application.
One
can
programmatically
set
cookie
values,
modify
headers,
and
change
query
strings
or
post
data
in
an
HTTP
request.
When
this
is
done
to
mark
a
request
it
is
called
‘tagging’
a
transaction.
This
has
many
uses
ranging
from
access
control
to
flagging
requests
so
that
they
do
not
proceed
past
a
certain
point
in
an
execution
sequence.
A
typical
scenario
where
this
is
useful
is
in
order
placement
on
an
e-‐commerce
application.
A
tagged
transaction
will
get
as
far
as
it
needs
to
go
to
achieve
adequate
test
coverage,
possibly
moving
through
alternate
code
paths.
The
goal
of
testing
is
to
get
the
right
amount
of
coverage
on
the
architectural
heat
map
identified
in
the
strategy
phase,
so
where
the
transaction
needs
to
stop
depends
on
the
criticality
of
what
needs
to
be
stressed.
Tagging
a
transaction
and
handling
it
properly
can
ensure
that
payments
don’t
get
processed,
orders
don’t
get
fulfilled,
and
that
ultimately
thousands
of
test
orders
don’t
end
up
getting
shipped
to
an
unfortunate
engineer’s
doorstep.
4.3
Potential
Live
Customer
Impact
Testing
in
production
can
occur
with
or
without
live
users
on
the
environment.
The
majority
of
customers
testing
in
production
with
the
CloudTest
Methodology
are
doing
so
on
a
live
environment.
It
is
possible,
but
not
always
feasible,
to
put
up
a
maintenance
or
turn-‐away
page
and
wait
until
all
existing
user
sessions
have
been
finished.
This
is
actually
rarely
encountered,
though,
because
the
right
tool
and
methodology
working
together
can
allow
testing
to
take
place
in
conjunction
with
real
users
in
all
but
the
most
extreme
cases.
To
use
one
customer’s
exact
words,
“The
cost
of
not
testing
is
far
greater
than
the
potential
impact
to
live
customers”.
You
can
continue
to
generate
revenue
while
testing
at
the
same
time
with
the
right
approach.
It
is
also
possible
to
segment
a
portion
of
the
live
environment
during
a
low-‐traffic
period
and
allow
for
testing
on
this
environment.
Typically,
a
separate
IP
address
is
configured
on
a
load
balancer
and
servers
are
moved
out
of
the
live
pool
and
placed
into
the
test
pool.
Sometimes
configuration
changes
need
to
be
made
to
the
application
servers
in
this
cluster
to
point
to
new
databases
and
be
taken
off
of
other
shared
components.
This
is
a
more
costly
approach
because
it
requires
extra
hardware
and
the
associated
maintenance
overhead.
It’s
also
somewhat
less
reliable
because
you
start
to
deviate
from
the
actual
production
configuration,
and
you
cannot
test
to
the
true
scale.
It
is
still,
however,
a
more
realistic
test
than
simply
testing
in
a
lab.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
13
16. Three
Requirements
for
Successful
Live
Testing
The
first
requirement
for
being
able
to
do
a
large-‐scale
test
with
live
customers
is
having
real-‐time
analytics
in
your
test
tool.
With
up
to
the
second
information
about
site
performance,
you
can
quickly
tell
if
a
site
has
started
to
experience
poor
performance
or
become
completely
unresponsive.
Secondly,
the
product
needs
a
good
kill
switch.
Pressing
stop
or
abort
in
a
running
CloudTest
will
stop
the
load
immediately.
Bandwidth,
concurrent
connections,
threads
in
use,
and
other
typical
pinch-‐points
will
all
drop
back
down
to
normal.
With
real-‐time
analytics
and
the
kill
switch,
you
can
stop
a
test
as
soon
as
you
suspect
customers
may
be
impacted.
If
the
test
tool
takes
a
long
time
to
kill
a
test
this
poses
a
serious
risk.
CloudTest
is
specifically
designed
to
accomplish
a
near
immediate
stop.
Third,
having
good
monitoring
practices
internally,
preferably
integrated
with
CloudTest,
can
prevent
you
from
ever
needing
to
abort
a
test
because
of
live
user
impact.
Being
able
to
watch
the
CPU
growing
to
high
levels
or
the
number
of
connections
reach
its
maximum
on
a
load
balancer
or
firewall
in
real-‐time
can
help
prevent
those
thresholds
ever
being
violated
by
routine
testing.
5.
Customer
Case
Studies
–
Testing
in
Production
5.1
Security
Case
Studies
5.1.1
Banking
and
Financial
Data
Security
A
large
financial
software
company
with
an
online
application
that
allows
customers
to
see
all
of
their
financial
accounts
in
one
place
uses
the
SOASTA
solution
to
test
in
production.
This
is
accomplished
through
the
use
of
test
data
and
alternate
code
paths.
The
customer
uses
the
following
test
data:
• Test
user
accounts
in
the
production
database
(10,000
total)
• Fictitious
financial
institutions
• Dummy
transactional
data
When
a
real
user
logs
into
this
application,
calls
are
made
to
their
financial
institutions
and
all
of
their
transactions
are
pulled
in.
Test
users
have
fictitious
financial
institutions
set
up
in
their
profiles,
so
when
they
log
in,
those
are
loaded
on
the
screen.
Then,
an
asynchronous
refresh
of
those
accounts
kicks
off
in
the
background.
The
code
responsible
for
triggering
this
refresh
recognizes
that
the
financial
institution
is
for
testing
purposes,
and
uses
an
alternate
code
path
that
ends
the
refresh,
aborting
any
attempts
to
try
to
look
up
and
contact
an
institution
that
doesn’t
exist.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
14
17. Using
these
mechanisms,
the
customer
is
able
to
test
all
of
the
major
functionality
of
the
application.
This
includes
starting
on
the
home
page,
logging
in,
loading
transactions
to
the
screen,
building
charts
and
graphs,
and
so
on.
No
actual
customer
data
is
shown
during
testing
as
the
accounts
and
the
transactions
are
fictitious.
All
tiers
of
the
application
are
stressed
except
the
asynchronous
account
refresh
service
that
contacts
3rd
parties
-‐
because
it
was
not
deemed
necessary
for
testing
during
the
strategy
phase
of
the
methodology.
5.1.2
Results
Confidentiality
A
large
provider
of
a
highly
scalable
e-‐commerce
platform
uses
SOASTA
to
conduct
testing
from
the
cloud.
Scalability
and
performance
are
critical
to
their
business,
for
ensuring
that
their
customers
have
the
best
platform
to
deploy
on,
and
ensuring
that
they
stay
ahead
of
the
competition.
As
such,
the
results
of
their
test
are
considered
highly
confidential.
SOASTA
has
a
special
NDA
in
place
with
the
customer
that
includes
a
few
individually
selected
and
named
SOASTA
engineers.
Access
to
controlled
test
environments
is
allowed
through
IP
address
restriction.
After
a
test
is
run,
the
results
are
exported
and
sent
to
the
customer.
Then
the
results
are
deleted
from
the
CloudTest
results
repository
in
a
documented
process
that
includes
customer
observation
of
the
post-‐test
cleanup
activities.
5.1.3
Testing
With
Live
Customers
One
of
America’s
largest
publicly
traded
financial
software
companies
uses
SOASTA
CloudTest
to
test
regularly
in
production
with
live
customers.
During
daylight
hours
with
tens
of
thousands
of
real
users
they
will
conduct
tests
up
into
the
hundreds
of
thousands
of
users.
This
is
accomplished
through
the
use
of
real-‐time
data
provided
by
CloudTest,
as
well
as
a
mature
in-‐house
monitoring
solution.
With
a
good
picture
of
the
customer
experience
at
any
given
time
from
network,
application,
and
end
user
perspectives,
the
team
can
tell
with
up
to
the
moment
data
whether
customers
are
being
impacted
by
a
test.
If
a
breaking
point
is
hit,
the
test
will
be
immediately
stopped
with
the
CloudTest
kill
switch.
At
this
point,
load
will
immediately
stop.
Testing
in
production
with
live
users
for
performance
and
failover
is
a
routine
approach
used
by
some
of
the
world’s
largest
companies.
Google
routinely
fails
over
entire
datacenters
as
part
of
monthly
maintenance.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
15
18. Conclusion
Today’s
fast
moving
online
world
demands
that
companies
have
the
right
performance
testing
solution
to
be
successful.
Creating
a
poor
customer
experience
can
be
catastrophic.
These
events
can
result
in
lost
revenue,
failed
customer
retention,
and
a
viral
spread
of
poor
brand
visibility.
The
dominant
players
in
the
testing
marketplace
have
been
slow
to
respond
and
give
companies
what
they
need:
high
confidence
that
these
problems
won’t
happen.
That
confidence
comes
through
using
the
right
testing
approach
and
the
right
tools
for
the
job.
SOASTA
is
the
leader
in
cloud
testing
and
at
the
forefront
of
modern
performance
testing.
SOASTA
has
received
numerous
awards
and
accolades
including
being
listed
as
the
Visionary
Leader
in
the
Gartner
Magic
Quadrant
for
test
tools,
and
named
a
Wall
Street
Journal
Top
50
Venture
Backed
Company
(out
of
38,000
world-‐wide).
SOASTA
delivers
the
highest
confidence
possible
by
conducting
accurate
and
realistic
testing
in
the
right
environment.
The
performance
engineering
team
at
SOASTA
has
deployed
hundreds
of
thousands
of
cloud
servers,
run
tens
of
thousands
of
hours
of
tests
against
production
environments,
and
has
run
tests
with
millions
of
concurrent
users.
This
team
is
uniquely
qualified
to
do
large-‐scale
tests
against
production
websites.
All
of
this
contributes
to
the
value
that
SOASTA
is
delivering
daily
to
leading
online
customers
such
as
Chegg,
Gilt
Groupe,
Hallmark,
Intuit,
Netflix,
Mattel,
MySpace,
Symantec,
and
many
others.
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
16
19. About
SOASTA,
Inc.
SOASTA’s
mission
is
to
ensure
that
today’s
Web
applications
and
services
perform
in
a
high
quality,
scalable,
and
predictable
manner.
The
company’s
product,
SOASTA
CloudTest®,
is
available
as
an
on-‐demand
service
in
the
Cloud
or
as
an
appliance
(virtual
and
hardware),
and
enables
developers
to
test
and
monitor
their
Web
applications
and
services
at
an
affordable
price.
SOASTA
CloudTest®
supports
Load,
Performance,
Functional,
and
Web
UI/Ajax
testing.
SOASTA
is
privately
held
and
headquartered
in
Mountain
View,
California.
For
more
information
about
SOASTA,
please
visit
www.soasta.com
Cloud
Testing
Production
Applications
9/27/10
Version
1.0
17