If your application is not fast people leave. It's that simple.
We have developed a repeatable process for assessing your web application performance and identifying bottlenecks. We can then fix the problems or give the detailed report to your team to implement solutions.
The best thing about this service is you are not on the hook to us every time you update code and need a new assessment. We can train your people on the tools and processes we use while we do the initial assessment.
The process we use has been finely tuned over time:
1. Identify areas under review. Are we reviewing the public web site? An internal, corporate application? A customer portal? Are we reviewing the entire application, or only the most critical areas, or a random selection of representative parts?
2. Planning
a. Identify key use cases – Which ones are we testing? Which ones are we ignoring?
b. Identify user profiles - similar to personas in the context of application design
c. Identify data workloads – we need to understand at what point the system starts to break down. How much data can a single page handle? 1 record? 10? 1,000? What is the maximum reasonable amount of data that any given page might contain?
3. Create test scripts and data workloads.
4. Run test scripts and record data
5. Analyze and report findings
6. Compare results with expectations and any earlier test iterations
7. Modify the application code
8. Start again at step 4 after software changes
2. The Aspenware Accelerate team has
worked through performance issues
with many clients, using a repeatable
Average Response Time
service for improving application
10000000 performance that is simple to implement,
1000000
fits into existing processes, and can
be implemented in a phased approach.
100000
10000
[ ms ]
1000
100 The following is an overview of the
10 service process and a roadmap
1
you can follow to establish a process
of continuous improvement.
3. Identify Key
Use Cases, User
Profiles, and
Data Workloads
We start our research in an open dialogue with the relevant
stakeholders to identify high impact areas, and together we
discover where to focus our efforts.
Once we have selected the key use cases, we determine
what types of users figure prominently within them.
It is important to identify the application role, the type of
computer used to access the application, the bandwidth
connection, and the geographic location.
We identify light, average, and heavy data workloads.
The light workloads contain just enough data to work with
the application—this is useful in establishing a baseline.
The average workload represents the amount of data for a
typical user. The heavy workload represents the maximum
reasonable amount of data users may have in the application.
4. Create and Run
Test Scripts
We then build test scripts based on the workloads and
user profiles. The test scripts are detailed and
repeatable. We give each test a unique identifier,
provide each test with a clear use case description, and
document each step in the test script. We record Web
1.0 and Web 2.0 metrics using leading edge
performance testing tools. We gather a large amount of
data, which allows us to do a comprehensive analysis.
5. Analyze Test
Results and
Report Findings
We contrast our test results with comparable, high-performing
applications and industry benchmarks. Our analysis identifies
the root cause of any bottlenecks. We deliver a comprehensive
report with an action plan, competitive analysis, and ROI
calculations. We deliver our report to you in the form of a
dynamic web site that contains summarized results as well as
developer-level documentation. In fact, we include full
documentation of our data, our testing process, testing tool
walkthroughs, and more. Many of our customers have used this
information to enhance their processes and improve
company-wide application performance knowledge.
6. Amazon found
that a .001 second
improvement
results in a 1%
increase in revenue.
7. Be fast.
Make money.
For More Information Contact
Jeremiah Fellows Rob Clark
@JWFellows @RClarkAspenware
J.Fellows@Aspenware.com R.Clark@Aspenware.com