Contenu connexe Similaire à Agile Strategies for Traditional Software Development Teams (20) Agile Strategies for Traditional Software Development Teams1.
T8
Test
Techniques
10/6/16
11:15
Agile
Strategies
for
Traditional
Software
Development
Teams
Presented
by:
Melanie
Drake
SAS
Brought
to
you
by:
350
Corporate
Way,
Suite
400,
Orange
Park,
FL
32073
888-‐-‐-‐268-‐-‐-‐8770
·∙·∙
904-‐-‐-‐278-‐-‐-‐0524
-‐
info@techwell.com
-‐
http://www.starwest.techwell.com/
2.
Melanie
Drake
A
senior
development
tester
for
JMP,
a
division
of
SAS,
Melanie
Drake
has
twenty
years
of
experience
in
software
documentation,
testing,
and
development.
She
currently
tests
a
scripting
language,
display
components,
and
leads
installer
validation
for
JMP¨,
statistical
discovery
software.
In
addition
to
her
regular
testing
duties,
Melanie
is
part
of
a
cross-‐functional
team
that
implements
and
maintains
a
system
of
software
builds,
automated
testing,
CI
builds,
data
gathering,
and
reports.
Her
driving
passion
is
using
JMP
to
test
JMP
and
to
analyze
the
results.
3. Copyright © 2013, SAS Institute Inc. All rights reserved.
Incorporating Agile Testing Tools
Into More Traditional Software
Development Processes
4. Copyright © 2013, SAS Institute Inc. All rights reserved.
BACKGROUND WHAT DO I DO?
• I test statistical software
• No, I am not a statistician
• Yes, I use statistics
5. Copyright © 2013, SAS Institute Inc. All rights reserved.
THE BIG IDEA PRODUCT VS PROCESSES
Your processes serve
you and your product.
Your product does not serve
your processes.
6. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
1. Customer satisfaction by early and continuous
delivery of valuable software
3. Working software is delivered frequently (weeks
rather than months)
We keep to a regular schedule:
• A main release every 18 months
• Two maintenance releases:
• 4-5 months after the main
• 5-6 months after the first maintenance – fully localized release
7. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
2. Welcome changing requirements, even in late
development
• We freeze new (large) features about 4 months before
code freeze
• This prevents large numbers of very late features
• It doesn’t prevent all late features
8. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
4. Close, daily cooperation between business people
and developers
• Certainly on-going
• Additionally, some of our developers have close
relationships directly with customers
9. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
5. Projects are built around motivated individuals,
who should be trusted
11. Best architectures, requirements, and designs
emerge from self-organizing teams
• Teams bubble up around projects depending on who has
the interest, expertise, and time
• Our managers are all working managers - they work a lot,
and they manage a little
10. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
6. Face-to-face conversation is the best form of
communication (co-location)
• Yes, though we have a handful of remote employees and
a small team of testers in Beijing
• Many problems have been solved in impromptu hallway
conversations
11. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
7. Working software is the
principal measure of
progress
12. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
8. Sustainable development, able to maintain a
constant pace
• Our company has a very strong commitment to work-life
balance
• Peaks and valleys in software development occur, but our
lives do not belong to the company or a project or a
release
13. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
9. Continuous attention to technical excellence and
good design
• We continually update and fix internals that customers will
never even see
• We have pulled features and moved them to the next
release if they were deemed unready
• For large-scale features and changes, we design first,
then code
14. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
10. Simplicity—the art of maximizing the amount of
work not done—is essential
• We have very low process. Write code. Test software.
Write docs.
• Most of our tracking is done in a bug tracking system and
we use a wiki to collect shared information
15. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ANALYZING THE TWELVE PRINCIPLES
12. Regularly, the team reflects on how to become
more effective, and adjusts accordingly
• We use a retrospective where everyone has a voice, and
we adjust and improve
16. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE THOUGHTS ONE LAST THOUGHT
Except for the goal to deliver software constantly
(1 and 3), very few of those principles pertain
only to agile software development
17. Copyright © 2013, SAS Institute Inc. All rights reserved.
AGILE PROCESSES WHAT WE HAVE ADDED THROUGH THE YEARS
• Daily Builds
• Regression Test Framework – Runs Daily with Full Reports
• Daily Installers
• Machines to run the regression tests on every operating system we
support (usually 8-10)
• Database of all regression reports for historical tracking
• Tools so anyone can run daily changelist builds through time
• The retrospective (after each main release, every 18 months)
• Jenkins for builds and regression tests for each changelist
• Data mining on code repository, bugs database, and test results for
analysis
18. Copyright © 2013, SAS Institute Inc. All rights reserved.
DAILY BUILDS JMP: THE EARLY DAYS
• The first version of JMP was released in 1989 with a handful of
developers – no testers and nothing was automated
• By 1996 (JMP 4), we ran on Windows and Macintosh both, and the
primary tester started making her own daily builds for testing
purposes manually
• Not long after that, making daily builds became official and was
moved to someone who came in at 6am, and he automated the
process – Automated Daily Builds were born!
For several years, that was what we had – automated daily builds,
ready every morning. Everyone knew if someone had broken the build
on any given morning and broken builds were fixed quickly.
19. Copyright © 2013, SAS Institute Inc. All rights reserved.
REGRESSION
TESTING
A HUGE LEAP FORWARD
• With JMP 7 (2005), our automated test framework debuted
• Although JMP has always been a GUI application (even in 1989!),
it also has a robust scripting language (JSL)
• The framework was written in JSL, and testers
started writing tests, based on both current
testing and already-reported bugs
• Since we had daily builds available, the test
framework was automated
• At that time, every morning the testers had a daily
build for Windows and Macintosh and a report
showing test failures on each build machine
20. Copyright © 2013, SAS Institute Inc. All rights reserved.
DAILY INSTALLERS THE NEXT STEPS
• Note that we had builds only – you had to install JMP, then copy the
executable, the dlls, the string files, etc. by hand to run the latest
• The clamor was heard for daily installers
• Every day at slightly after midnight, builds are kicked off, and when
the builds are finished, installers are created
• Then, using a system of hard machines and VMs, the installers are
copied to and run on a variety of machines – today, that’s 3 Windows
versions (7, 8.1, and 10) and 5 Macintosh versions (10.8-10.12, which
includes the developer’s version of Sierra, not yet released)
• Finally, the daily tests are run, which take between 3-4 hours to
complete
21. Copyright © 2013, SAS Institute Inc. All rights reserved.
COLLECTING DATA TESTBOT
• After tests are run, the results are injected into a database
• A set of php web pages display the results
• We have a lot of historical data
• Great for analyzing trends
• Useful for pinpointing when a particular failure started
• Emails are sent to testers with the current results 3 times a day
• We are currently running 2 entire sets – the current maintenance
release and the next major release
• They are run in sequence, since doubling our computing resources
was expensive – not to mention the few times we need 3 batches
22. Copyright © 2013, SAS Institute Inc. All rights reserved.
TESTBOT DAILY REPORT EXAMPLE
23. Copyright © 2013, SAS Institute Inc. All rights reserved.
JMPRUNNER
AND BLAME
MORE TOOLS!
• As testers, we know that the more
information you can give a developer about a
bug, the faster it will get fixed
• With daily test results, we could at least get a
bug down to a particular day it started
• We wanted more, and so did the developers
• One of them developed a Windows tool and
got a server for it – it collects daily builds and
creates changelist builds – as long as you have the appropriate
version installed, you can run any of them – jmprunner! Soon
thereafter, another developer created a tool to do something similar
on Macintosh – he called it blame
• A tester can now pinpoint a changelist as the cause of a bug
24. Copyright © 2013, SAS Institute Inc. All rights reserved.
RETROSPECTIVES DOESN’T EVERYONE ENJOY REMINISCING?
We instituted a retrospective with JMP 8, held shortly after every main
version release (every 18 months!)
• Everyone can anonymously report on things that went well, things
that didn’t go well, and suggestions for improvement in any area
• 1-day meeting where these items are discussed, starting with the
good – we want to continue doing those
• The “didn’t work” items are taped on the walls, and everyone votes for
the problems we should address with stickers
• The top few “winners” get further discussion, and volunteers for small
groups to brainstorm solutions and work on getting them in place
Some improvements that have come out of retrospectives include
Baseline tests, Jenkins, better ways to manage branching, greatly
expanded beta program, better freeze date management
25. Copyright © 2013, SAS Institute Inc. All rights reserved.
BASELINE TESTS IF THESE DON’T PASS, WE’RE IN TROUBLE
• We were still operating on a daily basis
as far as tests went
• Developers wanted to find horrible
problems before pushing their code
• They could run the test framework, but it’s hard for them to know
which tests to run, and the whole thing takes hours in a release build
and about a day in a debug build
• So we developed a subset of tests, called “Baseline tests”
• They cover all the basic operations that ought never fail, and run in
about 5 minutes on a release build and about 10 on a debug build
• Now developers can run those tests before submitting code – it won’t
catch everything, but it catches the types of failures that cause
massive headaches in the full daily test run
26. Copyright © 2013, SAS Institute Inc. All rights reserved.
JENKINS BECAUSE DEVELOPERS NEED A BUTLER
• Let’s face it – developers by long habit do not run baseline tests every
time they push code
• And so Jenkins was born:
• A system of VMs that make changelist builds to catch build errors
on-the-go
• Jenkins also runs the baseline tests against each changelist build
• If you break the build or cause a baseline test error, you get an email
immediately and are expected to fix it immediately
27. Copyright © 2013, SAS Institute Inc. All rights reserved.
DATA MINING SOME EXAMPLE REPORTS
Code Activity for JMP 13.0
Timing Data
Bug Reports
Tech Support Data
28. Copyright © 2013, SAS Institute Inc. All rights reserved.
SUMMARY WHAT DO WE GET FOR ALL THIS WORK?
Tool LOE ROI
Daily Builds Medium High
Daily Regression Test Suite High -> Medium Very High
Daily Installers Medium High
Suite of hardware and VMs to run
tests on a variety of operating
systems
Very High High
Database to hold and present
current and historical test results
Medium-High High
Tools to easily run changelist builds Medium Very High
Retrospective Low Medium-High
Baseline Tests and Jenkins Medium High
Data Mining and Analysis Medium Very High
29. Copyright © 2013, SAS Institute Inc. All rights reserved.
TEST SUITE MORE DETAILS – A LOT MORE DETAILS
• Test Suites never stop growing!
• Nearly 70,000 individual tests in over 2000 files
• 1500-2000 failures every day – many are old minor bugs that never
get fixed
• Added a system to filter out failures on open bug so we don’t have
to figure out which ones are new and important and which ones are
already known
• Then why have them? (bugs are still open, so there’s hope!)
• Refinements made since its first installment:
• Daily emails to testers with results
• Database with all results through time, allowing many different views
of failure data
• Using our own software, we can run queries against the database
and then analyze the data
30. Copyright © 2013, SAS Institute Inc. All rights reserved.
TEST SUITE SIDE BENEFITS
• “Eat Your Own Dog Food”: We use JMP to test JMP
• Specific tests catch specific failures
• For architectural changes, individual tests and the entire test system
itself can break in unforeseen ways
• In the past few years, we’ve leveraged this:
• For large architectural changes, we can run the entire test suite as a
side project, and get all the same information without overwriting the
official results
• Comparing results shows problems
• Iterate between fixes and tests outside the official build
• Submit code when it’s ready
31. Copyright © 2013, SAS Institute Inc. All rights reserved.
FUTURE WHO KNOWS WHERE THE FUTURE WILL TAKE US?
• Tracking crash reports
• Changing branch management
• ???
32. Copyright © 2013, SAS Institute Inc. All rights reserved.
CONTACT ME
Melanie Drake
Senior Development Tester, JMP
Melanie.Drake@jmp.com