SlideShare une entreprise Scribd logo
1  sur  19
Télécharger pour lire hors ligne
Extreme Programming
                                    Sven Rosvall
                                sven-e@lysator.liu.se




Extreme Programming (XP) is a rebellious new development methodology that has
got a lot of press recently. But is it so "extreme" as the name indicates? And what
makes it so "extreme"?
This talk aims at giving a quick introduction to XP and what is so special. XP has a
strong focus on human communication, simplicity and testing. There are lots of
work for testers in an XP environment. We will see how testing can drive the whole
development process. Testers are now in charge instead of being given the end
result from developers during the hectic final stages of the project




                                                                                       1
Values of XP
         •   Communication
         •   Simplicity
         •   Feedback
         •   Courage




•Communication between team members, management and customers.
•Do the simplest thing that could possibly work. Don’t do anything more. Do only
what you need today. Tomorrows complexity will be dealt with tomorrow.
•Feedback on the state of the system (testing). Feedback from customers. Feedback
on project progress.
•Courage to do radical, needed changes.
These values go hand in hand.
•Communication spreads courage to the whole team.
•It is easier to be courageous with a simple system.
•Feedback gives courage because you know how the system performs.




                                                                                    2
Why is XP popular?
         Waterfall model doesn’t work:
         • Relies on getting specifications right
         • Customer changes his mind
         • New technology
         • Late system testing
         • Poor visibility of progress




•It is impossible to get specifications right before you have a complete system.
•Customers always change their mind. They might not know in advance what they
really need. Competitors may have added features.
•The world changes. New technology is introduced. Not only new tools and
operating systems our project is based on. There may be smaller upgrades with
subtle changes that may break or improve our system.
•System Testing cannot start until we have a system. All pieces must be integrated.
Customers can only get a feel for it is how to work with the system at this point.
•Progress reports say. “Analysis Phase complete”. Customer says “Good, but what
does Analysis Phase really mean?” Think of the Emperors new clothes.


A lot has changed since the Waterfall model first saw the light. Many
methodologies have improved things but not enough.




                                                                                      3
What is XP?
         •   Customer focused
         •   Iterative, frequent releases
         •   Test intensive (Testing first)
         •   Lightweight software development
             methodology




•The customer pays us. Give him respect and keep him happy. Keep him informed
on progress. Show early releases. Let customer participate in planning and design.
•XP is the first truly iterative development methodology I have seen that does not
compromise quality. A system is delivered early and frequently to give visibility
into progress and what is developed.
•Testing is a crucial practice in XP. Automated as much as possible.
       •Test cases define what is to be developed and communicates functionality
       better than specifications to developers and to customers. It is easier to
       verify that a test case is satisfied than a requirement in a specification
       document.
       •Testing drives development. No development task is complete without
       passing its tests.
       •Testing assures that no code change breaks anything else.
       •Testing can start early because we have a system ready from the start.
•XP is a lightweight methodology in that thick documentation is avoided and
replaced by other means. XP is ideal for 5-15 people in a project. Bigger projects
can often be broken down into manageable sub-projects.




                                                                                     4
When not to do XP
                  •   Company culture
                  •   Expensive integration/testing
                  •   Big projects
                  •   Uninterested Customer
                  •   Required up-front design




XP requires communication. There may be “heroes” who want to be seen as indispensable experts and
are afraid of sharing their knowledge.
Some companies value their workers by how many hours they work, not what they do. There are also
many managers who think meetings are waste of useful time and want people to work instead of talking
to each other.
Integration and regression testing can be very expensive. Every developer won’t have access to his own
mainframe or moon-lander. But centralised builds on a dedicated test machine and testing with reports
made available to all team members can often be an acceptable replacement. Simulators at various scales
can also do great jobs.
If integration or regression testing takes too long, it takes too long to get feedback on recent changes.
This can sometimes be solved by breaking down the system in manageable pieces that can be integrated
and tested on their own. Regular large scale integration and testing will catch any problems at this level
reasonably fast.
If Quality Assurance takes two months before releasing, this slows down the feedback loop badly.
Instead QA should trust the automatic testing being performed.
Big projects causes problems with communications. This can be solved by dividing the project into
smaller pieces. One group delivers a module to another, who act like customers.
The customer may not be interested in giving feedback and prioritising the development. He may just
want something delivered and installed. You will have problems with this kind of customer in waterfall
projects too.
There may exist a requirement for up-front design. Maybe because of tradition, maybe because
management or customers want to see that the developers understand the problems in the project.
There are still useful practices in XP that can be used in any project.


                                                                                                             5
XP Practices
                 Onsite
                                                              Planning
                Customer
                                                               Game
                                           Simple
                                           Design
                Metaphor

                                                                    Short
          40h                                                      Releases
          Week               Refactoring            Testing


                               Pair
           Coding          Programming
                                                               Continuous
          Standards                                            Integration
                                           Collective
                                           Ownership




•Customer on site to ask for clarifications, planning, testing, etc.
•Everybody in the team uses the same Metaphor. (Mental Model)
•Short Releases so that new features can be tested quickly. Typically every 2-4
weeks.
•Testing can start immediately because we have a (small) system after the first
release. (I.e. 2-4 weeks.)
•Simple Design. Don’t do more than what is needed now. No need for up-front
complex design of things that might not ever be used. It is easier to change a simple
design than a complex design when the customer may change his.
•Refactoring is required when the Simple Design must be improved. We may have
gained experience on how to design after having tested the feature with the simple
design. The refactored design can be verified with the test suite, which has been
proven before.
•Pair Programming. Two brains works much more efficiently than two individuals.
They inspire and review each other all times. Tests are also designed in pairs.
•Collective Ownership. Don’t rely on individual to help you out. Do all changes you
need for your feature yourself (in pairs) in any module.
•Continuous Integration. Integrate your new feature and test it now.
Testing is a development tool, even drives development.




                                                                                        6
XP Practices
                Onsite
                                                             Planning
               Customer
                                                              Game
                                          Simple
                                          Design
               Metaphor

                                                                   Short
          40h                                                     Releases
          Week              Refactoring            Testing


                              Pair
          Coding          Programming
                                                              Continuous
         Standards                                            Integration
                                          Collective
                                          Ownership




All practices work together:
•We can plan short releases because the customer is with us defining what is needed
now and the customer has seen a working system and agrees on the system
metaphor.
•Anyone on the team can add code to refactor or add a new feature because we have
a common coding standard, simple design, work in pairs and have a test suite.
•We can refactor with confidence because we have a test suite to verify that we
haven’t broken anything.
•We can test frequently because we continually integrate small increments.
•We stick to a simple design because we only need to implement what new test
cases tell us to implement.




                                                                                      7
XP vs. Waterfall Activities
                          Req.   Analysis   Impl.   Integr.   Review   Test   Delivery


            Customer       X        X                           X       X        X

            Planning       X        X        X

            Metaphor       X        X        X                  X       X

            Simple                  X        X
            Design
            Refactoring             X        X


            Cont. Int.                       X        X                 X        X

            Testing                 X        X        X         X       X        X

            Short Rel.     X                                            X        X

            Coll. Own.              X        X                  X

            Pair Prog.              X        X                  X

            Code Std                         X                  X




This table shows a cross-relation between XP and Waterfall models. It shows which
XP activities are supporting which Waterfall activities.
We see here that all Waterfall activities are well covered, even though XP does
them in a different way.
Note that testing is an essential activity throughout XP.




                                                                                         8
User Stories
         • Describes One Feature
         • Communication between Customer and
           Developer
         • Basis for Planning
         • Basis for Test Cases




User Stories are very important in XP.


•User stories replace system and function specifications.
•When requirements are changed, this is reflected in a small number of user stories.
•US are simple enough to be understood by customers and developers.
•Each US is estimated and assigned to an iteration and a responsible developer.
•Test cases are derived from US.


US are organised on small cards so that they can be shuffled around during
planning. Simpler than using project planning software such as Microsoft Project.




                                                                                       9
Iterations
          • Prioritisation of User Stories
          • Implement a set of User Stories
          • Release




An iteration lasts typically 2-4 weeks. Customers and developers decide together
which user stories are included in each iteration. Replanning at the beginning of
every iteration.
An iteration encompasses all activities of a project cycle. We can do this because
we have a simple design and automated testing. We are integrating and testing
continuously all the time so we know the system works. Packaging will also be
automated to speed up the iteration.
Integration and testing is not listed separately on this slide as they are ongoing
activities.




                                                                                     10
Testing
          • Write tests before implementation
          • Test execution as soon as first feature
            implemented
          • All tests are re-run before accepting new
            features
          • Automating tests




Test cases are written before development starts for a User Story to get as much
independence of the test cases from knowledge of the design.
Test cases are both at system level and at unit level.
Tests are automated as much as possible as tests are executed very frequently. They
are not only run at the end of each iteration before release, but also before each
implemented feature is accepted as working and not breaking any other feature.
Tests are used to drive development. They are the metric to say that the developer
has done his job. This metric is used by management to see progress.
Developers are also helped by tests. They know clearly when their job is done. No
fuzz. There is no risk developers are stuck fine tuning code that fulfils no purpose as
there are no test cases for such work. The developer is confident that his new feature
does not break anything as this is proven by testing. The developer feels happy
because he can see his achievements clearly.
Customer is co-responsible for tests. Any bug reports coming in are seen as a failure
to define a test case. There are no bugs (failed test cases) in XP projects, only
missing test cases.




                                                                                          11
Testers Role
         •   Better test writer than developers
         •   Better test writer than customers
         •   Knows test tools
         •   Test automation
         •   Test execution




•Developers sometimes have difficulties distancing themselves from their system
and seeing things from a users perspective.
•Customers sometimes don’t know all the error conditions that should be
considered.
•Test Tools that neither developers nor customers have resources to learn.
•Automation is time consuming. Using Test Tools and frameworks.
•Execute Manual Tests that cannot be automated.
•Explorative testing etc to find any bugs/missing test cases and misunderstood
requirements.




                                                                                  12
Example Project




Online banking is a service many of us know. It is a kind of application that evolves
over time. Perfect for dividing into iterations and release often to customers.
The Metaphor here is a set of accounts that the user can work with. The paged
nature of the web is also part of the metaphor.
Need a “dummy customer” to act on behalf of all anonymous users.
First iteration: Show this page. This page shows the features that has most value to
the customer. (Login is often seen as the first feature, but it has no real value.) Get
feedback on look and feel. Correct logo and colours, menu of commands logically
ordered. Easy to view information on screen.
Early feedback may have suggested stripes in transaction list to make it easier to
follow lines.
Testing against a snapshot database, because we don’t have database support yet.
Note that this first iteration is not a publicly available release. We need a working
database and login etc. to have a satisfactory solution.




                                                                                          13
Deploying XP in QA C++
          QA C++, A static analysis tool for C++
                   source code.
          • Core analyser team 3-5 people.
          • Had been running for 2 years when XP was
            adopted.
          • Introduced XP during following 4 years.
          • 400 KLOC C++ code.



The core analyser was easy to define tests for. Most test cases consisted of source
code that the tools should parse and expected warning messages.
No dedicated tester. Instead a tiered automated test suite was built. The full suite
was run every night (8 hours). A limited set of the test suite had to be run before
code was allowed to be checked in.
The test suite consisted of:
•1000 selected code samples to demonstrate new features.
•800 code samples from bug reports.
•A compiler test suite with 14 000 test cases was bought and integrated into test
suite.
•A number of customer projects we managed to get our hands on.
•The source code for QA C++ itself.




                                                                                       14
Lessons Learned from QA C++
          - Too small team with very specialised roles
            made pair programming difficult.
          - Difficult to make management co-operate
            and find a customer.
          + Core parser was rewritten four times.
          + Never late production releases.
          - Slow feedback.



Building a C++ parser is a daunting task. Adding analysis to this does not only
require excellent understanding of the C++ language, but also understanding how
the tool is used and what needs customers have.
Had done sporadic snapshot releases before. These were taking too much time as
some files were missing in the release. Improved integration build scripts to be able
to deliver complete release frequently.
Finding a customer was impossible. Customers were seeing QA C++ as just another
tool in their toolbox. No customer saw QA C++ as business critical tool. Later we
had a resource in the sales support team dedicated to QA C++ who acted as a proxy
customer. This was a huge improvement as this person knew very well what the
customer wanted.


A good test suite was already in place, but had to be improved to make it easy to run
and view the results.
Thanks to the automated test suite we dared to rewrite major parts of the tool when
the design had to be enhanced to cope with new features. The test suite was
comprehensive enough to tell us if the new code was behaving as well as the old
code did.
We had an excellent track record on releasing on time, with very few bugs.
Although we managed to send out a new release every month, the people on the
beta list were to busy to install new versions this often. Our japanese distributor
wanted 6 months to make sure that the new version was as good as the previous,
that all their customisations worked properly and to translate it.

                                                                                        15
Adopted Practices in QA C++
                  Onsite
                                                                 Planning
                 Customer
                                                                  Game
                                              Simple
                                              Design
                 Metaphor

                                                                       Short
            40h                                                       Releases
            Week                Refactoring            Testing


                                Pair
            Coding          Programming
                                                                  Continuous
           Standards                                              Integration
                                              Collective
                                              Ownership




We did not have a customer to ask for details and priorities from the start. Instead
we indicated to management what the next versions were going to include. Later we
had a resource in the sales support team dedicated to QA C++ who acted as a proxy
customer.
We knew what to do, the C++ standard (700 pages) was our ultimate requirement.
Easy to create user stories and test cases from this. However, customers required
support for various compiler extensions.
Planning suffered from lack of understanding what features were needed in the
market. We wanted to do a proper C++ parser but most customers were using Visual
C++.
Pair programming was difficult as the team was small and very specialised. No-one
had capacity to know more than one area in the tool. We did occasional pair
programming to try it. This had always positive results. The resulting code had
outstanding quality. But it was exhausting to get introduced into someone elses
problem domain while maintaining your own.
Coding style consistency suffered due to the lack of pair programming. We used our
own tool to enforce some coding guidelines. You can never cover everything in a
coding standard.
-----------------------------
It was easy to introduce testing as a test suite was already in place.
Everyone in the team thought XP was cool.



                                                                                       16
Agile Development
         • A collection of lightweight methodologies
         • Patterns for design and organisation
             – Agile Software Development
               Alistair Cockburn, ISBN 0-201-66969-9
             – http://www.agilealliance.org




“Agile methods are people-oriented rather than process-oriented. They rely on
people’s expertise, competency and direct collaboration rather than on rigorous,
document-centric processes to produce high-quality software.”
“Travel light. Just keep the models needed. The less models, the less to update.”
XP is one of many agile methodologies.




                                                                                    17
Resources
• Books:
  – Extreme Programming Explained: Embrace Change
    Kent Beck, ISBN 0-201-61641-6
  – Test-Driven Development, By Example
    Kent Beck, ISBN 0-321-14653-0
  – Testing Extreme Programming
    Lisa Crispin, ISBN 0-321-11355-1




                                                    18
Resources
• Web links:
  – http://www.extremeprogramming.org
  – http://www.xprogramming.com
  – http://www.stickyminds.com/sitewide.asp?sid=2301486&sqry=%2AJ%28
    MIXED%29%2AR%28createdate%29%2AK%28simplesite%29%2AF%
    28Lisa+Crispin%29%2A&sidx=6&sopp=10&ObjectId=5045&Function=
    DETAILBROWSE&ObjectType=MAGAZINE




                                                                       19

Contenu connexe

En vedette

Agile Software Development I: Software crisis (Arabic)
Agile Software Development I: Software crisis (Arabic)Agile Software Development I: Software crisis (Arabic)
Agile Software Development I: Software crisis (Arabic)Sameh Deabes
 
AWB - 11 - Extreme Programming
AWB - 11 - Extreme ProgrammingAWB - 11 - Extreme Programming
AWB - 11 - Extreme ProgrammingAXA EMEA-LATAM
 
Model Agile & eXtreme Programming (XP)
Model Agile & eXtreme Programming (XP)Model Agile & eXtreme Programming (XP)
Model Agile & eXtreme Programming (XP)Ferryxz Yamato
 
مكونات نظام المعلومات
مكونات نظام المعلوماتمكونات نظام المعلومات
مكونات نظام المعلوماتProf. Othman Alsalloum
 
مفهوم دورة تطوير نظام المعلومات الادراية واهميته
مفهوم دورة تطوير نظام المعلومات الادراية واهميتهمفهوم دورة تطوير نظام المعلومات الادراية واهميته
مفهوم دورة تطوير نظام المعلومات الادراية واهميتهAshraf91
 
Extreme Programming
Extreme ProgrammingExtreme Programming
Extreme ProgrammingErkan Erol
 
Extreme programming
Extreme programmingExtreme programming
Extreme programmingMr SMAK
 
Agile Methodologies And Extreme Programming
Agile Methodologies And Extreme ProgrammingAgile Methodologies And Extreme Programming
Agile Methodologies And Extreme ProgrammingUtkarsh Khare
 
OOAD UNIT I UML DIAGRAMS
OOAD UNIT I UML DIAGRAMSOOAD UNIT I UML DIAGRAMS
OOAD UNIT I UML DIAGRAMSMikel Raj
 

En vedette (11)

Agile Software Development I: Software crisis (Arabic)
Agile Software Development I: Software crisis (Arabic)Agile Software Development I: Software crisis (Arabic)
Agile Software Development I: Software crisis (Arabic)
 
AWB - 11 - Extreme Programming
AWB - 11 - Extreme ProgrammingAWB - 11 - Extreme Programming
AWB - 11 - Extreme Programming
 
Model Agile & eXtreme Programming (XP)
Model Agile & eXtreme Programming (XP)Model Agile & eXtreme Programming (XP)
Model Agile & eXtreme Programming (XP)
 
Agile Engineering Practices
Agile Engineering PracticesAgile Engineering Practices
Agile Engineering Practices
 
مكونات نظام المعلومات
مكونات نظام المعلوماتمكونات نظام المعلومات
مكونات نظام المعلومات
 
تطوير نظم المعلومات
تطوير نظم المعلوماتتطوير نظم المعلومات
تطوير نظم المعلومات
 
مفهوم دورة تطوير نظام المعلومات الادراية واهميته
مفهوم دورة تطوير نظام المعلومات الادراية واهميتهمفهوم دورة تطوير نظام المعلومات الادراية واهميته
مفهوم دورة تطوير نظام المعلومات الادراية واهميته
 
Extreme Programming
Extreme ProgrammingExtreme Programming
Extreme Programming
 
Extreme programming
Extreme programmingExtreme programming
Extreme programming
 
Agile Methodologies And Extreme Programming
Agile Methodologies And Extreme ProgrammingAgile Methodologies And Extreme Programming
Agile Methodologies And Extreme Programming
 
OOAD UNIT I UML DIAGRAMS
OOAD UNIT I UML DIAGRAMSOOAD UNIT I UML DIAGRAMS
OOAD UNIT I UML DIAGRAMS
 

Plus de David O'Dowd

Ios driver presentation copy
Ios driver presentation copyIos driver presentation copy
Ios driver presentation copyDavid O'Dowd
 
Janet Gregory presents Current Testing Challenges with SoftTest Ireland
Janet Gregory presents Current Testing Challenges with SoftTest IrelandJanet Gregory presents Current Testing Challenges with SoftTest Ireland
Janet Gregory presents Current Testing Challenges with SoftTest IrelandDavid O'Dowd
 
Current Testing Challenges Ireland
Current Testing Challenges IrelandCurrent Testing Challenges Ireland
Current Testing Challenges IrelandDavid O'Dowd
 
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ie
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ieGordon baisley - eircom - Introducing the EDM role with www.softtest.ie
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ieDavid O'Dowd
 
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...David O'Dowd
 
Intune Agile Testing Talk with www.softtest.ie
Intune Agile Testing Talk with www.softtest.ieIntune Agile Testing Talk with www.softtest.ie
Intune Agile Testing Talk with www.softtest.ieDavid O'Dowd
 
Mobile Testing Challenges Lighting Talk with www.softtest.ie
Mobile Testing Challenges Lighting Talk with www.softtest.ieMobile Testing Challenges Lighting Talk with www.softtest.ie
Mobile Testing Challenges Lighting Talk with www.softtest.ieDavid O'Dowd
 
HMH Agile Testing Lightning Talk with www.softtest.ie
HMH Agile Testing Lightning Talk with www.softtest.ieHMH Agile Testing Lightning Talk with www.softtest.ie
HMH Agile Testing Lightning Talk with www.softtest.ieDavid O'Dowd
 
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip Bannon
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip BannonSoft Test Ireland - Introduction to Jakarta Jmeter - Philip Bannon
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip BannonDavid O'Dowd
 
www.softtest.ie presents Selenium 2 With David Burn's
www.softtest.ie presents Selenium 2 With David Burn'swww.softtest.ie presents Selenium 2 With David Burn's
www.softtest.ie presents Selenium 2 With David Burn'sDavid O'Dowd
 
Agile Test Management - www.softtest.ie
Agile Test Management - www.softtest.ieAgile Test Management - www.softtest.ie
Agile Test Management - www.softtest.ieDavid O'Dowd
 
Michael Bolton - Two futures of software testing - Sept 2010
Michael Bolton - Two futures of software testing - Sept 2010Michael Bolton - Two futures of software testing - Sept 2010
Michael Bolton - Two futures of software testing - Sept 2010David O'Dowd
 
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandMarkus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandDavid O'Dowd
 
Whittaker How To Break Software Security - SoftTest Ireland
Whittaker How To Break Software Security - SoftTest IrelandWhittaker How To Break Software Security - SoftTest Ireland
Whittaker How To Break Software Security - SoftTest IrelandDavid O'Dowd
 
David Parnas - Documentation Based Software Testing - SoftTest Ireland
David Parnas - Documentation Based Software Testing - SoftTest IrelandDavid Parnas - Documentation Based Software Testing - SoftTest Ireland
David Parnas - Documentation Based Software Testing - SoftTest IrelandDavid O'Dowd
 
James Lyndsay - Testing in an agile environment
James Lyndsay - Testing in an agile environmentJames Lyndsay - Testing in an agile environment
James Lyndsay - Testing in an agile environmentDavid O'Dowd
 
Neil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandNeil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandDavid O'Dowd
 
Neil Thompson - Thinking tools: from top motors, through software process imp...
Neil Thompson - Thinking tools: from top motors, through software process imp...Neil Thompson - Thinking tools: from top motors, through software process imp...
Neil Thompson - Thinking tools: from top motors, through software process imp...David O'Dowd
 
Tester's are doing it for themselves - Julie Gardiner - SoftTest Ireland
Tester's are doing it for themselves - Julie Gardiner - SoftTest IrelandTester's are doing it for themselves - Julie Gardiner - SoftTest Ireland
Tester's are doing it for themselves - Julie Gardiner - SoftTest IrelandDavid O'Dowd
 
Test Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssTest Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssDavid O'Dowd
 

Plus de David O'Dowd (20)

Ios driver presentation copy
Ios driver presentation copyIos driver presentation copy
Ios driver presentation copy
 
Janet Gregory presents Current Testing Challenges with SoftTest Ireland
Janet Gregory presents Current Testing Challenges with SoftTest IrelandJanet Gregory presents Current Testing Challenges with SoftTest Ireland
Janet Gregory presents Current Testing Challenges with SoftTest Ireland
 
Current Testing Challenges Ireland
Current Testing Challenges IrelandCurrent Testing Challenges Ireland
Current Testing Challenges Ireland
 
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ie
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ieGordon baisley - eircom - Introducing the EDM role with www.softtest.ie
Gordon baisley - eircom - Introducing the EDM role with www.softtest.ie
 
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...
Subhendu Mohapatra - BearingPoint - Environments Management talk with www.sof...
 
Intune Agile Testing Talk with www.softtest.ie
Intune Agile Testing Talk with www.softtest.ieIntune Agile Testing Talk with www.softtest.ie
Intune Agile Testing Talk with www.softtest.ie
 
Mobile Testing Challenges Lighting Talk with www.softtest.ie
Mobile Testing Challenges Lighting Talk with www.softtest.ieMobile Testing Challenges Lighting Talk with www.softtest.ie
Mobile Testing Challenges Lighting Talk with www.softtest.ie
 
HMH Agile Testing Lightning Talk with www.softtest.ie
HMH Agile Testing Lightning Talk with www.softtest.ieHMH Agile Testing Lightning Talk with www.softtest.ie
HMH Agile Testing Lightning Talk with www.softtest.ie
 
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip Bannon
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip BannonSoft Test Ireland - Introduction to Jakarta Jmeter - Philip Bannon
Soft Test Ireland - Introduction to Jakarta Jmeter - Philip Bannon
 
www.softtest.ie presents Selenium 2 With David Burn's
www.softtest.ie presents Selenium 2 With David Burn'swww.softtest.ie presents Selenium 2 With David Burn's
www.softtest.ie presents Selenium 2 With David Burn's
 
Agile Test Management - www.softtest.ie
Agile Test Management - www.softtest.ieAgile Test Management - www.softtest.ie
Agile Test Management - www.softtest.ie
 
Michael Bolton - Two futures of software testing - Sept 2010
Michael Bolton - Two futures of software testing - Sept 2010Michael Bolton - Two futures of software testing - Sept 2010
Michael Bolton - Two futures of software testing - Sept 2010
 
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandMarkus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
 
Whittaker How To Break Software Security - SoftTest Ireland
Whittaker How To Break Software Security - SoftTest IrelandWhittaker How To Break Software Security - SoftTest Ireland
Whittaker How To Break Software Security - SoftTest Ireland
 
David Parnas - Documentation Based Software Testing - SoftTest Ireland
David Parnas - Documentation Based Software Testing - SoftTest IrelandDavid Parnas - Documentation Based Software Testing - SoftTest Ireland
David Parnas - Documentation Based Software Testing - SoftTest Ireland
 
James Lyndsay - Testing in an agile environment
James Lyndsay - Testing in an agile environmentJames Lyndsay - Testing in an agile environment
James Lyndsay - Testing in an agile environment
 
Neil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandNeil Tompson - SoftTest Ireland
Neil Tompson - SoftTest Ireland
 
Neil Thompson - Thinking tools: from top motors, through software process imp...
Neil Thompson - Thinking tools: from top motors, through software process imp...Neil Thompson - Thinking tools: from top motors, through software process imp...
Neil Thompson - Thinking tools: from top motors, through software process imp...
 
Tester's are doing it for themselves - Julie Gardiner - SoftTest Ireland
Tester's are doing it for themselves - Julie Gardiner - SoftTest IrelandTester's are doing it for themselves - Julie Gardiner - SoftTest Ireland
Tester's are doing it for themselves - Julie Gardiner - SoftTest Ireland
 
Test Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssTest Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For Sucesss
 

Sven Rosvall - Extreme Programming - SoftTest Ireland

  • 1. Extreme Programming Sven Rosvall sven-e@lysator.liu.se Extreme Programming (XP) is a rebellious new development methodology that has got a lot of press recently. But is it so "extreme" as the name indicates? And what makes it so "extreme"? This talk aims at giving a quick introduction to XP and what is so special. XP has a strong focus on human communication, simplicity and testing. There are lots of work for testers in an XP environment. We will see how testing can drive the whole development process. Testers are now in charge instead of being given the end result from developers during the hectic final stages of the project 1
  • 2. Values of XP • Communication • Simplicity • Feedback • Courage •Communication between team members, management and customers. •Do the simplest thing that could possibly work. Don’t do anything more. Do only what you need today. Tomorrows complexity will be dealt with tomorrow. •Feedback on the state of the system (testing). Feedback from customers. Feedback on project progress. •Courage to do radical, needed changes. These values go hand in hand. •Communication spreads courage to the whole team. •It is easier to be courageous with a simple system. •Feedback gives courage because you know how the system performs. 2
  • 3. Why is XP popular? Waterfall model doesn’t work: • Relies on getting specifications right • Customer changes his mind • New technology • Late system testing • Poor visibility of progress •It is impossible to get specifications right before you have a complete system. •Customers always change their mind. They might not know in advance what they really need. Competitors may have added features. •The world changes. New technology is introduced. Not only new tools and operating systems our project is based on. There may be smaller upgrades with subtle changes that may break or improve our system. •System Testing cannot start until we have a system. All pieces must be integrated. Customers can only get a feel for it is how to work with the system at this point. •Progress reports say. “Analysis Phase complete”. Customer says “Good, but what does Analysis Phase really mean?” Think of the Emperors new clothes. A lot has changed since the Waterfall model first saw the light. Many methodologies have improved things but not enough. 3
  • 4. What is XP? • Customer focused • Iterative, frequent releases • Test intensive (Testing first) • Lightweight software development methodology •The customer pays us. Give him respect and keep him happy. Keep him informed on progress. Show early releases. Let customer participate in planning and design. •XP is the first truly iterative development methodology I have seen that does not compromise quality. A system is delivered early and frequently to give visibility into progress and what is developed. •Testing is a crucial practice in XP. Automated as much as possible. •Test cases define what is to be developed and communicates functionality better than specifications to developers and to customers. It is easier to verify that a test case is satisfied than a requirement in a specification document. •Testing drives development. No development task is complete without passing its tests. •Testing assures that no code change breaks anything else. •Testing can start early because we have a system ready from the start. •XP is a lightweight methodology in that thick documentation is avoided and replaced by other means. XP is ideal for 5-15 people in a project. Bigger projects can often be broken down into manageable sub-projects. 4
  • 5. When not to do XP • Company culture • Expensive integration/testing • Big projects • Uninterested Customer • Required up-front design XP requires communication. There may be “heroes” who want to be seen as indispensable experts and are afraid of sharing their knowledge. Some companies value their workers by how many hours they work, not what they do. There are also many managers who think meetings are waste of useful time and want people to work instead of talking to each other. Integration and regression testing can be very expensive. Every developer won’t have access to his own mainframe or moon-lander. But centralised builds on a dedicated test machine and testing with reports made available to all team members can often be an acceptable replacement. Simulators at various scales can also do great jobs. If integration or regression testing takes too long, it takes too long to get feedback on recent changes. This can sometimes be solved by breaking down the system in manageable pieces that can be integrated and tested on their own. Regular large scale integration and testing will catch any problems at this level reasonably fast. If Quality Assurance takes two months before releasing, this slows down the feedback loop badly. Instead QA should trust the automatic testing being performed. Big projects causes problems with communications. This can be solved by dividing the project into smaller pieces. One group delivers a module to another, who act like customers. The customer may not be interested in giving feedback and prioritising the development. He may just want something delivered and installed. You will have problems with this kind of customer in waterfall projects too. There may exist a requirement for up-front design. Maybe because of tradition, maybe because management or customers want to see that the developers understand the problems in the project. There are still useful practices in XP that can be used in any project. 5
  • 6. XP Practices Onsite Planning Customer Game Simple Design Metaphor Short 40h Releases Week Refactoring Testing Pair Coding Programming Continuous Standards Integration Collective Ownership •Customer on site to ask for clarifications, planning, testing, etc. •Everybody in the team uses the same Metaphor. (Mental Model) •Short Releases so that new features can be tested quickly. Typically every 2-4 weeks. •Testing can start immediately because we have a (small) system after the first release. (I.e. 2-4 weeks.) •Simple Design. Don’t do more than what is needed now. No need for up-front complex design of things that might not ever be used. It is easier to change a simple design than a complex design when the customer may change his. •Refactoring is required when the Simple Design must be improved. We may have gained experience on how to design after having tested the feature with the simple design. The refactored design can be verified with the test suite, which has been proven before. •Pair Programming. Two brains works much more efficiently than two individuals. They inspire and review each other all times. Tests are also designed in pairs. •Collective Ownership. Don’t rely on individual to help you out. Do all changes you need for your feature yourself (in pairs) in any module. •Continuous Integration. Integrate your new feature and test it now. Testing is a development tool, even drives development. 6
  • 7. XP Practices Onsite Planning Customer Game Simple Design Metaphor Short 40h Releases Week Refactoring Testing Pair Coding Programming Continuous Standards Integration Collective Ownership All practices work together: •We can plan short releases because the customer is with us defining what is needed now and the customer has seen a working system and agrees on the system metaphor. •Anyone on the team can add code to refactor or add a new feature because we have a common coding standard, simple design, work in pairs and have a test suite. •We can refactor with confidence because we have a test suite to verify that we haven’t broken anything. •We can test frequently because we continually integrate small increments. •We stick to a simple design because we only need to implement what new test cases tell us to implement. 7
  • 8. XP vs. Waterfall Activities Req. Analysis Impl. Integr. Review Test Delivery Customer X X X X X Planning X X X Metaphor X X X X X Simple X X Design Refactoring X X Cont. Int. X X X X Testing X X X X X X Short Rel. X X X Coll. Own. X X X Pair Prog. X X X Code Std X X This table shows a cross-relation between XP and Waterfall models. It shows which XP activities are supporting which Waterfall activities. We see here that all Waterfall activities are well covered, even though XP does them in a different way. Note that testing is an essential activity throughout XP. 8
  • 9. User Stories • Describes One Feature • Communication between Customer and Developer • Basis for Planning • Basis for Test Cases User Stories are very important in XP. •User stories replace system and function specifications. •When requirements are changed, this is reflected in a small number of user stories. •US are simple enough to be understood by customers and developers. •Each US is estimated and assigned to an iteration and a responsible developer. •Test cases are derived from US. US are organised on small cards so that they can be shuffled around during planning. Simpler than using project planning software such as Microsoft Project. 9
  • 10. Iterations • Prioritisation of User Stories • Implement a set of User Stories • Release An iteration lasts typically 2-4 weeks. Customers and developers decide together which user stories are included in each iteration. Replanning at the beginning of every iteration. An iteration encompasses all activities of a project cycle. We can do this because we have a simple design and automated testing. We are integrating and testing continuously all the time so we know the system works. Packaging will also be automated to speed up the iteration. Integration and testing is not listed separately on this slide as they are ongoing activities. 10
  • 11. Testing • Write tests before implementation • Test execution as soon as first feature implemented • All tests are re-run before accepting new features • Automating tests Test cases are written before development starts for a User Story to get as much independence of the test cases from knowledge of the design. Test cases are both at system level and at unit level. Tests are automated as much as possible as tests are executed very frequently. They are not only run at the end of each iteration before release, but also before each implemented feature is accepted as working and not breaking any other feature. Tests are used to drive development. They are the metric to say that the developer has done his job. This metric is used by management to see progress. Developers are also helped by tests. They know clearly when their job is done. No fuzz. There is no risk developers are stuck fine tuning code that fulfils no purpose as there are no test cases for such work. The developer is confident that his new feature does not break anything as this is proven by testing. The developer feels happy because he can see his achievements clearly. Customer is co-responsible for tests. Any bug reports coming in are seen as a failure to define a test case. There are no bugs (failed test cases) in XP projects, only missing test cases. 11
  • 12. Testers Role • Better test writer than developers • Better test writer than customers • Knows test tools • Test automation • Test execution •Developers sometimes have difficulties distancing themselves from their system and seeing things from a users perspective. •Customers sometimes don’t know all the error conditions that should be considered. •Test Tools that neither developers nor customers have resources to learn. •Automation is time consuming. Using Test Tools and frameworks. •Execute Manual Tests that cannot be automated. •Explorative testing etc to find any bugs/missing test cases and misunderstood requirements. 12
  • 13. Example Project Online banking is a service many of us know. It is a kind of application that evolves over time. Perfect for dividing into iterations and release often to customers. The Metaphor here is a set of accounts that the user can work with. The paged nature of the web is also part of the metaphor. Need a “dummy customer” to act on behalf of all anonymous users. First iteration: Show this page. This page shows the features that has most value to the customer. (Login is often seen as the first feature, but it has no real value.) Get feedback on look and feel. Correct logo and colours, menu of commands logically ordered. Easy to view information on screen. Early feedback may have suggested stripes in transaction list to make it easier to follow lines. Testing against a snapshot database, because we don’t have database support yet. Note that this first iteration is not a publicly available release. We need a working database and login etc. to have a satisfactory solution. 13
  • 14. Deploying XP in QA C++ QA C++, A static analysis tool for C++ source code. • Core analyser team 3-5 people. • Had been running for 2 years when XP was adopted. • Introduced XP during following 4 years. • 400 KLOC C++ code. The core analyser was easy to define tests for. Most test cases consisted of source code that the tools should parse and expected warning messages. No dedicated tester. Instead a tiered automated test suite was built. The full suite was run every night (8 hours). A limited set of the test suite had to be run before code was allowed to be checked in. The test suite consisted of: •1000 selected code samples to demonstrate new features. •800 code samples from bug reports. •A compiler test suite with 14 000 test cases was bought and integrated into test suite. •A number of customer projects we managed to get our hands on. •The source code for QA C++ itself. 14
  • 15. Lessons Learned from QA C++ - Too small team with very specialised roles made pair programming difficult. - Difficult to make management co-operate and find a customer. + Core parser was rewritten four times. + Never late production releases. - Slow feedback. Building a C++ parser is a daunting task. Adding analysis to this does not only require excellent understanding of the C++ language, but also understanding how the tool is used and what needs customers have. Had done sporadic snapshot releases before. These were taking too much time as some files were missing in the release. Improved integration build scripts to be able to deliver complete release frequently. Finding a customer was impossible. Customers were seeing QA C++ as just another tool in their toolbox. No customer saw QA C++ as business critical tool. Later we had a resource in the sales support team dedicated to QA C++ who acted as a proxy customer. This was a huge improvement as this person knew very well what the customer wanted. A good test suite was already in place, but had to be improved to make it easy to run and view the results. Thanks to the automated test suite we dared to rewrite major parts of the tool when the design had to be enhanced to cope with new features. The test suite was comprehensive enough to tell us if the new code was behaving as well as the old code did. We had an excellent track record on releasing on time, with very few bugs. Although we managed to send out a new release every month, the people on the beta list were to busy to install new versions this often. Our japanese distributor wanted 6 months to make sure that the new version was as good as the previous, that all their customisations worked properly and to translate it. 15
  • 16. Adopted Practices in QA C++ Onsite Planning Customer Game Simple Design Metaphor Short 40h Releases Week Refactoring Testing Pair Coding Programming Continuous Standards Integration Collective Ownership We did not have a customer to ask for details and priorities from the start. Instead we indicated to management what the next versions were going to include. Later we had a resource in the sales support team dedicated to QA C++ who acted as a proxy customer. We knew what to do, the C++ standard (700 pages) was our ultimate requirement. Easy to create user stories and test cases from this. However, customers required support for various compiler extensions. Planning suffered from lack of understanding what features were needed in the market. We wanted to do a proper C++ parser but most customers were using Visual C++. Pair programming was difficult as the team was small and very specialised. No-one had capacity to know more than one area in the tool. We did occasional pair programming to try it. This had always positive results. The resulting code had outstanding quality. But it was exhausting to get introduced into someone elses problem domain while maintaining your own. Coding style consistency suffered due to the lack of pair programming. We used our own tool to enforce some coding guidelines. You can never cover everything in a coding standard. ----------------------------- It was easy to introduce testing as a test suite was already in place. Everyone in the team thought XP was cool. 16
  • 17. Agile Development • A collection of lightweight methodologies • Patterns for design and organisation – Agile Software Development Alistair Cockburn, ISBN 0-201-66969-9 – http://www.agilealliance.org “Agile methods are people-oriented rather than process-oriented. They rely on people’s expertise, competency and direct collaboration rather than on rigorous, document-centric processes to produce high-quality software.” “Travel light. Just keep the models needed. The less models, the less to update.” XP is one of many agile methodologies. 17
  • 18. Resources • Books: – Extreme Programming Explained: Embrace Change Kent Beck, ISBN 0-201-61641-6 – Test-Driven Development, By Example Kent Beck, ISBN 0-321-14653-0 – Testing Extreme Programming Lisa Crispin, ISBN 0-321-11355-1 18
  • 19. Resources • Web links: – http://www.extremeprogramming.org – http://www.xprogramming.com – http://www.stickyminds.com/sitewide.asp?sid=2301486&sqry=%2AJ%28 MIXED%29%2AR%28createdate%29%2AK%28simplesite%29%2AF% 28Lisa+Crispin%29%2A&sidx=6&sopp=10&ObjectId=5045&Function= DETAILBROWSE&ObjectType=MAGAZINE 19