SlideShare une entreprise Scribd logo
1  sur  40
Télécharger pour lire hors ligne
Unit 9: Web Application Testing

 Testing is the activity conducted to evaluate the quality of a
    product and to improve it by finding errors.




                                     Testing




dsbw 2011/2012 q1                                                  1
Testing Terminology
 An error is “the difference between a computed, observed, or
  measured value or condition and the true, specified, or
  theoretically correct value or condition” (IEEE standard 610.12-
  1990).
 This “true, specified, or theoretically correct value or condition”
  comes from
    A well-defined requirements model, if available and complete
    An incomplete set of fuzzy and contradictory goals, concerns,
      and expectations of the stakeholders
 A test is a set of test cases for a specific object under test: the
  whole Web application, components of a Web application, a
  system that runs a Web application, etc.
 A single test case describes a set of inputs, execution conditions,
  and expected results, which are used to test a specific aspect of the
  object under test
dsbw 2011/2012 q1                                                     2
Testing [and] Quality

 Testing should address compliance not only to functional
    requirements but also to quality requirements, i.e., the kinds of
    quality characteristics expected by stakeholders.
 ISO/IEC 9126-1 [Software] Quality Model:




dsbw 2011/2012 q1                                                       3
Goals of Testing
 The main goal of testing is to find errors but not to prove
    their absence
 A test run is successful if errors are detected. Otherwise, it is
    unsuccessful and “a waste of time”.
 Testing should adopt a risk-based approach:
       Test first and with the greatest effort those critical parts of an
        application where the most dangerous errors are still
        undetected
       A further aim of testing is to bring risks to light, not simply to
        demonstrate conformance to stated requirements.
 Test as early as possible at the beginning of a project: errors
    happened in early development phases are harder to localize
    and more expensive to fix in later phases.
dsbw 2011/2012 q1                                                            4
Test Levels (1/2)
 Unit tests
       Test the smallest testable units (classes, Web pages, etc.)
        independently of one another.
       Performed by the developer during implementation.

 Integration tests
       Evaluate the interaction between distinct and separately tested
        units once they have been integrated.
       Performed by a tester, a developer, or both jointly.

 System tests
       Test the complete, integrated system.
       Typically performed by a specialized test team.




dsbw 2011/2012 q1                                                         5
Test Levels (2/2)
 Acceptance tests
         Evaluate the system with the client in an “realistic”
          environment, i.e. with real conditions and real data.
 Beta tests
       Let friendly users work with early versions of a product to get
        early feedback.
       Beta tests are unsystematic tests which rely on the number and
        “malevolence” of potential users.




dsbw 2011/2012 q1                                                     6
Fitting Testing in the Development Process




 Planning: Defines the quality goals, the general testing strategy, the test
  plans for all test levels, the metrics and measuring methods, and the test
  environment.
 Preparing: Involves selecting the testing techniques and tools and
  specifying the test cases (including the test data).
 Performing: Prepares the test infrastructure, runs the test cases, and then
  documents and evaluates the results.
 Reporting: Summarizes the test results and produces the test reports.
dsbw 2011/2012 q1                                                               7
Web Testing: A Road Map

     Content                     Interface
     Testing                      Testing             Usability Testing
                      user

                                             Navigation
                                              Testing



                                                             Component
                                                               Testing




                                                          Configuration
                                                             Testing

                                                 Performance         Security
                    technology                      Testing          Testing


dsbw 2011/2012 q1                                                               8
Usability
 Usability is a quality attribute that assesses how easy user
    interfaces are to use. Also refers to methods for improving
    ease-of-use during the design process.
 Usability is defined by five quality components:
         Learnability: How easy is it for users to accomplish basic tasks
          the first time they encounter the design?
         Efficiency: Once users have learned the design, how quickly
          can they perform tasks?
         Memorability: When users return to the design after a period
          of not using it, how easily can they reestablish proficiency?
         Errors: How many errors do users make, how severe are these
          errors, and how easily can they recover from the errors?
         Satisfaction: How pleasant is it to use the design?
dsbw 2011/2012 q1                                                            9
Why Usability matters*
 62% of web shoppers gave up looking for an item. (Zona
    study)
 50% of web sales are lost because visitors can’t easily find
    content. (Gartner Group)
 40% of repeat visitors do not return due to a negative
    experience. (Zona study)
 85% of visitors abandon a new site due to poor design.
    (cPulse)
 Only 51% of sites complied with simple web usability
    principles. (Forrester study of 20 major sites)

(*) data from www.usabilitynet.org/management/c_cost.htm

dsbw 2011/2012 q1                                                10
Why people fail

                                                     Search

                                                     Findability (IA, Category
                                                     names, Navigation, Links)

                                                     Page design (Readability,
                                                     Layout, Graphics, Amateur,
                                                     Scrolling)
                                                     Information (Content,
                                                     Product info, Corporate info,
                                                     Prices)
                                                     Task support (Workflow,
                                                     Privacy, Forms, Comparison,
                                                     Inflexible)
                                                     Fancy design (Multimedia,
                                                     Back button, PDF/Printing,
                                                     New window, Sound)
                                                     Other (Bugs, Presence on
                                                     Web, Ads, New site,
Usability problems weighted by how frequently they   Metaphors)
          caused users to fail a task [NL06]

dsbw 2011/2012 q1                                                                    11
Top Ten (Usability) Mistakes in Web Design

1.    Bad search
2.    Pdf files for online reading
3.    Not changing the color of visited links
4.    Non-scannable text
5.    Fixed font size
6.    Page titles with low search engine visibility
7.    Anything that looks like an advertisement
8.    Violating design conventions
9.    Opening new browser windows
10. Not answering users' questions

dsbw 2011/2012 q1                                     12
Assessing Usability
 Two major types of assessing methods:
         Usability evaluations:
               Evaluators and no users
               Techniques: surveys/questionnaires, observational
                evaluations, guideline based reviews, cognitive
                walkthroughs, expert reviews, heuristic evaluations
         Usability tests: focus on users working with the product
 Usability testing is the only way to know if the Web site
    actually has problems that keep people from having a
    successful and satisfying experience.




dsbw 2011/2012 q1                                                     13
Usability Testing
 Usability testing is a methodology that employs potential
    users to evaluate the degree to which a website/software
    meets predefined usability criteria.
 Basic Process:
      1. Watch Customers
      2. They Perform Tasks
      3. Note Their Problems
      4. Make Recommendations
      5. Iterate




dsbw 2011/2012 q1                                              14
Measures of Usability
 Effectiveness (Ability to successfully accomplish tasks)
       Percentage of goals/tasks achieved (success rate)
       Number of errors

 Efficiency (Ability to accomplish tasks with speed and ease)
       Time to complete a task
       Frequency of requests for help
       Number of times facilitator provides assistance
       Number of times user gives up




dsbw 2011/2012 q1                                                15
Measures of Usability
 Satisfaction (Pleasing to users)
         Positive and negative ratings on a satisfaction scale
         Percent of favorable comments to unfavorable comments
         Number of good vs. bad features recalled after test
         Number of users who would use the system again
         Number of times users express dissatisfaction or frustration
 Learnability (Ability to learn how to use site and remember it)
       Ratio of successes to failures
       Number of features that can be recalled after the test




dsbw 2011/2012 q1                                                        16
Usability Testing Roles
 Facilitator:
       Oversees the entire test process
       Plan, test, and report.

 Participant:
       Actual or potential customer.
       Representative users (marketing, designers) avoided.

 Observer (optional):
       Records events as they occur.
       Limits interaction with the customer.
       Does contribute to the report.




dsbw 2011/2012 q1                                              17
Usability Testing Process
Step 1: Planning The Usability Test
         Define what to test
         Define which customers should be tested
         Define what tasks should be tested
         Write usability scenarios and tasks
         Select participants
Step 2: Conducting The Usability Test
       Conduct a test
       Collect data

Step 3: Analyzing and Reporting The Usability Test
       Compile results
       Make recommendations

dsbw 2011/2012 q1                                    18
People – Context – Activities
Step 1: Planning The Usability Test
         Define what to test
               → Activities (Use Cases)
         Define which customers (user profiles) to be tested
               → People (Actors)
         Provide a background for the activities to test
               → Context




dsbw 2011/2012 q1                                               19
Usability Scenarios and Tasks
 Provide the participant with motivation and context to make
    the situation more realistic
 Include several tasks:
       Make the first task simple
       Give a goal, without describing steps

 Set some success criteria, examples:
       N% of test participants will be able to complete x% of tasks in
        the time allotted.
       Participants will be able to complete x% of tasks with no more
        than one error per task.
       N% of test participants will rate the system as highly usable on
        a scale of x to x.


dsbw 2011/2012 q1                                                          20
Example of Scenario with Tasks
 Context:
          You want to book a sailing on Royal Caribbean International for
           next June with your church group. The group is called “Saint
           Francis Summer 2010”. The group is selling out fast, so you
           want to book a cabin, which is close to an elevator because
           your leg hurts from a recent injury.
 Tasks to perform:
      1.    Open your browser
      2.    Click the link labeled “Royal Caribbean”
      3.    Tell me the available cabins in the “Saint Francis Summer
            2010” group
      4.    Tell me a cabin number closest to an elevator
      5.    Book the cabin the best suits your needs

dsbw 2011/2012 q1                                                        21
Selecting Participants
 Recruit participants
       In-house
       recruitment firms, databases, conferences

 Match participants with user profiles
 Numbers: of participants, floaters

 Schedule test sessions

 Incentives:
       Gift checks ($100 per session)
       Food or gift cards




dsbw 2011/2012 q1                                   22
How Many Test Participants Are Required?

 The number of usability problems found in a usability test
    with n participants is:
                              N(1-(1-L)n)
       N : total number of usability problems in the design
       L : the proportion of usability problems discovered while testing
        a single participant.

                                                           For L = 31%




dsbw 2011/2012 q1                                                       23
How Many Test Participants Are Required?

 It seems that you need to test with at least 15 participants to
    discover all the usability problems
 However, is better to perform 3 tests with 5 participants than
    to perform one with 15 participants:
         After the first test with 5 participants has found 85% of the
          usability problems, you will want to fix them in a redesign.
         After creating the new design, you need to test again.
         The second test with 5 users will discover most of the
          remaining 15% of the original usability problems that were not
          found in the first test (and some new one).
         The new test will be able to uncover structural usability
          problems that were obscured in initial studies as users were
          stumped by surface-level usability problems.
         Fix the new problems, and test …
dsbw 2011/2012 q1                                                          24
Usability Labs … Not Necessary




The testing room contains office     The observer side contains a
furniture, video tape equipment, a   powerful computer to collect the
microphone and a computer with       usability data and analyze it. A one-
appropriate software.                way mirror separates the rooms.

dsbw 2011/2012 q1                                                       25
Test Side-by-Side




dsbw 2011/2012 q1   26
Conducting Tests: Facilitator’s Role
 Start with an easy task to build confidence
 Sit beside the person not behind the glass

 Use “think-out-loud” protocol

 Give participants time to think it through

 Offer appropriate encouragement

 Lead participants, don’t answer questions (being an enabler)
 Don’t act knowledgeable (treat them as the experts)

 Don’t get too involved in data collection

 Don’t jump to conclusions

 Don’t solve their problems immediately

dsbw 2011/2012 q1                                                27
Collecting Data
 Performance
       Objective (what actually happened)
       Usually Quantitative
               Time to complete a task
               Time to recover from an error
               Number of errors
               Percentage of tasks completed successfully
               Number of clicks
               Pathway information
 Preference
       Subjective (what participants say/thought)
       Usually Qualitative
               Preference of versions
               Suggestions and comments
               Ratings or rankings (can be quantitative)

dsbw 2011/2012 q1                                            28
Report findings and recommendations
 Make report usable for your users
 Include quantitative data (success rates, times, etc.)

 Avoid words like “few, many, several”. Include counts

 Use quotes

 Use screenshots

 Mention positive findings
 Do not use participant names, use P1, P2, P3, etc.

 Include recommendations

 Make it short



dsbw 2011/2012 q1                                          29
Component Testing
 Focuses on a set of tests that attempt to uncover errors in
    WebApp functions
 Conventional black-box and white-box test case design
    methods can be used at each architectural layer
    (presentation, domain, data access)
 Form data can be exploited systematically to find errors:
         Missing/incomplete data
         Type conversion problems
         Value boundary violations
         Fake data
         Etc.
 Database testing is often an integral part of the component-
    testing regime
dsbw 2011/2012 q1                                                30
Configuration Testing: Server-Side Issues
 Is the WebApp fully compatible with the server OS?
 Are system files, directories, and related system data created correctly
    when the WebApp is operational?
   Do system security measures (e.g., firewalls or encryption) allow the
    WebApp to execute and service users without interference or
    performance degradation?
   Has the WebApp been tested with the distributed server configuration (if
    one exists) that has been chosen?
   Is the WebApp properly integrated with database software? Is the
    WebApp sensitive to different versions of database software?
   Do server-side WebApp scripts execute properly?
   Have system administrator errors been examined for their affect on
    WebApp operations?
   If proxy servers are used, have differences in their configuration been
    addressed with on-site testing?

dsbw 2011/2012 q1                                                            31
Configuration Testing: Client-Side Issues

 Hardware—CPU, memory, storage and printing devices
 Operating systems—Linux, Macintosh OS, Microsoft
    Windows, a mobile-based OS
 Browser software—Internet Explorer, Mozilla/Netscape,
    Opera, Safari, and others
 User interface components—Active X, Java applets and
    others
 Plug-ins—QuickTime, RealPlayer, and many others

 Connectivity—cable, DSL, regular modem, T1




dsbw 2011/2012 q1                                         32
Security Testing
 Designed to probe vulnerabilities
       of the client-side environment,
       the network communications that occur as data are passed
        from client to server and back again,
       and the server-side environment

 On the client-side, vulnerabilities can often be traced to pre-
    existing bugs in browsers, e-mail programs, or
    communication software.
 On the network infrastructure

 On the server-side,
                                       Review the DSBW Unit on
       At host level                  WebApp Security
       At WebApp level


dsbw 2011/2012 q1                                                   33
Performance Testing: Main Questions
 Does the server response time degrade to a point where it is
    noticeable and unacceptable?
   At what point (in terms of users, transactions or data loading) does
    performance become unacceptable?
   What system components are responsible for performance
    degradation?
   What is the average response time for users under a variety of
    loading conditions?
   Does performance degradation have an impact on system security?
   Is WebApp reliability or accuracy affected as the load on the
    system grows?
   What happens when loads that are greater than maximum server
    capacity are applied?


dsbw 2011/2012 q1                                                      34
Performance Testing: Load Tests
 A load test verifies whether or not the system meets the
    required response times and the required throughput.
 Steps:
      1.   Determine load profiles (what access types, how many visits per day,
           at what peak times, how many visits per session, how many
           transactions per session, etc.) and the transaction mix (which
           functions shall be executed with which percentage).
      2.   Determine the target values for response times and throughput (in
           normal operation and at peak times, for simple or complex accesses,
           with minimum, maximum, and average values).
      3.   Run the tests, generating the workload with the transaction mix
           defined in the load profile, and measure the response times and the
           throughput.
      4.   The results are evaluated, and potential bottlenecks are identified.


dsbw 2011/2012 q1                                                                 35
Performance Testing: Stress Tests
 A stress test verifies whether or not the system reacts in a
    controlled way in “stress situations”, which are simulated by
    applying extreme conditions, such as unrealistic overload, or
    heavily fluctuating load.
 The test is aimed at answering the questions:
       Does the server degrade ‘gently’ or does it shut down as
        capacity is exceeded?
       Does server software generate “server not available”
        messages? More generally, are users aware that they cannot
        reach the server?
       Are transactions lost as capacity is exceeded?
       Is data integrity affected as capacity is exceeded?




dsbw 2011/2012 q1                                                    36
Performance Testing: Stress Tests (cont.)

       Under what load conditions the server environment fails? How
        does failure manifest itself? Are automated notifications sent to
        technical support staff at the server site?
       If the system does fail, how long will it take to come back on-
        line?
       Are certain WebApp functions (e.g., compute intensive
        functionality, data streaming capabilities) discontinued as
        capacity reaches the 80 or 90% level?




dsbw 2011/2012 q1                                                       37
Performance Testing: Interpreting Graphics


                              Load: the number of
                               requests that arrive at
                               the system per time unit
                              Throughput: the number
                               of requests served per
                               time unit.
                              SLA: Service Level
                               Agreement




dsbw 2011/2012 q1                                       38
Test Automation
 Automation can significantly increase the efficiency of testing and
  enables new types of tests that also increase the scope (e.g.
  different test objects and quality characteristics) and depth of
  testing (e.g. large amounts and combinations of input data).
 Test automation brings the following benefits:
    Running automated regression tests on new versions of a
      WebApp allows to detect defects caused by side-effects to
      unchanged functionality.
    Various test methods and techniques would be difficult or
      impossible to perform manually. For example, load and stress
      testing requires to simulate a large number of concurrent users.
    Automation allows to run more tests in less time and, thus, to
      run the tests more often leading to greater confidence in the
      system under test.
 Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html
dsbw 2011/2012 q1                                                       39
References
 R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s
    Approach. McGraw Hill, 2008. Chapter 15.
 KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons,
    2006. Chapter 7.
 [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web
    Usability. New Riders Publishing.
 www.useit.com (Jakob Nielsen)
 www.usability.gov




dsbw 2011/2012 q1                                              40

Contenu connexe

Tendances

6 Ways to Measure the ROI of Automated Testing
6 Ways to Measure the ROI of Automated Testing6 Ways to Measure the ROI of Automated Testing
6 Ways to Measure the ROI of Automated TestingSmartBear
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?QA InfoTech
 
What is Web Testing?
What is Web Testing?   What is Web Testing?
What is Web Testing? QA InfoTech
 
Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Kaizenko
 
발표자료 1인qa로살아남는6가지방법
발표자료 1인qa로살아남는6가지방법발표자료 1인qa로살아남는6가지방법
발표자료 1인qa로살아남는6가지방법SangIn Choung
 
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
2019 Testim Webinar: Automation Test Strategy and Design for Agile TeamsTristanLombard1
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testingKanoah
 
Top ten software testing tools
Top ten software testing toolsTop ten software testing tools
Top ten software testing toolsJanBask Training
 
Manual testing concepts course 1
Manual testing concepts course 1Manual testing concepts course 1
Manual testing concepts course 1Raghu Kiran
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testingTharinda Liyanage
 
TESTING STRATEGY.ppt
TESTING STRATEGY.pptTESTING STRATEGY.ppt
TESTING STRATEGY.pptFawazHussain4
 
Performance testing presentation
Performance testing presentationPerformance testing presentation
Performance testing presentationBelatrix Software
 
Testing types functional and nonfunctional - Kati Holasz
Testing types   functional and nonfunctional - Kati HolaszTesting types   functional and nonfunctional - Kati Holasz
Testing types functional and nonfunctional - Kati HolaszHolasz Kati
 
Role Of Qa And Testing In Agile 1225221397167302 8
Role Of Qa And Testing In Agile 1225221397167302 8Role Of Qa And Testing In Agile 1225221397167302 8
Role Of Qa And Testing In Agile 1225221397167302 8a34sharm
 
Test Automation Framework Designs
Test Automation Framework DesignsTest Automation Framework Designs
Test Automation Framework DesignsSauce Labs
 

Tendances (20)

6 Ways to Measure the ROI of Automated Testing
6 Ways to Measure the ROI of Automated Testing6 Ways to Measure the ROI of Automated Testing
6 Ways to Measure the ROI of Automated Testing
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?
 
What is Web Testing?
What is Web Testing?   What is Web Testing?
What is Web Testing?
 
Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Agile testing - Testing From Day 1
Agile testing - Testing From Day 1
 
Manual testing
Manual testingManual testing
Manual testing
 
발표자료 1인qa로살아남는6가지방법
발표자료 1인qa로살아남는6가지방법발표자료 1인qa로살아남는6가지방법
발표자료 1인qa로살아남는6가지방법
 
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
2019 Testim Webinar: Automation Test Strategy and Design for Agile Teams
 
SonarQube Overview
SonarQube OverviewSonarQube Overview
SonarQube Overview
 
Agile testing
Agile testingAgile testing
Agile testing
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testing
 
Top ten software testing tools
Top ten software testing toolsTop ten software testing tools
Top ten software testing tools
 
Manual testing concepts course 1
Manual testing concepts course 1Manual testing concepts course 1
Manual testing concepts course 1
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testing
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
TESTING STRATEGY.ppt
TESTING STRATEGY.pptTESTING STRATEGY.ppt
TESTING STRATEGY.ppt
 
Performance testing presentation
Performance testing presentationPerformance testing presentation
Performance testing presentation
 
Testing types functional and nonfunctional - Kati Holasz
Testing types   functional and nonfunctional - Kati HolaszTesting types   functional and nonfunctional - Kati Holasz
Testing types functional and nonfunctional - Kati Holasz
 
Testing Services
Testing ServicesTesting Services
Testing Services
 
Role Of Qa And Testing In Agile 1225221397167302 8
Role Of Qa And Testing In Agile 1225221397167302 8Role Of Qa And Testing In Agile 1225221397167302 8
Role Of Qa And Testing In Agile 1225221397167302 8
 
Test Automation Framework Designs
Test Automation Framework DesignsTest Automation Framework Designs
Test Automation Framework Designs
 

En vedette

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web ApplicationsSeth McLaughlin
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering PresentationHira Anwer Khan
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_tomheck
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdomiiiapdst
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist Febin Chacko
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanuDhanu P G Naik
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developerscboecking
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process ReengineeringHira Anwer Khan
 
Mobile testing
Mobile testingMobile testing
Mobile testingAlex Hung
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testingLakshmi Nandoor
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Sauce Labs
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guideNugroho Hermanto
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software TestingAndrew Kandels
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Tom Eston
 

En vedette (20)

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web Applications
 
Testing web application
Testing web applicationTesting web application
Testing web application
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering Presentation
 
Group3
Group3Group3
Group3
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdom
 
Unit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UMLUnit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UML
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist
 
Unit 05: Physical Architecture Design
Unit 05: Physical Architecture DesignUnit 05: Physical Architecture Design
Unit 05: Physical Architecture Design
 
A perspective on web testing.ppt
A perspective on web testing.pptA perspective on web testing.ppt
A perspective on web testing.ppt
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanu
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developers
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process Reengineering
 
Mobile testing
Mobile testingMobile testing
Mobile testing
 
Web testing
Web testingWeb testing
Web testing
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testing
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guide
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software Testing
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
 

Similaire à Unit 09: Web Application Testing

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Carles Farré
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Visiontechmeetup
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web TestingCarles Farré
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentSameer Chavan
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque AutomationMD Tareque
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Usabilitydraft
UsabilitydraftUsabilitydraft
UsabilitydraftKimGriggs
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_ResumeAditya Vad
 

Similaire à Unit 09: Web Application Testing (20)

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Vision
 
Unit03: Process and Business Models
Unit03: Process and Business ModelsUnit03: Process and Business Models
Unit03: Process and Business Models
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface Development
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque Automation
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Raji_QA
Raji_QARaji_QA
Raji_QA
 
Usabilitydraft
UsabilitydraftUsabilitydraft
Usabilitydraft
 
Kasi Viswanath
Kasi ViswanathKasi Viswanath
Kasi Viswanath
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA Consultant
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_Resume
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
Ooad
OoadOoad
Ooad
 
PARVATHY INDIRA
PARVATHY INDIRAPARVATHY INDIRA
PARVATHY INDIRA
 
195
195195
195
 

Plus de DSBW 2011/2002 - Carles Farré - Barcelona Tech (9)

Unit 08: Security for Web Applications
Unit 08: Security for Web ApplicationsUnit 08: Security for Web Applications
Unit 08: Security for Web Applications
 
Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)
 
Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)
 
Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)
 
Unit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX ModelUnit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX Model
 
Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)
 
Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)
 
Unit 01 - Introduction
Unit 01 - IntroductionUnit 01 - Introduction
Unit 01 - Introduction
 
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
 

Dernier

What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 

Dernier (20)

What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 

Unit 09: Web Application Testing

  • 1. Unit 9: Web Application Testing  Testing is the activity conducted to evaluate the quality of a product and to improve it by finding errors. Testing dsbw 2011/2012 q1 1
  • 2. Testing Terminology  An error is “the difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition” (IEEE standard 610.12- 1990).  This “true, specified, or theoretically correct value or condition” comes from  A well-defined requirements model, if available and complete  An incomplete set of fuzzy and contradictory goals, concerns, and expectations of the stakeholders  A test is a set of test cases for a specific object under test: the whole Web application, components of a Web application, a system that runs a Web application, etc.  A single test case describes a set of inputs, execution conditions, and expected results, which are used to test a specific aspect of the object under test dsbw 2011/2012 q1 2
  • 3. Testing [and] Quality  Testing should address compliance not only to functional requirements but also to quality requirements, i.e., the kinds of quality characteristics expected by stakeholders.  ISO/IEC 9126-1 [Software] Quality Model: dsbw 2011/2012 q1 3
  • 4. Goals of Testing  The main goal of testing is to find errors but not to prove their absence  A test run is successful if errors are detected. Otherwise, it is unsuccessful and “a waste of time”.  Testing should adopt a risk-based approach:  Test first and with the greatest effort those critical parts of an application where the most dangerous errors are still undetected  A further aim of testing is to bring risks to light, not simply to demonstrate conformance to stated requirements.  Test as early as possible at the beginning of a project: errors happened in early development phases are harder to localize and more expensive to fix in later phases. dsbw 2011/2012 q1 4
  • 5. Test Levels (1/2)  Unit tests  Test the smallest testable units (classes, Web pages, etc.) independently of one another.  Performed by the developer during implementation.  Integration tests  Evaluate the interaction between distinct and separately tested units once they have been integrated.  Performed by a tester, a developer, or both jointly.  System tests  Test the complete, integrated system.  Typically performed by a specialized test team. dsbw 2011/2012 q1 5
  • 6. Test Levels (2/2)  Acceptance tests  Evaluate the system with the client in an “realistic” environment, i.e. with real conditions and real data.  Beta tests  Let friendly users work with early versions of a product to get early feedback.  Beta tests are unsystematic tests which rely on the number and “malevolence” of potential users. dsbw 2011/2012 q1 6
  • 7. Fitting Testing in the Development Process  Planning: Defines the quality goals, the general testing strategy, the test plans for all test levels, the metrics and measuring methods, and the test environment.  Preparing: Involves selecting the testing techniques and tools and specifying the test cases (including the test data).  Performing: Prepares the test infrastructure, runs the test cases, and then documents and evaluates the results.  Reporting: Summarizes the test results and produces the test reports. dsbw 2011/2012 q1 7
  • 8. Web Testing: A Road Map Content Interface Testing Testing Usability Testing user Navigation Testing Component Testing Configuration Testing Performance Security technology Testing Testing dsbw 2011/2012 q1 8
  • 9. Usability  Usability is a quality attribute that assesses how easy user interfaces are to use. Also refers to methods for improving ease-of-use during the design process.  Usability is defined by five quality components:  Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?  Efficiency: Once users have learned the design, how quickly can they perform tasks?  Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?  Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?  Satisfaction: How pleasant is it to use the design? dsbw 2011/2012 q1 9
  • 10. Why Usability matters*  62% of web shoppers gave up looking for an item. (Zona study)  50% of web sales are lost because visitors can’t easily find content. (Gartner Group)  40% of repeat visitors do not return due to a negative experience. (Zona study)  85% of visitors abandon a new site due to poor design. (cPulse)  Only 51% of sites complied with simple web usability principles. (Forrester study of 20 major sites) (*) data from www.usabilitynet.org/management/c_cost.htm dsbw 2011/2012 q1 10
  • 11. Why people fail Search Findability (IA, Category names, Navigation, Links) Page design (Readability, Layout, Graphics, Amateur, Scrolling) Information (Content, Product info, Corporate info, Prices) Task support (Workflow, Privacy, Forms, Comparison, Inflexible) Fancy design (Multimedia, Back button, PDF/Printing, New window, Sound) Other (Bugs, Presence on Web, Ads, New site, Usability problems weighted by how frequently they Metaphors) caused users to fail a task [NL06] dsbw 2011/2012 q1 11
  • 12. Top Ten (Usability) Mistakes in Web Design 1. Bad search 2. Pdf files for online reading 3. Not changing the color of visited links 4. Non-scannable text 5. Fixed font size 6. Page titles with low search engine visibility 7. Anything that looks like an advertisement 8. Violating design conventions 9. Opening new browser windows 10. Not answering users' questions dsbw 2011/2012 q1 12
  • 13. Assessing Usability  Two major types of assessing methods:  Usability evaluations:  Evaluators and no users  Techniques: surveys/questionnaires, observational evaluations, guideline based reviews, cognitive walkthroughs, expert reviews, heuristic evaluations  Usability tests: focus on users working with the product  Usability testing is the only way to know if the Web site actually has problems that keep people from having a successful and satisfying experience. dsbw 2011/2012 q1 13
  • 14. Usability Testing  Usability testing is a methodology that employs potential users to evaluate the degree to which a website/software meets predefined usability criteria.  Basic Process: 1. Watch Customers 2. They Perform Tasks 3. Note Their Problems 4. Make Recommendations 5. Iterate dsbw 2011/2012 q1 14
  • 15. Measures of Usability  Effectiveness (Ability to successfully accomplish tasks)  Percentage of goals/tasks achieved (success rate)  Number of errors  Efficiency (Ability to accomplish tasks with speed and ease)  Time to complete a task  Frequency of requests for help  Number of times facilitator provides assistance  Number of times user gives up dsbw 2011/2012 q1 15
  • 16. Measures of Usability  Satisfaction (Pleasing to users)  Positive and negative ratings on a satisfaction scale  Percent of favorable comments to unfavorable comments  Number of good vs. bad features recalled after test  Number of users who would use the system again  Number of times users express dissatisfaction or frustration  Learnability (Ability to learn how to use site and remember it)  Ratio of successes to failures  Number of features that can be recalled after the test dsbw 2011/2012 q1 16
  • 17. Usability Testing Roles  Facilitator:  Oversees the entire test process  Plan, test, and report.  Participant:  Actual or potential customer.  Representative users (marketing, designers) avoided.  Observer (optional):  Records events as they occur.  Limits interaction with the customer.  Does contribute to the report. dsbw 2011/2012 q1 17
  • 18. Usability Testing Process Step 1: Planning The Usability Test  Define what to test  Define which customers should be tested  Define what tasks should be tested  Write usability scenarios and tasks  Select participants Step 2: Conducting The Usability Test  Conduct a test  Collect data Step 3: Analyzing and Reporting The Usability Test  Compile results  Make recommendations dsbw 2011/2012 q1 18
  • 19. People – Context – Activities Step 1: Planning The Usability Test  Define what to test  → Activities (Use Cases)  Define which customers (user profiles) to be tested  → People (Actors)  Provide a background for the activities to test  → Context dsbw 2011/2012 q1 19
  • 20. Usability Scenarios and Tasks  Provide the participant with motivation and context to make the situation more realistic  Include several tasks:  Make the first task simple  Give a goal, without describing steps  Set some success criteria, examples:  N% of test participants will be able to complete x% of tasks in the time allotted.  Participants will be able to complete x% of tasks with no more than one error per task.  N% of test participants will rate the system as highly usable on a scale of x to x. dsbw 2011/2012 q1 20
  • 21. Example of Scenario with Tasks  Context:  You want to book a sailing on Royal Caribbean International for next June with your church group. The group is called “Saint Francis Summer 2010”. The group is selling out fast, so you want to book a cabin, which is close to an elevator because your leg hurts from a recent injury.  Tasks to perform: 1. Open your browser 2. Click the link labeled “Royal Caribbean” 3. Tell me the available cabins in the “Saint Francis Summer 2010” group 4. Tell me a cabin number closest to an elevator 5. Book the cabin the best suits your needs dsbw 2011/2012 q1 21
  • 22. Selecting Participants  Recruit participants  In-house  recruitment firms, databases, conferences  Match participants with user profiles  Numbers: of participants, floaters  Schedule test sessions  Incentives:  Gift checks ($100 per session)  Food or gift cards dsbw 2011/2012 q1 22
  • 23. How Many Test Participants Are Required?  The number of usability problems found in a usability test with n participants is: N(1-(1-L)n)  N : total number of usability problems in the design  L : the proportion of usability problems discovered while testing a single participant. For L = 31% dsbw 2011/2012 q1 23
  • 24. How Many Test Participants Are Required?  It seems that you need to test with at least 15 participants to discover all the usability problems  However, is better to perform 3 tests with 5 participants than to perform one with 15 participants:  After the first test with 5 participants has found 85% of the usability problems, you will want to fix them in a redesign.  After creating the new design, you need to test again.  The second test with 5 users will discover most of the remaining 15% of the original usability problems that were not found in the first test (and some new one).  The new test will be able to uncover structural usability problems that were obscured in initial studies as users were stumped by surface-level usability problems.  Fix the new problems, and test … dsbw 2011/2012 q1 24
  • 25. Usability Labs … Not Necessary The testing room contains office The observer side contains a furniture, video tape equipment, a powerful computer to collect the microphone and a computer with usability data and analyze it. A one- appropriate software. way mirror separates the rooms. dsbw 2011/2012 q1 25
  • 27. Conducting Tests: Facilitator’s Role  Start with an easy task to build confidence  Sit beside the person not behind the glass  Use “think-out-loud” protocol  Give participants time to think it through  Offer appropriate encouragement  Lead participants, don’t answer questions (being an enabler)  Don’t act knowledgeable (treat them as the experts)  Don’t get too involved in data collection  Don’t jump to conclusions  Don’t solve their problems immediately dsbw 2011/2012 q1 27
  • 28. Collecting Data  Performance  Objective (what actually happened)  Usually Quantitative  Time to complete a task  Time to recover from an error  Number of errors  Percentage of tasks completed successfully  Number of clicks  Pathway information  Preference  Subjective (what participants say/thought)  Usually Qualitative  Preference of versions  Suggestions and comments  Ratings or rankings (can be quantitative) dsbw 2011/2012 q1 28
  • 29. Report findings and recommendations  Make report usable for your users  Include quantitative data (success rates, times, etc.)  Avoid words like “few, many, several”. Include counts  Use quotes  Use screenshots  Mention positive findings  Do not use participant names, use P1, P2, P3, etc.  Include recommendations  Make it short dsbw 2011/2012 q1 29
  • 30. Component Testing  Focuses on a set of tests that attempt to uncover errors in WebApp functions  Conventional black-box and white-box test case design methods can be used at each architectural layer (presentation, domain, data access)  Form data can be exploited systematically to find errors:  Missing/incomplete data  Type conversion problems  Value boundary violations  Fake data  Etc.  Database testing is often an integral part of the component- testing regime dsbw 2011/2012 q1 30
  • 31. Configuration Testing: Server-Side Issues  Is the WebApp fully compatible with the server OS?  Are system files, directories, and related system data created correctly when the WebApp is operational?  Do system security measures (e.g., firewalls or encryption) allow the WebApp to execute and service users without interference or performance degradation?  Has the WebApp been tested with the distributed server configuration (if one exists) that has been chosen?  Is the WebApp properly integrated with database software? Is the WebApp sensitive to different versions of database software?  Do server-side WebApp scripts execute properly?  Have system administrator errors been examined for their affect on WebApp operations?  If proxy servers are used, have differences in their configuration been addressed with on-site testing? dsbw 2011/2012 q1 31
  • 32. Configuration Testing: Client-Side Issues  Hardware—CPU, memory, storage and printing devices  Operating systems—Linux, Macintosh OS, Microsoft Windows, a mobile-based OS  Browser software—Internet Explorer, Mozilla/Netscape, Opera, Safari, and others  User interface components—Active X, Java applets and others  Plug-ins—QuickTime, RealPlayer, and many others  Connectivity—cable, DSL, regular modem, T1 dsbw 2011/2012 q1 32
  • 33. Security Testing  Designed to probe vulnerabilities  of the client-side environment,  the network communications that occur as data are passed from client to server and back again,  and the server-side environment  On the client-side, vulnerabilities can often be traced to pre- existing bugs in browsers, e-mail programs, or communication software.  On the network infrastructure  On the server-side, Review the DSBW Unit on  At host level WebApp Security  At WebApp level dsbw 2011/2012 q1 33
  • 34. Performance Testing: Main Questions  Does the server response time degrade to a point where it is noticeable and unacceptable?  At what point (in terms of users, transactions or data loading) does performance become unacceptable?  What system components are responsible for performance degradation?  What is the average response time for users under a variety of loading conditions?  Does performance degradation have an impact on system security?  Is WebApp reliability or accuracy affected as the load on the system grows?  What happens when loads that are greater than maximum server capacity are applied? dsbw 2011/2012 q1 34
  • 35. Performance Testing: Load Tests  A load test verifies whether or not the system meets the required response times and the required throughput.  Steps: 1. Determine load profiles (what access types, how many visits per day, at what peak times, how many visits per session, how many transactions per session, etc.) and the transaction mix (which functions shall be executed with which percentage). 2. Determine the target values for response times and throughput (in normal operation and at peak times, for simple or complex accesses, with minimum, maximum, and average values). 3. Run the tests, generating the workload with the transaction mix defined in the load profile, and measure the response times and the throughput. 4. The results are evaluated, and potential bottlenecks are identified. dsbw 2011/2012 q1 35
  • 36. Performance Testing: Stress Tests  A stress test verifies whether or not the system reacts in a controlled way in “stress situations”, which are simulated by applying extreme conditions, such as unrealistic overload, or heavily fluctuating load.  The test is aimed at answering the questions:  Does the server degrade ‘gently’ or does it shut down as capacity is exceeded?  Does server software generate “server not available” messages? More generally, are users aware that they cannot reach the server?  Are transactions lost as capacity is exceeded?  Is data integrity affected as capacity is exceeded? dsbw 2011/2012 q1 36
  • 37. Performance Testing: Stress Tests (cont.)  Under what load conditions the server environment fails? How does failure manifest itself? Are automated notifications sent to technical support staff at the server site?  If the system does fail, how long will it take to come back on- line?  Are certain WebApp functions (e.g., compute intensive functionality, data streaming capabilities) discontinued as capacity reaches the 80 or 90% level? dsbw 2011/2012 q1 37
  • 38. Performance Testing: Interpreting Graphics  Load: the number of requests that arrive at the system per time unit  Throughput: the number of requests served per time unit.  SLA: Service Level Agreement dsbw 2011/2012 q1 38
  • 39. Test Automation  Automation can significantly increase the efficiency of testing and enables new types of tests that also increase the scope (e.g. different test objects and quality characteristics) and depth of testing (e.g. large amounts and combinations of input data).  Test automation brings the following benefits:  Running automated regression tests on new versions of a WebApp allows to detect defects caused by side-effects to unchanged functionality.  Various test methods and techniques would be difficult or impossible to perform manually. For example, load and stress testing requires to simulate a large number of concurrent users.  Automation allows to run more tests in less time and, thus, to run the tests more often leading to greater confidence in the system under test.  Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html dsbw 2011/2012 q1 39
  • 40. References  R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s Approach. McGraw Hill, 2008. Chapter 15.  KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons, 2006. Chapter 7.  [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web Usability. New Riders Publishing.  www.useit.com (Jakob Nielsen)  www.usability.gov dsbw 2011/2012 q1 40