SlideShare une entreprise Scribd logo
1  sur  40
Télécharger pour lire hors ligne
Unit 9: Web Application Testing

 Testing is the activity conducted to evaluate the quality of a
    product and to improve it by finding errors.




                                     Testing




dsbw 2011/2012 q1                                                  1
Testing Terminology
 An error is “the difference between a computed, observed, or
  measured value or condition and the true, specified, or
  theoretically correct value or condition” (IEEE standard 610.12-
  1990).
 This “true, specified, or theoretically correct value or condition”
  comes from
    A well-defined requirements model, if available and complete
    An incomplete set of fuzzy and contradictory goals, concerns,
      and expectations of the stakeholders
 A test is a set of test cases for a specific object under test: the
  whole Web application, components of a Web application, a
  system that runs a Web application, etc.
 A single test case describes a set of inputs, execution conditions,
  and expected results, which are used to test a specific aspect of the
  object under test
dsbw 2011/2012 q1                                                     2
Testing [and] Quality

 Testing should address compliance not only to functional
    requirements but also to quality requirements, i.e., the kinds of
    quality characteristics expected by stakeholders.
 ISO/IEC 9126-1 [Software] Quality Model:




dsbw 2011/2012 q1                                                       3
Goals of Testing
 The main goal of testing is to find errors but not to prove
    their absence
 A test run is successful if errors are detected. Otherwise, it is
    unsuccessful and “a waste of time”.
 Testing should adopt a risk-based approach:
       Test first and with the greatest effort those critical parts of an
        application where the most dangerous errors are still
        undetected
       A further aim of testing is to bring risks to light, not simply to
        demonstrate conformance to stated requirements.
 Test as early as possible at the beginning of a project: errors
    happened in early development phases are harder to localize
    and more expensive to fix in later phases.
dsbw 2011/2012 q1                                                            4
Test Levels (1/2)
 Unit tests
       Test the smallest testable units (classes, Web pages, etc.)
        independently of one another.
       Performed by the developer during implementation.

 Integration tests
       Evaluate the interaction between distinct and separately tested
        units once they have been integrated.
       Performed by a tester, a developer, or both jointly.

 System tests
       Test the complete, integrated system.
       Typically performed by a specialized test team.




dsbw 2011/2012 q1                                                         5
Test Levels (2/2)
 Acceptance tests
         Evaluate the system with the client in an “realistic”
          environment, i.e. with real conditions and real data.
 Beta tests
       Let friendly users work with early versions of a product to get
        early feedback.
       Beta tests are unsystematic tests which rely on the number and
        “malevolence” of potential users.




dsbw 2011/2012 q1                                                     6
Fitting Testing in the Development Process




 Planning: Defines the quality goals, the general testing strategy, the test
  plans for all test levels, the metrics and measuring methods, and the test
  environment.
 Preparing: Involves selecting the testing techniques and tools and
  specifying the test cases (including the test data).
 Performing: Prepares the test infrastructure, runs the test cases, and then
  documents and evaluates the results.
 Reporting: Summarizes the test results and produces the test reports.
dsbw 2011/2012 q1                                                               7
Web Testing: A Road Map

     Content                     Interface
     Testing                      Testing             Usability Testing
                      user

                                             Navigation
                                              Testing



                                                             Component
                                                               Testing




                                                          Configuration
                                                             Testing

                                                 Performance         Security
                    technology                      Testing          Testing


dsbw 2011/2012 q1                                                               8
Usability
 Usability is a quality attribute that assesses how easy user
    interfaces are to use. Also refers to methods for improving
    ease-of-use during the design process.
 Usability is defined by five quality components:
         Learnability: How easy is it for users to accomplish basic tasks
          the first time they encounter the design?
         Efficiency: Once users have learned the design, how quickly
          can they perform tasks?
         Memorability: When users return to the design after a period
          of not using it, how easily can they reestablish proficiency?
         Errors: How many errors do users make, how severe are these
          errors, and how easily can they recover from the errors?
         Satisfaction: How pleasant is it to use the design?
dsbw 2011/2012 q1                                                            9
Why Usability matters*
 62% of web shoppers gave up looking for an item. (Zona
    study)
 50% of web sales are lost because visitors can’t easily find
    content. (Gartner Group)
 40% of repeat visitors do not return due to a negative
    experience. (Zona study)
 85% of visitors abandon a new site due to poor design.
    (cPulse)
 Only 51% of sites complied with simple web usability
    principles. (Forrester study of 20 major sites)

(*) data from www.usabilitynet.org/management/c_cost.htm

dsbw 2011/2012 q1                                                10
Why people fail

                                                     Search

                                                     Findability (IA, Category
                                                     names, Navigation, Links)

                                                     Page design (Readability,
                                                     Layout, Graphics, Amateur,
                                                     Scrolling)
                                                     Information (Content,
                                                     Product info, Corporate info,
                                                     Prices)
                                                     Task support (Workflow,
                                                     Privacy, Forms, Comparison,
                                                     Inflexible)
                                                     Fancy design (Multimedia,
                                                     Back button, PDF/Printing,
                                                     New window, Sound)
                                                     Other (Bugs, Presence on
                                                     Web, Ads, New site,
Usability problems weighted by how frequently they   Metaphors)
          caused users to fail a task [NL06]

dsbw 2011/2012 q1                                                                    11
Top Ten (Usability) Mistakes in Web Design

1.    Bad search
2.    Pdf files for online reading
3.    Not changing the color of visited links
4.    Non-scannable text
5.    Fixed font size
6.    Page titles with low search engine visibility
7.    Anything that looks like an advertisement
8.    Violating design conventions
9.    Opening new browser windows
10. Not answering users' questions

dsbw 2011/2012 q1                                     12
Assessing Usability
 Two major types of assessing methods:
         Usability evaluations:
               Evaluators and no users
               Techniques: surveys/questionnaires, observational
                evaluations, guideline based reviews, cognitive
                walkthroughs, expert reviews, heuristic evaluations
         Usability tests: focus on users working with the product
 Usability testing is the only way to know if the Web site
    actually has problems that keep people from having a
    successful and satisfying experience.




dsbw 2011/2012 q1                                                     13
Usability Testing
 Usability testing is a methodology that employs potential
    users to evaluate the degree to which a website/software
    meets predefined usability criteria.
 Basic Process:
      1. Watch Customers
      2. They Perform Tasks
      3. Note Their Problems
      4. Make Recommendations
      5. Iterate




dsbw 2011/2012 q1                                              14
Measures of Usability
 Effectiveness (Ability to successfully accomplish tasks)
       Percentage of goals/tasks achieved (success rate)
       Number of errors

 Efficiency (Ability to accomplish tasks with speed and ease)
       Time to complete a task
       Frequency of requests for help
       Number of times facilitator provides assistance
       Number of times user gives up




dsbw 2011/2012 q1                                                15
Measures of Usability
 Satisfaction (Pleasing to users)
         Positive and negative ratings on a satisfaction scale
         Percent of favorable comments to unfavorable comments
         Number of good vs. bad features recalled after test
         Number of users who would use the system again
         Number of times users express dissatisfaction or frustration
 Learnability (Ability to learn how to use site and remember it)
       Ratio of successes to failures
       Number of features that can be recalled after the test




dsbw 2011/2012 q1                                                        16
Usability Testing Roles
 Facilitator:
       Oversees the entire test process
       Plan, test, and report.

 Participant:
       Actual or potential customer.
       Representative users (marketing, designers) avoided.

 Observer (optional):
       Records events as they occur.
       Limits interaction with the customer.
       Does contribute to the report.




dsbw 2011/2012 q1                                              17
Usability Testing Process
Step 1: Planning The Usability Test
         Define what to test
         Define which customers should be tested
         Define what tasks should be tested
         Write usability scenarios and tasks
         Select participants
Step 2: Conducting The Usability Test
       Conduct a test
       Collect data

Step 3: Analyzing and Reporting The Usability Test
       Compile results
       Make recommendations

dsbw 2011/2012 q1                                    18
People – Context – Activities
Step 1: Planning The Usability Test
         Define what to test
               → Activities (Use Cases)
         Define which customers (user profiles) to be tested
               → People (Actors)
         Provide a background for the activities to test
               → Context




dsbw 2011/2012 q1                                               19
Usability Scenarios and Tasks
 Provide the participant with motivation and context to make
    the situation more realistic
 Include several tasks:
       Make the first task simple
       Give a goal, without describing steps

 Set some success criteria, examples:
       N% of test participants will be able to complete x% of tasks in
        the time allotted.
       Participants will be able to complete x% of tasks with no more
        than one error per task.
       N% of test participants will rate the system as highly usable on
        a scale of x to x.


dsbw 2011/2012 q1                                                          20
Example of Scenario with Tasks
 Context:
          You want to book a sailing on Royal Caribbean International for
           next June with your church group. The group is called “Saint
           Francis Summer 2010”. The group is selling out fast, so you
           want to book a cabin, which is close to an elevator because
           your leg hurts from a recent injury.
 Tasks to perform:
      1.    Open your browser
      2.    Click the link labeled “Royal Caribbean”
      3.    Tell me the available cabins in the “Saint Francis Summer
            2010” group
      4.    Tell me a cabin number closest to an elevator
      5.    Book the cabin the best suits your needs

dsbw 2011/2012 q1                                                        21
Selecting Participants
 Recruit participants
       In-house
       recruitment firms, databases, conferences

 Match participants with user profiles
 Numbers: of participants, floaters

 Schedule test sessions

 Incentives:
       Gift checks ($100 per session)
       Food or gift cards




dsbw 2011/2012 q1                                   22
How Many Test Participants Are Required?

 The number of usability problems found in a usability test
    with n participants is:
                              N(1-(1-L)n)
       N : total number of usability problems in the design
       L : the proportion of usability problems discovered while testing
        a single participant.

                                                           For L = 31%




dsbw 2011/2012 q1                                                       23
How Many Test Participants Are Required?

 It seems that you need to test with at least 15 participants to
    discover all the usability problems
 However, is better to perform 3 tests with 5 participants than
    to perform one with 15 participants:
         After the first test with 5 participants has found 85% of the
          usability problems, you will want to fix them in a redesign.
         After creating the new design, you need to test again.
         The second test with 5 users will discover most of the
          remaining 15% of the original usability problems that were not
          found in the first test (and some new one).
         The new test will be able to uncover structural usability
          problems that were obscured in initial studies as users were
          stumped by surface-level usability problems.
         Fix the new problems, and test …
dsbw 2011/2012 q1                                                          24
Usability Labs … Not Necessary




The testing room contains office     The observer side contains a
furniture, video tape equipment, a   powerful computer to collect the
microphone and a computer with       usability data and analyze it. A one-
appropriate software.                way mirror separates the rooms.

dsbw 2011/2012 q1                                                       25
Test Side-by-Side




dsbw 2011/2012 q1   26
Conducting Tests: Facilitator’s Role
 Start with an easy task to build confidence
 Sit beside the person not behind the glass

 Use “think-out-loud” protocol

 Give participants time to think it through

 Offer appropriate encouragement

 Lead participants, don’t answer questions (being an enabler)
 Don’t act knowledgeable (treat them as the experts)

 Don’t get too involved in data collection

 Don’t jump to conclusions

 Don’t solve their problems immediately

dsbw 2011/2012 q1                                                27
Collecting Data
 Performance
       Objective (what actually happened)
       Usually Quantitative
               Time to complete a task
               Time to recover from an error
               Number of errors
               Percentage of tasks completed successfully
               Number of clicks
               Pathway information
 Preference
       Subjective (what participants say/thought)
       Usually Qualitative
               Preference of versions
               Suggestions and comments
               Ratings or rankings (can be quantitative)

dsbw 2011/2012 q1                                            28
Report findings and recommendations
 Make report usable for your users
 Include quantitative data (success rates, times, etc.)

 Avoid words like “few, many, several”. Include counts

 Use quotes

 Use screenshots

 Mention positive findings
 Do not use participant names, use P1, P2, P3, etc.

 Include recommendations

 Make it short



dsbw 2011/2012 q1                                          29
Component Testing
 Focuses on a set of tests that attempt to uncover errors in
    WebApp functions
 Conventional black-box and white-box test case design
    methods can be used at each architectural layer
    (presentation, domain, data access)
 Form data can be exploited systematically to find errors:
         Missing/incomplete data
         Type conversion problems
         Value boundary violations
         Fake data
         Etc.
 Database testing is often an integral part of the component-
    testing regime
dsbw 2011/2012 q1                                                30
Configuration Testing: Server-Side Issues
 Is the WebApp fully compatible with the server OS?
 Are system files, directories, and related system data created correctly
    when the WebApp is operational?
   Do system security measures (e.g., firewalls or encryption) allow the
    WebApp to execute and service users without interference or
    performance degradation?
   Has the WebApp been tested with the distributed server configuration (if
    one exists) that has been chosen?
   Is the WebApp properly integrated with database software? Is the
    WebApp sensitive to different versions of database software?
   Do server-side WebApp scripts execute properly?
   Have system administrator errors been examined for their affect on
    WebApp operations?
   If proxy servers are used, have differences in their configuration been
    addressed with on-site testing?

dsbw 2011/2012 q1                                                            31
Configuration Testing: Client-Side Issues

 Hardware—CPU, memory, storage and printing devices
 Operating systems—Linux, Macintosh OS, Microsoft
    Windows, a mobile-based OS
 Browser software—Internet Explorer, Mozilla/Netscape,
    Opera, Safari, and others
 User interface components—Active X, Java applets and
    others
 Plug-ins—QuickTime, RealPlayer, and many others

 Connectivity—cable, DSL, regular modem, T1




dsbw 2011/2012 q1                                         32
Security Testing
 Designed to probe vulnerabilities
       of the client-side environment,
       the network communications that occur as data are passed
        from client to server and back again,
       and the server-side environment

 On the client-side, vulnerabilities can often be traced to pre-
    existing bugs in browsers, e-mail programs, or
    communication software.
 On the network infrastructure

 On the server-side,
                                       Review the DSBW Unit on
       At host level                  WebApp Security
       At WebApp level


dsbw 2011/2012 q1                                                   33
Performance Testing: Main Questions
 Does the server response time degrade to a point where it is
    noticeable and unacceptable?
   At what point (in terms of users, transactions or data loading) does
    performance become unacceptable?
   What system components are responsible for performance
    degradation?
   What is the average response time for users under a variety of
    loading conditions?
   Does performance degradation have an impact on system security?
   Is WebApp reliability or accuracy affected as the load on the
    system grows?
   What happens when loads that are greater than maximum server
    capacity are applied?


dsbw 2011/2012 q1                                                      34
Performance Testing: Load Tests
 A load test verifies whether or not the system meets the
    required response times and the required throughput.
 Steps:
      1.   Determine load profiles (what access types, how many visits per day,
           at what peak times, how many visits per session, how many
           transactions per session, etc.) and the transaction mix (which
           functions shall be executed with which percentage).
      2.   Determine the target values for response times and throughput (in
           normal operation and at peak times, for simple or complex accesses,
           with minimum, maximum, and average values).
      3.   Run the tests, generating the workload with the transaction mix
           defined in the load profile, and measure the response times and the
           throughput.
      4.   The results are evaluated, and potential bottlenecks are identified.


dsbw 2011/2012 q1                                                                 35
Performance Testing: Stress Tests
 A stress test verifies whether or not the system reacts in a
    controlled way in “stress situations”, which are simulated by
    applying extreme conditions, such as unrealistic overload, or
    heavily fluctuating load.
 The test is aimed at answering the questions:
       Does the server degrade ‘gently’ or does it shut down as
        capacity is exceeded?
       Does server software generate “server not available”
        messages? More generally, are users aware that they cannot
        reach the server?
       Are transactions lost as capacity is exceeded?
       Is data integrity affected as capacity is exceeded?




dsbw 2011/2012 q1                                                    36
Performance Testing: Stress Tests (cont.)

       Under what load conditions the server environment fails? How
        does failure manifest itself? Are automated notifications sent to
        technical support staff at the server site?
       If the system does fail, how long will it take to come back on-
        line?
       Are certain WebApp functions (e.g., compute intensive
        functionality, data streaming capabilities) discontinued as
        capacity reaches the 80 or 90% level?




dsbw 2011/2012 q1                                                       37
Performance Testing: Interpreting Graphics


                              Load: the number of
                               requests that arrive at
                               the system per time unit
                              Throughput: the number
                               of requests served per
                               time unit.
                              SLA: Service Level
                               Agreement




dsbw 2011/2012 q1                                       38
Test Automation
 Automation can significantly increase the efficiency of testing and
  enables new types of tests that also increase the scope (e.g.
  different test objects and quality characteristics) and depth of
  testing (e.g. large amounts and combinations of input data).
 Test automation brings the following benefits:
    Running automated regression tests on new versions of a
      WebApp allows to detect defects caused by side-effects to
      unchanged functionality.
    Various test methods and techniques would be difficult or
      impossible to perform manually. For example, load and stress
      testing requires to simulate a large number of concurrent users.
    Automation allows to run more tests in less time and, thus, to
      run the tests more often leading to greater confidence in the
      system under test.
 Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html
dsbw 2011/2012 q1                                                       39
References
 R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s
    Approach. McGraw Hill, 2008. Chapter 15.
 KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons,
    2006. Chapter 7.
 [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web
    Usability. New Riders Publishing.
 www.useit.com (Jakob Nielsen)
 www.usability.gov




dsbw 2011/2012 q1                                              40

Contenu connexe

Tendances

Web Application Testing
Web Application TestingWeb Application Testing
Web Application TestingRicha Goel
 
Problem solving agents
Problem solving agentsProblem solving agents
Problem solving agentsMegha Sharma
 
Artificial Intelligence: Knowledge Acquisition
Artificial Intelligence: Knowledge AcquisitionArtificial Intelligence: Knowledge Acquisition
Artificial Intelligence: Knowledge AcquisitionThe Integral Worm
 
state modeling In UML
state modeling In UMLstate modeling In UML
state modeling In UMLKumar
 
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCE
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCEIntelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCE
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCEKhushboo Pal
 
Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 DigiGurukul
 
Control Strategies in AI
Control Strategies in AI Control Strategies in AI
Control Strategies in AI Bharat Bhushan
 
Performance Testing
Performance TestingPerformance Testing
Performance TestingSelin Gungor
 
What Is Accessibility Testing?
What Is Accessibility Testing?What Is Accessibility Testing?
What Is Accessibility Testing?QA InfoTech
 
Artificial Intelligence for Automated Software Testing
Artificial Intelligence for Automated Software TestingArtificial Intelligence for Automated Software Testing
Artificial Intelligence for Automated Software TestingLionel Briand
 
5 black box and grey box testing
5   black box and grey box testing5   black box and grey box testing
5 black box and grey box testingYisal Khan
 
Introduction to natural language processing (NLP)
Introduction to natural language processing (NLP)Introduction to natural language processing (NLP)
Introduction to natural language processing (NLP)Alia Hamwi
 
Artificial Intelligence Searching Techniques
Artificial Intelligence Searching TechniquesArtificial Intelligence Searching Techniques
Artificial Intelligence Searching TechniquesDr. C.V. Suresh Babu
 
object oriented methodologies
object oriented methodologiesobject oriented methodologies
object oriented methodologiesAmith Tiwari
 
I.BEST FIRST SEARCH IN AI
I.BEST FIRST SEARCH IN AII.BEST FIRST SEARCH IN AI
I.BEST FIRST SEARCH IN AIvikas dhakane
 
evaluation techniques in HCI
evaluation techniques in HCIevaluation techniques in HCI
evaluation techniques in HCIsawsan slii
 
natural language processing help at myassignmenthelp.net
natural language processing  help at myassignmenthelp.netnatural language processing  help at myassignmenthelp.net
natural language processing help at myassignmenthelp.netwww.myassignmenthelp.net
 

Tendances (20)

Web Application Testing
Web Application TestingWeb Application Testing
Web Application Testing
 
Problem solving agents
Problem solving agentsProblem solving agents
Problem solving agents
 
Chapter1(hci)
Chapter1(hci)Chapter1(hci)
Chapter1(hci)
 
Artificial Intelligence: Knowledge Acquisition
Artificial Intelligence: Knowledge AcquisitionArtificial Intelligence: Knowledge Acquisition
Artificial Intelligence: Knowledge Acquisition
 
state modeling In UML
state modeling In UMLstate modeling In UML
state modeling In UML
 
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCE
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCEIntelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCE
Intelligent Agent PPT ON SLIDESHARE IN ARTIFICIAL INTELLIGENCE
 
Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1
 
search strategies in artificial intelligence
search strategies in artificial intelligencesearch strategies in artificial intelligence
search strategies in artificial intelligence
 
Control Strategies in AI
Control Strategies in AI Control Strategies in AI
Control Strategies in AI
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
 
What Is Accessibility Testing?
What Is Accessibility Testing?What Is Accessibility Testing?
What Is Accessibility Testing?
 
Artificial Intelligence for Automated Software Testing
Artificial Intelligence for Automated Software TestingArtificial Intelligence for Automated Software Testing
Artificial Intelligence for Automated Software Testing
 
5 black box and grey box testing
5   black box and grey box testing5   black box and grey box testing
5 black box and grey box testing
 
Introduction to natural language processing (NLP)
Introduction to natural language processing (NLP)Introduction to natural language processing (NLP)
Introduction to natural language processing (NLP)
 
Artificial Intelligence Searching Techniques
Artificial Intelligence Searching TechniquesArtificial Intelligence Searching Techniques
Artificial Intelligence Searching Techniques
 
Object Oriented Design
Object Oriented DesignObject Oriented Design
Object Oriented Design
 
object oriented methodologies
object oriented methodologiesobject oriented methodologies
object oriented methodologies
 
I.BEST FIRST SEARCH IN AI
I.BEST FIRST SEARCH IN AII.BEST FIRST SEARCH IN AI
I.BEST FIRST SEARCH IN AI
 
evaluation techniques in HCI
evaluation techniques in HCIevaluation techniques in HCI
evaluation techniques in HCI
 
natural language processing help at myassignmenthelp.net
natural language processing  help at myassignmenthelp.netnatural language processing  help at myassignmenthelp.net
natural language processing help at myassignmenthelp.net
 

En vedette

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web ApplicationsSeth McLaughlin
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering PresentationHira Anwer Khan
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_tomheck
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdomiiiapdst
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist Febin Chacko
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanuDhanu P G Naik
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developerscboecking
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process ReengineeringHira Anwer Khan
 
Mobile testing
Mobile testingMobile testing
Mobile testingAlex Hung
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testingLakshmi Nandoor
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Sauce Labs
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guideNugroho Hermanto
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software TestingAndrew Kandels
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Tom Eston
 

En vedette (20)

Testing Web Applications
Testing Web ApplicationsTesting Web Applications
Testing Web Applications
 
Testing web application
Testing web applicationTesting web application
Testing web application
 
Business Process Reengineering Presentation
Business Process Reengineering PresentationBusiness Process Reengineering Presentation
Business Process Reengineering Presentation
 
Group3
Group3Group3
Group3
 
Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_Nyc 7 qualities_of_the_leader_as_coach_
Nyc 7 qualities_of_the_leader_as_coach_
 
Moneran Kingdom
Moneran KingdomMoneran Kingdom
Moneran Kingdom
 
Unit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UMLUnit 06: The Web Application Extension for UML
Unit 06: The Web Application Extension for UML
 
TESTING Checklist
TESTING Checklist TESTING Checklist
TESTING Checklist
 
Unit 05: Physical Architecture Design
Unit 05: Physical Architecture DesignUnit 05: Physical Architecture Design
Unit 05: Physical Architecture Design
 
A perspective on web testing.ppt
A perspective on web testing.pptA perspective on web testing.ppt
A perspective on web testing.ppt
 
Training & development dhanu
Training & development dhanuTraining & development dhanu
Training & development dhanu
 
Open Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java DevelopersOpen Source ERP Technologies for Java Developers
Open Source ERP Technologies for Java Developers
 
Final Report Business Process Reengineering
Final Report Business Process ReengineeringFinal Report Business Process Reengineering
Final Report Business Process Reengineering
 
Mobile testing
Mobile testingMobile testing
Mobile testing
 
Web testing
Web testingWeb testing
Web testing
 
browser compatibility testing
browser compatibility testingbrowser compatibility testing
browser compatibility testing
 
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
Compatibility Testing of Your Web Apps - Tips and Tricks for Debugging Locall...
 
7 1-1 soap-developers_guide
7 1-1 soap-developers_guide7 1-1 soap-developers_guide
7 1-1 soap-developers_guide
 
Web Application Software Testing
Web Application Software TestingWeb Application Software Testing
Web Application Software Testing
 
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
Don't Drop the SOAP: Real World Web Service Testing for Web Hackers
 

Similaire à Unit 09: Web Application Testing

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Carles Farré
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Visiontechmeetup
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web TestingCarles Farré
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMijseajournal
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentSameer Chavan
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque AutomationMD Tareque
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Softwaredinasharawi
 
Usabilitydraft
UsabilitydraftUsabilitydraft
UsabilitydraftKimGriggs
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_ResumeAditya Vad
 

Similaire à Unit 09: Web Application Testing (20)

Web Usability (Slideshare Version)
Web Usability (Slideshare Version)Web Usability (Slideshare Version)
Web Usability (Slideshare Version)
 
User Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User VisionUser Testing talk by Chris Rourke of User Vision
User Testing talk by Chris Rourke of User Vision
 
Unit03: Process and Business Models
Unit03: Process and Business ModelsUnit03: Process and Business Models
Unit03: Process and Business Models
 
[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing[DSBW Spring 2009] Unit 09: Web Testing
[DSBW Spring 2009] Unit 09: Web Testing
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEMA RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
A RELIABLE AND AN EFFICIENT WEB TESTING SYSTEM
 
Standards Based Approach to User Interface Development
Standards Based Approach to User Interface DevelopmentStandards Based Approach to User Interface Development
Standards Based Approach to User Interface Development
 
MD Tareque Automation
MD Tareque AutomationMD Tareque Automation
MD Tareque Automation
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Helpdesk Software
Helpdesk SoftwareHelpdesk Software
Helpdesk Software
 
Raji_QA
Raji_QARaji_QA
Raji_QA
 
Usabilitydraft
UsabilitydraftUsabilitydraft
Usabilitydraft
 
Kasi Viswanath
Kasi ViswanathKasi Viswanath
Kasi Viswanath
 
Richa Rani-QA Consultant
Richa Rani-QA ConsultantRicha Rani-QA Consultant
Richa Rani-QA Consultant
 
Aditya Vad_Resume
Aditya Vad_ResumeAditya Vad_Resume
Aditya Vad_Resume
 
QA_Resume
QA_ResumeQA_Resume
QA_Resume
 
Ooad
OoadOoad
Ooad
 
PARVATHY INDIRA
PARVATHY INDIRAPARVATHY INDIRA
PARVATHY INDIRA
 
195
195195
195
 

Plus de DSBW 2011/2002 - Carles Farré - Barcelona Tech (9)

Unit 08: Security for Web Applications
Unit 08: Security for Web ApplicationsUnit 08: Security for Web Applications
Unit 08: Security for Web Applications
 
Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)Unit 07: Design Patterns and Frameworks (3/3)
Unit 07: Design Patterns and Frameworks (3/3)
 
Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)Unit 07: Design Patterns and Frameworks (2/3)
Unit 07: Design Patterns and Frameworks (2/3)
 
Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)Unit 07: Design Patterns and Frameworks (1/3)
Unit 07: Design Patterns and Frameworks (1/3)
 
Unit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX ModelUnit 04: From Requirements to the UX Model
Unit 04: From Requirements to the UX Model
 
Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)Unit 02: Web Technologies (2/2)
Unit 02: Web Technologies (2/2)
 
Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)Unit 02: Web Technologies (1/2)
Unit 02: Web Technologies (1/2)
 
Unit 01 - Introduction
Unit 01 - IntroductionUnit 01 - Introduction
Unit 01 - Introduction
 
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
Unit 10: XML and Beyond (Sematic Web, Web Services, ...)
 

Dernier

GenAI and AI GCC State of AI_Object Automation Inc
GenAI and AI GCC State of AI_Object Automation IncGenAI and AI GCC State of AI_Object Automation Inc
GenAI and AI GCC State of AI_Object Automation IncObject Automation
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsSeth Reyes
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1DianaGray10
 
Cloud Revolution: Exploring the New Wave of Serverless Spatial Data
Cloud Revolution: Exploring the New Wave of Serverless Spatial DataCloud Revolution: Exploring the New Wave of Serverless Spatial Data
Cloud Revolution: Exploring the New Wave of Serverless Spatial DataSafe Software
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesMd Hossain Ali
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfDaniel Santiago Silva Capera
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdfPedro Manuel
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioChristian Posta
 
Videogame localization & technology_ how to enhance the power of translation.pdf
Videogame localization & technology_ how to enhance the power of translation.pdfVideogame localization & technology_ how to enhance the power of translation.pdf
Videogame localization & technology_ how to enhance the power of translation.pdfinfogdgmi
 
Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024SkyPlanner
 
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdfJamie (Taka) Wang
 
Designing A Time bound resource download URL
Designing A Time bound resource download URLDesigning A Time bound resource download URL
Designing A Time bound resource download URLRuncy Oommen
 
Do we need a new standard for visualizing the invisible?
Do we need a new standard for visualizing the invisible?Do we need a new standard for visualizing the invisible?
Do we need a new standard for visualizing the invisible?SANGHEE SHIN
 
PicPay - GenAI Finance Assistant - ChatGPT for Customer Service
PicPay - GenAI Finance Assistant - ChatGPT for Customer ServicePicPay - GenAI Finance Assistant - ChatGPT for Customer Service
PicPay - GenAI Finance Assistant - ChatGPT for Customer ServiceRenan Moreira de Oliveira
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1DianaGray10
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Adtran
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureEric D. Schabell
 
COMPUTER 10: Lesson 7 - File Storage and Online Collaboration
COMPUTER 10: Lesson 7 - File Storage and Online CollaborationCOMPUTER 10: Lesson 7 - File Storage and Online Collaboration
COMPUTER 10: Lesson 7 - File Storage and Online Collaborationbruanjhuli
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6DianaGray10
 

Dernier (20)

GenAI and AI GCC State of AI_Object Automation Inc
GenAI and AI GCC State of AI_Object Automation IncGenAI and AI GCC State of AI_Object Automation Inc
GenAI and AI GCC State of AI_Object Automation Inc
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and Hazards
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1
 
Cloud Revolution: Exploring the New Wave of Serverless Spatial Data
Cloud Revolution: Exploring the New Wave of Serverless Spatial DataCloud Revolution: Exploring the New Wave of Serverless Spatial Data
Cloud Revolution: Exploring the New Wave of Serverless Spatial Data
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
 
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdfIaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
IaC & GitOps in a Nutshell - a FridayInANuthshell Episode.pdf
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdf
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and Istio
 
Videogame localization & technology_ how to enhance the power of translation.pdf
Videogame localization & technology_ how to enhance the power of translation.pdfVideogame localization & technology_ how to enhance the power of translation.pdf
Videogame localization & technology_ how to enhance the power of translation.pdf
 
Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024Salesforce Miami User Group Event - 1st Quarter 2024
Salesforce Miami User Group Event - 1st Quarter 2024
 
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf
20200723_insight_release_plan_v6.pdf20200723_insight_release_plan_v6.pdf
 
Designing A Time bound resource download URL
Designing A Time bound resource download URLDesigning A Time bound resource download URL
Designing A Time bound resource download URL
 
Do we need a new standard for visualizing the invisible?
Do we need a new standard for visualizing the invisible?Do we need a new standard for visualizing the invisible?
Do we need a new standard for visualizing the invisible?
 
PicPay - GenAI Finance Assistant - ChatGPT for Customer Service
PicPay - GenAI Finance Assistant - ChatGPT for Customer ServicePicPay - GenAI Finance Assistant - ChatGPT for Customer Service
PicPay - GenAI Finance Assistant - ChatGPT for Customer Service
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability Adventure
 
COMPUTER 10: Lesson 7 - File Storage and Online Collaboration
COMPUTER 10: Lesson 7 - File Storage and Online CollaborationCOMPUTER 10: Lesson 7 - File Storage and Online Collaboration
COMPUTER 10: Lesson 7 - File Storage and Online Collaboration
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6
 

Unit 09: Web Application Testing

  • 1. Unit 9: Web Application Testing  Testing is the activity conducted to evaluate the quality of a product and to improve it by finding errors. Testing dsbw 2011/2012 q1 1
  • 2. Testing Terminology  An error is “the difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition” (IEEE standard 610.12- 1990).  This “true, specified, or theoretically correct value or condition” comes from  A well-defined requirements model, if available and complete  An incomplete set of fuzzy and contradictory goals, concerns, and expectations of the stakeholders  A test is a set of test cases for a specific object under test: the whole Web application, components of a Web application, a system that runs a Web application, etc.  A single test case describes a set of inputs, execution conditions, and expected results, which are used to test a specific aspect of the object under test dsbw 2011/2012 q1 2
  • 3. Testing [and] Quality  Testing should address compliance not only to functional requirements but also to quality requirements, i.e., the kinds of quality characteristics expected by stakeholders.  ISO/IEC 9126-1 [Software] Quality Model: dsbw 2011/2012 q1 3
  • 4. Goals of Testing  The main goal of testing is to find errors but not to prove their absence  A test run is successful if errors are detected. Otherwise, it is unsuccessful and “a waste of time”.  Testing should adopt a risk-based approach:  Test first and with the greatest effort those critical parts of an application where the most dangerous errors are still undetected  A further aim of testing is to bring risks to light, not simply to demonstrate conformance to stated requirements.  Test as early as possible at the beginning of a project: errors happened in early development phases are harder to localize and more expensive to fix in later phases. dsbw 2011/2012 q1 4
  • 5. Test Levels (1/2)  Unit tests  Test the smallest testable units (classes, Web pages, etc.) independently of one another.  Performed by the developer during implementation.  Integration tests  Evaluate the interaction between distinct and separately tested units once they have been integrated.  Performed by a tester, a developer, or both jointly.  System tests  Test the complete, integrated system.  Typically performed by a specialized test team. dsbw 2011/2012 q1 5
  • 6. Test Levels (2/2)  Acceptance tests  Evaluate the system with the client in an “realistic” environment, i.e. with real conditions and real data.  Beta tests  Let friendly users work with early versions of a product to get early feedback.  Beta tests are unsystematic tests which rely on the number and “malevolence” of potential users. dsbw 2011/2012 q1 6
  • 7. Fitting Testing in the Development Process  Planning: Defines the quality goals, the general testing strategy, the test plans for all test levels, the metrics and measuring methods, and the test environment.  Preparing: Involves selecting the testing techniques and tools and specifying the test cases (including the test data).  Performing: Prepares the test infrastructure, runs the test cases, and then documents and evaluates the results.  Reporting: Summarizes the test results and produces the test reports. dsbw 2011/2012 q1 7
  • 8. Web Testing: A Road Map Content Interface Testing Testing Usability Testing user Navigation Testing Component Testing Configuration Testing Performance Security technology Testing Testing dsbw 2011/2012 q1 8
  • 9. Usability  Usability is a quality attribute that assesses how easy user interfaces are to use. Also refers to methods for improving ease-of-use during the design process.  Usability is defined by five quality components:  Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?  Efficiency: Once users have learned the design, how quickly can they perform tasks?  Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?  Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?  Satisfaction: How pleasant is it to use the design? dsbw 2011/2012 q1 9
  • 10. Why Usability matters*  62% of web shoppers gave up looking for an item. (Zona study)  50% of web sales are lost because visitors can’t easily find content. (Gartner Group)  40% of repeat visitors do not return due to a negative experience. (Zona study)  85% of visitors abandon a new site due to poor design. (cPulse)  Only 51% of sites complied with simple web usability principles. (Forrester study of 20 major sites) (*) data from www.usabilitynet.org/management/c_cost.htm dsbw 2011/2012 q1 10
  • 11. Why people fail Search Findability (IA, Category names, Navigation, Links) Page design (Readability, Layout, Graphics, Amateur, Scrolling) Information (Content, Product info, Corporate info, Prices) Task support (Workflow, Privacy, Forms, Comparison, Inflexible) Fancy design (Multimedia, Back button, PDF/Printing, New window, Sound) Other (Bugs, Presence on Web, Ads, New site, Usability problems weighted by how frequently they Metaphors) caused users to fail a task [NL06] dsbw 2011/2012 q1 11
  • 12. Top Ten (Usability) Mistakes in Web Design 1. Bad search 2. Pdf files for online reading 3. Not changing the color of visited links 4. Non-scannable text 5. Fixed font size 6. Page titles with low search engine visibility 7. Anything that looks like an advertisement 8. Violating design conventions 9. Opening new browser windows 10. Not answering users' questions dsbw 2011/2012 q1 12
  • 13. Assessing Usability  Two major types of assessing methods:  Usability evaluations:  Evaluators and no users  Techniques: surveys/questionnaires, observational evaluations, guideline based reviews, cognitive walkthroughs, expert reviews, heuristic evaluations  Usability tests: focus on users working with the product  Usability testing is the only way to know if the Web site actually has problems that keep people from having a successful and satisfying experience. dsbw 2011/2012 q1 13
  • 14. Usability Testing  Usability testing is a methodology that employs potential users to evaluate the degree to which a website/software meets predefined usability criteria.  Basic Process: 1. Watch Customers 2. They Perform Tasks 3. Note Their Problems 4. Make Recommendations 5. Iterate dsbw 2011/2012 q1 14
  • 15. Measures of Usability  Effectiveness (Ability to successfully accomplish tasks)  Percentage of goals/tasks achieved (success rate)  Number of errors  Efficiency (Ability to accomplish tasks with speed and ease)  Time to complete a task  Frequency of requests for help  Number of times facilitator provides assistance  Number of times user gives up dsbw 2011/2012 q1 15
  • 16. Measures of Usability  Satisfaction (Pleasing to users)  Positive and negative ratings on a satisfaction scale  Percent of favorable comments to unfavorable comments  Number of good vs. bad features recalled after test  Number of users who would use the system again  Number of times users express dissatisfaction or frustration  Learnability (Ability to learn how to use site and remember it)  Ratio of successes to failures  Number of features that can be recalled after the test dsbw 2011/2012 q1 16
  • 17. Usability Testing Roles  Facilitator:  Oversees the entire test process  Plan, test, and report.  Participant:  Actual or potential customer.  Representative users (marketing, designers) avoided.  Observer (optional):  Records events as they occur.  Limits interaction with the customer.  Does contribute to the report. dsbw 2011/2012 q1 17
  • 18. Usability Testing Process Step 1: Planning The Usability Test  Define what to test  Define which customers should be tested  Define what tasks should be tested  Write usability scenarios and tasks  Select participants Step 2: Conducting The Usability Test  Conduct a test  Collect data Step 3: Analyzing and Reporting The Usability Test  Compile results  Make recommendations dsbw 2011/2012 q1 18
  • 19. People – Context – Activities Step 1: Planning The Usability Test  Define what to test  → Activities (Use Cases)  Define which customers (user profiles) to be tested  → People (Actors)  Provide a background for the activities to test  → Context dsbw 2011/2012 q1 19
  • 20. Usability Scenarios and Tasks  Provide the participant with motivation and context to make the situation more realistic  Include several tasks:  Make the first task simple  Give a goal, without describing steps  Set some success criteria, examples:  N% of test participants will be able to complete x% of tasks in the time allotted.  Participants will be able to complete x% of tasks with no more than one error per task.  N% of test participants will rate the system as highly usable on a scale of x to x. dsbw 2011/2012 q1 20
  • 21. Example of Scenario with Tasks  Context:  You want to book a sailing on Royal Caribbean International for next June with your church group. The group is called “Saint Francis Summer 2010”. The group is selling out fast, so you want to book a cabin, which is close to an elevator because your leg hurts from a recent injury.  Tasks to perform: 1. Open your browser 2. Click the link labeled “Royal Caribbean” 3. Tell me the available cabins in the “Saint Francis Summer 2010” group 4. Tell me a cabin number closest to an elevator 5. Book the cabin the best suits your needs dsbw 2011/2012 q1 21
  • 22. Selecting Participants  Recruit participants  In-house  recruitment firms, databases, conferences  Match participants with user profiles  Numbers: of participants, floaters  Schedule test sessions  Incentives:  Gift checks ($100 per session)  Food or gift cards dsbw 2011/2012 q1 22
  • 23. How Many Test Participants Are Required?  The number of usability problems found in a usability test with n participants is: N(1-(1-L)n)  N : total number of usability problems in the design  L : the proportion of usability problems discovered while testing a single participant. For L = 31% dsbw 2011/2012 q1 23
  • 24. How Many Test Participants Are Required?  It seems that you need to test with at least 15 participants to discover all the usability problems  However, is better to perform 3 tests with 5 participants than to perform one with 15 participants:  After the first test with 5 participants has found 85% of the usability problems, you will want to fix them in a redesign.  After creating the new design, you need to test again.  The second test with 5 users will discover most of the remaining 15% of the original usability problems that were not found in the first test (and some new one).  The new test will be able to uncover structural usability problems that were obscured in initial studies as users were stumped by surface-level usability problems.  Fix the new problems, and test … dsbw 2011/2012 q1 24
  • 25. Usability Labs … Not Necessary The testing room contains office The observer side contains a furniture, video tape equipment, a powerful computer to collect the microphone and a computer with usability data and analyze it. A one- appropriate software. way mirror separates the rooms. dsbw 2011/2012 q1 25
  • 27. Conducting Tests: Facilitator’s Role  Start with an easy task to build confidence  Sit beside the person not behind the glass  Use “think-out-loud” protocol  Give participants time to think it through  Offer appropriate encouragement  Lead participants, don’t answer questions (being an enabler)  Don’t act knowledgeable (treat them as the experts)  Don’t get too involved in data collection  Don’t jump to conclusions  Don’t solve their problems immediately dsbw 2011/2012 q1 27
  • 28. Collecting Data  Performance  Objective (what actually happened)  Usually Quantitative  Time to complete a task  Time to recover from an error  Number of errors  Percentage of tasks completed successfully  Number of clicks  Pathway information  Preference  Subjective (what participants say/thought)  Usually Qualitative  Preference of versions  Suggestions and comments  Ratings or rankings (can be quantitative) dsbw 2011/2012 q1 28
  • 29. Report findings and recommendations  Make report usable for your users  Include quantitative data (success rates, times, etc.)  Avoid words like “few, many, several”. Include counts  Use quotes  Use screenshots  Mention positive findings  Do not use participant names, use P1, P2, P3, etc.  Include recommendations  Make it short dsbw 2011/2012 q1 29
  • 30. Component Testing  Focuses on a set of tests that attempt to uncover errors in WebApp functions  Conventional black-box and white-box test case design methods can be used at each architectural layer (presentation, domain, data access)  Form data can be exploited systematically to find errors:  Missing/incomplete data  Type conversion problems  Value boundary violations  Fake data  Etc.  Database testing is often an integral part of the component- testing regime dsbw 2011/2012 q1 30
  • 31. Configuration Testing: Server-Side Issues  Is the WebApp fully compatible with the server OS?  Are system files, directories, and related system data created correctly when the WebApp is operational?  Do system security measures (e.g., firewalls or encryption) allow the WebApp to execute and service users without interference or performance degradation?  Has the WebApp been tested with the distributed server configuration (if one exists) that has been chosen?  Is the WebApp properly integrated with database software? Is the WebApp sensitive to different versions of database software?  Do server-side WebApp scripts execute properly?  Have system administrator errors been examined for their affect on WebApp operations?  If proxy servers are used, have differences in their configuration been addressed with on-site testing? dsbw 2011/2012 q1 31
  • 32. Configuration Testing: Client-Side Issues  Hardware—CPU, memory, storage and printing devices  Operating systems—Linux, Macintosh OS, Microsoft Windows, a mobile-based OS  Browser software—Internet Explorer, Mozilla/Netscape, Opera, Safari, and others  User interface components—Active X, Java applets and others  Plug-ins—QuickTime, RealPlayer, and many others  Connectivity—cable, DSL, regular modem, T1 dsbw 2011/2012 q1 32
  • 33. Security Testing  Designed to probe vulnerabilities  of the client-side environment,  the network communications that occur as data are passed from client to server and back again,  and the server-side environment  On the client-side, vulnerabilities can often be traced to pre- existing bugs in browsers, e-mail programs, or communication software.  On the network infrastructure  On the server-side, Review the DSBW Unit on  At host level WebApp Security  At WebApp level dsbw 2011/2012 q1 33
  • 34. Performance Testing: Main Questions  Does the server response time degrade to a point where it is noticeable and unacceptable?  At what point (in terms of users, transactions or data loading) does performance become unacceptable?  What system components are responsible for performance degradation?  What is the average response time for users under a variety of loading conditions?  Does performance degradation have an impact on system security?  Is WebApp reliability or accuracy affected as the load on the system grows?  What happens when loads that are greater than maximum server capacity are applied? dsbw 2011/2012 q1 34
  • 35. Performance Testing: Load Tests  A load test verifies whether or not the system meets the required response times and the required throughput.  Steps: 1. Determine load profiles (what access types, how many visits per day, at what peak times, how many visits per session, how many transactions per session, etc.) and the transaction mix (which functions shall be executed with which percentage). 2. Determine the target values for response times and throughput (in normal operation and at peak times, for simple or complex accesses, with minimum, maximum, and average values). 3. Run the tests, generating the workload with the transaction mix defined in the load profile, and measure the response times and the throughput. 4. The results are evaluated, and potential bottlenecks are identified. dsbw 2011/2012 q1 35
  • 36. Performance Testing: Stress Tests  A stress test verifies whether or not the system reacts in a controlled way in “stress situations”, which are simulated by applying extreme conditions, such as unrealistic overload, or heavily fluctuating load.  The test is aimed at answering the questions:  Does the server degrade ‘gently’ or does it shut down as capacity is exceeded?  Does server software generate “server not available” messages? More generally, are users aware that they cannot reach the server?  Are transactions lost as capacity is exceeded?  Is data integrity affected as capacity is exceeded? dsbw 2011/2012 q1 36
  • 37. Performance Testing: Stress Tests (cont.)  Under what load conditions the server environment fails? How does failure manifest itself? Are automated notifications sent to technical support staff at the server site?  If the system does fail, how long will it take to come back on- line?  Are certain WebApp functions (e.g., compute intensive functionality, data streaming capabilities) discontinued as capacity reaches the 80 or 90% level? dsbw 2011/2012 q1 37
  • 38. Performance Testing: Interpreting Graphics  Load: the number of requests that arrive at the system per time unit  Throughput: the number of requests served per time unit.  SLA: Service Level Agreement dsbw 2011/2012 q1 38
  • 39. Test Automation  Automation can significantly increase the efficiency of testing and enables new types of tests that also increase the scope (e.g. different test objects and quality characteristics) and depth of testing (e.g. large amounts and combinations of input data).  Test automation brings the following benefits:  Running automated regression tests on new versions of a WebApp allows to detect defects caused by side-effects to unchanged functionality.  Various test methods and techniques would be difficult or impossible to perform manually. For example, load and stress testing requires to simulate a large number of concurrent users.  Automation allows to run more tests in less time and, thus, to run the tests more often leading to greater confidence in the system under test.  Web Site Test Tools: http://www.softwareqatest.com/qatweb1.html dsbw 2011/2012 q1 39
  • 40. References  R. G. Pressman, D. Lowe: Web Engineering. A Practitioner’s Approach. McGraw Hill, 2008. Chapter 15.  KAPPEL, Gerti et al. Web Engineering, John Wiley & Sons, 2006. Chapter 7.  [NH06] NIELSEN, J. and LORANGER, H. 2006 Prioritizing Web Usability. New Riders Publishing.  www.useit.com (Jakob Nielsen)  www.usability.gov dsbw 2011/2012 q1 40