Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Manual, Visual, and Automated Testing For Web Apps

Chargement dans…3

Consultez-les par la suite

1 sur 22 Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à Manual, Visual, and Automated Testing For Web Apps (20)


Plus par SmartBear (20)

Plus récents (20)


Manual, Visual, and Automated Testing For Web Apps

  1. 1. Manual, Visual, and Automated Testing For Web Apps. Join Ministry of Testing and SmartBear as we explore how to conduct complete, end-to-end testing on web applications TUESDAY, JUNE 20TH 6:00PM EST Learn more about CrossBrowserTesti ng
  2. 2. Today’s Agenda • Our Testing Playbook • Challenges, Plan, Data, Environments. • Browser Testing • There’s so many of them. • The Non Automated • The tools and tactics to be a successful manual tester. • The Automated Part • What, how, when, and why we automate testing.
  3. 3. Web Testing Challenges
  4. 4. Questions For Web Testers Who Will Test? • Dev Team? • Dedicated QA Team? • Project Managers? What Type Of Testing • Functional • Performance • Load • Accessibility When Will You Test • Each commit • Every Deployment • Nightly • Weekly What Are We Testing • GUI • API • End-to-End
  5. 5. Our Test’s Data • No Data o Zero, Null, Blank • Valid Data o Valid Data Strings • Invalid Data o Wrong Language, Incorrect input • Stressed Data o Out of Range, Injections driver.find_element_by_name("email").send_keys("daniel@crossbrowsertesting.com") driver.find_element_by_name("password").clear() driver.find_element_by_name("password").send_keys(“WrongPassWord")
  6. 6. State • Subscription Status • Email Verified • Custom Pricing • Referrer Trial Subscribed SubCancelled Unsubscribed
  7. 7. Cross Browser Testing Bob, looks like it works on my machine. Not sure what the problem is.
  8. 8. Our Test Environments Operating Systems Devices Browsers Breaks 1366 X 768 | 1920 X 1080 | 1600 X 900 | 1280 X 1024 | 1440 X 900
  9. 9. Grab Info From Analytics
  10. 10. The “Spreadsheet” Test Case State Windows Mac iOS 10 7 10.12 10.11 7 Plus 7 6S Chrome FireFox IE Chrome FireFox IE Safari Chrome Safari Chrome Safari Safari Safari Switch Plan Trial X X X X X Switch Plan Subscribed X X X X Switch Plan Cancelled X X X X X
  11. 11. Consider Cloud Testing Tools Test In ParallelTest More BrowsersNo Infrastructure Needed
  12. 12. Not Automated Testing How to go about an effective manual testing strategy
  13. 13. Starting Out Exploratory Testing Write Test Cases Manual Testing Smoke Testing Maximum test execution. Test baby, test! Be descriptive and thorough. MVT. Minimum Viable Testing Someone’s gotta do it.
  14. 14. Exploratory Testing
  15. 15. An Example Test Case Title: Logging into CrossBrowserTesting Description: A registered user should be able to login successfully into our application. Conditions/State: A user is “Subscribed” Visit CrossBrowserTesting.com Enter Email into "email" field Enter a password into the "password" field Click log in Expected Result: A user will land on the “Live Testing” page.
  16. 16. Tools For Manual Web Testing • Link Checker ScreamingFrog & MonkeyTest.It • HTTP Traffic Fiddler, Chrome DevTools, FireBug • Performance Testing Webpagetest.org, AlertSite, Google PageSpeed, Optimzilla, JMeter
  17. 17. Visual Smoke Testing
  18. 18. Automated Testing Let’s go faster, get more coverage, and make our lives easier.
  19. 19. Why Automate Testing? • Improve our test coverage • Decrease deployment times • Save time and resources
  20. 20. What Do We Automate • When the business flow is critical • Repetitive by nature • Regression tests • Data-Driven testing • Longer tests • Reused as a performance test
  21. 21. Basic Anatomy Of A Selenium Test
  22. 22. Questions? Name Position Email Phone Number Running It In The Cloud Learn more about CrossBrowserTesting

Notes de l'éditeur

  • The challenges of web testing and how the CBT team approaches our own testing.
  • JavaScript renders differently. It’s like a surprise in every different browser.

    Mobile Devices and Screen Sizes.

    Websites deploy lightning quick. Your marketing team has never heard of CI/CD but they operate.

    Not the state of Massachusetts. User state is one of the most difficult aspects we deal with testing our web application. Over time, there are just a lot of different variables in our table.
  • Web Testing if often a unique team. Made up of many different players and stake holders. While Dev, Product, Marketing can share responsibility. Who is accountable?

    What type of testing are we going to do. Functional, Performance, Load Testing

    When will you test

    What are we testing
  • Test data is actually the input given to a software program. It represents data that affects or is affected by the execution of the specific module. Some data may be used for positive testing, typically to verify that a given set of input to a given function produces an expected result. Other data may be used for negative testing to test the ability of the program to handle unusual, extreme, exceptional, or unexpected input.

    Positive Data – Works
    No Data – None, Null, Default
    Valid – Data that works, but outcome is wrong
    Invalid Data – Data that does not work
    Stressed Data – Data doesn’t work, and out come is wonky

    Test Data can be Generated -
    Mass copy of data from production to testing environment
    Mass copy of test data from legacy client systems
    Automated Test Data Generation Tools
  • State is the single hardest piece of added complexity when it comes to testing our web applications.

    Not only does it added a multiplying factor to our test cases, for each state, it is heavily relied upon by other systems – our marketing and sales cadence flows completely off our States.
  • Why do we do cross browser testing?
    - No longer a question of if it works, but if it works on Windows 7 running Chrome 46, or X browser on X device.
    We want the confidence it brings knowing our application works for every customer
    Necessary for responsive testing.
  • Take a snapshot of weekly, monthly, and quartley. See how they compare. Often times taking too long of a look distorts your true browser habits.

    Top 10 browsers gets us ~45% of coverage.

    Good but not great. But also from a time and resource perspective – manageable only by automated testing

  • We don’t do exploratory testing every release, every day, or probably even every week.

    But it is a critical piece of software testing, as you can either wait until a Customer tells you it’s broken – or go digging yourself. Repeating test cases, whether it be manual or automated is actually not a great way to find new bugs!

    Record session, focus on testing, rewatch to document bugs and write your test cases.