Qual Quiz2

  1. Testing Techniques Categories
    • general testing
    • functional testing
    • non-functional testing
    • volume testing.
  2. Testing Technique 1: General Testing
    • Positive Testing: using valid data as input.
    • Does system do what requirements state using positive input?
    • Negative Testing: using invalid data as input.
    • Ensures the application does not do negative things. Can be vague, open ended and expensive.
    • Error guessing: Sample guesses- blank input, null input, extra blanks, divide by 0, non-numeric.
    • Automated Software Testing: test case prerequisites, input and expected results.
    • Compare actual results to expected results.
    • Useful when you have a test to be repeated- multiple platforms.
    • Black Box: more later
    • White Box: more later
    • Grey Box: more later
  3. Black Box
    • The concept is a box you can’t see in.
    • Also called behavioural test or specification based.
    • Testing an app through the user interface, that is, how does the user see it?
    • What the software does, not how it does it.
    • Where quality intersects with business goals to ensure that business goals are being met.
    • Focuses on functionality without looking at the internals.
    • All levels of testing: unit testing through to acceptance testing.
    • Done by tester during system test.
    • Done by user during user acceptance test.
    • Can be manual or automated but the idea is that you are looking at the product from the outside.
    • A black box tester advocates for the user.
    • Tester is often not aware of how the application is constructed.
    • Especially useful for: Legacy code-source is lost.
    • Compiled DLL.
    • Commercial software-no source code available.
  4. White Box
    • Concept is a transparent box where you can see inside and focus on the internals.
    • Also called clear box or structural test or structure based.
    • You're ensuring that you're building the product you set out to and that all the internals fit together.
    • Where quality and development overlap.
    • Validating everything done under the hood.
    • Doesn’t care about UI (user interface) or UX (user experience)
    • Test cases examine every IF and every CASE.
    • At the static level, white box testing is code inspections.
    • At the dynamic level, testing code by running it: unit testing and integration testing.
    • All levels of testing.
    • Testing done by developers. Testing done by tester working closely with developers.
  5. Grey Box
    • Concept is a semitransparent box that allows for a better understanding of the test cases and what they mean.
    • Concerned about the app from the outside in but have some knowledge of the internals.
    • Test cases are designed using internal data structures and algorithms. However, the testing itself is done at the black box level.
    • A combination of black box and white box testing.
    • It’s a bit of a grey area as to what exactly is categorized in grey box testing.
  6. Testing Technique 2: Functional Testing
    • confirm that an application meets its functional requirements, including:
    • Validation-the functions exist and work.
    • Suitability-does it perform in an appropriate manner. Do you really need a 35 character email password?
    • Compliance-relates to industry/government compliance and laws.
    • Accuracy-does the software produce accurate results?
    • Equivalence Partitioning
    • Boundary Analysis Testing
    • Loop Boundary Testing
    • Coverage Testing-verify that every line of code has been tested-all paths
    • Static Testing
  7. Testing Technique 3: Non-Functional Testing
    • confirm that an application meets its non-functional requirements. This refers to the aspect of the software that may not be related to a specific function or user action.
    • Done after functional testing.
    • Often subjective (1 – 5). For example: easy/average/hard; slow/normal/fast.
    • Best done by development team at system test phase.
    • Usability test: how easy is it for the user? Novices want help, experienced want shortcuts. Use focus groups.
    • Configuration test: are the following installed and working properly: hardware, OS, network, database.
    • Compatibility test: ensure the application does not cause problems for other applications. Ensure application interfaces well with other applications.
    • Load test: how the system behaves at normal and at peak conditions.
    • Performance test: speed and time based.
    • Reliability test: does it work all the time in every way without corruption or data loss?
    • Security test: access to data works and restrictions are in place. Hacking attempts are carried out. 
    • Stress test: can the system handle more than the normal number of transactions at the same time?
    • Fault recovery: if plug is pulled, will it restart okay? If fire, can it be rebuilt elsewhere?
  8. Testing Technique 4: Volume Testing
    • Test the system with a large number of transactions to mirror the expected real world.
    • Watch for counter, memory and file overflows.
    • Best done at system test. Often left until user acceptance test.
  9. Testing Phases map
    Image Upload 1
  10. Testing Phases Summary
    • Tests can be divided in to different phases.
    • Each phase is meant to cover a different aspect of the software to be tested.
    • The objective is to cover all aspects yet avoid redundancy.
    • Phases: unit testing, integration testing, system testing and acceptance testing.
    • Regression testing must also be considered.
  11. Unit Testing
    • -Also called component testing or module testing.
    • It is isolated testing. The goal is to verify one component (module, class …) works independently and is reliable.
    • The programmer that coded unit usually designs the tests.
    • Normally found, debugged and fixed by developer.
    • Should also include static testing (code review).
    • Unit test plan is followed. More on this later.
  12. Integration Testing
    • Can be called link testing or interoperability testing.
    • Test that all modules work together as a whole.
    • Ensures the output of one module is valid for the next module.
    • Module definition: group of functions that provide a well-defined service and communicate with other modules through a defined interface.
    • Examples: A method in a class.
    • An individual program in a large system-payroll, accounts receivable, accounts payable.
    • An html page on a web site.
    • Generally performed by development team.
    • Can be a good time for users to get an early view and to verify test results.
  13. Integration Test Plan
    • Test plan should be created at module design time.
    • Test cases concentrate on demonstrating the interfaces and interactions between the modules.
    • Requires component already unit tested.
  14. Integration Testing Test Cases
    • Review design documents that show module interaction (org charts, structure charts,
    • UML interaction diagram).
    • For each call, create a test case to ensure:
    • each returned value is correct: every possible value or every possible error.
    • Correct module is actually called.
    • All parameters are correct (valid and invalid).
    • No side effects: one module destroying another module’s resources (memory, data set).
    • Reliability: call is repeatable.
    • Create iterative combinations that test all pairs of interoperations then more complex ones. Example: A calls B, B calls A, A calls A then B.
  15. System Testing
    • Verifies the entire system meets requirements.
    • Test the application running with other applications.
    • Verify the new system does not adversely affect other systems.
    • Usually done by the test team.
    • Exercises all functions from end to end.
    • Includes installation testing, multiple environments and stress/volume tests.
  16. Acceptance Testing
    • Also called UAT: user acceptance testing.
    • Confirm the application meets the business requirements. Final test level to validate the system.
    • Can happen when the client asks to see and test the software themselves.
    • UAT test plan is followed. More on this later.
    • Repeat of system testing end to end.
    • Performed by black box testers and users.
    • Alpha testing done on developer side by bringing in select clients.
    • Beta testing is done by allowing selected clients to run software in their own environments.
  17. Regression Testing
    • Verify that any changes made to fix a problem did not break something else.
    • Helpful to know that legacy code will not be negatively affected.
    • Done by re-using old test scripts.
    • Often where automated test tools are used.
    • Also required when other factors change- OS, database version, hardware, compiler.
  18. When To Stop Testing
    • The problem is that testing is costly.
    • It’s nice to say test and re-test fully but time and money run out.
    • High risk requirements are a priority and are done first. After that, the goal can be to achieve code coverage: every line of code is executed at least once. One theory is to stop when the number of defects being found approaches 0.
  19. Defect Tracking
    • Testing finds bugs but does not fix them. Developers fix them.
    • Communication and documentation are key.
    • Test management software records id, severity level and status (open, fixed). Software provides metrics.
  20. A software test plan...
    • describes and guides the testing process.
    • typically created at design time.
    • It includes:
    • Definition of the testing environment including hardware and software.
    • Entry and exit requirements.
    • Constraints.
    • Tested items: unique identifiers for each software product involved.
    • Requirements traceability: each user requirement is mapped to one or more tests.
    • Test recording procedures: how test results are to be recorded and audited.
    • Testing schedule: who and when
    • Roles and responsibilities
    • Budget.
    • Resources: quantity of staff, facilities, hardware, network and training.
  21. A test script is...
    • the document that describes tests to be taken and the execution order of the tests.
    • It can be a manual test script or an automation test script.
    • A test script includes valid and invalid data.
  22. Test Conditions
  23. Test Cases
    • ensure business requirements are met.
    • prepared using detailed design specs
    • defines an exact scenario and states expected outcome(very specific)

    • can be 'hand made' or
    • use live data but security and corruption concerns(template of live data can be extracted, modified and used)
  24. Sample Test Scripts (UAT Test Plan)
    Test Script 1: Patients

    Test Condition: New patient intake...
    Test Case 1
    Test Case 2
    • Test Condition: New patient intake data successfully added to clinic records.
    • Test Case 1:
    • Requirement Being Tested
    •       New Patient Intake
    • Test ID
    •      0007770
    • Tester Assigned (name and role)
    •      Ann Anna - UAT Tester
    • Date Completed
    •      April 8, 2017
    • Test Environment
    •      iPad Air 2
    • Entry Requirements/Pre-Conditions
    •      Table must be created.
    •      Admin staff need read/write permissions.
    • Input Value
    •      Add a new patient filling in all fields.
    • Expected Results
    •      Verify all fields added accurately.
    •      File folder label printed.
    • Test Case 2:
    • Requirement Being Tested
    •     New Patient EMail
    • Test ID
    •     0007775
    • Tester Assigned (name and role)
    •     Ann Anna - UAT Tester
    • Date Completed
    •     April 8, 2017
    • Test Environment
    •     iPhone 7
    • Entry Requirements/Pre-Conditions
    •     Table must be created.
    •     Admin staff need read/write permissions.
    •     New patient added to table.
    • Input Values
    •     Dummy email address.  
    • Expected Results
    •     Email received by dummy test account.
  25. Sample Test Scripts (UAT Test Plan)
    Test Script 1: Patients
         Test Condition ?
    Test Script 2: Physicians
         Test Condition:
    • Test Script 1: Patients
    • Test Condition: New patient intake data successfully added to clinic records. Client notified.

    • Test Script 2: Physicians
    • Test Condition: Produce, verify and submit MOHLTC bill.
  26. Sample Test Scripts (Unit Test Plan)
    Test Script 1: Patients
        Test Condition?
    Test Script 2: Medications
        Test Condition?
    • Test Script 1: Patients
    • Test Condition: receive and validate new address then store in table.

    • Test Script 2: Medications
    • Test Condition: ensure doctor’s recommendation is correct before dispensing medication.
Author
slc53
ID
330135
Card Set
Qual Quiz2
Description
Qual Quiz2
Updated