Author: Neftaly Malatjie

  • 114066 LG 1.37 Integration Testing Steps

      • Integration tests will be created and performed by designated members of each team.  Each individual will be responsible for the preparation of all test cases, procedures and data, as well as for conducting and documenting the tests.  Each individual will also be responsible for the specification of all additional tools and facilities required for the integration testing of their tasks.  The procedure for integration testing is as follows:

        • Review all relevant design documentation and attend all design overviews/walkthroughs.
        • Create an integration test plan.
        • Where possible, create scripts to automate the execution of the test case.
        • Arrange to have Integration Test Plans reviewed by Development for technical accuracy.   The Test Plans may have to be updated after these reviews to incorporate changes suggested by the Developers.
        • Conduct the test as specified in the test cases.
        • Identify any problems which are encountered or where the actual results do not agree with the defined expected results.  Complete a Problem Report.  (see the TBU Problem Report and  System User Guide for the procedure to follow for handling problem reports.)  Update Test Plan execution status in the tracking document (see the appendix entitled Matrices, Logs and indices).

        Once all problems have been resolved, re-run the necessary tests

  • 114066 LG 1.36 Integration Test Design Guidelines

      • The guidelines to be followed during the creation of Integration Test Plans are:

        • The number of new units or tasks to be tested by one Test Plan should not exceed five.
        • To minimize the number of test cases, combine test cases into one if they test the same interface point.
        • Use test cases which already exist.  Portions of available Unit Test Plans or System Test Plans can be used where applicable.
        • When testing few or minor changes to an existing subsystem, structure the Test Plan such that the bulk of it will be regression testing using an existing Integration or System Test plan.  New test cases can then be added to cover the detailed testing of changes.
        • Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.  Where possible, arrange the test cases in the order they will be performed.
        • Whenever possible, design test plans to run independently of other test plans.
        • Design test cases and test data which reveal errors in the interaction between the software components.  (Check the various response codes to calls to external interfaces.).
  • 114066 LG 1.35 Documents Required

      • The following documents provide information required to create the Integration Test Plan and are recommended reading before starting the planning phase.

        •  System Blueprint
        •  High Level Design overview from the developers
        •  Detailed Design
        •  Entity Relationship Diagrams
        •  System Requirements Report
        •  Change Requests


  • 114066 LG 1.52 INTRODUCTION

        • At the end of the testing process, test documents must be prepared. Test documentation is the complete suite of artifacts that describe test planning, test design, test execution, test results and conclusions drawn from the testing activity. As testing activities typically consume 30% to 50% of project effort, testing represents a project within a project. Testing activities must therefore be fully documented to support resource allocation, monitoring and control. This page identifies the types of documents you need to set up and run your test program and summarises their content.

  • 114066 LG 1.51 SESSION 4: COLLECT AND RECORD DATA FOR TESTING

        • On completion of this section you will be able to Collect and record data from testing the networked IT systems. 

        • The recording ensures that the required data was produced. 
        • The recording ensures that the data was correctly collected in line with the documented test scenarios. 
        • The recording ensures that the data are sufficient to meet the purpose of the test. 
        • The recording identifies any problems with the collection of data and takes appropriate action.