Author: Neftaly Malatjie

  • 114066 LG 1.42 Documents Required

      • The following documents provide information required to create the System Test Plan and is recommended reading before starting the planning phase.

        • High Level Design overview
        • Problem Report Analysis Report
        • Database Design Report
        • FRAM
        • Requirements Reports
        • Change Requests
        • Appropriate 3rd Party Interface Specifications
  • 114066 LG 1.41 SYSTEM TESTING PROCEDURE

      • The goal of System Testing is to ensure that the system performs as per the functional requirements specified by client.

        A system test covers the testing of functions within the system.  System testing is performed once integration testing has been completed.  System Testing procedures consist of:

        • Creating Test Plans
        • Creating test data
        • Conducting tests according to the System Test Plan
        • Reporting and reviewing the results of the test

        Features to be tested during System Testing are:

        • Functional Requirements
        • Depending on the project, any regression tests deemed necessary


  • 114066 LG 1.40 Conduct Integration Test

      • Integration testing will be performed according to the following:

        • Integration tests will be run according to the Integration Test Plans by the Test Team Leader or Test Team Member.
        • Actual results of the test runs are presented by printing documentation (reports, file dumps) or by demonstration (screen, panel displays).
        • If any of the actual results do not agree with expected results, the person performing the test will complete a Problem Report (PR).
        • After the necessary action has been taken to resolve the problem, the test run will be performed again from the beginning of the test step .The Test Plan may need to be updated, depending on the results of the test.
        • Update the Tracking document at least once a day. As a test is completed, either successfully or unsuccessfully, the tester should update the Tracking document.  The tester’s initials are to be updated each time a different person performs the test. If a test step is completed without any problem reports, the test step is considered “closed”.  However, if a problem is raised after running a test step, the tester will indicate this in the tracking document. The tester updates the tracking document to reflect both the number of problem reports raised by, and the problem report PRS numbers associated with, the particular test step executed. (see tracking document template in the appendix entitled Matrices, Logs and indices).  The Integration Test Team meets frequently to discuss the testing activities, possible conflicts and to review Problem Reports.
        • The Integration Test Manager meets frequently with the Development Team Leaders to review Problem Reports, negotiate priorities for code fixes, and discuss support issues.
        • When an error is found, do not spend a lot of time trying to debug the problem.  Instead, raise a problem report providing as much details as possible so that the person or persons resolving the problem will know what to look for.  Whenever possible, dump screens, logs, or tables to files or paper and forward a copy to whomever the problem reports are assigned.  This will help everyone to get the problems reports answered as efficiently as possible.
  • 114066 LG 1.39 Integration Test

      • This section provides a guide for creating an Integration Test Plan.  Skeletons for the test plan, test case and results summary are available in Word for Windows.  By using this template and the style codes defined, table of contents can be created that are used to create the tracking document.

        •  Identify subsystem interface points:
        • The Design Reports identify subsystem interface points. This should provide a high level view of which subsystems are changing and what, if any, new subsystems will be created to bring the system in line with requirements.
        • A review of the Detailed Designs is conducted to determine which units (and therefore, which subsystems) are changing .
        • For new subsystems, or major changes to existing subsystems, the interface points must be identified by using the Detailed Designs–these contain IPCs, Tables/Files accessed and process descriptions which will help the tester to identify critical interface points.
        • If the subsystem in question is not new and will not require major changes, then this points to the need for regression testing of existing interface points to test that the subsystem functions as it did before any changes were implemented .
        •  Divide the interface points into logical groupings (test plans).  Draw the IPC diagram illustrating the interface points.
        •  Create test cases to test each interface:
        • Enter a purpose for each test case. Identify the conditions being tested.  Ensure that each statement in the purpose is proven in the Expected Results.
        • Using the Detailed Design Report, identify the processes within the subsystems that are the actual interfaces. These could be messages passed between processes or data written by one process and read by another.  List these processes under Interface Components Tested.  If the interface is by file, identify the tables being read, written or updated and list them in the File/Table Interface Points section.
        •  List the steps to be followed in order to accomplish the purpose of the test:
        • List the sub-test that identifies the interfaces being tested in each test case.
        • Below each sub-test heading, list the steps required to accomplish the test.
        • In test cases for interactive functions, describe the actions to be performed by the tester followed by the result expected from the action.
        • For non-interactive tests, list the steps to be performed. This usually involves running a command file, but may also consist of listing the steps required to use an emulator or other test tool.
        • Expected results statements must describe only that which is visible to the tester. Processing which cannot be proven is not to be included.
        •  Create test data where applicable.
        •  Establish the expected results for each test case:
        • The Expected Results section describes the outcome of an event that was triggered by a step in a test plan. For example it may be expected that after an IPC is sent from one process and successfully received by another, a database change is made.  In this instance, the Expected Results section would describe how the database should look (i.e. the changes to a file/table caused by the IPC).  Once all the test cases in a test plan are defined, update the Interface Points Tested and File/Table Interface Points sections of the test plan introduction page.  It is not necessary to list every software component being used in the test cases, only the specific ones being tested by the test cases.  (i.e., do not repeat software components tested fully by a previous test plan, unless the software component is being used for re-configuration of the system.)
        •  Test Setup Notes:  Identify special instructions for the test case.
        • List any requirements for the test cases in the Test Setup Notes section. For example, it might be stated that it will be necessary for the tester to backup the data files used in the test case so that they may be restored for running subsequent test cases. Where possible, create scripts to automate the execution of each test case.  The name of this script should be listed in the Notes section of the test case.
        • Develop procedures to execute (scripts) and evaluate each test plan (i.e., produce SQLCI reports to list the contents of tables).
        • Identify command files that will back up or restore the data base to the state it was in at the start or completion of each test plan and list these command file names in the NOTES section of the test case.
        •  Create a Test Case Tracking document.
        • After completing all the Integration Test Plans, create a Tracking document (see the appendix entitled Matrices, Logs and Indices) using a spreadsheet such as Microsoft Excel.

         

  • 114066 LG 1.38 Create Integration Test Plan

      • This section provides a guide for creating an Integration Test Plan.  Skeletons for the test plan, test case and results summary are available in Word for Windows.  By using this template and the style codes defined, table of contents can be created that are used to create the tracking document.

        •  Identify subsystem interface points:
        • The Design Reports identify subsystem interface points. This should provide a high level view of which subsystems are changing and what, if any, new subsystems will be created to bring the system in line with requirements.
        • A review of the Detailed Designs is conducted to determine which units (and therefore, which subsystems) are changing .
        • For new subsystems, or major changes to existing subsystems, the interface points must be identified by using the Detailed Designs–these contain IPCs, Tables/Files accessed and process descriptions which will help the tester to identify critical interface points.
        • If the subsystem in question is not new and will not require major changes, then this points to the need for regression testing of existing interface points to test that the subsystem functions as it did before any changes were implemented .
        •  Divide the interface points into logical groupings (test plans).  Draw the IPC diagram illustrating the interface points.
        •  Create test cases to test each interface:
        • Enter a purpose for each test case. Identify the conditions being tested.  Ensure that each statement in the purpose is proven in the Expected Results.
        • Using the Detailed Design Report, identify the processes within the subsystems that are the actual interfaces. These could be messages passed between processes or data written by one process and read by another.  List these processes under Interface Components Tested.  If the interface is by file, identify the tables being read, written or updated and list them in the File/Table Interface Points section.
        •  List the steps to be followed in order to accomplish the purpose of the test:
        • List the sub-test that identifies the interfaces being tested in each test case.
        • Below each sub-test heading, list the steps required to accomplish the test.
        • In test cases for interactive functions, describe the actions to be performed by the tester followed by the result expected from the action.
        • For non-interactive tests, list the steps to be performed. This usually involves running a command file, but may also consist of listing the steps required to use an emulator or other test tool.
        • Expected results statements must describe only that which is visible to the tester. Processing which cannot be proven is not to be included.
        •  Create test data where applicable.
        •  Establish the expected results for each test case:
        • The Expected Results section describes the outcome of an event that was triggered by a step in a test plan. For example it may be expected that after an IPC is sent from one process and successfully received by another, a database change is made.  In this instance, the Expected Results section would describe how the database should look (i.e. the changes to a file/table caused by the IPC).  Once all the test cases in a test plan are defined, update the Interface Points Tested and File/Table Interface Points sections of the test plan introduction page.  It is not necessary to list every software component being used in the test cases, only the specific ones being tested by the test cases.  (i.e., do not repeat software components tested fully by a previous test plan, unless the software component is being used for re-configuration of the system.)
        •  Test Setup Notes:  Identify special instructions for the test case.
        • List any requirements for the test cases in the Test Setup Notes section. For example, it might be stated that it will be necessary for the tester to backup the data files used in the test case so that they may be restored for running subsequent test cases. Where possible, create scripts to automate the execution of each test case.  The name of this script should be listed in the Notes section of the test case.
        • Develop procedures to execute (scripts) and evaluate each test plan (i.e., produce SQLCI reports to list the contents of tables).
        • Identify command files that will back up or restore the data base to the state it was in at the start or completion of each test plan and list these command file names in the NOTES section of the test case.
        •  Create a Test Case Tracking document.
        • After completing all the Integration Test Plans, create a Tracking document (see the appendix entitled Matrices, Logs and Indices) using a spreadsheet such as Microsoft Excel.