114066 LG 1.38 Create Integration Test Plan

Email: info@saypro.online Call/WhatsApp: + 27 84 313 7407

SayPro is a Global Solutions Provider working with Individuals, Governments, Corporate Businesses, Municipalities, International Institutions. SayPro works across various Industries, Sectors providing wide range of solutions.

    • This section provides a guide for creating an Integration Test Plan.  Skeletons for the test plan, test case and results summary are available in Word for Windows.  By using this template and the style codes defined, table of contents can be created that are used to create the tracking document.

      •  Identify subsystem interface points:
      • The Design Reports identify subsystem interface points. This should provide a high level view of which subsystems are changing and what, if any, new subsystems will be created to bring the system in line with requirements.
      • A review of the Detailed Designs is conducted to determine which units (and therefore, which subsystems) are changing .
      • For new subsystems, or major changes to existing subsystems, the interface points must be identified by using the Detailed Designs–these contain IPCs, Tables/Files accessed and process descriptions which will help the tester to identify critical interface points.
      • If the subsystem in question is not new and will not require major changes, then this points to the need for regression testing of existing interface points to test that the subsystem functions as it did before any changes were implemented .
      •  Divide the interface points into logical groupings (test plans).  Draw the IPC diagram illustrating the interface points.
      •  Create test cases to test each interface:
      • Enter a purpose for each test case. Identify the conditions being tested.  Ensure that each statement in the purpose is proven in the Expected Results.
      • Using the Detailed Design Report, identify the processes within the subsystems that are the actual interfaces. These could be messages passed between processes or data written by one process and read by another.  List these processes under Interface Components Tested.  If the interface is by file, identify the tables being read, written or updated and list them in the File/Table Interface Points section.
      •  List the steps to be followed in order to accomplish the purpose of the test:
      • List the sub-test that identifies the interfaces being tested in each test case.
      • Below each sub-test heading, list the steps required to accomplish the test.
      • In test cases for interactive functions, describe the actions to be performed by the tester followed by the result expected from the action.
      • For non-interactive tests, list the steps to be performed. This usually involves running a command file, but may also consist of listing the steps required to use an emulator or other test tool.
      • Expected results statements must describe only that which is visible to the tester. Processing which cannot be proven is not to be included.
      •  Create test data where applicable.
      •  Establish the expected results for each test case:
      • The Expected Results section describes the outcome of an event that was triggered by a step in a test plan. For example it may be expected that after an IPC is sent from one process and successfully received by another, a database change is made.  In this instance, the Expected Results section would describe how the database should look (i.e. the changes to a file/table caused by the IPC).  Once all the test cases in a test plan are defined, update the Interface Points Tested and File/Table Interface Points sections of the test plan introduction page.  It is not necessary to list every software component being used in the test cases, only the specific ones being tested by the test cases.  (i.e., do not repeat software components tested fully by a previous test plan, unless the software component is being used for re-configuration of the system.)
      •  Test Setup Notes:  Identify special instructions for the test case.
      • List any requirements for the test cases in the Test Setup Notes section. For example, it might be stated that it will be necessary for the tester to backup the data files used in the test case so that they may be restored for running subsequent test cases. Where possible, create scripts to automate the execution of each test case.  The name of this script should be listed in the Notes section of the test case.
      • Develop procedures to execute (scripts) and evaluate each test plan (i.e., produce SQLCI reports to list the contents of tables).
      • Identify command files that will back up or restore the data base to the state it was in at the start or completion of each test plan and list these command file names in the NOTES section of the test case.
      •  Create a Test Case Tracking document.
      • After completing all the Integration Test Plans, create a Tracking document (see the appendix entitled Matrices, Logs and Indices) using a spreadsheet such as Microsoft Excel.

       

  • Neftaly Malatjie | CEO | SayPro
  • Email: info@saypro.online
  • Call: + 27 84 313 7407
  • Website: www.saypro.online

SayPro ShopApp Jobs Courses Classified AgriSchool Health EventsCorporate CharityNPOStaffSports

Comments

Leave a Reply