-
-
-
A performance test is planned and executed on all components for which performance requirements and targets have been agreed to with the client. The complexity of the Performance Test is a function of both the number of test cases required and the level of difficulty to set up and execute each test case.
-
-
Author: Neftaly Malatjie
114066 LG 1.47 PERFORMANCE TESTING PROCEDURE
114066 LG 1.46 Conduct System Test
-
-
- Verify that the System Test Environment has been created and that it is functional.
- Create any test data necessary for executing the system test plan scenarios.
- Execute the system test plan scenarios as assigned to each test team member.
- Create a problem report for deviations from the expected results documented in the system test plan scenario.
- Interact with support team to help resolve problem reports.
- Update the tracking report to reflect test step execution and completion.
- Depending of the project, interface with the client testing prime to communicate the system test status and issues.
- Communicate the system test status and issues to management.
- Ensure execution of the system test plan as per acceptance criteria.
- Upon system test completion, refine system test plans for final publication.
- Produce Post Project System Test Reports.
-
-
114066 LG 1.45 Create System Test Plan
-
-
- Obtain a copy of the Requirements Report, FRAM document, Database Design Report, and Detailed Design Report.
- Determine a table of contents for the system test plan and assign the individual test plan scenarios to testers.
- Inform QA and publishing of delivery dates for QA review and publishing. Inform QA of any special testing strategies which will be adopted.
- Review the above-mentioned documents for the test plan scenarios to be written.
- Schedule a testing overview with the analysis and/or development teams to gather the necessary information for writing the test plan scenario.
- Determine the test cases for the test plan scenario.
- Allocate the FRAM to the appropriate test case.
- Write the test plan scenario using the system test plan template.
- Submit a copy of the test plan scenario to the appropriate parties (Analysis and Development teams) for review. The appropriate parties include a System Test peer and development. Depending on the project, the client may participate in a system testing role and may also review the test plan scenario prior to publication.
- Submit a copy of the allocated FRAM to test plan/test case to the FRAM officer. Obtain an updated FRAM document allocated to test plan/test case.
- Submit a copy of the test plan scenario which has been reviewed in a previous step to QA. Along with the test plan scenario, submit a copy of the FRAM which has been allocated to test plan/test case level.
- Upon QA review, make any updates to the test plan scenario which are deemed appropriate.
- Resubmit the test plan scenario to QA for final review.
- Submit the test plan scenario to publishing.
- Create the System Test Tracking Report once all test plan scenarios have been reviewed by QA.
- Submit the System Test Tracking Report to Publishing
- Obtain a copy of the Requirements Report, FRAM document, Database Design Report, and Detailed Design Report.
-
-
114066 LG 1.44 System Testing Steps
-
-
System Tests shall be run by the System Testing Team. A skeleton for the system test plan is available in Word for Windows. The procedure for system testing is as follows:
- Review all requirements and design documents.
- Attend system reviews presented by Development and Analysis Team members.
- Create and maintain a detailed System Test Project Plan.
- Divide the FRAM Requirements into logical groupings or scenarios. These scenarios should reflect the business or operational approach to the system.
- Define any necessary regression tests.
- Create a System Test Plan.
- Where possible, create scripts to automate the execution of a test case.
- Ensure the System Test Plan is reviewed by appropriate parties (Development and Quality Assurance).
- Verify that the System Test Environment has been created.
- Conduct the test as specified in the test cases.
- Identify any problems that are encountered or where the actual results do not agree with the defined expected results and complete a Problem Report.
- Record in the Tracking document the steps executed, relevant PRs, and test cases completed.
- Once all problems have been resolved, re-run the necessary tests.
- Update test plans after the testing is complete.
- Produce Post Project System Testing Reports
-
-
114066 LG 1.43 System Test Design Guidelines
-
-
The following are recommended guidelines when designing system tests:
- Design test cases to ensure that all requirements identified in the Functional Requirements Analysis Matrix document are tested by one or more test cases.
- In order to minimize the number of test cases required, design test cases to establish the presence of several related requirements.
- Each logical test case should test related functionality.
- Use test cases that already exist wherever possible.
- Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.
- Where possible, arrange the test cases in the order the function would be performed from a business perspective.
- Design test plans to run independently of other test plans.
- Identify a procedure to set up the database as required at the start of the test.
- Design test cases and test data that reveal errors in software.
- Design test data that will ensure all conditions and qualities of data edits are covered.
- Use live or representative data as much as possible in order to provide realistic functional tests. Any comments about setting up the test data are to be documented.
- Data for most reports should come from the data prepared for testing the interactive processes. It is acceptable to have the reports contain existing data from the database.
-
-