Skip to product information
1 of 12

PayPal, credit cards. Download editable-PDF and invoice in 1 second!

GB/T 39788-2021 English PDF (GBT39788-2021)

GB/T 39788-2021 English PDF (GBT39788-2021)

Regular price $635.00 USD
Regular price Sale price $635.00 USD
Sale Sold out
Shipping calculated at checkout.
Quotation: In 1-minute, 24-hr self-service. Click here GB/T 39788-2021 to get it for Purchase Approval, Bank TT...

GB/T 39788-2021: System and software engineering -- Performance testing method

This standard specifies the test process, test requirement model, test types for the performance of system and software. This standard applies to the analysis, design, execution of performance testing for system and software.
GB/T 39788-2021
GB
NATIONAL STANDARD OF THE
PEOPLE REPUBLIC OF CHINA
ICS 35.080
L 77
System and software engineering -
Performance testing method
ISSUED ON: MARCH 09, 2021
IMPLEMENTED ON: OCTOBER 01, 2021
Issued by: State Administration for Market Regulation;
Standardization Administration of PRC.
Table of Contents
Foreword ... 3
1 Scope ... 4
2 Normative references ... 4
3 Terms and definitions ... 4
4 Overview of performance testing ... 6
5 Performance test process ... 6
6 Performance test demand model ... 10
7 Types of performance tests ... 14
Appendix A (Informative) Quality measure of performance efficiency ... 29 Appendix B (Informative) Mobile application performance test case ... 30 Appendix C (Informative) Application case of performance test of large-scale information system ... 37
Appendix D (Informative) Test case of cloud application performance ... 41 Appendix E (Informative) Test case of embedded software performance ... 48 System and software engineering -
Performance testing method
1 Scope
This standard specifies the test process, test requirement model, test types for the performance of system and software.
This standard applies to the analysis, design, execution of performance testing for system and software.
2 Normative references
The following documents are essential to the application of this document. For the dated documents, only the versions with the dates indicated are applicable to this document; for the undated documents, only the latest version (including all the amendments) are applicable to this standard.
GB/T 25000.10-2016 Systems and software engineering - Systems and
software quality requirements and evaluation (SQuaRE) - Part 10: System and software quality models
GB/T 25000.23-2019 Systems and software engineering - Systems and
software quality requirements and evaluation (SQuaRE) - Part 23:
Measurement of system and software product quality
GB/T 38634.1-2020 Systems and software engineering - Software testing - Part 1: Concepts and definitions
3 Terms and definitions
The terms and definitions as defined in GB/T 38634.1-2020, AND the following terms and definitions apply to this document.
3.1
Load testing
It is used to evaluate the performance of the system and software, under the expected varying load. The load is usually between the low, typical and peak usage expectations.
Note: A type of performance efficiency test.
4 Overview of performance testing
Performance testing is used to evaluate the extent, to which the system and software under test complete its designated functions, within a given time and other resource constraints. It is also called performance efficiency testing. For the performance efficiency, quality characteristics, sub-characteristics of system and software, see 4.3.2.2 in GB/T 25000.10-2016. For the performance efficiency, quality measures of system and software, see Appendix A. The description and measurement functions of the quality measures are as shown in 8.3 of GB/T 25000.23-2019. When in use, the quality measures shall be tailored, according to the actual needs of the system and software.
Refer to Appendix B for cases of test of mobile application performance. Refer to Appendix C for cases of test of large-scale information system performance.
Refer to Appendix D for cases of test of cloud application performance. Refer to Appendix E for cases of test of embedded software performance. 5 Performance test process
5.1 Overview
The performance test process includes four processes: demand analysis of performance test, design and implementation of performance test, execution of performance test, summary of performance test.
5.2 Demand analysis of performance test
The demand analysis of performance test includes the following activities: a) Determine the admission criteria of the performance test. Execute it after the system architecture is determined or the smoke test is passed. The
sooner the test is involved, the better.
b) Determine the performance requirements of the system and software to be tested. Performance requirements can come from requirements, which
are specified in documents such as contracts and requirements
specifications, OR the implicit requirements agreed upon by business,
data, expected users, system behavior. The performance requirements
should be determined, according to the performance requirement model.
- The system is unavailable;
- The server is down OR the necessary services are stopped, due to
uncertain reasons;
- The application has a blocking program/serious defect in the ON
state;
- The required dependencies are not available.
2) Recovery criteria may include:
- The system and/or server are available, turned on and running;
- Resolve blocking and/or critical issues;
- The function of the application has been restored;
- Test the degree of recovery when the data processing cycle is not
completed.
5.4 Execution of performance test
The execution process of performance test includes the following activities: a) Perform a pre-readiness check, before execution, to evaluate the
environment and resources required for performance testing.
b) Execute test scripts manually or using test tools; monitor performance indicators during execution; record test results.
c) Performance testing usually needs to examine the comprehensive
performance of the system and software under test, over a period of time; it takes the average, maximum, or minimum value as the test result, as
needed.
d) If the performance test terminates abnormally OR does not meet the
requirements or expectations, fill in the performance problem report form. The problem report shall include the source of the problem, scenario
configuration, problem description, problem level, etc.
e) Determine whether the executed test case passes. If the test fails, analyze the specific situation, to determine whether it is caused by the
performance bottleneck of the software itself OR caused by the test
environment.
Figure 1 shows the execution framework of performance test, which consists of five components: input, operating environment and system and software under a) Organize the performance test results. For the results of the performance test, it should consider the comprehensive results under a variety of
environmental factors; use mathematical and statistical methods for
comprehensive data analysis, such as standard deviations, statistical
models agreed by users.
Note 1: When analyzing data, it should delete abnormal data, such as data captured when the system is started or shut down.
Note 2: For the considerations of response time, throughput rate, resource utilization rate, it includes average, minimum, maximum, or standard
deviation.
Note 3: For the number of concurrent requests, it should analyze the
largest concurrent requests.
b) Compile a software performance test report, which should include: test result analysis, evaluation and suggestions on software performance.
c) Compile performance problem reports, based on test records and
performance problem report sheets. The performance test event report
shall include the source of the problem, scenario configuration, problem description, problem level, etc.
6 Performance test demand model
6.1 Overview
The performance test demand model shall consider the following factors: environment, data, business processes, user distribution, request timing distribution, network status, etc.
6.2 Environmental requirements
For different quality requirements, it shall consider the impact of the test environment on performance testing. It is recommended to use the actual production environment of the system or software, as the performance testing environment. When planning and designing the performance test environment, it shall consider the following factors:
a) Stability: The results of multiple rounds of testing, under the same conditions, shall be consistent, OR within the acceptable error range;
b) Independence: To avoid distortion of test results, the test environment shall be kept isolated from other systems or software in use;
For load test, the load volume requirement of each tested business is a test coverage item.
7.1.3 Export test case
The load test case is exported, according to the following steps:
a) Determine the prerequisites:
1) Determine the pre-business conditions of the business to be tested,
according to the actual situation of the business scenario;
2) Determine the combination of test cases, that need to be run at the
same time.
b) Design input data:
1) Determine the input data required for each operation;
2) Determine the source of input data, such as historical data or data from similar systems.
c) Select user operation:
1) Determine user operations based on user usage scenarios;
2) Determine the number of users in normal/peak time;
3) Determine user activity trends;
4) Determine the thinking time.
d) Determine the expected result:
1) Determine the expected output of each business;
2) When applicable, determine the monitoring indicators of the system
(such as response time, number of concurrent users, resource
utilization, etc.);
3) Determine the pass/fail criteria for the load test, for example, when the response time or resource occupancy rate is greater than a certain
threshold, it will be regarded as failing the load test.
Table 1 shows examples of test coverage items and test cases for load testing in a certain scenario.
b) Design input data:
1) Determine the input data, which is required for each operation;
2) Determine the source of input data, such as historical data or data from similar systems;
3) When designing input data, it usually shall consider the following:
- Provide the amount of information required to be processed,
maximum load that exceeds the expected load;
- The saturation test of data transmission capacity, which requires
more data transmission than the designed capacity: writing and
reading of memory, data transmission of external equipment, other
subsystems and internal interfaces, etc.;
- The capacity for storage range (such as buffer area, table area,
database) to exceed the rated size.
c) Select user operation:
1) Determine user operations based on user usage scenarios;
2) Determine the number of users in normal/peak time, usually using
incremental load loading and peak-valley loading (high-and-low loading
with sharp change);
3) Determine user activity trends;
4) Determine the thinking time.
d) Determine the expected result:
1) Determine the expected output of each business;
2) When applicable, determine the monitoring indicators of the system
(such as response time, number of concurrent users, resource
utilization, etc.);
3) Determine the performance of the system and software, under extreme
conditions (when the expected peak value is exceeded OR the
available resources are less than the minimum requirements).
Table 2 shows an example of stress testing, in a certain business scenario. The load is incrementally loaded. When the load (number of users) reaches 100, the system response time increases too fast, but within the operating range; however, when the load (number of users) reaches 150, the system is abnormal. 7.3.3 Export test case
The peak test case is exported according to the following steps:
a) Determine the prerequisites:
1) Determine the pre-business conditions of the business to be tested,
according to the actual situation of the business scenario;
2) Determine the combination of test cases, that need to be run at the
same time.
b) Design input data:
1) Determine the input data required for each operation;
2) Determine the source of input data, such as historical data or data from similar systems.
c) Select user operation:
1) Determine user operations based on user usage scenarios;
2) Determine the user activity trend, including the initial number of users, peak volume, peak duration, load drop step length;
3) Determine the thinking time.
d) Determine the expected result:
1) When applicable, determine the monitoring indicators of the system
(such as response time, number of concurrent users, resource
utilization, etc.);
2) Determine the expected output of the normal operation of various
businesses;
3) Determine the expected output of degraded operation of each business, at the peak of the current planned test;
4) Determine the expected output of each business to resume operation,
after the peak of the current planned test.
Table 3 shows examples of test coverage items and test cases for peak tests in a certain scenario.
chapters; OR it may be conducted in a separate benchmark test scenario. The construction process of the scalability test model includes:
a) Determine the business that needs scalability testing;
b) Determine the user role distribution of each business under test;
c) Determine the expansion requirements of each business under test;
d) Determine the method for generating scalability test cases.
7.4.2 Export test coverage items
For scalability test, the scalability test requirement of each tested business is a test coverage item.
7.4.3 Export test case
The scalability test case is exported according to the following steps: a) Determine the prerequisites:
1) According to the actual situation of the business scenario, determine the pre-business conditions of the scalability test business;
2) According to the scalability test requirements, determine the
environmental conditions for benchmark test and scalability test, such
as the expansion of the number of servers of the tested system, the
expansion of server memory, etc.;
3) Determine the combination of test case that needs to be compared for performance expansion.
b) Design input data:
1) Determine the input data required for each operation of the scalability test; it should ensure that the input data of the benchmark test is
consistent with that of the scalability test;
2) Determine the source of the input data of scalability test, such as
historical data or data from similar systems. It should be the same as
the data source, which is used in the benchmark test; expanded on the
basis of the benchmark test data;
3) The following content needs to be considered when designing the input data of scalability test:
- Business data expansion contents, which is compared with
The construction process of the endurance test model includes:
a) Determine the user scale, that meets the requirements of performance indicators (time characteristics, resource characteristics, etc.);
b) Build a mixed business model of the software under test;
c) Determine the test execution time;
d) Determine other elements in the scenario (thinking time, aggregation strategy, etc.).
7.5.2 Export test coverage items
For endurance test, the robustness requirement of the mixed business model of the software under test, is a test coverage item.
7.5.3 Export test case
The endurance test case is exported according to the following steps:
a) Clarify the scale of users:
1) Select the maximum number of users, that meet the requirements of
the software performance indicators, as the number of user scales for
the endurance test;
2) Clarify the group distribution, behavior trends, interaction patterns of business-related users.
b) Build a business model:
1) Select business modules, with high performance criticality, to form
multiple sets of mixed business models;
2) Determine the business processing ratio of each group of mixed
business models, according to the actual situation of the business
scenario;
3) Determine the execution order and preconditions of the business in
each group of mixed business models.
c) Determine the execution time:
It should be estimated, based on the operating conditions of the software production environment OR based on the evaluation of software scalability. Note: Usually choose 24 h, 3 ?? 24 h or 7 ?? 24 h to execute.
b) When the amount of data, that needs to be stored and read, is large, focus on indicators such as throughput and disk I/O.
7.6.3 Export test case
The volume test case is exported according to the following steps:
a) Determine the prerequisites:
According to the actual situation of the business scenario, determine the pre-business conditions of the business to be tested.
b) Design input data:
Determine the source of input data, such as historical data or data from similar systems.
c) Monitor the system:
1) Load the large-capacity data;
2) Determine user operations, based on user usage scenarios;
3) Determine the number of users in normal/peak time;
4) Monitor such indicators as CPU/memory/disk/response
time/transaction success rate.
d) Determine the expected result:
1) Determine the expected output of each business;
2) When applicable, determine the monitoring indicators of the system
(such as response time, number of concurrent users, resource
utilization, etc.);
3) Determine the pass/fail criteria for the volume test. For example, as long as a certain resource reaches the maximum usage state or an
indicator exceeds the acceptable threshold, it will be deemed to have
failed the volume test.
Table 6 shows examples of test coverage items and test cases for volume test in a certain scenario.
Appendix B
(Informative)
Mobile application performance test case
B.1 System description
The system and software to be tested are mobile application software. The main function is to send messages online. It is divided into two parts: one is the mobile application terminal, the operating environment is the Android platform, the Android version must be greater than 5.0; the other is the server-side application, which is responsible for message storage and access.
The system includes the following two roles:
a) Ordinary users;
b) System administrator.
The main functions are as follows:
a) System login: All users can perform this operation;
b) Publish messages: Ordinary users can create message AND save or
publish messages;
c) Audit messages: System administrators can audit messages, which are
posted by users; they can pass the audit or cancel the release.
B.2 Performance requirements
The client performance requirements are as follows:
a) In the idle state, the maximum memory consumption, when the software is running, does not exceed 200 MB; meanwhile the memory shall be
cleaned up when the software exits;
b) There is no significant difference in standby power consumption, before and after installing the target application software;
c) The flow value of the application, which runs continuously for 2 hours in the background, does not exceed 20 MB. If it is greater than 20 MB, a
prompt shall be given.
The server-side performance requirements are as follows:
Appendix C
(Informative)
Application case of performance test of...

View full details