Tải bản đầy đủ (.docx) (3 trang)

Performance Testing

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (60.17 KB, 3 trang )

1 Performance Testing
The performance testing is a measure of the performance characteristics of an application. The
main objective of a performance testing is to demonstrate that the system functions to specification
with acceptable response times while processing the required transaction volumes in real-time
production database. The objective of a performance test is to demonstrate that the system meets
requirements for transaction throughput and response times simultaneously. The main deliverables
from such a test, prior to execution, are automated test scripts and an infrastructure to be used to
execute automated tests for extended periods.
1.1 What is Performance testing?
Performance testing of an application is basically the process of understanding how the web application and
its operating environment respond at various user load levels. In general, we want to measure the latency,
throughput, and utilization of the web site while simulating attempts by virtual users to simultaneously
access the site. One of the main objectives of performance testing is to maintain a web site with low latency,
high throughput, and low utilization.
1.2 Why Performance testing?
Performance problems are usually the result of contention for, or exhaustion of, some system resource.
When a system resource is exhausted, the system is unable to scale to higher levels of performance.
Maintaining optimum Web application performance is a top priority for application developers and
administrators.
Performance analysis is also carried for various purposes such as:
• During a design or redesign of a module or a part of the system, more than one
alternative presents itself. In such cases, the evaluation of a design alternative is
the prime mover for an analysis.
• Post-deployment realities create a need for the tuning the existing system. A systematic approach
like performance analysis is essential to extract maximum benefit from an existing system.
• Identification of bottlenecks in a system is more of an effort at troubleshooting. This helps to
replace and focus efforts at improving overall system response.
• As the user base grows, the cost of failure becomes increasingly unbearable. To
increase confidence and to provide an advance warning of potential problems in
case of load conditions, analysis must be done to forecast performance under
load.


Typically to debug applications, developers would execute their applications using different execution
streams (i.e., completely exercise the application) in an attempt to find errors.
When looking for errors in the application, performance is a secondary issue to features;
however, it is still an issue.
1.3 Performance Testing Objectives
The objective of a performance test is to demonstrate that the system meets requirements for transaction
throughput and response times simultaneously.
This infrastructure is an asset and an expensive one too, so it pays to make as much use of this
infrastructure as possible. Fortunately, this infrastructure is a test bed, which can be re-used for other tests
with broader objectives. A comprehensive test strategy would define a test infrastructure to enable all these
objectives be met.
The performance testing goals are:
• End-to-end transaction response time measurements.
• Measure Application Server components performance under various loads.
• Measure database components performance under various loads.
• Monitor system resources under various loads.
• Measure the network delay between the server and clients
1.4 Pre-Requisites for Performance Testing
We can identify five pre-requisites for a performance test. Not all of these need be in place prior to
planning or preparing the test (although this might be helpful), but rather, the list defines what is required
before a test can be executed.
First and foremost thing is
The design specification or a separate performance requirements document should :
• Defines specific performance goals for each feature that is instrumented.
• Bases performance goals on customer requirements.
• Defines specific customer scenarios.
Quantitative, relevant, measurable, realistic, achievable requirements
As a foundation to all tests, performance requirements should be agreed prior to the test. This helps
in determining whether or not the system meets the stated requirements. The following attributes
will help to have a meaningful performance comparison.

• Quantitative - expressed in quantifiable terms such that when response times are
measured, a sensible comparison can be derived.
• Relevant - a response time must be relevant to a business process.
• Measurable - a response time should be defined such that it can be measured using a
tool or stopwatch and at reasonable cost.
• Realistic - response time requirements should be justifiable when compared with the
durations of the activities within the business process the system supports.
• Achievable - response times should take some account of the cost of achieving them.
Stable system
A test team attempting to construct a performance test of a system whose software is of poor quality
is unlikely to be successful. If the software crashes regularly, it will probably not withstand the
relatively minor stress of repeated use. Testers will not be able to record scripts in the first instance,
or may not be able to execute a test for a reasonable length of time before the software, middleware
or operating systems crash.
Realistic test environment
The test environment should ideally be the production environment or a close simulation and be
dedicated to the performance test team for the duration of the test. Often this is not possible.
However, for the results of the test to be realistic, the test environment should be comparable to the
actual production environment. Even with an environment which is somewhat different from the
production environment, it should still be possible to interpret the results obtained using a model of
the system to predict, with some confidence, the behavior of the target environment. A test
environment which bears no similarity to the actual production environment may be useful for
finding obscure errors in the code, but is, however, useless for a performance test.
1.5 Performance Requirements
Performance requirements normally comprise three components:
• Response time requirements
• Transaction volumes detailed in ‘Load Profiles’
• Database volumes
Response time requirements
When asked to specify performance requirements, users normally focus attention on

response times, and often wish to define requirements in terms of generic response times.
A single response time requirement for all transactions might be simple to define from the
user’s point of view, but is unreasonable. Some functions are critical and require short
response times, but others are less critical and response time requirements can be less
stringent.
Load profiles
The second component of performance requirements is a schedule of load profiles. A load
profile is the level of system loading expected to occur during a specific business scenario.
Business scenarios might cover different situations when the users’ organization has different
levels of activity or involve a varying mix of activities, which must be supported by the system.
Database volumes
Data volumes, defining the numbers of table rows which should be present in the database
after a specified period of live running complete the load profile. Typically, data volumes
estimated to exist after one year’s use of the system are used, but two year volumes or
greater might be used in some circumstances, depending on the business application.

Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay
×