1
ASSIGNMENT SOFTWARE-TESTING
Test Plan
Table of content
1. References
2. Introduction
3. Test Items
4. Software Risk Issues
5. Features to be tested
6. Features not to be tested
7. Test Strategies
8. Item Pass/Fail Criteria
9. Environmental Needs
10. Test Deliverables
Group members:
50600357
Tran Hoang Duy
50601095
Truong Quang Khai
50600939
Bui Phi Hung
50601490
Nguyễn Trường Minh
2
3
4
5
6
7
8
10
11
12
2
1. REFERENCES
No.
NAME
1
Test plan template
2
www.google.com.vn
3
Slide
AVA
ILABLE
Location
/>
3
2. INTRODUCTION
[Topics introduction]
“BUILDING ONLINE DOCUMENT MANAGEMENT SYSTEM”
Manager Vietnamese Documents
Allow user searching documents by semantic , by word key and by combination
System is based on JSP and Struts Framework 1.3.10 technology
Run on Internet Explorer or Mozilla FireFox.
System must ensure search speeds less than 10 seconds
System can distribute documents access to users.
[Give an overview of the plan:
The summary of the requirements.
List what needs to be achieved (test objectives)
Detail why testing needed.]
The summary of the features will be tested :
General Functions:
View company documents (TC: 4 man-days, Test: 2 man-days)
View department documents (TC: 2 man-days, Test: 1.5 man-days)
Grant privilege (TC: 1 man-days, Test: 0.5 man-days)
Manager personal documents (TC: 2 man-days, Test: 1 man-days)
Common Functions:
View individual profile (TC: 0.5 man-days, Test: 0.5 man-days)
Change password (TC: 0.5 man-days, Test: 0.5 man-days)
Share documents (TC: 5 man-days, Test: 3 man-days)
Upload one or many documents (TC: 2 man-days, Test: 1 man-days)
Search documents (TC: 2 man-days, Test: 1 man-days)
[Testing purpose]
List what needs to be achieved and details why testing needed :
Test all of auxiliary tasks.
Estimate project performance.
4
3. TEST ITEMS
[List of Software Items to be tested, their versions and how they are handed over for testing]
A build of Project Version 1.0
Teacher send to my group Project and installations as Testing Software
Assignment.
5
4. SOFTWARE RISK ISSUES
[List all software Risks. These risks are related to the testing process, other risks will be mentioned in
section 5.Features to be tested. Below are some common risks:
Lack of personnel resources when testing is to begin.
Lack of availability of required hardware, software, data or tools.
Late delivery of the software, hardware or tools.
Delays in training on the application and/or tools.
Changes to the original requirements or designs.
Complexities involved in testing the applications]
Lack of personnel resources:
We have 2 persons while the system has about 9 tasks must be tested.
Lack of availability of required hardware, software, data or tools :
Hardware:
We have about 2 PCs.
Software:
OS : Unix , Windows Xp , Windows 7 run in VMware workstation
Web browsers: Internet Explored 6 ,7,8 and Mozilla FireFox.
Data:
OK.
Tools:
We have no some tools for doing performance test , usability test and
security test.
6
7
5. FEATURES TO BE TESTED
[List all features will be tested under this test plan
Identify risks for each feature by their likelihood and impact and then determine the extent of
testing.
Identify testing efforts for each type of test]
Feature No
Feature Description
Technical
Risk
Business
Risk
Risk
Priority
Extent of
Testing
Estimated
Testing
Time
(hours)
1
View company
documents
2
View department
documents
3
Grant privilege
4
Manager personal
documents
5
View
profile
6
Change password
7
Share documents
8
Upload one or many
documents
9
Search documents
individual
Total Estimated Testing Time
8
6. FEATURES NOT TO BE TESTED
[List all features will not be tested under this test plan]
Feature
No
Feature Description
Technical
Risk
Business
Risk
Risk
Priority
Extent of
Testing
Estimated
Testing Time
(hours)
1
Performance
2
Usability
3
Security
9
10
7. TEST STRATEGIES
[The Test Strategy presents the recommended approach to the testing the target-of-test. The previous
section, feature to be tested, described what will be tested, this section describes how the target-of-test will
be tested.
For each type of test, provide a description of the test and why it is being implemented and executed.
If a type of test will not be implemented and executed, indicate in a sentence stating the test will not be
implemented / executed and stating the justification, such as “This test will not be implemented / executed.
This test is not appropriate …”
The main considerations for the test strategy are the techniques to be used and the criterion for knowing
when the testing is completed.
In addition to the considerations provided for each test below, testing should only be executed using
known, controlled databases, in secured environments.
In addition, you need to describe:
Testing Tools/Aids
Constrains to testing
Support Required – Environment & Staffing
What metrics will be collected?
Which level is each metric to be collected at?
How is Configuration Management to be handled?
How many different configurations will be tested?
Hardware
Software
Combinations of HW, SW and other vendor packages
What levels of regression testing will be done and how much at each test level?
Will regression testing be based on severity of defects detected?
How will elements in the requirements and design that do not make sense or are untestable be
processed?]
7.1 Function Testing
[Function testing of the target-of-test should focus on any requirements for test that can be traced
directly to use cases (or business functions), and business rules. The goals of these tests are to
verify proper data acceptance, processing, and retrieval, and the appropriate implementation of the
business rules. This type of testing is based upon black box techniques, that is, verifying the
application (and its internal processes) by interacting with the application via the GUI and analyzing
the output (results). Identified below is an outline of the testing recommended for each application:]
Test Objective:
Ensure proper target-of-test functionality, including navigation, data entry,
processing.
Security , performance and retrieval will not tested.
11
Technique:
Completion Criteria:
Special Considerations:
Execute each use case, use case flow, or function, using valid and invalid data,
to verify the following:
The expected results occur when valid data is used.
The appropriate error / warning messages are displayed when invalid data
is used.
Each business rule is properly applied.
All planned tests have been executed.
All identified defects have been addressed.
[Identify / describe those items or issues (internal or external) that impact
the implementation and execution of function test]
7.2 Performance Testing
[Performance profiling is a performance test in which response times, transaction rates, and other time
sensitive requirements are measured and evaluated. The goal of Performance Profiling is to verify
performance requirements have been achieved. Performance profiling is implemented and executed to
profile and tune a target-of-test's performance behaviors as a function of conditions such as workload or
hardware configurations.
NOTE: Transactions below refer to “logical business transactions.” These transactions are defined as
specific use cases that an actor of the system is expected to perform using the target-of-test, such as add
or modify a given contract.]
Test Objective:
Ensure search speed less than 10 seconds
Technique:
Use Test Procedures developed for Function Cycle Testing.
Modify data files (to increase the number of transactions) or the scripts to
increase the number of iterations each transaction occurs.
Many users access at the same time.
Completion Criteria:
Single Transaction / single user: Successful completion of the test scripts
without any failures and within the expected / required time allocation (per
transaction)
Multiple transactions / multiple users: Successful completion of the test scripts
without any failures and within acceptable time allocation.
Special Considerations:
In assignment requirements hasn’t performance test.
12
8. ITEM PASS/FAIL CRITERIA
[This section of the test plan describes the pass/fail criteria for each of the items described in Section 3 Test Items
Typically, pass/fail criteria are expressed in terms of test cases passed and failed; number, type, severity
and location of bugs; usability, reliability, and/or stability.
Examples of pass/fail criteria include:
% of test cases passed
number, severity, and distribution of defects
test case coverage
successful conclusion of user test
completion of documentation
performance criteria.]
TestCase Result
13
9. ENVIRONMENTAL NEEDS
[List all testing environments needed]
System Resources
Resource
Name / Type
Operating system
Browsers
Windown XP
..
Firefox (all of version)
IE 7
14
10. TEST DELIVERABLES
[List all documents can be delivered such as: Test Plan, Test cases, Test Reports. ..etc]
[List all test scripts can be delivered]
No
Document
Type
Assigned To
Tran Hoang Duy (50600357)
1
Test Plan
Nguyen Phi Hung (50600939)
Nguyen Truong Minh (50601490)
2
3
Tran Hoang Duy (50600357)
Installation
Truong Quang Khai (50601095)
Design : Tran Hoang Duy (50600357)
TestCase
Excute : Truong Quang Khai (50601095)