I haven’t written a huge number of Detailed Test Plans for performance testing, so I am still interested in reading other people’s (non-confidential) documents or templates. If anyone out there has a template, I would love to have a look.

 

Published On: January 18, 2005Tags: ,

26 Comments

  1. munna December 27, 2005 at 11:38 pm - Reply

    hi,
    any kind of software test plan will be help ful for me. please send ifyou have one

    • emmanuel November 22, 2011 at 5:40 pm - Reply

      any kind of software test plan will be help ful for me. please send ifyou have one

      • Srinivas November 4, 2015 at 7:52 pm - Reply

        any kind of software test plan will be help ful for me. please send ifyou have one

  2. Prasad January 3, 2006 at 4:49 pm - Reply

    Please send some load docs

  3. veer November 30, 2006 at 7:04 am - Reply

    Hi, Could you post a performance test plan template for reference. That would of great help.

  4. Asawari July 1, 2007 at 1:35 pm - Reply

    Hi, Please send me one test plan for all kinds of testings

  5. Asawari July 1, 2007 at 1:37 pm - Reply

    Hi, Please send me all kinds of test plans.

  6. Niranjan November 29, 2007 at 3:29 pm - Reply

    please send me all kinds of test plan for performance Testing

  7. ravi February 13, 2008 at 9:13 pm - Reply

    please send any good TestPlans regarding Performance

  8. Siddhartha March 13, 2008 at 10:54 pm - Reply

    [Project_name_here]

    Load/Performance Test Plan

    Version [Version_number]
    Author: [Your_name_here]

    [Your_Company_name]
    [Street_name_1]
    [Street_name_2]
    [City_Zip_Country]
    [Phone_number]
    [URL]

    Audit Trail:

    Date Version Name Comment

    Table of Contents
    TABLE OF CONTENTS 2
    1. REFERENCE DOCUMENTS 3
    1. SCOPE 3
    2. APPROACH 3
    3. LOAD TEST TYPES AND SCHEDULES 3
    4. PERFORMANCE/CAPABILITY GOALS 3
    5. LOAD TESTING PROCESS, STATUS REPORTING, FINAL REPORT 4
    6. BUG REPORTING AND REGRESSION INSTRUCTIONS 5
    7. TOOLS USED 5
    8. TRAINING NEEDS 5
    9. LOAD DESCRIPTIONS 5
    10. SYSTEM UNDER TEST ENVIRONMENT 6
    11. EXCLUSIONS 6
    12. TEST DELIVERABLES 6
    13. BUDGET/RESOURCE 6
    14. TEAM MEMBERS AND RESPONSIBILITIES 7
    15. LIST OF APPENDICES 7
    16. TEST PLAN APPROVAL 7
    APPENDIX 1 USER SCENARIO TEST SUITE 8
    APPENDIX 2 CONCURRENCY LOAD TESTING SUITE 8
    APPENDIX 3 DATA ELEMENT FROM LOAD TEST 8
    APPENDIX 4 TEST SCRIPTS – REQUIRES WEBLOAD OR TEXT EDITOR – IN JAVASCRIPT 8
    APPENDIX 5 ERROR OR WEB SERVER FAILURES. 8
    APPENDIX 5 WEB MONITORING DATA. 8

    1. Reference Documents

    Reference information used for the development of this plan including:
    • Business requirements
    • Technical requirements
    • Test requirements
    • …and other dependencies

    1. Scope

    What does this document entail?
    What is being tested?
    What is the overall objective of this plan? For examples:
    • To document test objectives, test requirements, test designs, test procedures, and other project management information
    • To solicit feedback and build consensus
    • To define development and testing deliverables
    • To secure commitment and resources for the test effort

    2. Approach

    The high-level description of the testing approach that enables us to cost effectively meet the expectation stated in the Scope section.
    3. Load Test Types and Schedules

    Specify the test types (with definition for each) to run:
    • Acceptance test
    • Baseline test
    • 2B1 load test
    • Goal-reaching test
    • Spike test
    • Burstiness test
    • Stress test
    • Scalability test
    • Regression test
    • Benchmark test

    Be specific:
    • Specify what tests you will run
    • Estimate how many cycles of each test you will run
    • Schedule your tests ahead of time
    • Specify by what criteria you will consider the SUT to be ready-for-test
    • Forward thinking: Determine and communicate the planned tests and how the tests are scheduled
    4. Performance/Capability Goals

    Identify goals:
    • Percentage of requested static pages that must meet the acceptable response time?
    • Percentage of requested scripts that must meet the acceptable response time?
    • The baseline multiplier (2x, 4x, …) that the system must be capable of handling?
    • The spike ratio that the system must be capable of handling?
    • The peak ratio that the system must be capable of handling?
    • The burstiness ratio that the system must be capable of handling?
    • Tolerance ratio: Imposed load ? 25 %?
    • Safety ratio: Imposed load x 2?
    • Spike ratio: Imposed load x 3?
    • Burstiness ratio: Imposed load x 5?
    • Increase the load by multiplying the load baseline by 1x, 2x, 3x, 4x, Nx gradually until unacceptable response time is reached.

    Other questions to consider:
    • What is response time?
    • What is acceptable response time?
    • Which metrics should we collect?
    • What is the correlation between demand and increased load?
    • How do we determine which components are problematic?
    • How do we correlate financial implications?

    5. Load Testing Process, Status Reporting, Final Report

    Describe the testing and reporting procedures. For example:
    • The internal test team will execute all created scripts. These Scripts will be generated and executed against the system at least three times. We will execute these scripts again, after subsequent hardware, software, or other fixes are introduced.

    • Test team will baseline load as follows:
    • Load Test Team will test Nile.com with 1000 Simultaneous Clients/Users, and report back on the following metrics:
    • Response Time each transaction hitting the Web site.
    • Any web or database server errors as reported in the data log.
    • Round time
    • Failed Web Transactions
    • There will be Status Reports sent to Team Lead detailing:
    • Performance tests run
    • Performance metrics collected
    • Performance Errors and status
    • Number of Bugs Entered
    • Status Summary
    • Additional load testing, if needed.
    • The Final Report will include summary bug counts, overall performance assessment, and test project summary items.

    Additional Information to be provided by Development Team:
    1. Build Schedule
    2. Acceptance test criteria
    3. Deployment Plans

    6. Bug Reporting and Regression Instructions

    Describe the bug reporting process and the fix/change regression test procedures.

    7. Tools Used

    State the tool solutions for the project:
    • Load testing tools
    • Monitoring tools
    Tool Options:
    • Product vs. Application Service Provider (ASP)
    • Freeware
    • Lease or rent
    • Purchase
    • Build
    • Outsourcing (testing with virtual client licensing included)

    8. Training Needs

    Training programs to be provided to the team to enable successful planning and execution.

    9. Load Descriptions

    Server-based
    • Number of users and/or sessions
    • Average session time
    • Number of page views
    • Average page views per session
    • Peak period (e.g., 75% of traffic is from 11:00 AM-4:00 PM)
    • Number of hits
    • Average page size
    • Most requested pages
    • Average time spend on page
    • New users vs. returning users
    • Frequency of visits (e.g., 75% of users made one visit)
    • Demographics
    • Client information such as browser, browser version, Java script support, Java script enable/disable, and so on.

    User-based
    • Number of users
    • Session length
    • User activities and frequency of activities per session
    • Think/Read/Data-input time
    • Percentage by functional group
    • Percentage by human speed
    • Percentage by human patience (cancellation rates)
    • Percentage by domain expertise (speed)
    • Percentage by familiarity (speed)
    • Percentage by demographics (arrival rates)

    Other questions to consider:
    • What is the definition of “workload”?
    • How do we size the workload?
    • What is the expected workload?
    • What’s the mix ratio of static pages vs. code?
    • What is the definition of “increased load”?
    • What is future growth? Can it be quantified?
    • What is the definition of scalability?

    10. System Under Test Environment

    Specifying mixes of system hardware, software, memory, network protocol, bandwidth, etc.
    • Network access variables: For example, 56K modem, 128K Cable modem, T1, etc.
    • Demographic variables: For example San Francisco, Los Angeles, Chicago, New York, Paris, London, etc.
    • ISP infrastructure variables: For example, first tier, second tier, etc.
    • Client baseline configurations
    • Computer variables
    • Browser variables
    • Server baseline configurations
    • Computer variables
    • System architecture variables and diagrams

    Other questions to consider asking:
    • What is the definition of “system”?
    • How many other users are using the same resources on the system under test (SUT)?
    • Are you testing the SUT in its complete, real-world environment (with load balances, replicated database, etc.)?
    • Is the SUT inside or outside the firewall?
    • Is the load coming from the inside or outside of the firewall?

    11. Exclusions

    Set clear expectations—State which goals will be outside of the scope of this testing. For example:
    • Content accuracy or appropriateness testing is out of the scope of this plan.
    • The integration of any major third party components (for example a search engine, credit card processor, or mapping component) with the site will be tested, though the scope of the project does not include in-depth functional testing of these components.
    • Internationalization
    • Compatibility Testing

    12. Test Deliverables

    • This test plan
    • Performance testing goals
    • Workload definitions
    • User scenario designs
    • Performance test designs
    • Test procedures
    • System baseline/System-under-test configurations
    • Metrics to collect
    • Tool evaluation and selection reports (first time, or as needed)
    • Test scripts/suites
    • Test run results
    • Analysis reports against the collected data
    • Performance related error reports (e.g., failed transactions)
    • Functional bug reports (e.g., data integrity problems)
    • Periodic status reports
    • Final report

    13. Budget/Resource
    Monetary requirements for equipment and people to complete the plan.

    14. Team Members and Responsibilities
    Project team members, their responsibilities and contact information.

    15. List of Appendices

    Specific test case, test design and test script information to be added as we go. Here are a few examples:
    • Real-World User-Level Test Suite
    • Concurrency Test Suite
    • Data Elements
    • Test Scripts
    • Error Reports
    • Web Monitoring Data
    16. Test Plan Approval

    Business Approval

    __________________________________________________ _____________
    [Name/Title] Date

    Testing Approval

    ___________________________________________________ _____________
    [Name/Title] Date

    Appendices
    Appendix 1 User Scenario Test Suite

    Appendix 2 Concurrency Load Testing Suite

    Appendix 3 Data Element from Load Test

    Appendix 4 Test Scripts – Requires Webload or Text Editor – IN JAVASCRIPT

    Appendix 5 Error or Web Server Failures.

    Appendix 5 Web Monitoring Data.

  9. Vijila June 20, 2008 at 3:34 am - Reply

    Hi Sidd,Thanks for the test plan,…..

  10. SUNIL July 11, 2008 at 6:35 pm - Reply

    Not able to get the performance test paln

  11. PK August 27, 2008 at 4:23 am - Reply

    Hi Sidd,

    Can you please share a ample test plan in the format that you have specified?

  12. Rajeev September 12, 2008 at 4:29 pm - Reply

    I would like to know that how to prepare full fleged performance test plan,
    plaese give one prototype,

  13. Vivek June 2, 2009 at 6:48 pm - Reply

    Hi,

    Great to have come across the post abt the performance/Load testplan, its really very helpful , i just prepared one few days ago with the silkperformer in focus
    Could you please share any non-confidential load testplan doc for reference, that helps me a good deal

    Thanks again
    Vivek

  14. Vivek July 1, 2009 at 9:43 pm - Reply

    Hi,

    We have to create a performance test plan for our project. Can you please share a performance test plan with us?

    Thanks
    Vivek

  15. Sankar December 17, 2009 at 4:08 pm - Reply

    Dear Siddhartha ,

    Can you please provide the test plan sample which you have created . That would be helpful for my project.Thanks in advance

  16. Luke January 21, 2010 at 11:32 am - Reply

    Thank you, Siddhartha. That helps.

  17. Sunny March 15, 2010 at 4:27 am - Reply

    Thanks dear frnd its great help for all the people those who r into performance testing.

  18. Anna January 12, 2011 at 2:20 am - Reply

    Sid,
    Thanks man, Good information regarding performance test plan

  19. Akbar April 17, 2012 at 8:44 pm - Reply

    Please have a look, it contains all which we need to built Test plan for performance testing…

    Before performance testing can be performed effectively, a detailed plan should be formulated that specifies how performance testing will proceed from a business perspective and technical perspective. At a minimum, a performance testing plan needs to address the following:

    •Overall approach
    •Dependencies and baseline assumptions
    •Pre-performance testing actions
    •Performance testing approach
    •Performance testing activities
    •In-scope business processes
    •Out-of-scope business processes
    •Performance testing scenarios
    •Performance test execution
    •Performance test metrics

    As in any testing plan, try to keep the amount of text to a minimum. Use tables and lists to articulate the information. This will reduce the incidents of miscommunication.

    Overall approach
    This section of the performance plan lays out the overall approach for this performance testing engagement in non-technical terms. The target audience is the management and the business. Example:

    “The performance testing approach will focus on the business processes supported by the new system implementation. Within the context of the performance testing engagement, we will:
    •Focus on mitigating the performance risks for this new implementation.
    •Make basic working assumptions on which parts of the implementation need to be performance-tested.
    •Reach consensus on these working assumptions and determine the appropriate level of performance and stress testing that shall be completed within this compressed time schedule.

    This is a living document, as more information is brought to light, and as we reach consensus on the appropriate performance testing approach, this document will be updated.”Dependencies and baseline assumptions
    This section of the performance test plan articulates the dependencies (tasks that must be completed) and baseline assumptions (conditions testing believes to be true) that must be met before effective performance testing can proceed. Example:

    “To proceed with any performance testing engagement the following basic requirements should be met:
    •Components to be performance tested shall be completely functional.
    •Components to be performance tested shall be housed in hardware/firmware components that are representative or scaleable to the intended production systems.
    •Data repositories shall be representative or scaleable to the intended production systems.
    •Performance objectives shall be agreed upon, including working assumptions and testing scenarios.
    •Performance testing tools and supporting technologies shall be installed and fully licensed.”

    More information about performance testing
    Testing for performance, part 1: Assess the problem space

    Testing for performance, part 2: Build out the test assets

    Testing for performance, part 3: Provide information

    Pre-performance testing actions
    This section of the performance test plan articulates pre-testing activities that could be performed before formal performance testing begins to ensure the system is ready. It’s the equivalent to smoke testing in the functional testing space. Example:

    “Several pre-performance testing actions could be taken to mitigate any risks during performance testing:
    •Create a “stubs” or “utilities” to push transactions through the QA environment -– using projected peak loads.
    •Create a “stubs” or “utilities” to replace business-to-business transactions that are not going to be tested or will undergo limited performance. This would remove any dependencies on B2B transactions.
    •Create a “stubs” or “utilities” to replace internal components that will not be available during performance testing. This would remove any dependencies on these components.
    •Implement appropriate performance monitors on all high-volume servers.”
    Performance testing approach
    This section of the performance plan expands on the overall approach, but this time the focus is on the both the business and technical approach. As an example:

    “The performance testing approach will focus on a logical view of the new system implementation. Within the context of the performance testing engagement, we will:
    •Focus on mitigating the performance risks for this new implementation.
    •Make basic working assumptions on which parts of the implementation need to be performance-tested.
    •Reach consensus on these working assumptions and determine the appropriate level of performance that shall be completed.
    •Use a tier 1 performance testing tool that can replicate the expected production volumes.
    •Use an environment that replicates the components (as they will exist in production) that will be performance-tested -– noting all exceptions.
    •Use both production and non-production (testing) monitors to measure the performance of the system during performance testing.”
    Performance testing activities
    This section of the performance test plan specifies the activities that will occur during performance testing. Example:

    “During performance testing the following activities shall occur:
    •Performance test shall create appropriate loads against the system following agreed-upon scenarios that include:
    •User actions (workflow)
    •Agreed-upon loads (transactions per minute)
    •Agreed-upon metrics (response times)

    •Manual testing and automated functional tests shall be conducted during performance testing to ensure that user activities are not impacted by the current load.

    •System monitors shall be used to observe the performance of all servers involved in the test to ensure they meet predefined performance requirements.

    •Post-implementation support teams shall be represented during performance testing to observe and support the performance testing efforts.”

    In-scope business processes
    This section of the performance test plan speaks to which aspects of the system are deemed to be in-scope (measured). Example:

    “The following business processes are considered in-scope for the purposes of performance testing:
    •User registration
    •Logon/access
    •Users browsing content
    •Article sales & fulfillment
    •Billing
    Business process list formed in consultation with: Business Analysts, Marketing Analyst, Infrastructure, and Business Owner.”Out-of-scope business processes
    This section of the performance testing plan speaks to which aspects of the system are deemed to be out-of-scope (measured). Example:

    “Business processes that are considered out-of-scope for the purposes of testing are as follows:
    •Credit check
    •Assumption: Credit check link shall be hosted by a third party — therefore no significant performance impact.
    •All other business functionality not previously listed as in-scope or out-of-scope
    •Assumption: Any business activity not mentioned in the in-scope or out-of-scope sections of this document does not present a significant performance risk to the business.”
    Formulating performance testing scenarios
    The existence of this section within the body of the performance testing plan depends on the maturity of the organization within the performance testing space. If the organization has little or no experience in this space, then include this section within the plan otherwise include it as an appendix. Example:

    “Formulation of performance testing scenarios requires significant inputs from IT and the business:
    •Business scenario
    •The business scenario starts as a simple textual description of the business workflow being performance-tested.
    •The business scenario expands to a sequence of specific steps with well-defined data requirements.
    •The business scenario is complete once IT determines what (if any) additional data requirements are required because of the behavior of the application/servers (i.e. caching).

    •Expected throughput (peak)
    •The expected throughput begins with the business stating how many users are expected to be performing this activity during peak and non-peak hours.
    •The expected throughput expands to a sequence of distinguishable transactions that may (or may not) be discernable to the end user.
    •The expected throughput is completed once IT determines what (if any) additional factors could impact the load (i.e. load-balancing)

    •Acceptance performance criteria (acceptable response times under various loads)
    •Acceptance performance criteria are stated by the business in terms of acceptable response times under light, normal and heavy system load. System load being day-in-the-life activity. These could be simulated by other performance scenarios.
    •The performance testing team then restates the acceptance criteria in terms of measurable system events. These criteria are then presented to the business for acceptance.
    •The acceptance criteria are completed once IT determines how to monitor system performance during the performance test. This will include metrics from the performance testing team.

    •Data requirements (scenario and implementation specific)
    •The business specifies the critical data elements that would influence the end-user experience.
    •IT expands these data requirements to include factors that might not be visible to the end user, such as caching.
    •The performance testing team working with IT and the business creates the necessary data stores to support performance testing.”
    Performance test execution
    Once again the existence of this section of the performance test plan is dependent upon the maturity of the organization within the performance testing space. If the organization has significant performance testing experience, then this section can become a supporting appendix. Example:

    “Performance testing usually follows a linear path of events:
    •Define performance-testing scenarios.
    •Define day-in-the-life loads based on the defined scenarios.
    •Execute performance tests as standalone tests to detect issues within a particular business workflow.
    •Execute performance scenarios as a “package” to simulate day-in-the-life activities that are measured against performance success criteria.
    •Report performance testing results.
    •Tune the system.
    •Repeat testing as required.”
    Performance test metrics
    The performance test metrics need to track against acceptance performance criteria formulated as part of the performance testing scenarios. If the organization has the foresight to articulate these as performance requirements, then a performance requirements section should be published within the context of the performance test plan. The most basic performance test metrics consist of measuring response time and transaction failure rate against a given performance load — as articulated in the performance test scenario. These metrics are then compared to the performance requirements to determine if the system is meeting the business need.

    Closing comments
    This article can speak only to generalities; the specifics depend on the system and situations being tested. As a closing note, non-performance testing activities are often referred to as performance testing — what I like to call warm-and-fuzzy performance testing. If you are not replicating expected production loads, then you are not doing performance testing.

    • mitesh May 17, 2012 at 5:53 am - Reply

      Hi thank you for sharing the test plan. do u have it with all this details in brief? if yes, can u please send me ? i really need it for my project. appricated.

  20. santoshpavan December 7, 2012 at 2:16 pm - Reply

    please send me a load test plan to my mail id

  21. Kamran Khan May 12, 2014 at 3:52 pm - Reply

    Thanks everyone for flooding this page.
    Those asking for performance testing material might like to see my presentation Load Testing using HP LoadRunner here (among top 1% on slideshare) :
    chromeis.com/loadrunner

    Sincerely,

Leave A Comment