Pages

Blogger Themes

Performance Testing Overview

"


Performance Testing:

Overview
This paper introduces Performance testing and describes how to perform Performance testing, and tools and techniques relevant to Performance testing.

Couple of definitions, explanations: what is Performance Testing?
  • In general, Performance testing is the process of testing the application to check if the application behaves well under the specified conditions (User loads, transactions, speed, process, memory, threads, hits, throughputs and so...)
  • The objective of a performance test is to demonstrate that the system meets requirements such as defined number of user loads, transactions, and throughput and response times as part of system/application performance.
  • Performance testing can be done for determining the speed or effectiveness of a computer, network, software program or device.
  • This process can involve quantitative tests done in a lab, such as measuring the response time or the number of MIPS (millions of instructions per second) at which a system functions. Qualitative attributes such as reliability, scalability and interoperability may also be evaluated. Performance testing is often done in conjunction with stress testing.
  • Performance testing can verify that a system meets the specifications claimed by its manufacturer or vendor.

Performance test approach:
Let’s consider a small scenario here.
Let’s say, for example…we are in search of “GE” company info on the Google.
If, all of us (let’s say 100 members of us) search the word “GE” at a time in Google, will the Google Server handle this load? (user load)
How many transactions (responses) are generated by the Google?
How many number of bytes (throughput) is generated by the Google at this point?
How many users got the results back and how many did not (transaction summary)
And so on….

Hope, you have understood the scenario.
Let’s say we wanted to performance test the above scenario;
Well, in olden days the performance testing was the most critical and toughest test to be performed
Why it was so?
Let’s look at these reasons.
ü  Manual effort is essentially needed.
ü  It was very risky.
ü  Time consuming
ü  Money consuming
ü  No appropriate results though
ü  Heavy maintenance
ü  And there are so many reasons to talk about as such.

Consider the following diagrams.
(A)              Manual effort for Performance Testing
(B)               Automation effort for Performance testing
(C)               Industry-leading automated scalability and performance testing process, in general



Manual effort for Performance Testing       
                                   
                          



Automation effort for Performance testing


            


Industry-leading automated scalability and performance testing process, in general



With the help of the automation tools, such as LoadRunner, JMeter, Silk Performer ,  Empirix e-Load/RSW …
So with such tools, we have overcome all of the above obstacles and now
And thus Performance test has become so easy, so fast, so accurate and so comfortable for the applications to be performance tested.
There are a set of ways, how we can performance test, like Load Test, Stress Test, Performance Test, Capacity test, and so..
But at this moment, let’s accumulate all of these tests and say “Performance test”’ at over all, for an application in general passion.

Let me explain this Performance test process considering LOADRUNNER tool (for better understanding):
         Step 1: Planning the test. Here, we develop a clearly defined test plan to ensure the test scenarios; we develop load-testing objectives. 
         Step 2: Creating Vusers. Here, we create Vuser scripts (Test scripts) that contain tasks performed on the application, and these tasks are measured as transactions. 
         Step 3: Creating the scenario. A scenario is the combination of Load test scripts, Vusers, Performance conditions, Servers and so…
      A scenario defines the events that occur during each testing session.
For example, a scenario defines and controls the number of users to emulate, the actions to be performed, and the servers/machines on which the virtual users run their emulations
         Step 4: Running the scenario.
We emulate load on the server by instructing multiple Vusers to perform tasks simultaneously. Before the testing, we set the scenario configuration and scheduling. We can run the entire scenario, Vuser groups, or individual Vusers. 
         Step 5: Monitoring the scenario.
We monitor scenario execution using the LoadRunner online runtime, transaction, system resource, Web resource, Web server resource, Web application server resource, database server resource, network delay, streaming media resource, firewall server resource, ERP server resource, and Java performance monitors. 
         Step 6: Analyzing test results. During scenario execution, LoadRunner records the performance of the application under different loads. We use LoadRunner’s graphs and reports to analyze the application’s
Note: VUsers=Virtual users who act as similar as real users (human beings)

Why Performance Test?
Speed - Does the application respond quickly enough for the intended users?
Scalability – Will the application handle the expected user load and beyond?
Stability – Is the application stable under expected and unexpected user loads?
Performance Parameters: Are the no# transactions, throughputs, process, memory, threads and so..
Confidence – Are you sure that users will have a positive experience on go-live day?


Performance Indicators
• Resource utilization: The percentage of time a resource (CPU, Memory, I/O, Peripheral, Network) is busy
• Throughput: Throughput is the average rate of successful message delivery over a communication channel.
The throughput is usually measured in bits per second (bit/s or bps)
• Response time: The time elapsed between a request and its reply.
It is a measure of how responsive an application or subsystem is to a client request.
• Database access rates: The number of times database is accessed by web application over a given interval of time
• Scalability: The ability of an application to handle additional workload, without adversely affecting performance, by adding resources such as processor, memory, and storage capacity

There is a lot to talk about this PERFORMANCE testing, but as of now, I feel this could be enough to make you understand what’s performance test is and how it is being performed and what are all the types and what are all the different performance parameters that can be tested as part of performance test.
Any suggestions, questions please you are most welcome. Thank you.


"


Performance Testing:

Overview
This paper introduces Performance testing and describes how to perform Performance testing, and tools and techniques relevant to Performance testing.

Couple of definitions, explanations: what is Performance Testing?
  • In general, Performance testing is the process of testing the application to check if the application behaves well under the specified conditions (User loads, transactions, speed, process, memory, threads, hits, throughputs and so...)
  • The objective of a performance test is to demonstrate that the system meets requirements such as defined number of user loads, transactions, and throughput and response times as part of system/application performance.
  • Performance testing can be done for determining the speed or effectiveness of a computer, network, software program or device.
  • This process can involve quantitative tests done in a lab, such as measuring the response time or the number of MIPS (millions of instructions per second) at which a system functions. Qualitative attributes such as reliability, scalability and interoperability may also be evaluated. Performance testing is often done in conjunction with stress testing.
  • Performance testing can verify that a system meets the specifications claimed by its manufacturer or vendor.

Performance test approach:
Let’s consider a small scenario here.
Let’s say, for example…we are in search of “GE” company info on the Google.
If, all of us (let’s say 100 members of us) search the word “GE” at a time in Google, will the Google Server handle this load? (user load)
How many transactions (responses) are generated by the Google?
How many number of bytes (throughput) is generated by the Google at this point?
How many users got the results back and how many did not (transaction summary)
And so on….

Hope, you have understood the scenario.
Let’s say we wanted to performance test the above scenario;
Well, in olden days the performance testing was the most critical and toughest test to be performed
Why it was so?
Let’s look at these reasons.
ü  Manual effort is essentially needed.
ü  It was very risky.
ü  Time consuming
ü  Money consuming
ü  No appropriate results though
ü  Heavy maintenance
ü  And there are so many reasons to talk about as such.

Consider the following diagrams.
(A)              Manual effort for Performance Testing
(B)               Automation effort for Performance testing
(C)               Industry-leading automated scalability and performance testing process, in general



Manual effort for Performance Testing       
                                   
                          



Automation effort for Performance testing


            


Industry-leading automated scalability and performance testing process, in general



With the help of the automation tools, such as LoadRunner, JMeter, Silk Performer ,  Empirix e-Load/RSW …
So with such tools, we have overcome all of the above obstacles and now
And thus Performance test has become so easy, so fast, so accurate and so comfortable for the applications to be performance tested.
There are a set of ways, how we can performance test, like Load Test, Stress Test, Performance Test, Capacity test, and so..
But at this moment, let’s accumulate all of these tests and say “Performance test”’ at over all, for an application in general passion.

Let me explain this Performance test process considering LOADRUNNER tool (for better understanding):
         Step 1: Planning the test. Here, we develop a clearly defined test plan to ensure the test scenarios; we develop load-testing objectives. 
         Step 2: Creating Vusers. Here, we create Vuser scripts (Test scripts) that contain tasks performed on the application, and these tasks are measured as transactions. 
         Step 3: Creating the scenario. A scenario is the combination of Load test scripts, Vusers, Performance conditions, Servers and so…
      A scenario defines the events that occur during each testing session.
For example, a scenario defines and controls the number of users to emulate, the actions to be performed, and the servers/machines on which the virtual users run their emulations
         Step 4: Running the scenario.
We emulate load on the server by instructing multiple Vusers to perform tasks simultaneously. Before the testing, we set the scenario configuration and scheduling. We can run the entire scenario, Vuser groups, or individual Vusers. 
         Step 5: Monitoring the scenario.
We monitor scenario execution using the LoadRunner online runtime, transaction, system resource, Web resource, Web server resource, Web application server resource, database server resource, network delay, streaming media resource, firewall server resource, ERP server resource, and Java performance monitors. 
         Step 6: Analyzing test results. During scenario execution, LoadRunner records the performance of the application under different loads. We use LoadRunner’s graphs and reports to analyze the application’s
Note: VUsers=Virtual users who act as similar as real users (human beings)

Why Performance Test?
Speed - Does the application respond quickly enough for the intended users?
Scalability – Will the application handle the expected user load and beyond?
Stability – Is the application stable under expected and unexpected user loads?
Performance Parameters: Are the no# transactions, throughputs, process, memory, threads and so..
Confidence – Are you sure that users will have a positive experience on go-live day?


Performance Indicators
• Resource utilization: The percentage of time a resource (CPU, Memory, I/O, Peripheral, Network) is busy
• Throughput: Throughput is the average rate of successful message delivery over a communication channel.
The throughput is usually measured in bits per second (bit/s or bps)
• Response time: The time elapsed between a request and its reply.
It is a measure of how responsive an application or subsystem is to a client request.
• Database access rates: The number of times database is accessed by web application over a given interval of time
• Scalability: The ability of an application to handle additional workload, without adversely affecting performance, by adding resources such as processor, memory, and storage capacity

There is a lot to talk about this PERFORMANCE testing, but as of now, I feel this could be enough to make you understand what’s performance test is and how it is being performed and what are all the types and what are all the different performance parameters that can be tested as part of performance test.
Any suggestions, questions please you are most welcome. Thank you.


0 comments:

Post a Comment

Sunday

Performance Testing Overview



Performance Testing:

Overview
This paper introduces Performance testing and describes how to perform Performance testing, and tools and techniques relevant to Performance testing.

Couple of definitions, explanations: what is Performance Testing?
  • In general, Performance testing is the process of testing the application to check if the application behaves well under the specified conditions (User loads, transactions, speed, process, memory, threads, hits, throughputs and so...)
  • The objective of a performance test is to demonstrate that the system meets requirements such as defined number of user loads, transactions, and throughput and response times as part of system/application performance.
  • Performance testing can be done for determining the speed or effectiveness of a computer, network, software program or device.
  • This process can involve quantitative tests done in a lab, such as measuring the response time or the number of MIPS (millions of instructions per second) at which a system functions. Qualitative attributes such as reliability, scalability and interoperability may also be evaluated. Performance testing is often done in conjunction with stress testing.
  • Performance testing can verify that a system meets the specifications claimed by its manufacturer or vendor.

Performance test approach:
Let’s consider a small scenario here.
Let’s say, for example…we are in search of “GE” company info on the Google.
If, all of us (let’s say 100 members of us) search the word “GE” at a time in Google, will the Google Server handle this load? (user load)
How many transactions (responses) are generated by the Google?
How many number of bytes (throughput) is generated by the Google at this point?
How many users got the results back and how many did not (transaction summary)
And so on….

Hope, you have understood the scenario.
Let’s say we wanted to performance test the above scenario;
Well, in olden days the performance testing was the most critical and toughest test to be performed
Why it was so?
Let’s look at these reasons.
ü  Manual effort is essentially needed.
ü  It was very risky.
ü  Time consuming
ü  Money consuming
ü  No appropriate results though
ü  Heavy maintenance
ü  And there are so many reasons to talk about as such.

Consider the following diagrams.
(A)              Manual effort for Performance Testing
(B)               Automation effort for Performance testing
(C)               Industry-leading automated scalability and performance testing process, in general



Manual effort for Performance Testing       
                                   
                          



Automation effort for Performance testing


            


Industry-leading automated scalability and performance testing process, in general



With the help of the automation tools, such as LoadRunner, JMeter, Silk Performer ,  Empirix e-Load/RSW …
So with such tools, we have overcome all of the above obstacles and now
And thus Performance test has become so easy, so fast, so accurate and so comfortable for the applications to be performance tested.
There are a set of ways, how we can performance test, like Load Test, Stress Test, Performance Test, Capacity test, and so..
But at this moment, let’s accumulate all of these tests and say “Performance test”’ at over all, for an application in general passion.

Let me explain this Performance test process considering LOADRUNNER tool (for better understanding):
         Step 1: Planning the test. Here, we develop a clearly defined test plan to ensure the test scenarios; we develop load-testing objectives. 
         Step 2: Creating Vusers. Here, we create Vuser scripts (Test scripts) that contain tasks performed on the application, and these tasks are measured as transactions. 
         Step 3: Creating the scenario. A scenario is the combination of Load test scripts, Vusers, Performance conditions, Servers and so…
      A scenario defines the events that occur during each testing session.
For example, a scenario defines and controls the number of users to emulate, the actions to be performed, and the servers/machines on which the virtual users run their emulations
         Step 4: Running the scenario.
We emulate load on the server by instructing multiple Vusers to perform tasks simultaneously. Before the testing, we set the scenario configuration and scheduling. We can run the entire scenario, Vuser groups, or individual Vusers. 
         Step 5: Monitoring the scenario.
We monitor scenario execution using the LoadRunner online runtime, transaction, system resource, Web resource, Web server resource, Web application server resource, database server resource, network delay, streaming media resource, firewall server resource, ERP server resource, and Java performance monitors. 
         Step 6: Analyzing test results. During scenario execution, LoadRunner records the performance of the application under different loads. We use LoadRunner’s graphs and reports to analyze the application’s
Note: VUsers=Virtual users who act as similar as real users (human beings)

Why Performance Test?
Speed - Does the application respond quickly enough for the intended users?
Scalability – Will the application handle the expected user load and beyond?
Stability – Is the application stable under expected and unexpected user loads?
Performance Parameters: Are the no# transactions, throughputs, process, memory, threads and so..
Confidence – Are you sure that users will have a positive experience on go-live day?


Performance Indicators
• Resource utilization: The percentage of time a resource (CPU, Memory, I/O, Peripheral, Network) is busy
• Throughput: Throughput is the average rate of successful message delivery over a communication channel.
The throughput is usually measured in bits per second (bit/s or bps)
• Response time: The time elapsed between a request and its reply.
It is a measure of how responsive an application or subsystem is to a client request.
• Database access rates: The number of times database is accessed by web application over a given interval of time
• Scalability: The ability of an application to handle additional workload, without adversely affecting performance, by adding resources such as processor, memory, and storage capacity

There is a lot to talk about this PERFORMANCE testing, but as of now, I feel this could be enough to make you understand what’s performance test is and how it is being performed and what are all the types and what are all the different performance parameters that can be tested as part of performance test.
Any suggestions, questions please you are most welcome. Thank you.


No comments:

Post a Comment