logging in or signing up jmeter material_Mahesh prudhvinaik Download Post to : URL : Related Presentations : Share Add to Flag Embed Email Send to Blogs and Networks Add to Channel Uploaded from authorPOINT lite Insert YouTube videos in PowerPont slides with aS Desktop Copy embed code: Embed: Flash iPad Copy Does not support media & animations WordPress Embed Customize Embed URL: Copy Thumbnail: Copy The presentation is successfully added In Your Favorites. Views: 755 Category: Entertainment License: All Rights Reserved Like it (0) Dislike it (0) Added: June 24, 2012 This Presentation is Public Favorites: 0 Presentation Description No description available. Comments Posting comment... Premium member Presentation Transcript JMeter: JMeter OverviewAgenda: Agenda What is Performance Testing Why do Performance Testing Types of Performance Testing Overview of J-Meter Prerequisites for J-Meter Terminologies used in J-Meter Creating Test Plans Recording the Test Plans Executing the Test Plans Results-Report Preparation HTTP Request - Response Codes QuestionsPowerPoint Presentation: Performance Testing determines how fast a system performs under a particular workload to validate the quality attributes of the system such as scalability and reliability and Resource usage of the Application Under Test. It explores several system qualities, that can be simplified to: Speed - does the system respond quickly enough? Capacity - is the infrastructure sized satisfactorily ? Scalability - can the system grow to handle future volumes? Stability - does the system behave correctly under load? WHAT IS PERFORMANCE TESTING ?Why do performance testing?: May not detect some functional defects that only appear under load. Does the application respond quickly enough for the intended users? • Will the application handle the expected user load and beyond? • Will the application handle the number of transactions required by the business? • Is the application stable under expected and unexpected user loads? Why do performance testing?Why Do Performance Testing? : Why Do Performance Testing? Obtain an accurate picture of end-to-end system performance before going live. How The System is Capable of handling future growth. Whether Application requires a performance improvement/hardware upgrade prior to release. Revenue losses or damaged brand credibility due to scalability or stability issues. Customers being dissatisfied with application response time. Analyzing the behavior of the application at various load levels. Identifying bottlenecks in the application. Overview of J-Meter : Overview of J-Meter Apache J-Meter is open source software, a 100% pure Java desktop application designed to load test functional behavior and measure performance. It was originally designed for testing web applications but has since expanded to other test functions.Performance Tester Must Know: Performance Tester Must Know What is the type of communication between the client and server? Requirements to monitor & Transaction Response Time or server utilization (SLA). Is the application client-side?, And the AUT architecture. Is there client-side activities involved in the entire application? What is the authentication mechanism? Does the application allow multiple login of a single user? How does the application maintain a session?. Environment specifications including Software and hardware. How many users or Volumetric client expecting? How does the load balancer distribution the load? What are the parameters been sent to the Server?Key Types of Performance Testing : Key Types of Performance Testing Performance test : A performance test is a technical investigation done to determine or validate the response, speed, scalability, stability characteristics of the product under test. Focuses on determining if the user of the system will be satisfied with the performance characteristics of the application. Identifies mismatches between performance-related expectations and reality. Supports tuning, capacity planning, and optimization efforts. Purpose : To determine or validate speed, scalability, and/or stability.PowerPoint Presentation: Load test: A load test enables you to measure response times, throughput rates, and resource-utilization levels, and to identify your application’s breaking point, assuming that the breaking point occurs below the peak load condition. Performance objectives are often specified in a service level agreement (SLA). Purpose : To verify application behavior under normal and peak load conditions. Determines the sufficiency of a hardware environment. Evaluates the adequacy of a load balancer. Detects concurrency issues. Determines how many users can application handle. Detects functionality errors under load.PowerPoint Presentation: Stress test : S tress testing enables you to identify your application’s weak points, and shows how the application behaves under extreme load conditions. Purpose: To determine or validate an application’s behavior when it is pushed beyond normal or peak load conditions. Determines the side effects of common hardware or supporting application failures. Determines if data can be corrupted by overstressing the system. Helps to determine what kinds of failures are most valuable to plan for. Challenges : Because stress tests are unrealistic by design, some stakeholders may dismiss test results. It is often difficult to know how much stress is worth applying. Capacity test: Capacity testing helps you to identify a scaling strategy in order to determine whether you should scale up or scale Down. Capacity testing is conducted in conjunction with capacity planning, which you use to plan for future growth, such as an increased user base or increased volume of data. For example, to accommodate future loads, you need to know how many additional resources (such as processor capacity, memory usage, disk capacity, or network bandwidth) are necessary to support future usage levels.SOAK TEST/Endurance Test: SOAK TEST/Endurance Test Soak testing involves testing a system with a significant load extended over a significant period of time, to discover how the system behaves under sustained use. For Example: a system may behave exactly as expected when tested for 1 hour. However, when it is tested for 3 hours, problems such as memory leaks cause the system to fail or behave randomly.Prerequisites of J-Meter: Prerequisites of J-Meter Latest version of Java Software should be installed. In system properties, ‘Environment Variables’ should be set as shown in below. System Variables: JAVA_HOME = “installed java path” JDK_HOME = “installed java path” CATALINA_HOME = “Installed path of Tomcat server” 3. After Java installation completes, J-Meter can be opened by double-clicking the “Jmeter.bat” from the following installed path “D:\mahesh\Softwares\jakarta-jmeter-2.4\bin ”Terminologies used in J-Meter: Terminologies used in J-Meter Number of Threads : Number of virtual users. Thread Group : This is the place where number of threads, ramp-up period, loop count are given while executing. Ramp-Up Period : It indicates the time taken by J-Meter to create all of the threads needed. If we set 10 seconds as the ramp-up period for 5 threads then the J-Meter will take 10 seconds to create those 5 threads. Also by setting its value to 0, all the threads can be created at once. Loop Count : By specifying its value J-meter gets to know that how many times a test is to be repeated.Terminologies used in J-Meter: Terminologies used in J-MeterTerminologies used in J-Meter: Terminologies used in J-Meter Http Request Defaults : This is the place where build hosted machine name or machine IP number, build hosted machine port number and name of the protocol (HTTP) are saved. Please refer the below figure.PowerPoint Presentation: Terminologies used in J-Meter Through-Put : Number of request handled in a particular time period. Normally through-put will be in Request/Minute. Sampler : This is the place where all requests are saved. Please refer the marked areas in the below figure.PowerPoint Presentation: Listeners : This is the place where result will be shown. Here we are going to use only three types of listeners. Graph Results Listener : Using this listener, we can get through-put value. Through-Put got from this listener normally will be in requests/minute. We have to convert it to requests/second. Please refer the below figure.PowerPoint Presentation: Creating Test Plan Open JMeter by click on batch file JMeter initially starts with “TestPlan and Workbench” (See below figure) Steps to create the Test Plan in JMeter: a. Add Thread Group by right click on Test Plan b. Right Click on Thread Group and Logic Controller, Listeners, Pre processors, Post ProcessorsPowerPoint Presentation: HTTP Proxy Server : This is the place where we click the ‘Start’ button to make the J-Meter to record the scenario. Here we have to set port number for our proxy server (normally we set as 8080) to send the request and get the response. Also, here we can include the URL patterns to include or exclude while recording. URL patterns are: .* - all .*\. png – png images .*\.gif – gif images .*\.jpg – jpeg images .*\. php .*\. jsp .*\.html .*\. htm .*\. js Generally, png files will be used as ‘URL Patterns to Exclude’. Since png files will be stored in internet temporary files. While reloading the page, request will not sent again for these png files and these files will be loaded from internet temporary file. HTML patterns will be used as ‘URL Patterns to Include’. Please refer in below figure. Changes to be made in the Workbench (JMeter) Right Click on the WorkBench Add HTTP Proxy Server to the WorkBench Enter the port no. in HTTP Proxy Server as same as proxy setting in IE Browser Select the Target Controller for storing the HTTP Requests Enter the URL Patterns to Include (ex: .*\. jsp , .*\. png , etc.) &/or URL Patterns to exclude (ex: .*\. jsp , .*\. png , etc.) Click on start button Recording Starts from here Launch the Application URL in the IEPowerPoint Presentation: Note : HTTP Proxy server should be kept under workbench as shown in the above figure. While saving, HTTP proxy server will not be saved.Recording the Test Plan: Recording the Test Plan Before recording some proxy settings are required in the browsers: 1. How to change the Proxy Settings in IE browser: a. Open IE browser b. Click on Tools Menu Click on Internet Options Click on Connections Click on LAN Settings Change the Proxy Server address to ”localhost” and “port” to 8080 or 9090 Click on Ok button Please look the below figure for detail Note : The Proxy Server settings will be different in every browsers.Configuring the Test Plan: Configuring the Test Plan Before executing the recorded scripts we need to ensure that scripts are recorded properly or not. Configuring the recorded TestPlan: Check any unwanted requests are recorded/added in the Test Plan Add the variables wherever required by using User Parameters, CSV Data Set Config and User Defined Variables You must save the Test plan after recording completed Settings required in the recorded TestPlan: Enter No. of Threads (Virtual Users) Enter Ramp-Up & Ramp-Down Period (in seconds) if required Enter Loop Count if required Enter Start time and End time by adding Scheduler if requiredExecute/Run the Test Plan: Execute/Run the Test Plan Make sure that TestPlan is ready for Execution Once the TestPlan is ready go to Run menu and click on Start Note that in Listeners whether the requests are processed with Server successfully or not After the Execution completes check all the User data applied for the appropriate users Save the Test resultsResults - Report Preparation: Results - Report Preparation Here, mainly we are tracking the two values for the Results. Through put Response Time Through put : Using the Graph Listener we get the Through put value. Throughput is nothing but Request per minute Response Time :Using the Aggregate Graph Listener to find out the Response Time value in 90% line column. Here Response time shown in milli-seconds (ms)HTTP Request - Response Codes: HTTP Request - Response Codes There are various Http Request - Response Codes are available in application. Here we see the some Proper and Error response codes. In the following Response codes Green color are Proper Response Codes for the Request, Blue color are minor Error Response codes for the Request and Red color are major Error Response codes for the Request. 200 - Ok 201 - Created 202 - Accepted 400 - Bad Request 404 - Not Found 410 - Gone 401 - Un-Authorized 403 - Access Forbidden 500 - Internal Server ErrorQuestions: Questions If you have any doubts regarding J-Meter “Google” is the Best friend to find out the solution. You can learn J-Meter through following website: http://jakarta.apache.org/jmeter/ You do not have the permission to view this presentation. In order to view it, please contact the author of the presentation.