Home / Blog / JMeter Tutorial for Beginners: Load Testing from Scratch

Performance Testing

JMeter Tutorial for Beginners: Load Testing from Scratch

QA Knowledge Hub·2026-04-10·8 min read

JMeter is Apache's open-source load testing tool. It simulates hundreds or thousands of virtual users hitting your application simultaneously and measures how it responds. If you have never done performance testing, this is the right starting point.

What JMeter Tests

JMeter answers questions like:

  • "Can our checkout handle 500 simultaneous users?"
  • "What is the average response time under normal load?"
  • "At what point does the system start failing?"
  • "Does performance degrade after 2 hours of sustained traffic?"

These are questions unit and functional tests cannot answer. They require a tool that generates concurrent load.

Installation

JMeter requires Java. Install JDK 17+ first (from adoptium.net).

Download JMeter:

  1. Go to jmeter.apache.org/download_jmeter.cgi
  2. Download the .zip (Windows/Mac) or .tgz (Linux) archive
  3. Extract to a folder (e.g., C:\jmeter or /opt/jmeter)

Start JMeter:

  • Windows: Double-click bin/jmeter.bat
  • Mac/Linux: Run bin/jmeter.sh

The JMeter GUI opens. For running actual tests, you will use the CLI — but the GUI is essential for building test plans.

JMeter Core Concepts

Test Plan

The root container for everything in your test. All other elements sit inside it.

Thread Group

Defines your virtual users:

  • Number of Threads (users): How many concurrent users
  • Ramp-Up Period: How long to gradually bring all users to full load (seconds)
  • Loop Count: How many times each user runs through the test

Samplers

The actual requests. The HTTP Request Sampler makes HTTP/HTTPS calls. This is the core of every web performance test.

Listeners

Components that collect and display results: Summary Report, View Results Tree, Response Time Graph.

Assertions

Validation rules — verify the response contains expected data or status codes.

Config Elements

Reusable configuration: HTTP Header Manager (add headers), HTTP Cookie Manager, CSV Data Set Config (load test data from a file).

Your First Load Test

We will load test the JSONPlaceholder API.

Step 1: Create Thread Group

  1. Right-click Test Plan → Add → Threads (Users) → Thread Group
  2. Set:
    • Number of Threads: 10
    • Ramp-Up Period: 5 (bring 10 users on over 5 seconds)
    • Loop Count: 3 (each user runs the test 3 times = 30 total requests)

Step 2: Add HTTP Request Sampler

  1. Right-click Thread Group → Add → Sampler → HTTP Request
  2. Set:
    • Protocol: https
    • Server Name: jsonplaceholder.typicode.com
    • Method: GET
    • Path: /users

Step 3: Add HTTP Header Manager

  1. Right-click Thread Group → Add → Config Element → HTTP Header Manager
  2. Add header: Content-Type = application/json

Step 4: Add Response Assertion

  1. Right-click the HTTP Request → Add → Assertions → Response Assertion
  2. Apply to: Main sample only
  3. Field to test: Response Code
  4. Pattern: 200

Step 5: Add Summary Report Listener

  1. Right-click Thread Group → Add → Listener → Summary Report
  2. Also add: View Results Tree (for debugging individual requests)

Step 6: Run the Test

Click the green Play button. Switch to the Summary Report tab to watch results populate in real time.

Understanding the Summary Report

ColumnMeaning
# SamplesTotal number of requests sent
AverageMean response time in milliseconds
MinFastest response time
MaxSlowest response time
Std DevHow much response times vary (lower = more consistent)
Error %Percentage of requests that failed
ThroughputRequests per second the server handled
Avg BytesAverage response size

What good looks like:

  • Error %: 0% (no failures)
  • Average response: under 500ms for most APIs
  • Throughput: consistent across the test duration

Red flags:

  • Error % above 0% — the server is failing under load
  • Average response time increasing over time — memory leak or resource exhaustion
  • Max response time much higher than average — inconsistent performance (spiking)

A More Realistic Test Scenario

Real load tests simulate multiple user journeys, not just one endpoint.

Adding Multiple HTTP Requests

Add a second HTTP Request sampler for /posts inside the same Thread Group:

  1. Right-click Thread Group → Add → Sampler → HTTP Request
  2. Server: jsonplaceholder.typicode.com, Path: /posts/1

Using a CSV for Test Data

Instead of hardcoding user IDs, load them from a file.

Create user_ids.csv:

userId
1
2
3
4
5

Add CSV Data Set Config:

  1. Right-click Thread Group → Add → Config Element → CSV Data Set Config
  2. Filename: path to your CSV
  3. Variable Names: userId
  4. Recycle on EOF: True
  5. Stop thread on EOF: False

Update your HTTP Request path to /users/${userId}.

Now each virtual user gets a different user ID from the CSV.

Running JMeter from the Command Line (CI Mode)

The GUI is for building tests. For actual load testing runs — especially in CI — use the command line:

# Run test plan, save results to CSV, generate HTML report
jmeter -n -t test-plan.jmx -l results.jtl -e -o report/

# Options:
# -n = non-GUI (headless) mode
# -t = test plan file
# -l = log results to file
# -e = generate HTML report after test
# -o = output folder for HTML report

The HTML report contains response time graphs, percentile charts, and throughput over time — everything you need for a performance test report.

Parameterising Thread Count (for Different Scenarios)

Pass user count at runtime without editing the test plan:

# Run with 50 users and 30-second ramp-up
jmeter -n -t test-plan.jmx -l results.jtl \
  -Jusers=50 \
  -JrampUp=30

In the Thread Group, use ${__P(users, 10)} for the thread count (default 10 if not passed).

Common Load Testing Scenarios

Scenario 1: Baseline Performance Test

Goal: Establish normal performance metrics before any changes.

Setup:

  • 10 concurrent users
  • 0-second ramp-up (all at once)
  • Run for 5 minutes

Record: Average response time, 95th percentile, throughput.

Scenario 2: Load Test to Requirements

Goal: Verify the system meets a stated SLA (e.g., "handles 500 users with <2 second response time").

Setup:

  • 500 concurrent users
  • 60-second ramp-up
  • Run for 10 minutes

Pass criteria: Error % = 0%, Average response < 2000ms.

Scenario 3: Stress Test — Finding the Breaking Point

Goal: Find at what load the system starts failing.

Setup:

  • Start with 100 users
  • Add 100 users every 5 minutes
  • Keep increasing until error rate climbs above 5%

Record the user count at which the system degrades. This is your capacity limit.

Scenario 4: Soak Test — Endurance Under Load

Goal: Check for memory leaks and resource exhaustion over time.

Setup:

  • 200 concurrent users (normal expected load)
  • Run for 2–4 hours

What to watch: Does response time gradually increase? Does throughput gradually decrease? If yes, there is a resource leak (memory, database connections, file handles).

Performance Test KPIs

When reporting performance test results, use these key metrics:

MetricTarget (typical)How to Measure
Average Response Time< 500ms (API), < 2s (UI pages)Summary Report → Average
95th Percentile (P95)< 2000msSummary Report / Aggregate Report
Error Rate0% under normal loadSummary Report → Error %
ThroughputAs per load requirementSummary Report → Throughput
CPU Usage< 70% at peak loadServer monitoring
Memory UsageStable (no upward trend)Server monitoring

P95 means 95% of requests completed within that time. It is more useful than average because it shows what most users actually experience.

JMeter Interview Questions

Q1: What is the difference between load testing and stress testing?

Load testing verifies performance under expected user volumes — can it handle our normal peak traffic? Stress testing pushes beyond normal limits to find the breaking point — at what user count does it fail?

Q2: What is a ramp-up period and why is it important?

Ramp-up gradually increases virtual users to the target count over a defined time. Without ramp-up, all users hit the server simultaneously — an unrealistic spike. Ramp-up simulates organic traffic growth.

Q3: What is the 90th percentile (P90)?

90% of requests completed within this response time. Better than average for understanding user experience — a 500ms average could hide occasional 5-second responses that affect 10% of users. P90 shows the experience of most users more honestly.

Q4: What would you do if you see a steady increase in response time during a soak test?

This indicates resource exhaustion — likely a memory leak, database connection pool growing, or thread accumulation. I would capture heap dumps, check connection pool size, and review server monitoring charts for the specific resource that is growing.

Summary

JMeter is the standard tool for load testing in Java-heavy and general QA environments. The basics — Thread Group, HTTP Request, Summary Report — cover 80% of real-world performance tests.

Your next steps:

  1. Install JMeter and run the example test from this guide against JSONPlaceholder
  2. Generate the HTML report and review the charts
  3. Try a stress test by increasing users until you see error rates climb
  4. Add JMeter to your resume skills once you have run 2–3 test scenarios

Performance testing is a specialisation that very few QA engineers develop. Learning it makes you significantly more valuable than a tester who only does functional testing.

Recommended Resource

Automation Testing Scenarios Pack

High-quality automation scenarios for UI, API, and microservices systems.

1299Get This Guide →

Related Posts

📝
Interview Prep
Apr 2026·10 min read

30 Performance Testing Interview Questions and Answers

Performance testing interview questions for QA roles — covering load testing, stress testing, JMeter, key metrics, and how to analyze results.

Read article →
📝
Interview Prep
Apr 2026·12 min read

35 Playwright Interview Questions (With Answers)

Playwright interview questions and answers for QA and SDET roles — covering setup, locators, waits, fixtures, API testing, and debugging.

Read article →
📝
Interview Prep
Apr 2026·11 min read

40 SQL Interview Questions for QA Engineers

SQL interview questions specifically for QA and SDET roles — covering SELECT, JOINs, aggregations, NULL handling, and data validation queries with answers.

Read article →