What Is Latency?
Latency is the delay between sending a request and receiving a response. It is completely independent from download speed — a 1 Gbps connection can have terrible latency, and a 10 Mbps connection can have excellent latency.
Under 20 ms — Excellent
Indistinguishable from local. Ideal for competitive gaming, live trading, and video surgery.
20–50 ms — Good
Suitable for all online activities including competitive gaming and HD video calls.
50–100 ms — Acceptable
Fine for casual gaming, browsing, and standard video calls.
Above 150 ms — Noticeable
Perceptible delay in gaming and voice calls. Above 300 ms, conversations become difficult.
Latency vs ping — are they the same thing?
In everyday internet usage, latency and ping are used interchangeably and mean essentially the same thing: the round-trip time (RTT) in milliseconds for a small packet to travel from your device to a server and back. Technically, latency refers to the one-way delay, while ping (or RTT) refers to the full round trip — but consumer tools including Speedtest.now report round-trip time and label it as both latency and ping.
The distinction that actually matters is latency vs jitter. Latency is the average delay. Jitter is how much that delay varies from packet to packet. A connection with 40 ms latency and 2 ms jitter is far more usable than one with 15 ms latency and 40 ms jitter. Learn about jitter →
What latency measures — inside the network
When you send a request to a web server, the data travels through multiple stages. Each adds a small delay:
| Stage | Typical delay | What drives it |
|---|---|---|
| Wi-Fi radio transmission | 2–15 ms | Channel contention, signal strength, interference |
| Router processing | 0.5–5 ms | Router CPU speed, NAT table size, firmware |
| ISP network traversal | 1–20 ms | Number of hops, congestion, routing quality |
| Physical distance (propagation) | ~0.5 ms per 100 km | Speed of light in fibre (~200,000 km/s) |
| Server processing | 0.1–10 ms | Server load, query complexity |
The physics floor — why latency can't go below a certain point
Data travels through fibre optic cables at roughly two-thirds the speed of light — approximately 200,000 km per second. This means there's an absolute minimum latency based purely on geographic distance. You cannot send a packet from London to New York and receive a reply in under ~35 ms no matter how fast your connection is. No amount of infrastructure upgrade can overcome this physics limit.
This is why latency to a nearby server (50 km away) can be 1–3 ms, while latency to a server on the other side of the planet is 150–300 ms even on a perfect network. Speedtest.now selects the geographically nearest server to minimise this factor when measuring your baseline latency.
Latency vs bandwidth — why they're not the same
Bandwidth (download/upload speed) is how much data can pass through the pipe per second. Latency is how long it takes for a packet to make one trip through that pipe. They are independent measurements.
A garden hose can have high bandwidth (large diameter) but still take a long time for water to reach the end. A fibre optic cable the width of a hair can carry data at the speed of light — but if the server is 10,000 km away, the trip still takes 50 ms.
The practical implication: for video streaming and downloads, bandwidth matters most. For online gaming, video calls, and any interactive application, latency matters most. You can stream 4K Netflix on a 100 ms connection without issue. You will struggle to play competitive FPS games on a 100 ms connection regardless of how fast your download speed is.
What affects latency on your home network
Wi-Fi vs Ethernet
Switching from Wi-Fi to Ethernet typically reduces latency by 5–20 ms and dramatically reduces jitter. Wired connections have deterministic latency; wireless connections have variable latency due to radio channel contention. Ethernet vs Wi-Fi latency comparison →
Buffer bloat
Buffer bloat occurs when your router or modem has an oversized queue buffer. When the connection is under load (e.g. someone downloading a large file), packets queue in this buffer and wait. This can add 50–200 ms of latency to all other traffic while a download is happening. What is buffer bloat and how to fix it →
Router hardware age
Older consumer routers process each packet through their CPU. A router with a slow CPU adds measurable processing delay to every packet, especially under load. A router that was adequate for a 100 Mbps plan may add noticeable latency on a 500 Mbps plan because its CPU is overwhelmed.
ISP and routing path
Your ISP's routing decisions affect how many hops your data takes to reach its destination. Poor routing — where your ISP sends traffic on a longer path than necessary — can add 20–50 ms of avoidable latency. This is most visible when gaming on foreign servers or connecting to services hosted in geographically distant data centres.
How to measure and improve your latency
Run the Latency Test to see your current round-trip time to the nearest test server. For a detailed view including minimum, maximum, and P95 latency, use the Ping Test.
To improve latency: 8 step-by-step fixes to lower your latency →