The Case For Latency, Part 1

by | Feb 6, 2020 | IT Infrastructure

I recently had a conversation about the impact of network latency on application performance. Andrew Rustad is a solution engineer with Intellisys. Intellisys is widely regarded as the industry’s leading Technology Services Distributor of telecommunications network and cloud services.

Latency is thought to be the common enemy amongst network engineers and software developers. It is one of the first causes that they point to when remote applications fail. After all, everyone knows that a circuit must have X latency or less otherwise the application won’t work. Right? It seems to make sense, but is latency really the determining factor for application performance? Andrew provided some insight into the issue and what can be done to alleviate it.

Bandwidth vs Latency
He first discussed the difference between bandwidth and latency. “All too often, I hear this used interchangeably, but they are not the same” he said. He went on to explain how Bandwidth refers to the maximum capacity of a connection compared to how Latency represents how long it takes for data packets to get to its destination. Adding that since the internet is often referred to the information super-highway, a highway is a good way to explain the difference.

“Imagine a highway with two lanes connecting two cities 100 miles apart with a speed limit of 100mph. The maximum number of cars that can simultaneously drive down the highway is two cars – this represents the bandwidth. Because the speed limit is 100mph, it will take those two cars exactly 1 hour to drive from one city to the next. That one hour represents the latency for this highway (or circuit)”.

Impact of Physics on Latency
He clarified the reasoning behind adding a speed limit into the aforementioned metaphor, this represents a real world speed limit – the speed of light. No matter the type of connection, we don’t have a way to send packets faster than the speed of light. Resulting in a minimum latency threshold that we simply cannot break. In fact, you can approximate latency with a simple understanding that light travels at approximately 186 miles per millisecond (186 miles/ms). It’s also worth noting this is the speed of light inside a vacuum – meaning there are no other factors that could hinder the transmission of light.

Utilizing the example above and applying a fiber circuit (which shoots packets in the form of light), you would use this formula: 100 = 186 * (X) milliseconds. This would tell you that it would take roughly 0.53 ms for a packet to go 100 miles. As a real world example, it would take a single packet to travel the 2400 miles between New York and Los Angeles, it would take approximately 13 ms in a vacuum.

Any telecom veteran will see the 13ms network speed and think, “I’ve never seen a 13ms link between New York and Los Angeles”. They would not be wrong. This is because there are other several factors in play:

1. 2,400 miles is “as the bird flies” or the shortest distance between New York and Los Angeles. Essentially it’s as if you pulled a fiber cable directly in a straight shot across the US. Here’s the problem – Circuits simply are not laid out that way. The circuits we use are all dependent on the infrastructure laid out by the carriers, and they will typically hop from one datacenter and switching office to the next across the US until it gets to the destination. It will never be a straight shot.
2. There are other forms of latency beyond those imposed by the laws of physics. There is RAM latency, CPU latency, disk latency, and more. Routers, switches, servers all have their own latency measurements and the common thread is these latencies create some type of bottleneck that can result in a “delay”. In the computing world, all of these types of latencies/delays (while small and only a few milliseconds) can add up and potentially create noticeable increases in overall latency.

Asking the Right Questions
After a polite apology for the physics lesson;-, Andrew explained that it is important to understanding the relationship between latency and performance. He has experienced cases where latency related performance issues are diagnosed as bandwidth related performance issues and vice versa. Often times, the company’s request more bandwidth in an attempt to improve performance for latency related issues. He pulled out this cartoon to put an image around the situation.

As you can see, widening the road doesn’t necessarily improve the times it takes for a car to drive from one city to another. It merely means more cars can drive from one city to another in the same amount of time. Bringing it back to your network, it’s important to ask the right questions during a discovery session to fully pinpoint and understand what you are trying to accomplish with your telecom solution and why your applications may not be performing well. Sometimes it’s not always bandwidth; sometimes it’s not always latency. In the near future, Andrew promised to elaborate into additional factors that can affect performance.

 

Teledata Select offers a whole range of services that will help you meet your business goals. Starting with a complimentary review of your current telecommunication bills to identify errors and find opportunities for savings, better service and more functionality. We also offer project management for new service implementation and infrastructure installs, including fiber and low voltage cabling. Call us at 404-257-1502 to discuss your current Telecom Service Solution and what you would like to get out of it. Or send us a note via This Link to start a no obligation discussion of your specific business technology needs.

 

John HaganJohn Hagan is President of TeleData Select, a Telecommunications Consultant located in Atlanta, Ga serving customers throughout the US and overseas. His company provides business solutions for voice, data, mobile and cloud solutions for both large and small businesses. Contact John or his team for a complimentary telecom audit to make sure you’re getting the best value for communication services. The savings you receive could offset the cost of purchasing a new Hosted or Premise based solution.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Leave a comment

[searchandfilter id="2058"]

Related Posts