Delay, drag, lag — these are all terms used to describe latency, or a noticeable hesitation in streaming video. While everyone prefers to avoid high latency, not everyone understands exactly what latency is, what causes it or what AV system specifications to request in order to reduce latency as much as possible.

Here, we'll talk about what latency is, how it commonly arises in AV over IP systems, and options to reduce it so video comes through crystal-clear and perfectly synched.

 

Latency: What Is It?

 

Depending on the context, latency in computer networking can stem from any number of causes. The result, however, is typically the same: Users experience a delay between input and output when using a computer, AV system or other networked device, according to S. Kyle Davis at HARMAN International.

Delay may occur at any of several points. Zender cofounder Patrick Debois describes the process in three steps:

  • A video signal is converted into a digital format by software called an encoder. Depending on the number and type of outputs the video will travel to, it may be converted into multiple digital formats.
  • The digital signal is transmitted over a data line to the playback device or devices.
  • The digital signal is converted back to a video signal, then displayed on the playback device or devices.

For AV purposes, the biggest concern is the delay between when an input device sends an audio or video signal and when the output device displays it. When the delay is too great, audio and video may not be synched. Viewers watching a live event on screens may see what happens a moment after they hear the audience react. Video conferences can turn into “an awkward jumble of overlapping audio” that no longer matches what's on screen, digital marketing consultant Annelise Fasnacht writes at Solutionz.

Depending on the number and type of output devices involved, the type of encoding/decoding they require, the distance between devices and other factors, delay between video input and playback can be considerable. Tech writer Kerry Flynn analyzed the different lag times for broadcasts of the 2016 U.S. presidential debates on various broadcast and digital channels, and found that they could differ by up to 55 seconds!

While an AVoIP system can reduce this delay to mere milliseconds, it cannot eliminate latency entirely. As Davis explains, “it's physically impossible for electrical signals to instantaneously be converted to data, transmitted over a data line, and then converted back into electrical signal at the other end.” With the right combination of tools, however, streaming AV latency can be reduced to speeds imperceptible to human senses.

 

Bandwidth vs. Latency: Why It Matters in AVoIP

 

For many years, streaming video was difficult or impossible due to high latency. At first, lack of bandwidth played a significant role; today, however, bandwidth is often sufficient for audiovisual streaming, yet still latency occurs.

As a result, it's important to understand the difference between latency and bandwidth. Software engineer and program manager at Intel, Bradley Mitchell, describes the difference in terms of a pipe.

Think of a network connection as a pipe through which information flows. Bandwidth refers to the overall size of the pipe and provides a sense of the pipe's maximum flow, usually expressed in megabits per second (Mbps) or gigabits per second (Gbps).

Latency, meanwhile, expresses how quickly or slowly information is actually flowing through the pipe. While a too-narrow pipe (or too little bandwidth) can cause latency, so can other factors. For instance, a bottleneck in the pipe can cause the stream of information to slow, even if the pipe's overall size is otherwise sufficient for the amount of information flow, explains Mitchell at Lifewire.

While many internet service providers advertise their offerings in terms of speed, the speed to which they refer is typically bandwidth, according to Lynn Phillips at HighSpeedInternet.com. High latency can cause users to experience the network as being slow even if they have plenty of bandwidth available.

For organizations considering an AV over IP installation, both bandwidth and latency are concerns — particularly when the AV installation will work with the same computer network already used for business or education. Including both an AV integrator and members of the IT team in the planning process can help ensure that both bandwidth and latency concerns are addressed. 

Defeating High Latency in AVoIP Setups

 

An ideal AV over IP system will reduce latency to the point that no human viewer can detect it — effectively rendering it invisible. To get there, however, integrators and IT staff may need to work together to identify and address the causes of latency in a particular system.

Here are several common causes of unacceptably high latency in AVoIP.

 

Inadequate Bandwidth

 

Because bandwidth and latency are related, ensuring adequate bandwidth exists is an essential first step in designing an AV over IP installation, especially if a significant amount of data is already being handled by the network the system will use.

David Meyer, director of technical content at the international trade association CEDIA, notes in his article at Hidden Wires that many buildings are equipped with 1Gb Ethernet — but uncompressed video streams may demand much higher amounts of bandwidth, from 3.6 Gbps for a 1080p60 stream to 7 Gbps or more for 4K30.

Two options exist for addressing the bandwidth problem. The first is to increase the available bandwidth to 10Gbps; the second is to compress the video signal. Since traditional CAT5e won't support 10 Gb Ethernet and CAT6 will do so only for limited distances, compression has become standard in many AV over IP systems.

 

Using Software Alone

 

 

While compressing audiovisual signals is often essential for negating latency problems in AV over IP, relying on compression software alone may cause the very problem it seeks to solve, according to a Silex Inside white paper by Jean-Marie Cloquet and Gregory Baudet.

“Due to the high bandwidth of the video signal and the latency constraint, it is essential to use dedicated hardware processing from the video input to the IP network,” they write. “Purely software-based solutions will inevitably increase the latency because of the memory transfers and the CPU load.”

The correct combination of hardware and software can reduce buffering time in order to reduce latency, compress video to allow more information to be sent over the same network without compromising speed, and even encrypt video transmissions on the fly so that encryption does not cause perceptible latency, according to Cloquet and Baudet.

Some companies are experimenting with zero compression transceivers, which promise zero loss, crystal-clear video with no perceptible latency. Aurora received a Best in Show award at InfoComm 2015 for its IPX series transceivers, which are designed to work with 10 Gbps networks.

 

 

The Distance Factor

 

 

 

 

According to Davis, latency is most concerning when it's high enough to be perceptible to the user. Placing the user and the input device in the same space increases the likelihood that the user will notice the delay, since the user can perceive both the input and the output at the same time.

Yet distance can increase latency, too, because “packets don't travel through routers instantly,” notes Chris Hoffmanat HowToGeek. “Each router a packet has to travel through introduces a delay of a few milliseconds, which can add up if the packet has to travel through many routers to reach the other side of the world.” In addition, some types of internet connections, such as satellites, have high latency even in the best conditions.

 

 

 

Considerations for Users, Integrators and IT

 

 

 

High latency causes frustration and can even make an AV over IP project ultimately unusable —  particularly if its value relies on a lack of perceivable delay, such as in videoconferencing or streaming live video.

Achieving low latency requires integrators, IT staff and users to consider a number of factors, says media solution consultant and Eyevinn Technology partner Magnus Svensson, adding that “this will be worth the effort when it comes to user experience. And there are ways to achieve acceptable latency, balancing the different techniques, without compromising cost and reliability.”

Images by: Mathew Schwartzraw pixelMaria Svecova