--- layout: default title: Latency and Latency-Compensation ---

Latency

When speaking about synchronization, there is no way around also mentioning Latency: Latency is how you call the reaction time of a system to a certain stimulus. There are many factors that contribute to the total latency of a given system. In order to achieve exact time synchronization all sources of latency need to be take into account and compensated for.

Figure 1: Latency chain

Figure 1: Latency chain. The numbers are an example for a typical PC. With professional gear and an optimized system the total roundtrip latency is usually lower. The important point is that latency is always additive and a sum of many independent factors.

There is not much that can done about the first two other than using headphones or sitting near the loudspeaker and buying quality gear.

Processing latency is usually divided into capture latency and playback latency:

But this division is an implementation detail of no great interest. What really matters is the combination of both. It is called processing roundtrip latency: the time necessary for a certain audio event to be captured, processed and played back.

It is important to note that processing latency in a jackd is a matter of choice: It can be lowered within the limits imposed only by the hardware and audio driver. But the lower it is, the more likely the system will fail to meet its processing deadline and the dreaded xrun will make its appearance more often, leaving its merry trail of clicks, pops and crackles.

The digital I/O latency is usually negligible for integrated or PCI audio devices but for USB or FireWire interfaces the bus clocking and buffering can add some milliseconds.

The JACK Audio Connection Kit has a few parameters to configure the latency. However the settings are constrained by hardware (audio-device, CPU and bus-speed). Lower latencies increase the load on the computer-system (it needs to process the audio in smaller chunks which arrive much more frequently). If the system can not keep up: an x-run (short for buffer over-run and buffer under-run) occurs which usually results in audible clicks or dropouts.

Low-latency is not always a feature you want to have. It comes with a couple of drawbacks: the most prominent is increased power-consumption because the CPU needs to process many small chunks of audio-data, it is constantly active and can not enter power-saving mode. Furthermore, if more than one application (sound-processor) is involved in processing the sound, the operating system performs a context-switch to run each of these for each audio-cycle which results in a much higher system-load and an increased chance of x-runs.

Reliable low-latency (≤10ms) on GNU/Linux can usually only be achieved by running realtime-kernel.

Yet there are only few situations where a very low-latency is really important, because they require very quick response from the computer. Some examples that come quickly to mind are:

In many other cases - such as playback, recording, overdubbing, mixing, mastering, etc. latency is not important, It can be relatively large and easily be compensated for.

To explain the last statement: during mixing or mastering you don't care if it take 10 or 100ms between the instant you press the play button and sound coming from the speaker. The same is true when recording.

During tracking, it is however important that the sound that is currently played back is internally aligned with the sound that is being recorded.

This is where latency-compensation comes into play: There are two possibilities to compensate for latency in a DAW: read-ahead the DAW actually starts playing a bit early. So that the sound hits the speakers a short time later, it is exactly aligned with the timecode of the material that is being recorded. and write-behind since we know that the sound that is being played back has latency, the incoming audio can be delayed by the same amount to line things up again.

As you may see the second approach has various issues implementation issues regarding timecode and transport synchronization. Ardour uses internal read-ahead to compensate for latency. The time displayed in the Ardour clock corresponds to the audio-signal that you hear on the speakers (and is not where ardour reads files from disk).

NB. this is also one of the reasons why many projects start at timecode 01:00:00:00. When compensating for output-latency the DAW will need to read data from before the start of the session so that the audio arrives in time at the output when the timecode hits 01:00:00:00. Ardour3 does handle the case of 00:00:00:00 properly but not all systems/software/hardware that you may inter-operate with may behave the same.

Latency compensation and clock sync

To achieve sample accurate timecode synchronization, the latency introduced by the audio-setup needs to be known and compensated for.

In order to compensate for Latency, JACK or JACK applications need to know exactly how long a certain signal needs to be read-ahead or delayed:

Figure 2: Jack Latency Compensation

Figure 2: Jack Latency Compensation. This figure outlines the jack latency API. -- excerpt from http://jackaudio.org/files/jack-latency.png

In Figure 2, clients A and B need to be able to answer the following two questions:

JACK includes an API that allows applications to determine the answers to above questions. However JACK can not know about the additional latency that is introduced by the computer architecture, operating system and soundcard. These values indicated by -I and -O in Figure 2 and vary from system to system but are generally constant values. On a general purpose computer system the only way to accurately learn about the total latency is to measure it.

Calibrating JACK latency

Linux DSP guru Fons Adriaensen wrote a tool called jack_delay to accurately measure the roundtrip latency of a closed loop audio chain, with sub-sample accuracy. JACK itself includes a variant of this called jack_iodelay.

Jack_iodelay allows you to measure the total latency of the system, subtracts the known latency of JACK itself and suggests parameters for jackd's audio-backend -I and -O options.

jack_[io]delay works by emitting some rather annoying tones, capturing them again after a round trip through the whole chain, and measuring the difference in phase so it can estimate with great accuracy the time taken. This is not a theoretical estimation, jack_delay is a measuring tool that will provide very accurate answers.

You can close the loop in a number of ways:

Once you have closed the loop you have to:

  1. Launch jackd with the configuration you want to test.
  2. Launch jack_delay.
  3. Make the appropriate connections between your jack ports so the loop is closed.
  4. Adjust the playback and capture levels in your mixer.