LatencyMeter app measures the real-world Round Trip Latency (RTL) of a given iOS audio system*. It functions by playing short test tones while simultaneously recording (listening for) the tones to return back to the input. This technique requires patching a loopback cable between audio system's input and output.
(* iOS audio system here refers to the combination of an iPhone/iPad paired with particular external audio interface)
WARNING: Always turn off/mute any monitor speakers connected to the audio interface, as the LatencyMeter app plays loud sounds during testing.
LatencyMeter app allows you to define different buffer sizes and sample rates to determine the actual RTL times for a particular setup. Simply repeat last 3 steps as needed.
Depending on the levels set on the app and/or audio interface and the quality of the audio interface results may slightly vary, usually within 0.2 milliseconds.
The minimum latency that can be reliably measured is 1.5 ms. This represents the length of the test pulse. Anything below that value may be the result of audio bleeding between input and output.
Ensure the patch cable connects input 1 and output 1 of the audio interface. Set the app output volume slider to the maximum, and/or increase the level on the interface.
Turn volume down, and/or check any "Line/Mic" switches, or "Gain" knobs on the audio interface.
The theoretical minimum Round Trip Latency for a given sample rate, IO buffer size, audio interface and device. Estimated minimum RTL is calculated as twice the IO buffer duration plus the sum of reported hardware input and output latency. In reality, factors such as safety buffers on audio interface and/or OS-level will increase this number, so the "Measured RTL" shows the actual round trip latency. For convenience the difference between the two is shown in both milliseconds and frames.
See previous answer