What Buffer Size Should I Use on My Audio Interface?

On some audio interfaces, a buffer size can be defined in the devices control panel. For example, an RME Fireface UC:


Some versions of drivers come with different default buffer sizes. If you purchased your interface from Listen, the buffer size used to calibrate the latency settings will be stated in the spreadsheet. 

Choosing a buffer size is dependent on many factors. The CPU, RAM, connection type, interface in use, and simultaneous channels can all affect what buffer size is needed. Increasing the buffer size can help with audio dropouts, crackling, and other performance issues. However, increasing the buffer size will increase the latency between the input and output data.

Using AutoDelay in a SoundCheck analysis step will correct for changes in latency or any devices with unstable latency, but if you are using an ASIO or DAQmx device the latency value in System Hardware can be calibrated to consistently measure zero.

To Correct the Latency of a Device in SoundCheck:

In Setup > Hardware set the latency value to zero.


Open the sequence “Self Test” in the SoundCheck install folder > Sequences > Calibration. Follow the instructions and run the sequence. The final results window will display a Sound card Delay value.


Enter this number into Setup > Hardware > Latency.


Re-run Self Test. The Sound card Delay value should now be zero.


If a buffer value is changed, the latency value will need to be recalculated. To avoid calibrating when switching between buffer sizes, multiple hardware channels can be created in System Hardware to be used with different latencies. 

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request


Powered by Zendesk