Since a call to AudioWorkletProcessor.process
is synchronous according to https://developer.mozilla.org/en-US/docs/Web/API/AudioWorkletProcessor/process, what happens when an execution takes too long time, more than 1 second, for example? Would some audio samples be skipped in a next call? Or would samples be queued somewhere? I could not find a documentation on this.
A realtime AudioContext
is usually tight to a certain physical audio output device. Therefore the currentTime
of an AudioContext
is driven by the hardware clock of that audio output device.
If the AudioContext
somehow fails to deliver the samples in time the result will be silence. Typically that can be heard as a click sound if there is suddenly a little bit of silence where it shouldn't be. The samples that took too long to render will then be used for the next render quantum if they happen to be ready by then.
There is a difference though in how browsers advance the currentTime
in case some samples couldn't be delivered in time. Firefox seems to not count any missed samples whereas Chrome progresses time counting all samples that the audio hardware spit out including those that couldn't be rendered in time.
To test this I created an AudioWorkletProcessor
that can be paused or blocked from within the main thread.
class BlockableProcessor extends AudioWorkletProcessor {
constructor () {
super();
this.int32Array = null;
this.port.onmessage = ({ data }) => this.int32Array = data;
}
process() {
if (this.int32Array !== null) {
while (Atomics.load(this.int32Array, 0) === 0) {
Atomics.store(this.int32Array, 1, currentTime);
}
Atomics.store(this.int32Array, 1, currentTime);
}
return true;
}
}
registerProcessor('blockable-processor', BlockableProcessor);
This BlockableProcessor
expects to receive a SharedArrayBuffer
which it uses to check if it should block the process()
function or not. And it also uses it to communicate the currentTime
back to the main thread.
It can be used like this:
const audioWorkletNode = new AudioWorkletNode(
audioContext,
'blockable-processor'
);
const sharedArrayBuffer = new SharedArrayBuffer(8);
const int32Array = new Int32Array(sharedArrayBuffer);
Atomics.store(int32Array, 0, 1);
Atomics.store(int32Array, 1, 0);
audioWorkletNode.port.postMessage(int32Array);
audioWorkletNode.connect(audioContext.destination);
It can be blocked by setting the first value of the SharedArrayBuffer
to 0.
Atomics.store(int32Array, 0, 0);
And likewise it can be unblocked again by setting it to 1 again.
Atomics.store(int32Array, 0, 1);
It's also possible to read the currentTime
written from within the AudioWorkletProcessor
on the main thread.
Atomics.load(int32Array, 1);
Using this setup I was able to see that Chrome (v95) and Firefox (v93) both stop the time progression on the main thread and in the worklet in case the processor is blocked.
When it gets unblocked Firefox continues where it stopped and Chrome continues where it should be if everything would have been going as expected.