简体   繁体   中英

Connect an audio player UI to an AudioContext.destination

There are various ReactJS components that provide UI to control audio playback. Most of these assume you will provide audio file paths. How can I instead tell the UI to control the audio playback of an AudioContext.destination node ?

The AudioContext will have various sources and intermediate nodes. I want the UI component to give information (current time position; volume status) and control (play/pause; volume, mute) to the user, correspondingly on the AudioContext.

Unfortunately there is no simple way to map the transport functions of an AudioElement to an AudioContext but there are of course some similarities.

I don't use any React in the following examples but it should hopefully be fairly simple to wrap the code snippets in a Component that can be consumed by your frontend framework of choice.

Let's say you have an instance of an AudioContext.

const audioContext = new AudioContext();

In this case the audioContext is only used to play a simple continuous sine wave by using an OscillatorNode.

const oscillatorNode = new OscillatorNode(audioContext);

oscillatorNode.start();
oscillatorNode.connect(audioContext.destination);

The oscillatorNode could of course be stopped by calling oscillatorNode.stop() but this would render the oscillatorNode useless. It can't be started again. You would also have to do this for every OscillatorNode in case there are more than one.

But there is a way to pause the whole AudioContext by suspending it.

audioContext.suspend();

This will return a promise that resolves when the AudioContext is paused. To get the AudioContext running again you can use its resume() method.

audioContext.resume();

Just like the suspend() method resume() returns a promise which resolves when the context is running again.

In addition to that an AudioContext has also a state property which can be used to find out if the audioContext is 'running' , 'suspended' or 'closed' .

Controlling the volume of the whole audioContext is a bit more tricky. Every AudioContext has a destination which is the AudioNode which everything has to be connected to. But the destination does not allow to modify the volume. I think the easiest way to get this functionality is to use an additional GainNode as a proxy.

const destinationGainNode = new GainNode(audioContext);

destinationGainNode.connect(audioContext.destination);

Then you have to make sure that you connect everything to the destinationGainNode instead. In case of the oscillatorNode introduced above that would look like this:

oscillatorNode.connect(destinationGainNode);

With that proxy in place you can control the volume by using the gain AudioParam of the destinationGainNode . To mute the signal call ...

destinationGainNode.gain.value = 0;

... and to unmute it again just call ...

destinationGainNode.gain.value = 1;

I hope this helps to create a React Component to control an AudioContext.

Please note, that all examples use the latest syntax of the Web Audio API which is not yet available in Edge and Safari. To get the examples working in these browsers a polyfill is needed. I do of course recommend standardized-audio-context as I am the author of that package. :-)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM