简体   繁体   中英

How to listen when audioContext.destination do not play sound in Web Audio API

I'm using the $timeout angular function to call tick() each 512 ms in order to play datas which are in my audio queue. I'm using this to perform a live audio stream. Sometimes there are some cuts in the sounds and I really need to maintain a delta of one second between emitting and receiving sound. So I want to delete some audio datas in my queue corresponding to the duration of each cuts.

Do you know if there is a way to listen to those cuts on the audioContext.destination like :

audioContext.destination.oncuts = function(duration) {
    audioQueue.read(duration);
});

Here is my tick and audioQueue functions :

var tick = function() {
    $scope.soundclock = Date.now();
    $timeout(tick, $scope.tickInterval);
    if(startStream && isFocused) {
        if(isChrome === true || isOpera === true || isIE === true || isFirefox === true) {
            if(audioQueue.length()>=size) {
                float32 = audioQueue.read(size);
                source = audioContext.createBufferSource();
                audioBuffer = audioContext.createBuffer(1, size, sampleRate);
                data = audioBuffer.getChannelData(0);
                for(var i=0; i<size;i++) {
                    data[i] = float32[i];
                }
                source.buffer = audioBuffer;
                source.connect(audioContext.destination);
                source.start(0);
            }
        }
        if(isSafari === true) {
            if(audioQueue.length()>=size) {
                float32 = audioQueue.read(size);
                source = audioContext.createBufferSource();
                audioBuffer = audioContext.createBuffer(1, size, sampleRate);
                data = audioBuffer.getChannelData(0);
                for(var j=0; j<size;j++) {
                    data[j] = float32[j];
                }
                source.buffer = audioBuffer;
                source.connect(audioContext.destination);
                source.noteOn(0);
            }
        }
    }
};

var audioQueue = {
    buffer: new Float32Array(0),

    write: function(newAudio){
        currentQLength = this.buffer.length;
        newBuffer = new Float32Array(currentQLength+newAudio.length);
        d = Date.now() - date;
        console.log('Queued '+newBuffer.length+' samples. ');
        date = Date.now();
        newBuffer.set(this.buffer, 0);
        newBuffer.set(newAudio, currentQLength);
        this.buffer = newBuffer;
    },

    read: function(nSamples){
        samplesToPlay = this.buffer.subarray(0, nSamples);
        this.buffer = this.buffer.subarray(nSamples, this.buffer.length);
        console.log('Queue at '+this.buffer.length+' samples. ');
        return samplesToPlay;
    },

    length: function(){
        return this.buffer.length;
    }
};

You need to not rely on Javascript timers (which are, for audio purposes, horribly inaccurate) and schedule your ticks ahead of time. Check out http://www.html5rocks.com/en/tutorials/audio/scheduling/ , which I wrote a while ago about scheduling timers.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM