admin管理员组

文章数量:1022467

I have audio data streaming from the server to the client. It starts as a Node.js buffer (which is a Uint8Array) and is then sent to the AudioWorkletProcessor via port.postMessage(), where it is converted into a Float32Array and stored in this.data. I have spent hours trying to set the output to the audio data contained in the Float32Array. Logging the Float32Array pre-processing shows accurate data, but logging it during processing shows that it is not changing when the new message is posted. This is probably a gap in my low-level audio-programming knowledge.

When data arrives in the client, the following function is called:

  process = (data) => {
        this.node.port.postMessage(data)
  }

As an aside, (and you can let me know) maybe I should be using parameter descriptors instead of postMessage? Anyways, here's my AudioWorkletProcessor:

class BypassProcessor extends AudioWorkletProcessor {

  constructor() {
    super();
    this.isPlaying = true;
    this.port.onmessage = this.onmessage.bind(this)
  }

  static get parameterDescriptors() {
    return [{ // Maybe we should use parameters. This is not utilized at present.
      name: 'stream',
      defaultValue: 0.707
    }];
  }

  convertBlock = (iningData) => { // ining data is a UInt8Array
    let i, l = iningData.length;
    let outputData = new Float32Array(iningData.length);
    for (i = 0; i < l; i++) {
      outputData[i] = (iningData[i] - 128) / 128.0;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    let ui8 = new Uint8Array(data);
    this.data = this.convertBlock(ui8)
  }

  process(inputs, outputs) {
    const input = inputs[0];
    const output = outputs[0];
    if (this.data) {
      for (let channel = 0; channel < output.length; ++channel) {
        const inputChannel = input[channel]
        const outputChannel = output[channel]
        for (let i = 0; i < inputChannel.length; ++i) {
          outputChannel[i] = this.data[i]
        }
      }
    }

    return true;

  }
}

registerProcessor('bypass-processor', BypassProcessor);

How can I simply set the output of the AudioWorkletProcessor to the data ing through?

I have audio data streaming from the server to the client. It starts as a Node.js buffer (which is a Uint8Array) and is then sent to the AudioWorkletProcessor via port.postMessage(), where it is converted into a Float32Array and stored in this.data. I have spent hours trying to set the output to the audio data contained in the Float32Array. Logging the Float32Array pre-processing shows accurate data, but logging it during processing shows that it is not changing when the new message is posted. This is probably a gap in my low-level audio-programming knowledge.

When data arrives in the client, the following function is called:

  process = (data) => {
        this.node.port.postMessage(data)
  }

As an aside, (and you can let me know) maybe I should be using parameter descriptors instead of postMessage? Anyways, here's my AudioWorkletProcessor:

class BypassProcessor extends AudioWorkletProcessor {

  constructor() {
    super();
    this.isPlaying = true;
    this.port.onmessage = this.onmessage.bind(this)
  }

  static get parameterDescriptors() {
    return [{ // Maybe we should use parameters. This is not utilized at present.
      name: 'stream',
      defaultValue: 0.707
    }];
  }

  convertBlock = (iningData) => { // ining data is a UInt8Array
    let i, l = iningData.length;
    let outputData = new Float32Array(iningData.length);
    for (i = 0; i < l; i++) {
      outputData[i] = (iningData[i] - 128) / 128.0;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    let ui8 = new Uint8Array(data);
    this.data = this.convertBlock(ui8)
  }

  process(inputs, outputs) {
    const input = inputs[0];
    const output = outputs[0];
    if (this.data) {
      for (let channel = 0; channel < output.length; ++channel) {
        const inputChannel = input[channel]
        const outputChannel = output[channel]
        for (let i = 0; i < inputChannel.length; ++i) {
          outputChannel[i] = this.data[i]
        }
      }
    }

    return true;

  }
}

registerProcessor('bypass-processor', BypassProcessor);

How can I simply set the output of the AudioWorkletProcessor to the data ing through?

Share Improve this question edited Apr 9, 2023 at 21:07 Scott Stensland 28.4k12 gold badges98 silver badges109 bronze badges asked Aug 21, 2019 at 1:16 heyaphraheyaphra 4691 gold badge5 silver badges16 bronze badges 7
  • Hi, I have the same problem, did you get it ? – vmontanheiro Commented Oct 1, 2019 at 14:56
  • 1 @vmontanheiro No, unfortunately. I am able to do it without the AudioWorklet however – heyaphra Commented Oct 1, 2019 at 16:53
  • I got it! Are you getting your data from an RTP packet ming from server ?? And are you posting message to your process using SharedArrayBuffer? Don't lose your hope. haha – vmontanheiro Commented Oct 1, 2019 at 17:05
  • Omg! if there's any way I could pick your brain on the SharedArrayBuffer pattern, do get in touch! Currently just have a pretty vanilla pattern with packets being sent from the server – heyaphra Commented Oct 1, 2019 at 18:23
  • 1 SharedArrayBuffer is the preferred way because it doesn't generate additional garbage, but I think that your problem is the same that mine. My data length is 160 bytes, but the AudioWorkletProcessor process only each 128 bytes, so I'm trying managing it a BufferRing(FIFO) to return the buffer with a corrct size. When I finish I'll share it with you. – vmontanheiro Commented Oct 1, 2019 at 19:22
 |  Show 2 more ments

1 Answer 1

Reset to default 5

The AudioWorkletProcessor process only each 128 bytes, so you need to manage your own buffers to make sure that is the case for an AudioWorklet, probably by adding a FIFO. I resolved something like this using a RingBuffer(FIFO) implemented in WebAssembly, in my case I was receiving a buffer with 160 bytes.

Look my AudioWorkletProcessor implementation

import Module from './buffer-kernel.wasmodule.js';
import { HeapAudioBuffer, RingBuffer, ALAW_TO_LINEAR } from './audio-helper.js';

class SpeakerWorkletProcessor extends AudioWorkletProcessor {
  constructor(options) {
    super();
    this.payload = null;
    this.bufferSize = options.processorOptions.bufferSize; // Getting buffer size from options
    this.channelCount = options.processorOptions.channelCount;
    this.inputRingBuffer = new RingBuffer(this.bufferSize, this.channelCount);
    this.outputRingBuffer = new RingBuffer(this.bufferSize, this.channelCount);
    this.heapInputBuffer = new HeapAudioBuffer(Module, this.bufferSize, this.channelCount);
    this.heapOutputBuffer = new HeapAudioBuffer(Module, this.bufferSize, this.channelCount);
    this.kernel = new Module.VariableBufferKernel(this.bufferSize);
    this.port.onmessage = this.onmessage.bind(this);
  }

  alawToLinear(iningData) {
    const outputData = new Float32Array(iningData.length);
    for (let i = 0; i < iningData.length; i++) {
      outputData[i] = (ALAW_TO_LINEAR[iningData[i]] * 1.0) / 32768;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    if (data) {
      this.payload = this.alawToLinear(new Uint8Array(data)); //Receiving data from my Socket listener and in my case converting PCM alaw to linear
    } else {
      this.payload = null;
    }
  }

  process(inputs, outputs) {
    const output = outputs[0];
    if (this.payload) {
      this.inputRingBuffer.push([this.payload]); // Pushing data from my Socket

      if (this.inputRingBuffer.framesAvailable >= this.bufferSize) { // if the input data size hits the buffer size, so I can "outputted"  
        this.inputRingBuffer.pull(this.heapInputBuffer.getChannelData());
        this.kernel.process(
          this.heapInputBuffer.getHeapAddress(),
          this.heapOutputBuffer.getHeapAddress(),
          this.channelCount,
        );
        this.outputRingBuffer.push(this.heapOutputBuffer.getChannelData());
      }
      this.outputRingBuffer.pull(output); // Retriving data from FIFO and putting our output
    }
    return true;
  }
}

registerProcessor(`speaker-worklet-processor`, SpeakerWorkletProcessor);

Look the AudioContext and AudioWorklet instances

 this.audioContext = new AudioContext({
      latencyHint: 'interactive',
      sampleRate: this.sampleRate,
      sinkId: audioinput || "default"
    });

    this.audioBuffer = this.audioContext.createBuffer(1, this.audioSize, this.sampleRate);
    this.audioSource = this.audioContext.createBufferSource();
    this.audioSource.buffer = this.audioBuffer;
    this.audioSource.loop = true;
    this.audioContext.audioWorklet
    .addModule('workers/speaker-worklet-processor.js')
    .then(() => {
      this.speakerWorklet = new AudioWorkletNode(
        this.audioContext,
        'speaker-worklet-processor',
        {
          channelCount: 1,
          processorOptions: {
            bufferSize: 160, //Here I'm passing the size of my output, I'm just saying to RingBuffer what size I need 
            channelCount: 1,
          },
        },
      );
      this.audioSource.connect(this.speakerWorklet).connect(this.audioContext.destination);
    }).catch((err)=>{
      console.log("Receiver ", err);
    })

Look how I'm receiving and sending the data from Socket to audioWorklet

  protected onMessage(e: any): void { //My Socket message listener
    const { data:serverData } = e;
    const socketId = e.socketId;
    if (this.audioWalking && this.ws && !this.ws.isPaused() && this.ws.info.socketId === socketId) {
      const buffer = arrayBufferToBuffer(serverData);
      const rtp = RTPParser.parseRtpPacket(buffer);
      const sharedPayload = new Uint8Array(new SharedArrayBuffer(rtp.payload.length)); //sharing javascript buffer memory between main thread and worklet thread
      sharedPayload.set(rtp.payload, 0);
      this.speakerWorklet.port.postMessage(sharedPayload); //Sending data to worklet
    }
  } 

For help people I putted on Github the piece important of this solution

  • audio-worklet-processor-wasm-example

I followed this example, it have the all explanation how the RingBuffer works

  • wasm-ring-buffer

I have audio data streaming from the server to the client. It starts as a Node.js buffer (which is a Uint8Array) and is then sent to the AudioWorkletProcessor via port.postMessage(), where it is converted into a Float32Array and stored in this.data. I have spent hours trying to set the output to the audio data contained in the Float32Array. Logging the Float32Array pre-processing shows accurate data, but logging it during processing shows that it is not changing when the new message is posted. This is probably a gap in my low-level audio-programming knowledge.

When data arrives in the client, the following function is called:

  process = (data) => {
        this.node.port.postMessage(data)
  }

As an aside, (and you can let me know) maybe I should be using parameter descriptors instead of postMessage? Anyways, here's my AudioWorkletProcessor:

class BypassProcessor extends AudioWorkletProcessor {

  constructor() {
    super();
    this.isPlaying = true;
    this.port.onmessage = this.onmessage.bind(this)
  }

  static get parameterDescriptors() {
    return [{ // Maybe we should use parameters. This is not utilized at present.
      name: 'stream',
      defaultValue: 0.707
    }];
  }

  convertBlock = (iningData) => { // ining data is a UInt8Array
    let i, l = iningData.length;
    let outputData = new Float32Array(iningData.length);
    for (i = 0; i < l; i++) {
      outputData[i] = (iningData[i] - 128) / 128.0;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    let ui8 = new Uint8Array(data);
    this.data = this.convertBlock(ui8)
  }

  process(inputs, outputs) {
    const input = inputs[0];
    const output = outputs[0];
    if (this.data) {
      for (let channel = 0; channel < output.length; ++channel) {
        const inputChannel = input[channel]
        const outputChannel = output[channel]
        for (let i = 0; i < inputChannel.length; ++i) {
          outputChannel[i] = this.data[i]
        }
      }
    }

    return true;

  }
}

registerProcessor('bypass-processor', BypassProcessor);

How can I simply set the output of the AudioWorkletProcessor to the data ing through?

I have audio data streaming from the server to the client. It starts as a Node.js buffer (which is a Uint8Array) and is then sent to the AudioWorkletProcessor via port.postMessage(), where it is converted into a Float32Array and stored in this.data. I have spent hours trying to set the output to the audio data contained in the Float32Array. Logging the Float32Array pre-processing shows accurate data, but logging it during processing shows that it is not changing when the new message is posted. This is probably a gap in my low-level audio-programming knowledge.

When data arrives in the client, the following function is called:

  process = (data) => {
        this.node.port.postMessage(data)
  }

As an aside, (and you can let me know) maybe I should be using parameter descriptors instead of postMessage? Anyways, here's my AudioWorkletProcessor:

class BypassProcessor extends AudioWorkletProcessor {

  constructor() {
    super();
    this.isPlaying = true;
    this.port.onmessage = this.onmessage.bind(this)
  }

  static get parameterDescriptors() {
    return [{ // Maybe we should use parameters. This is not utilized at present.
      name: 'stream',
      defaultValue: 0.707
    }];
  }

  convertBlock = (iningData) => { // ining data is a UInt8Array
    let i, l = iningData.length;
    let outputData = new Float32Array(iningData.length);
    for (i = 0; i < l; i++) {
      outputData[i] = (iningData[i] - 128) / 128.0;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    let ui8 = new Uint8Array(data);
    this.data = this.convertBlock(ui8)
  }

  process(inputs, outputs) {
    const input = inputs[0];
    const output = outputs[0];
    if (this.data) {
      for (let channel = 0; channel < output.length; ++channel) {
        const inputChannel = input[channel]
        const outputChannel = output[channel]
        for (let i = 0; i < inputChannel.length; ++i) {
          outputChannel[i] = this.data[i]
        }
      }
    }

    return true;

  }
}

registerProcessor('bypass-processor', BypassProcessor);

How can I simply set the output of the AudioWorkletProcessor to the data ing through?

Share Improve this question edited Apr 9, 2023 at 21:07 Scott Stensland 28.4k12 gold badges98 silver badges109 bronze badges asked Aug 21, 2019 at 1:16 heyaphraheyaphra 4691 gold badge5 silver badges16 bronze badges 7
  • Hi, I have the same problem, did you get it ? – vmontanheiro Commented Oct 1, 2019 at 14:56
  • 1 @vmontanheiro No, unfortunately. I am able to do it without the AudioWorklet however – heyaphra Commented Oct 1, 2019 at 16:53
  • I got it! Are you getting your data from an RTP packet ming from server ?? And are you posting message to your process using SharedArrayBuffer? Don't lose your hope. haha – vmontanheiro Commented Oct 1, 2019 at 17:05
  • Omg! if there's any way I could pick your brain on the SharedArrayBuffer pattern, do get in touch! Currently just have a pretty vanilla pattern with packets being sent from the server – heyaphra Commented Oct 1, 2019 at 18:23
  • 1 SharedArrayBuffer is the preferred way because it doesn't generate additional garbage, but I think that your problem is the same that mine. My data length is 160 bytes, but the AudioWorkletProcessor process only each 128 bytes, so I'm trying managing it a BufferRing(FIFO) to return the buffer with a corrct size. When I finish I'll share it with you. – vmontanheiro Commented Oct 1, 2019 at 19:22
 |  Show 2 more ments

1 Answer 1

Reset to default 5

The AudioWorkletProcessor process only each 128 bytes, so you need to manage your own buffers to make sure that is the case for an AudioWorklet, probably by adding a FIFO. I resolved something like this using a RingBuffer(FIFO) implemented in WebAssembly, in my case I was receiving a buffer with 160 bytes.

Look my AudioWorkletProcessor implementation

import Module from './buffer-kernel.wasmodule.js';
import { HeapAudioBuffer, RingBuffer, ALAW_TO_LINEAR } from './audio-helper.js';

class SpeakerWorkletProcessor extends AudioWorkletProcessor {
  constructor(options) {
    super();
    this.payload = null;
    this.bufferSize = options.processorOptions.bufferSize; // Getting buffer size from options
    this.channelCount = options.processorOptions.channelCount;
    this.inputRingBuffer = new RingBuffer(this.bufferSize, this.channelCount);
    this.outputRingBuffer = new RingBuffer(this.bufferSize, this.channelCount);
    this.heapInputBuffer = new HeapAudioBuffer(Module, this.bufferSize, this.channelCount);
    this.heapOutputBuffer = new HeapAudioBuffer(Module, this.bufferSize, this.channelCount);
    this.kernel = new Module.VariableBufferKernel(this.bufferSize);
    this.port.onmessage = this.onmessage.bind(this);
  }

  alawToLinear(iningData) {
    const outputData = new Float32Array(iningData.length);
    for (let i = 0; i < iningData.length; i++) {
      outputData[i] = (ALAW_TO_LINEAR[iningData[i]] * 1.0) / 32768;
    }
    return outputData;
  }

  onmessage(event) {
    const { data } = event;
    if (data) {
      this.payload = this.alawToLinear(new Uint8Array(data)); //Receiving data from my Socket listener and in my case converting PCM alaw to linear
    } else {
      this.payload = null;
    }
  }

  process(inputs, outputs) {
    const output = outputs[0];
    if (this.payload) {
      this.inputRingBuffer.push([this.payload]); // Pushing data from my Socket

      if (this.inputRingBuffer.framesAvailable >= this.bufferSize) { // if the input data size hits the buffer size, so I can "outputted"  
        this.inputRingBuffer.pull(this.heapInputBuffer.getChannelData());
        this.kernel.process(
          this.heapInputBuffer.getHeapAddress(),
          this.heapOutputBuffer.getHeapAddress(),
          this.channelCount,
        );
        this.outputRingBuffer.push(this.heapOutputBuffer.getChannelData());
      }
      this.outputRingBuffer.pull(output); // Retriving data from FIFO and putting our output
    }
    return true;
  }
}

registerProcessor(`speaker-worklet-processor`, SpeakerWorkletProcessor);

Look the AudioContext and AudioWorklet instances

 this.audioContext = new AudioContext({
      latencyHint: 'interactive',
      sampleRate: this.sampleRate,
      sinkId: audioinput || "default"
    });

    this.audioBuffer = this.audioContext.createBuffer(1, this.audioSize, this.sampleRate);
    this.audioSource = this.audioContext.createBufferSource();
    this.audioSource.buffer = this.audioBuffer;
    this.audioSource.loop = true;
    this.audioContext.audioWorklet
    .addModule('workers/speaker-worklet-processor.js')
    .then(() => {
      this.speakerWorklet = new AudioWorkletNode(
        this.audioContext,
        'speaker-worklet-processor',
        {
          channelCount: 1,
          processorOptions: {
            bufferSize: 160, //Here I'm passing the size of my output, I'm just saying to RingBuffer what size I need 
            channelCount: 1,
          },
        },
      );
      this.audioSource.connect(this.speakerWorklet).connect(this.audioContext.destination);
    }).catch((err)=>{
      console.log("Receiver ", err);
    })

Look how I'm receiving and sending the data from Socket to audioWorklet

  protected onMessage(e: any): void { //My Socket message listener
    const { data:serverData } = e;
    const socketId = e.socketId;
    if (this.audioWalking && this.ws && !this.ws.isPaused() && this.ws.info.socketId === socketId) {
      const buffer = arrayBufferToBuffer(serverData);
      const rtp = RTPParser.parseRtpPacket(buffer);
      const sharedPayload = new Uint8Array(new SharedArrayBuffer(rtp.payload.length)); //sharing javascript buffer memory between main thread and worklet thread
      sharedPayload.set(rtp.payload, 0);
      this.speakerWorklet.port.postMessage(sharedPayload); //Sending data to worklet
    }
  } 

For help people I putted on Github the piece important of this solution

  • audio-worklet-processor-wasm-example

I followed this example, it have the all explanation how the RingBuffer works

  • wasm-ring-buffer

本文标签: javascriptAudioWorkletSet output to Float32Array to stream live audioStack Overflow