admin管理员组文章数量:1026114
I am trying to get an audio file from an assets folder and play the decoded audio in the web. I want to use the web audio API so I can apply visuals based on the audio data.
Currently my code is:
let audioContext = new (window.AudioContext || window.webkitAudioContext)();
let masterGainNode = audioContext.createGain();
let songBuffer = null;
let path = "../assets/sampleTrackForWeb.mp3";
function fetchSong() {
fetch(path)
.then((response) => response.arrayBuffer())
.then((arrayBuffer) =>
audioContext.decodeAudioData(
arrayBuffer,
(audioBuffer) => {
console.log(audioBuffer) // the audio buffer is here and ready to go!
songBuffer = audioBuffer;
},
(error) => console.error(error)
)
);
}
fetchSong();
console.log(songBuffer); // null???!!!
I am following almost exactly from the MDN docs on how to do this. Any help is appreciated! :)
EDIT: Posting the MDN docs on how they do it
var source;
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
I am trying to get an audio file from an assets folder and play the decoded audio in the web. I want to use the web audio API so I can apply visuals based on the audio data.
Currently my code is:
let audioContext = new (window.AudioContext || window.webkitAudioContext)();
let masterGainNode = audioContext.createGain();
let songBuffer = null;
let path = "../assets/sampleTrackForWeb.mp3";
function fetchSong() {
fetch(path)
.then((response) => response.arrayBuffer())
.then((arrayBuffer) =>
audioContext.decodeAudioData(
arrayBuffer,
(audioBuffer) => {
console.log(audioBuffer) // the audio buffer is here and ready to go!
songBuffer = audioBuffer;
},
(error) => console.error(error)
)
);
}
fetchSong();
console.log(songBuffer); // null???!!!
I am following almost exactly from the MDN docs on how to do this. Any help is appreciated! :)
EDIT: Posting the MDN docs on how they do it
var source;
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
Share
Improve this question
edited Mar 18, 2021 at 13:27
Roko C. Buljan
207k41 gold badges328 silver badges340 bronze badges
asked Mar 18, 2021 at 1:29
JacobJacob
3013 silver badges13 bronze badges
7
- Just like asking someone to go get buy some matches by asking immediately afterwards - where are the matches. Please read: developer.mozilla/en-US/docs/Learn/JavaScript/Asynchronous/… - and developer.mozilla/en-US/docs/Web/JavaScript/Reference/… – Roko C. Buljan Commented Mar 18, 2021 at 1:31
- @RokoC.Buljan web audio api doesn't support promise based syntax for decodingAudioData on safari which is what im using – Jacob Commented Mar 18, 2021 at 1:34
- Fetch Song!--Gimme buffer!--- ... --- > song fetched. "Why is buffer Null"? Are you saying fetch does not relies on Promises? developer.mozilla/en-US/docs/Web/JavaScript/Reference/… – Roko C. Buljan Commented Mar 18, 2021 at 1:41
- OK I get that fetch is asynchronous that's why I have the .then() to await the response. But the buffer is there in the callback. Why is the audioBuffer not being stored into the songBuffer memory slot? – Jacob Commented Mar 18, 2021 at 1:48
-
By the same reason this won't work:
let a; setTimeout(() => (a = "Albatros"), 1000 ); console.log(a); // why is a not an Albatros
- If you placeconsole.log(songBuffer);
inside the thennable callback, what do you get in console? The ArrayBuffer, right? That means, if you want toawait
for it - than await for it. – Roko C. Buljan Commented Mar 18, 2021 at 1:52
1 Answer
Reset to default 3The issue:
you're synchronously expecting a result ahead of time. Just like doing:
let A;
setTimeout(() => (A = "Albatros"), 1000);
console.log(A); // undefined
// ...Why is A not an Albatros?
Promise
Learn about Promises to dive deeper into asynchronicity:
- Promise, Promise.then
- Async, Await
- Fetch (also returns a Promise)
From MDN decodeAudioData
The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an ArrayBuffer. In this case the ArrayBuffer is loaded from XMLHttpRequest and FileReader. The decoded AudioBuffer is resampled to the AudioContext's sampling rate, then passed to a callback or promise.
So let's explore how to pass it to a callback or promise:
Promise.then()
You could chain your Promises with .then()
.
Since .then()
returns a Promise, that allows to promisify the returned result of decodeAudioData
:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = (path) =>
fetch(path)
.then((res) => res.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer));
Async Await
or by using the sugary Async Await syntax:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = async (path) => {
const xhr = await fetch(path);
const arrayBuffer = await xhr.arrayBuffer();
return audioContext.decodeAudioData(arrayBuffer);
};
Both the above examples return a Promise, and can therefore be used like:
const songDataPromise = fetchSong("test.mp3"); // Promise {<pending>}
songDataPromise.then((audioBuffer) => {
console.log(audioBuffer); // AudioBuffer {}
console.log(audioBuffer.getChannelData(0)); // Float32Array []
});
Callback
To resolve all in-time, just chain another .then()
to the set - and pass its result to a callback function:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = (path, cb) =>
fetch(path)
.then((res) => res.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer))
.then(cb); // Resolve with callback
fetchSong("test.mp3", (audioBuffer) => {
console.log(audioBuffer); // AudioBuffer {}
console.log(audioBuffer.getChannelData(0)); // Float32Array []
});
I am trying to get an audio file from an assets folder and play the decoded audio in the web. I want to use the web audio API so I can apply visuals based on the audio data.
Currently my code is:
let audioContext = new (window.AudioContext || window.webkitAudioContext)();
let masterGainNode = audioContext.createGain();
let songBuffer = null;
let path = "../assets/sampleTrackForWeb.mp3";
function fetchSong() {
fetch(path)
.then((response) => response.arrayBuffer())
.then((arrayBuffer) =>
audioContext.decodeAudioData(
arrayBuffer,
(audioBuffer) => {
console.log(audioBuffer) // the audio buffer is here and ready to go!
songBuffer = audioBuffer;
},
(error) => console.error(error)
)
);
}
fetchSong();
console.log(songBuffer); // null???!!!
I am following almost exactly from the MDN docs on how to do this. Any help is appreciated! :)
EDIT: Posting the MDN docs on how they do it
var source;
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
I am trying to get an audio file from an assets folder and play the decoded audio in the web. I want to use the web audio API so I can apply visuals based on the audio data.
Currently my code is:
let audioContext = new (window.AudioContext || window.webkitAudioContext)();
let masterGainNode = audioContext.createGain();
let songBuffer = null;
let path = "../assets/sampleTrackForWeb.mp3";
function fetchSong() {
fetch(path)
.then((response) => response.arrayBuffer())
.then((arrayBuffer) =>
audioContext.decodeAudioData(
arrayBuffer,
(audioBuffer) => {
console.log(audioBuffer) // the audio buffer is here and ready to go!
songBuffer = audioBuffer;
},
(error) => console.error(error)
)
);
}
fetchSong();
console.log(songBuffer); // null???!!!
I am following almost exactly from the MDN docs on how to do this. Any help is appreciated! :)
EDIT: Posting the MDN docs on how they do it
var source;
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
Share
Improve this question
edited Mar 18, 2021 at 13:27
Roko C. Buljan
207k41 gold badges328 silver badges340 bronze badges
asked Mar 18, 2021 at 1:29
JacobJacob
3013 silver badges13 bronze badges
7
- Just like asking someone to go get buy some matches by asking immediately afterwards - where are the matches. Please read: developer.mozilla/en-US/docs/Learn/JavaScript/Asynchronous/… - and developer.mozilla/en-US/docs/Web/JavaScript/Reference/… – Roko C. Buljan Commented Mar 18, 2021 at 1:31
- @RokoC.Buljan web audio api doesn't support promise based syntax for decodingAudioData on safari which is what im using – Jacob Commented Mar 18, 2021 at 1:34
- Fetch Song!--Gimme buffer!--- ... --- > song fetched. "Why is buffer Null"? Are you saying fetch does not relies on Promises? developer.mozilla/en-US/docs/Web/JavaScript/Reference/… – Roko C. Buljan Commented Mar 18, 2021 at 1:41
- OK I get that fetch is asynchronous that's why I have the .then() to await the response. But the buffer is there in the callback. Why is the audioBuffer not being stored into the songBuffer memory slot? – Jacob Commented Mar 18, 2021 at 1:48
-
By the same reason this won't work:
let a; setTimeout(() => (a = "Albatros"), 1000 ); console.log(a); // why is a not an Albatros
- If you placeconsole.log(songBuffer);
inside the thennable callback, what do you get in console? The ArrayBuffer, right? That means, if you want toawait
for it - than await for it. – Roko C. Buljan Commented Mar 18, 2021 at 1:52
1 Answer
Reset to default 3The issue:
you're synchronously expecting a result ahead of time. Just like doing:
let A;
setTimeout(() => (A = "Albatros"), 1000);
console.log(A); // undefined
// ...Why is A not an Albatros?
Promise
Learn about Promises to dive deeper into asynchronicity:
- Promise, Promise.then
- Async, Await
- Fetch (also returns a Promise)
From MDN decodeAudioData
The decodeAudioData() method of the BaseAudioContext Interface is used to asynchronously decode audio file data contained in an ArrayBuffer. In this case the ArrayBuffer is loaded from XMLHttpRequest and FileReader. The decoded AudioBuffer is resampled to the AudioContext's sampling rate, then passed to a callback or promise.
So let's explore how to pass it to a callback or promise:
Promise.then()
You could chain your Promises with .then()
.
Since .then()
returns a Promise, that allows to promisify the returned result of decodeAudioData
:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = (path) =>
fetch(path)
.then((res) => res.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer));
Async Await
or by using the sugary Async Await syntax:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = async (path) => {
const xhr = await fetch(path);
const arrayBuffer = await xhr.arrayBuffer();
return audioContext.decodeAudioData(arrayBuffer);
};
Both the above examples return a Promise, and can therefore be used like:
const songDataPromise = fetchSong("test.mp3"); // Promise {<pending>}
songDataPromise.then((audioBuffer) => {
console.log(audioBuffer); // AudioBuffer {}
console.log(audioBuffer.getChannelData(0)); // Float32Array []
});
Callback
To resolve all in-time, just chain another .then()
to the set - and pass its result to a callback function:
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const fetchSong = (path, cb) =>
fetch(path)
.then((res) => res.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer))
.then(cb); // Resolve with callback
fetchSong("test.mp3", (audioBuffer) => {
console.log(audioBuffer); // AudioBuffer {}
console.log(audioBuffer.getChannelData(0)); // Float32Array []
});
本文标签: javascriptRead result of PromiseBuffer from decoded audioStack Overflow
版权声明:本文标题:javascript - Read result of Promise - Buffer from decoded audio - Stack Overflow 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://it.en369.cn/questions/1745629805a2160091.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论