Web Audio API is one of the cutting-edge technologies Javascript has offered. (It isn’t even supported in IE11! But both Chrome and Firefox now support it without prefixes). It allows direct low-level manipulation of audio data (to sample level).
While the .wav
file format is supported by all browsers (can be decoded via AudioContext.decodeAudioData
, which is normally the same as those supported in the <audio>
tag), many other formats are not, such as MP3/AAC/Ogg due to various patent problems. Because audio codecs, as opposed to video codecs, require very low power to decode, it has been done in pure Javascript before (jsMAD, etc.). I actually aim to write some audio decoder in pure Javascript too, so the .wav
file playing is the first stepping stone. (audiocogs/etc is Not Invented Here™)
tl;dr Here’s the code. Read on for details.
https://gist.github.com/innocenat/5c5a48365930b691c863
The implementation is pretty straightforward. First, it fetches the file via XMLHttpRequest
as an array buffer. All .wav
files are in RIFF file format, so it looks for the RIFF header and format specifier. It then reads some metadata of the stream from the “fmt
" chunk and then reads the actual sample data and converts it to Float32Array to feed into audio context API.
Now, what is the problem with this implementation? The entire content is stored in memory, twice!. First inside arraybuffer
from XMLHttpRequest
, and second from AudioBuffer
. This seriously wastes memory (both Firefox and Chrome use roughly 300MB of memory when reading a 63MB file). Normally file of this length (it was 6:16 minutes long) isn’t supported to be played via AudioBuffer
, but to stream from the <audio>
element via MediaElementSource
. The problem is that we don’t want to use the browser’s audio decoder! (Which is why we are writing the decoder ourselves.)
In a normal situation though (i.e. not for a .wav
file), currently, for mp3/aac/etc file to be decoded in Javascript, we should decode it to a .wav
file, convert it to Blob and store to the browser via URL.createObjectURL
. Then use the blob URL via <audio>
to stream content to the Web Audio API.
In an ideal situation, we should be able to write the file via FileSystem API (only supported in Chrome right now) and the stream file to the decoder from disk, as most media players do.
I have yet to see how the Aurora.js audio engine works though. This is my initial approach to this problem. We will see.