The AudioContext
interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode
. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext
before you do anything else, as everything happens inside a context.
AudioContext()
AudioContext
object.Also inherits properties from its parent interface, BaseAudioContext
.
AudioContext.baseLatency
Read only
AudioContext
passing the audio from the AudioDestinationNode
to the audio subsystem.AudioContext.outputLatency
Read only
Also inherits methods from its parent interface, BaseAudioContext
.
AudioContext.close()
AudioContext.createMediaElementSource()
MediaElementAudioSourceNode
associated with an HTMLMediaElement
. This can be used to play and manipulate audio from <video>
or <audio>
elements.AudioContext.createMediaStreamSource()
MediaStreamAudioSourceNode
associated with a MediaStream
representing an audio stream which may come from the local computer microphone or other sources.AudioContext.createMediaStreamDestination()
MediaStreamAudioDestinationNode
associated with a MediaStream
representing an audio stream which may be stored in a local file or sent to another computer.AudioContext.createMediaStreamTrackSource()
MediaStreamTrackAudioSourceNode
associated with a MediaStream
representing an media stream track.AudioContext.getOutputTimestamp()
AudioTimestamp
object containing two correlated context's audio stream position values.AudioContext.suspend()
AudioContext.resume()
Note: The resume()
method is still available — it is now defined on the BaseAudioContext
interface (see BaseAudioContext.resume()
) and thus can be accessed by both the AudioContext
and OfflineAudioContext
interfaces.
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillatorNode = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); var finish = audioCtx.destination; // etc.
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'AudioContext' in that specification. | Working Draft |
Desktop | ||||||
---|---|---|---|---|---|---|
Chrome | Edge | Firefox | Internet Explorer | Opera | Safari | |
Basic support | 35
|
Yes | 25 | No | 22
|
6
|
AudioContext() constructor |
55
|
Yes | 25 | No | 42 | Yes
|
baseLatency
|
58 | ? | No | No | 45 | No |
outputLatency |
No | ? | No | No | No | No |
close |
42 | ? | 40 | No | Yes | ? |
createMediaElementSource |
14 | 12 | 25 | No | 15 | 6 |
createMediaStreamSource |
14 | 12 | 25 | No | 15 | 6 |
createMediaStreamDestination |
14 | Yes | 25 | No | 15 | 6 |
createMediaStreamTrackSource
|
? | ? | No | No | ? | No |
getOutputTimestamp
|
57 | ? | No | No | 44 | No |
suspend |
43 | ? | 40 | No | Yes | ? |
Mobile | |||||||
---|---|---|---|---|---|---|---|
Android webview | Chrome for Android | Edge Mobile | Firefox for Android | Opera for Android | iOS Safari | Samsung Internet | |
Basic support | Yes | 35
|
Yes | 26 | 22
|
? | Yes
|
AudioContext() constructor |
55 | 55
|
? | 25 | 42 | ? | 6.0 |
baseLatency
|
58 | 58 | ? | No | 45 | No | No |
outputLatency |
No | No | ? | No | No | ? | No |
close |
43 | 43 | ? | 40 | Yes | ? | 4.0 |
createMediaElementSource |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
createMediaStreamSource |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
createMediaStreamDestination |
Yes | 18 | Yes | 26 | 15 | ? | Yes |
createMediaStreamTrackSource
|
? | ? | ? | No | ? | No | ? |
getOutputTimestamp
|
57 | 57 | ? | No | 44 | No | 7.0 |
suspend |
43 | 43 | ? | 40 | Yes | ? | 4.0 |
© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext