OscillatorNode

Baseline Widely available

This feature is well established and works across many devices and browser versions. It’s been available across browsers since ⁨July 2015⁩.

The OscillatorNode interface represents a periodic waveform, such as a sine wave. It is an AudioScheduledSourceNode audio-processing module that causes a specified frequency of a given wave to be created—in effect, a constant tone.

EventTarget AudioNode AudioScheduledSourceNode OscillatorNode
Number of inputs 0
Number of outputs 1
Channel count mode max
Channel count 2 (not used in the default count mode)
Channel interpretation speakers

Constructor

OscillatorNode()

Creates a new instance of an OscillatorNode object, optionally providing an object specifying default values for the node's properties. As an alternative, you can use the BaseAudioContext.createOscillator() factory method; see Creating an AudioNode.

Instance properties

Also inherits properties from its parent, AudioScheduledSourceNode.

OscillatorNode.frequency

An a-rate AudioParam representing the frequency of oscillation in hertz (though the AudioParam returned is read-only, the value it represents is not). The default value is 440 Hz (a standard middle-A note).

OscillatorNode.detune

An a-rate AudioParam representing detuning of oscillation in cents (though the AudioParam returned is read-only, the value it represents is not). The default value is 0.

OscillatorNode.type

A string which specifies the shape of waveform to play; this can be one of a number of standard values, or custom to use a PeriodicWave to describe a custom waveform. Different waves will produce different tones. Standard values are "sine", "square", "sawtooth", "triangle" and "custom". The default is "sine".

Instance methods

Also inherits methods from its parent, AudioScheduledSourceNode.

OscillatorNode.setPeriodicWave()

Sets a PeriodicWave which describes a periodic waveform to be used instead of one of the standard waveforms; calling this sets the type to custom.

AudioScheduledSourceNode.start()

Specifies the exact time to start playing the tone.

AudioScheduledSourceNode.stop()

Specifies the time to stop playing the tone.

Events

Also inherits events from its parent, AudioScheduledSourceNode.

Examples

Using an OscillatorNode

The following example shows basic usage of an AudioContext to create an oscillator node and to start playing a tone on it. For an applied example, check out our Violent Theremin demo (see app.js for relevant code).

js
// create web audio api context
const audioCtx = new AudioContext();

// create Oscillator node
const oscillator = audioCtx.createOscillator();

oscillator.type = "square";
oscillator.frequency.setValueAtTime(440, audioCtx.currentTime); // value in hertz
oscillator.connect(audioCtx.destination);
oscillator.start();

Different oscillator node types

The four built-in oscillator types are sine, square, triangle and sawtooth. They are the shape of the waveform generated by an oscillator. Fun fact: These are the defaults for most synths because they are waveforms which are easy to generate electronically. This example visualizes the waveforms for the different types at different frequencies.

html
<div class="controls">
  <label for="type-select">
    Oscillator type
    <select id="type-select">
      <option>sine</option>
      <option>square</option>
      <option>triangle</option>
      <option>sawtooth</option>
    </select>
  </label>

  <label for="freq-range">
    Frequency
    <input
      type="range"
      min="100"
      max="800"
      step="10"
      value="250"
      id="freq-range" />
  </label>
  <button data-playing="init" id="play-button">Play</button>
</div>

<canvas id="wave-graph"></canvas>

The code is in two parts: in the first part, we set up the sound stuff.

js
const typeSelect = document.getElementById("type-select");
const frequencyControl = document.getElementById("freq-range");
const playButton = document.getElementById("play-button");

const audioCtx = new AudioContext();
const osc = new OscillatorNode(audioCtx, {
  type: typeSelect.value,
  frequency: frequencyControl.valueAsNumber,
});
// Rather than creating a new oscillator for every start and stop
// which you would do in an audio application, we are just going
// to mute/un-mute for demo purposes - this means we need a gain node
const gain = new GainNode(audioCtx);
const analyser = new AnalyserNode(audioCtx, {
  fftSize: 1024,
  smoothingTimeConstant: 0.8,
});
osc.connect(gain).connect(analyser).connect(audioCtx.destination);

typeSelect.addEventListener("change", () => {
  osc.type = typeSelect.value;
});

frequencyControl.addEventListener("input", () => {
  osc.frequency.value = frequencyControl.valueAsNumber;
});

playButton.addEventListener("click", () => {
  if (audioCtx.state === "suspended") {
    audioCtx.resume();
  }

  if (playButton.dataset.playing === "init") {
    osc.start(audioCtx.currentTime);
    playButton.dataset.playing = "true";
    playButton.innerText = "Pause";
  } else if (playButton.dataset.playing === "false") {
    gain.gain.linearRampToValueAtTime(1, audioCtx.currentTime + 0.2);
    playButton.dataset.playing = "true";
    playButton.innerText = "Pause";
  } else if (playButton.dataset.playing === "true") {
    gain.gain.linearRampToValueAtTime(0.0001, audioCtx.currentTime + 0.2);
    playButton.dataset.playing = "false";
    playButton.innerText = "Play";
  }
});

As for the second part, we draw the waveform on a canvas using the AnalyserNode we created above.

js
const dpr = window.devicePixelRatio;
const w = 500 * dpr;
const h = 300 * dpr;
const canvasEl = document.getElementById("wave-graph");
canvasEl.width = w;
canvasEl.height = h;
const canvasCtx = canvasEl.getContext("2d");

const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
analyser.getByteTimeDomainData(dataArray);

// draw an oscilloscope of the current oscillator
function draw() {
  analyser.getByteTimeDomainData(dataArray);

  canvasCtx.fillStyle = "white";
  canvasCtx.fillRect(0, 0, w, h);

  canvasCtx.lineWidth = 4.0;
  canvasCtx.strokeStyle = "black";
  canvasCtx.beginPath();

  const sliceWidth = (w * 1.0) / bufferLength;
  let x = 0;

  for (let i = 0; i < bufferLength; i++) {
    const v = dataArray[i] / 128.0;
    const y = (v * h) / 2;
    if (i === 0) {
      canvasCtx.moveTo(x, y);
    } else {
      canvasCtx.lineTo(x, y);
    }
    x += sliceWidth;
  }

  canvasCtx.lineTo(w, h / 2);
  canvasCtx.stroke();

  requestAnimationFrame(draw);
}

draw();

Warning: This example makes a noise!

Specifications

Specification
Web Audio API
# OscillatorNode

Browser compatibility

See also