Code Monkey home page Code Monkey logo

music-tempo's Introduction

Description

Javascript-library for finding out tempo (BPM) of a song and beat tracking. It uses an algorithm "Beatroot" authored by Simon Dixon

Example App

Docs

Instalation

In a browser

<script src="music-tempo.min.js"></script>

Using npm:

$ npm i --save music-tempo

Usage

Pass to the constructor MusicTempo the buffer that contains data in the following format: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0. This format is used in the AudioBuffer interface of Web Audio API. The object returned by the constructor contain properties tempo - tempo value in beats per minute and beats - array with beat times in seconds.

Browser

var context = new AudioContext({ sampleRate: 44100 });
var fileInput = document.getElementById("fileInput");

fileInput.onchange = function () {
  var files = fileInput.files;

  if (files.length == 0) return;
  var reader = new FileReader();

  reader.onload = function(fileEvent) {
    context.decodeAudioData(fileEvent.target.result, calcTempo);
  }

  reader.readAsArrayBuffer(files[0]);
}
var calcTempo = function (buffer) {
  var audioData = [];
  // Take the average of the two channels
  if (buffer.numberOfChannels == 2) {
    var channel1Data = buffer.getChannelData(0);
    var channel2Data = buffer.getChannelData(1);
    var length = channel1Data.length;
    for (var i = 0; i < length; i++) {
      audioData[i] = (channel1Data[i] + channel2Data[i]) / 2;
    }
  } else {
    audioData = buffer.getChannelData(0);
  }
  var mt = new MusicTempo(audioData);

  console.log(mt.tempo);
  console.log(mt.beats);
}

Node.js

In Node.js environment can be used node-web-audio-api library

var AudioContext = require("web-audio-api").AudioContext;
var MusicTempo = require("music-tempo");
var fs = require("fs");

var calcTempo = function (buffer) {
  var audioData = [];
  // Take the average of the two channels
  if (buffer.numberOfChannels == 2) {
    var channel1Data = buffer.getChannelData(0);
    var channel2Data = buffer.getChannelData(1);
    var length = channel1Data.length;
    for (var i = 0; i < length; i++) {
      audioData[i] = (channel1Data[i] + channel2Data[i]) / 2;
    }
  } else {
    audioData = buffer.getChannelData(0);
  }
  var mt = new MusicTempo(audioData);

  console.log(mt.tempo);
  console.log(mt.beats);
}

var data = fs.readFileSync("songname.mp3");

var context = new AudioContext();
context.decodeAudioData(data, calcTempo);

Optional parameters

You can pass object with parameters as second argument to the constructor:

var p = { expiryTime: 30, maxBeatInterval: 1.5 };
var mt = new MusicTempo(audioData, p);

Most useful are maxBeatInterval/minBeatInterval and expiryTime. First two used for setting up maximum and minimum BPM. Default value for maxBeatInterval is 1 which means that minimum BPM is 60 (60 / 1 = 60). Default value for minBeatInterval is 0.3 which means that maximum BPM is 200 (60 / 0.3 = 200). Be careful, the more value of maximum BPM, the more probability of 2x-BPM errors (e.g. if max BPM = 210 and real tempo of a song 102 BPM, in the end you can get 204 BPM). expiryTime can be used if audio file have periods of silence or almost silence and because of that beat tracking is failing. Other parameters are listed in documentation.

Other

Tests

Requires mocha and chai

$ npm test

Documentation

Requires esdoc

$ esdoc

Build

Requires gulp and babel. Other dependencies can be found in package.json

$ gulp build

License

MIT License

music-tempo's People

Contributors

killercrush avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

music-tempo's Issues

[Suggestion] Expose the AudioBuffer duration

For now I'm simply adding the duration from the audiobuffer, but I think it would be an interesting property to have as part of MusicTempo.

audioContext.decodeAudioData(fileEvent.target.result)
  .then(audioBuffer => resolve({
    ...new MusicTempo(audioBuffer.getChannelData(0)),
    duration: audioBuffer.duration,
  }))
  .catch(reject)

Tone.js and music-tempo: bpm detection mismatch

Hi,
I am creating a web app using Tone.js and music-tempo to get the bpm of the song. Tone is creating an audioContext that gets the sample rate from my laptop (48000Hz) and music-tempo is calculating a tempo that is always 10 bpm less than what it should be (e.g. a song at 120bpm, I'm getting a result of 110bpm). I suspect that it could be because in the examples provided the audioContext is set to 44100 instead of 48000Hz. Any help would be much appreciated.

Different (inaccurate) results on mobile vs desktop.

I made a beat-aware media player using music-tempo: https://phrasier.leftium.com/

It works great on desktop, however the results are not quite right on mobile browsers. The tempo should be 119 BPM, but mobile browsers result in 107 BPM. The actual beat timings are off, too. Both mobile and desktop agree on the total number of beats, though.

I think music-tempo does a deterministic analysis of the audio data, so I'm not sure how mobile and desktop results could diverge. Does music-tempo change the number of tests or cut tests short depending on the performance of the system?

This doesn't work (BPM is incorrect)

I tried using your test page to get the BPM of the song "Say Something" by A Big World (feat. Christina Aguilera). Your test page shows that song has a BPM of 131. That BPM is way, way, too high for that song. That song actually has a BPM of about 50.

RangeError: Invalid string length

Hi i'm getting following error for few tracks.
re[_k] = hammWindow[_k] * audioData[i];
^

RangeError: Invalid string length
at Float32Array.join (native)
at Float32Array.toString (native)

in **/music-tempo/dist/node/OnsetDetection.js:116:49

can you please let me know how to fix it? thanks!

Differences between music-tempo and Beatroot

Hello,

I hope you don't mind answering a nerdy question.

In the paper linked in the readme it appears that Beatroot uses mean-absolute-peaks for onset detection, whereas I see music-tempo is using spectral flux.

I wondered whether there are any other developments on Dixon's algorithm at play in this library?

Thanks so much,
Joel

Bad beats

This is probably similar to #5 in that the bottom line is that I got the same number of beats, and even roughly at similar places, but the end result sounded horrible. (To debug it I recreated parts of your test page code with the ticker thing.)

But in this case I finally figured that something is off with web-audio-api on node: once I repeated things in a browser, it finally worked as it should. For reference, I'm attaching the file with the beats for both.

Also, the code I used with node -- which didn't work (adding sampleRate makes no difference) is:

const fs = require("fs");
const MusicTempo = require("music-tempo");
const AudioContext = require("web-audio-api").AudioContext;

const calcTempo = async file => {
  const data = fs.readFileSync(file);
  const buff = await new Promise((res,rej) => (new AudioContext()).decodeAudioData(data, res, rej));
  const buf0 = buff.getChannelData(0);
  const buf1 = buff.getChannelData(buff.numberOfChannels > 1 ? 1 : 0);
  const audioData = buf0.map((n0, i) => (n0 + buf1[i])/2);
  const mt = new MusicTempo(audioData);
  console.log(mt.tempo);
  console.log(mt.beats);
}

and in a browser:

fetch($player.src)
  .then(x => x.arrayBuffer())
  .then(x => (new AudioContext()).decodeAudioData(x))
  .then(x => {
    const c1 = x.getChannelData(0), c2 = x.getChannelData(1), 
          c = c1.map((x,i) => (x+c2[i])/2),
          mt = new MusicTempo(c);
    console.log(mt.beats);
  });

Maybe there's some problem with w-a-a, but I followed your readme, so adding a warning might be good...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.