Code Monkey home page Code Monkey logo

jsmpeg's Introduction

JSMpeg – MPEG1 Video & MP2 Audio Decoder in JavaScript

JSMpeg is a Video Player written in JavaScript. It consists of an MPEG-TS demuxer, MPEG1 video & MP2 audio decoders, WebGL & Canvas2D renderers and WebAudio sound output. JSMpeg can load static videos via Ajax and allows low latency streaming (~50ms) via WebSockets.

JSMpeg can decode 720p Video at 30fps on an iPhone 5S, works in any modern browser (Chrome, Firefox, Safari, Edge) and comes in at just 20kb gzipped.

Using it can be as simple as this:

<script src="jsmpeg.min.js"></script>
<div class="jsmpeg" data-url="video.ts"></div>

Some more info and demos: jsmpeg.com

Usage

A JSMpeg video player can either be created in HTML using the CSS class jsmpeg for the container:

<div class="jsmpeg" data-url="<url>"></div>

or by directly calling the JSMpeg.Player() constructor in JavaScript:

var player = new JSMpeg.Player(url [, options]);

Note that using the HTML Element (internally JSMpeg.VideoElement) provides some features on top of JSMpeg.Player. Namely a SVG pause/play button and the ability to "unlock" audio on iOS devices.

The url argument accepts a URL to an MPEG .ts file or a WebSocket server (ws://...).

The options argument supports the following properties:

  • canvas – the HTML Canvas elment to use for video rendering. If none is given, the renderer will create its own Canvas element.
  • loop – whether to loop the video (static files only). Default true.
  • autoplay - whether to start playing immediately (static files only). Default false.
  • audio - whether to decode audio. Default true.
  • video - whether to decode video. Default true.
  • poster – URL to an image to use as the poster to show before the video plays.
  • pauseWhenHidden – whether to pause playback when the tab is inactive. Default true. Note that browsers usually throttle JS in inactive tabs anyway.
  • disableGl - whether to disable WebGL and always use the Canvas2D renderer. Default false.
  • disableWebAssembly - whether to disable WebAssembly and always use JavaScript decoders. Default false.
  • preserveDrawingBuffer – whether the WebGL context is created with preserveDrawingBuffer - necessary for "screenshots" via canvas.toDataURL(). Default false.
  • progressive - whether to load data in chunks (static files only). When enabled, playback can begin before the whole source has been completely loaded. Default true.
  • throttled - when using progressive, whether to defer loading chunks when they're not needed for playback yet. Default true.
  • chunkSize - when using progressive, the chunk size in bytes to load at a time. Default 1024*1024 (1mb).
  • decodeFirstFrame - whether to decode and display the first frame of the video. Useful to set up the Canvas size and use the frame as the "poster" image. This has no effect when using autoplay or streaming sources. Default true.
  • maxAudioLag – when streaming, the maximum enqueued audio length in seconds.
  • videoBufferSize – when streaming, size in bytes for the video decode buffer. Default 512*1024 (512kb). You may have to increase this for very high bitrates.
  • audioBufferSize – when streaming, size in bytes for the audio decode buffer. Default 128*1024 (128kb). You may have to increase this for very high bitrates.
  • onVideoDecode(decoder, time) – A callback that is called after each decoded and rendered video frame
  • onAudioDecode(decoder, time) – A callback that is called after each decoded audio frame
  • onPlay(player) – A callback that is called whenever playback starts
  • onPause(player) – A callback that is called whenever playback paused (e.g. when .pause() is called or the source has ended)
  • onEnded(player) – A callback that is called when playback has reached the end of the source (only called when loop is false).
  • onStalled(player) – A callback that is called whenever there's not enough data for playback
  • onSourceEstablished(source) – A callback that is called when source has first received data
  • onSourceCompleted(source) – A callback that is called when the source has received all data

All options except from canvas can also be used with the HTML Element through data- attributes. E.g. to specify looping and autoplay in JavaScript:

var player = new JSMpeg.Player('video.ts' {loop: true, autoplay: true});

or HTML

<div class="jsmpeg" data-url="video.ts" 
	data-loop="true" data-autoplay="true"></div>

Note that camelCased options have to be hyphenated when used as data attributes. E.g. decodeFirstFrame: true becomes data-decode-first-frame="true" for the HTML element.

JSMpeg.Player API

A JSMpeg.Player instance supports the following methods and properties:

  • .play() – start playback
  • .pause() – pause playback
  • .stop() – stop playback and seek to the beginning
  • .nextFrame() – advance playback by one video frame. This does not decode audio. Returns true on success, false when there's not enough data.
  • .volume – get or set the audio volume (0-1)
  • .currentTime – get or set the current playback position in seconds
  • .paused – read only, wether playback is paused
  • .destroy() – stops playback, disconnects the source and cleans up WebGL and WebAudio state. The player can not be used afterwards. If the player created the canvas element it is removed from the document.

Encoding Video/Audio for JSMpeg

JSMpeg only supports playback of MPEG-TS containers with the MPEG1 Video Codec and the MP2 Audio Codec. The Video Decoder does not handle B-Frames correctly (though no modern encoder seems to use these by default anyway) and the width of the video has to be a multiple of 2.

You can encode a suitable video using ffmpeg like this:

ffmpeg -i in.mp4 -f mpegts -codec:v mpeg1video -codec:a mp2 -b 0 out.ts

You can also control the video size (-s), framerate (-r), video bitrate (-b:v), audio bitrate (-b:a), number of audio channels (-ac), sampling rate (-ar) and much more. Please refer to the ffmpeg documentation for the details.

Comprehensive example:

ffmpeg -i in.mp4 -f mpegts \
	-codec:v mpeg1video -s 960x540 -b:v 1500k -r 30 -bf 0 \
	-codec:a mp2 -ar 44100 -ac 1 -b:a 128k \
	out.ts

Performance Considerations

While JSMpeg can handle 720p video at 30fps even on an iPhone 5S, keep in mind that MPEG1 is not as efficient as modern codecs. MPEG1 needs quite a bit of bandwidth for HD video. 720p begins to look okay-ish at 2 Mbits/s (that's 250kb/s). Also, the higher the bitrate, the more work JavaScript has to do to decode it.

This should not be a problem for static files, or if you're only streaming within your local WiFi. If you don't need to support mobile devices, 1080p at 10mbit/s works just fine (if your encoder can keep up). For everything else I would advise you to use 540p (960x540) at 2Mbit/s max.

Here is a performance comparison with multiple resolutions and features en-/disabled. Test this on your target devices to get a feel for what you can get away with.

https://jsmpeg.com/perf.html

Streaming via WebSockets

JSMpeg can connect to a WebSocket server that sends out binary MPEG-TS data. When streaming, JSMpeg tries to keep latency as low as possible - it immediately decodes everything it has, ignoring video and audio timestamps altogether. To keep everything in sync (and latency low), audio data should be interleaved between video frames very frequently (-muxdelay in ffmpeg).

A separate, buffered streaming mode, where JSMpeg pre-loads a few seconds of data and presents everything with exact timing and audio/video sync is conceivable, but currently not implemented.

The internal buffers for video and audio are fairly small (512kb and 128kb respectively) and JSMpeg will discard old (even unplayed) data to make room for newly arriving data without much fuzz. This could introduce decoding artifacts when there's a network congestion, but ensures that latency is kept at a minimum. If necessary You can increase the videoBufferSize and audioBufferSize through the options.

JSMpeg comes with a tiny WebSocket "relay", written in Node.js. This server accepts an MPEG-TS source over HTTP and serves it via WebSocket to all connecting Browsers. The incoming HTTP stream can be generated using ffmpeg, gstreamer or by other means.

The split between the source and the WebSocket relay is necessary, because ffmpeg doesn't speak the WebSocket protocol. However, this split also allows you to install the WebSocket relay on a public server and share your stream on the Internet (typically NAT in your router prevents the public Internet from connecting into your local network).

In short, it works like this:

  1. run the websocket-relay.js
  2. run ffmpeg, send output to the relay's HTTP port
  3. connect JSMpeg in the browser to the relay's Websocket port

Example Setup for Streaming: Raspberry Pi Live Webcam

For this example, ffmpeg and the WebSocket relay run on the same system. This allows you to view the stream in your local network, but not on the public internet.

This example assumes that your webcam is compatible with Video4Linux2 and appears as /dev/video0 in the filesystem. Most USB webcams support the UVC standard and should work just fine. The onboard Raspberry Camera can be made available as V4L2 device by loading a kernel module: sudo modprobe bcm2835-v4l2.

  1. Install ffmpeg (See How to install ffmpeg on Debian / Raspbian). Using ffmpeg, we can capture the webcam video & audio and encode it into MPEG1/MP2.

  2. Install Node.js and npm (See Installing Node.js on Debian and Ubuntu based Linux distributions for newer versions). The Websocket relay is written in Node.js

  3. Install http-server. We will use this to serve the static files (view-stream.html, jsmpeg.min.js), so that we can view the website with the video in our browser. Any other webserver would work as well (nginx, apache, etc.): sudo npm -g install http-server

  4. Install git and clone this repository (or just download it as ZIP and unpack)

sudo apt-get install git
git clone https://github.com/phoboslab/jsmpeg.git
  1. Change into the jsmpeg/ directory cd jsmpeg/

  2. Install the Node.js Websocket Library: npm install ws

  3. Start the Websocket relay. Provide a password and a port for the incomming HTTP video stream and a Websocket port that we can connect to in the browser: node websocket-relay.js supersecret 8081 8082

  4. In a new terminal window (still in the jsmpeg/ directory, start the http-server so we can serve the view-stream.html to the browser: http-server

  5. Open the streaming website in your browser. The http-server will tell you the ip (usually 192.168.[...]) and port (usually 8080) where it's running on: http://192.168.[...]:8080/view-stream.html

  6. In a third terminal window, start ffmpeg to capture the webcam video and send it to the Websocket relay. Provide the password and port (from step 7) in the destination URL:

ffmpeg \
	-f v4l2 \
		-framerate 25 -video_size 640x480 -i /dev/video0 \
	-f mpegts \
		-codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 \
	http://localhost:8081/supersecret

You should now see a live webcam image in your browser.

If ffmpeg failed to open the input video, it's likely that your webcam does not support the given resolution, format or framerate. To get a list of compatible modes run:

ffmpeg -f v4l2 -list_formats all -i /dev/video0

To add the webcam audio, just call ffmpeg with two separate inputs.

ffmpeg \
	-f v4l2 \
		-framerate 25 -video_size 640x480 -i /dev/video0 \
	-f alsa \
		-ar 44100 -c 2 -i hw:0 \
	-f mpegts \
		-codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 \
		-codec:a mp2 -b:a 128k \
		-muxdelay 0.001 \
	http://localhost:8081/supersecret

Note the muxdelay argument. This should reduce lag, but doesn't always work when streaming video and audio - see remarks below.

Some remarks about ffmpeg muxing and latency

Adding an audio stream to the MPEG-TS can sometimes introduce considerable latency. I especially found this to be a problem on linux using ALSA and V4L2 (using AVFoundation on macOS worked just fine). However, there is a simple workaround: just run two instances of ffmpeg in parallel. One for audio, one for video. Send both outputs to the same Websocket relay. Thanks to the simplicity of the MPEG-TS format, proper "muxing" of the two streams happens automatically in the relay.

ffmpeg \
	-f v4l2 \
		-framerate 25 -video_size 640x480 -i /dev/video0 \
	-f mpegts \
		-codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 \
		-muxdelay 0.001 \
	http://localhost:8081/supersecret

# In a second terminal
ffmpeg \
	-f alsa \
		-ar 44100 -c 2 -i hw:0 \
	-f mpegts \
		-codec:a mp2 -b:a 128k \
		-muxdelay 0.001 \
	http://localhost:8081/supersecret

In my tests, USB Webcams introduce about ~180ms of latency and there seems to be nothing we can do about it. The Raspberry Pi however has a camera module that provides lower latency video capture.

To capture webcam input on Windows or macOS using ffmpeg, see the ffmpeg Capture/Webcam Wiki.

JSMpeg Architecture and Internals

This library was built in a fairly modular fashion while keeping overhead at a minimum. Implementing new Demuxers, Decoders, Outputs (Renderers, Audio Devices) or Sources should be possible without changing any other parts. However, you would still need to subclass the JSMpeg.Player in order to use any new modules.

Have a look a the jsmpeg.js source for an overview of how the modules interconnect and what APIs they should provide. I also wrote a blog post about some of JSMpeg's internals: Decode It Like It's 1999.

Using parts of the library without creating a full player should also be fairly straightforward. E.g. you can create a stand-alone instance of the JSMpeg.Decoder.MPEG1Video class, .connect() a renderer, .write() some data to it and .decode() a frame, without touching JSMpeg's other parts.

Previous Version

The JSMpeg version currently living in this repo is a complete rewrite of the original jsmpeg library that was just able to decode raw mpeg1video. If you're looking for the old version, see the v0.2 tag.

jsmpeg's People

Contributors

19h avatar alphacornutum avatar bboyle avatar cjroth avatar cxa avatar james-tr avatar kresimirfijacko avatar maikmerten avatar nielsswinkels avatar nulltask avatar phoboslab avatar rasmusvhansen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jsmpeg's Issues

Noise reduction

Has anybody experienced the below issue (on movement)? What would the possible causes be?

  • input source - Stream #0:0: Video: h264 (Baseline), yuv420p(tv), 640x480 [SAR 1:1 DAR 4:3], 7.67 tbr, 1k tbn, 60 tbc
  • poor bandwidth
  • encoding params

Note: The noise gets fixed by itself ~1 sec after I stop moving. If I try to use -b 0 it gets much much worse and looks very pixelated.

interlace

Command used is pretty raw:

ffmpeg -i [input] -q:v 0.6 -s 320x240 -r 20 -f mpeg1video [output HTTP server]

Multiple viewers degrade performance for everyone

I am pretty new in NodeJS world, but I've been noticing that when multiple clients connect to a broadcast endpoint, it starts to work slower. Seems like node.js is wasting a lot of time to do lines 37-44 in stream-server.js.

Is there such thing as "parallel for" for node and is there a space for threading improvement at all when it comes to NodeJS? I am fine with diggin into this performance thing, just want to make sure this is possible.

High CPU usage under imac 5K

Hi,
thank you for your great work. I would like to know if it is some kind of bug or if is something i have mistaken (most probable).
I have a video, taken from a raspberry pi 2, in this way:
raspivid -t 0 -w 1280 -h 720 -o - | nc -k -l 2222
then from my imac i run ffmpeg:
ffmpeg -an -f h264 -vcodec h264 -i tcp://raspberry_ip:2222 -f mpeg1video -s 430x240 -pass 1 -coder 0 -bf 0 -wpredp 0 -an http://127.0.0.1:8083/secret/430/240

And i am able to see a video in a html page (using node-webkit) with canvas and websocket. The problem is CPU usage that is 20% for most time, but in some moments goes up to 120-130% (making my app almost frozen).
I am trying to use webgl, but it looks like i am not able to "activate" webgl under node-webkit....

Is there a way to obtain a slower CPU usage? Some "tweak" of jsmeg?

Cache function

How can I add cache function, which can cache each frame state (webgl )?

duration

Hi,

please tell me how I can get video duration?

Regards,
Sergey

How to manipulate the size of the output?

Hi, thank you for the great library 👍
But one big question: How can I change the size of the output video (video canvas)?
I looked into the jsmpg.js an I think that the canvas always applies the video width (line 515/516).

Thanks in advance,
Tilman

Does work in iOS?

I have tried several demos on several iOS devices (iphone 5, ipad 1) tested on safari and chrome and in all cases the image does not show up. Only shows "loading..."
It's like the canvas does not get painted because I can see in my own server that the websockets are connecting fine and the video is streaming (at least for a while).

Any ideas?
TIA

Skewed/delayed video blocks error

I'm trying to stream live video from a Raspberry Pi using JSMPEG, but the following input video is giving me the following broken output:

Input sample: Input sample
Output sample: Output sample

I'm using the example files/code unmodified, and all the correct ports/configs/etc. The socket itself seems to be working since the video does connect and stream, it's just that it's indescernible because of the weird blocks/codec? issue picture above.

Any suggestions on how I can fix this?

Here's the code I'm calling to stream:

raspivid -t 0 -w 360 -h 240 -o - | avconv -i - -f mpeg1video http://127.0.0.1:8082/raspberry/320/240/

Consider using mediump precision in the WebGL fragment shaders

I've just reported this Chrome 39 issue, but it occurred to me that shader compilation could fail on mobile devices with GPUs that don't support highp precision, which is an optional part of the OpenGL ES 2.0 spec.

I tested using mediump precision on the Big Buck Bunny demo and saw no noticeable drop in performance or quality. Ilmari Heikkinen of Google Developer Relations suggests that using highp can even negatively impact performance. MDN's WebGL best practices article also suggests avoiding highp.

Alternatively, we could check highp support with getShaderPrecisionSupport and gracefully degrade to mediump otherwise.

Can't work on Mac OS X 10.10

I use

ffmpeg -f avfoundation -video_device_index 0 -i "default" -f mpeg1video -b 800k -r 30 http://localhost:8082/111111/640/480/

( install ffmpeg via brew)

the view of browser (Chrome):

2015-05-25 11 22 19

Streaming to IE 11

According to the blog post, the library should also play on IE10, but even trying on IE11 it doesn´t seem to work, Chrome and Firefox work prefectly. Am I missing something, or does the library doesn´t support IE anymore?

can't play in ios7

hi, sorry my english is pool. i test it with ios 7, howerver, can't play whit it. i also test it with safari browser on windows, still can't play. help me.

video playing slow down

HI,
in Mobile devices, video playing duration is ~10% more then real video duration. So I am trying to sync audio with video but video slows down and not playing synchronously with audio.
I am tested on iPod 4g, Motorola Moto G(Android 4.4.3), Samsung S3, Nexus 5 and some more devices.

Please tell me how to fix this issue.

memory leak?

first of all - this is totally insane (the positive way)
I was able to setup the streaming server-client in notime, thanks for that.

The only problem I´m facing is that the Browser (Chrome) has a problem recovering the stream after coming back from another tab.
The framerate drops to - lets say - 0. And it takes a while to recover.
While having the tab in focus the memory-usage rises to arround 2.5MB and drops every now and then to a basevalue to build up again.
But out of focus the memory-usage rises much higher and stays even after a drop on a higer level:

image

Having the tab out-of-focus for a long time (~ 1 Minute in my case) makes it completely inresponsive...
Is there something i can do, to avoid this (except keeping the tab in focus :)

Enable Audio in jsmpeg?

Hey @phoboslab I was wondering if audio is enabled in your awesome library? Is it possible to play it from the file without causing any video playback decoding issues or it must be a separate audio layer played/sync'ed with html5 audio element?

Also, what so you think about audio scrubbing? possible?

Audio ws streaming

Not sure if the right place to post this question but will give it a shot.

Do you think it's possible to stream audio live from an RTMP source using the same toolbox - FFMPEG/WS ?

What encoding should FFMPEG use?
How about the client-side decoding, how much can HTML5 <audio> API help?

Stream blur

Hey, 2 questions here:

  1. Is there an efficient way to blur the incoming images to be drawn on the canvas? Currently I'm doing this but it's very CPU expensive:
player = new jsmpeg(client,
    canvas:canvas
    ondecodeframe: ->
    boxBlurCanvasRGB("my-canvas", 0, 0, width, height, 18, 1);
    # from http://www.quasimondo.com/BoxBlurForCanvas/FastBlurDemo.html
)
  1. Is there a way to control the incoming frame-rate? As in only draw one out of 5 incoming images?

Loading video stream via some event, NOT on page load doesn't work

I've been trying to have the video stream start only after a user clicks a button (or some other event), but I am finding that if jsmpeg is not invoked on page load, it won't start.

For example, you can do a simple test...this works fine (as in the example):

var player = new jsmpeg(client, {canvas:canvas});

But if you delay it in any way, like below. It will not start.

var player;
setTimeout(function() {
player = new jsmpeg(client, {canvas:canvas});
}, 1);

Is there a way to have it start on demand and NOT just when the page loads?

Thanks!

Separate Canvas from Decoding

Separating the Canvas/Rendering from decoding could allow us to run the decoding in a web worker and use transferable objects for communication.

Channels support

Hi, thumbs up for the great work. I've been using an "artificial" solution before (base64 images sent through ws) and I can tell it doesn't achieve the same framerate as this. This solution is really easy to use, however want to use the stream server for multiple "broadcasters" eg. publish multiple streams to routes like below:

http://127.0.0.:80/<channel>/<secret>/<width>/<height>

I have already implemented this in my fork, is it fine with you if I submit a PR?

how to reduce latency?

Hi,

First, this is an amazing piece of work. I've been working on streaming video from a raspberry pi to a broadcast server and this is the best (and easiest) solution I've found yet. Thanks for making this available!

This is sort of a general question... when you built this, was your priority to stream every frame or to reduce latency? I generally get a latency of 1 second or less which is excellent considering the stream is going from the Pi to my AWS server and back down to a browser. But sometimes the delay becomes >15 seconds which i'm guessing is from network traffic. Is there any change I could make to always stream the most recent frame? Maybe there's an FFmpeg option or a tweak to the node code?

Any guidance would be much appreciated!
Thanks

Stream to Kodi Player

Hi Dominic,

Apologies if this is the wrong place to ask this but I was wondering something. I use the "Instant Webcam" app (which is simple and works a treat) on my iPhone and manage to stream to Firefox without any problems. I was wondering if there is a way to stream the video to my Kodi player? I have tried various options but nothing has worked so far. I'm guessing that since the app is using jsmpeg and that Kodi isn't a browser, then that's where the problem lies.

Any ideas?

Thanks and again, apologies if this is the wrong place to ask.
IB

Undertanding..

Dear developer!

On your blog access mpeg file directly, not over websocket. So.. why need websocket server runing on provider server, if it's not used?

Possible for jsmpeg to record from Camera?

Hey,
I was wondering if it is possible to capture video from camera and save it as mpg using the jsmpeg library ? (Not streaming)
Current state of solutions is recrodRTC which uses whammy, but maby it is possible to encode using jsmpeg?

License

What license is this under?

Edit: (Please include a LICENSE file in the repository)

Benchmark

Hi,

I was just wondering how the benchark worked,

In the console log i see:

Average time per frame: 1.8477333330035133 ms
jsmpg.js:473 Average time per frame: 1.6367083332928207 ms
jsmpg.js:473 Average time per frame: 1.360716666385997 ms
jsmpg.js:473 Average time per frame: 1.4053333334838196 ms
jsmpg.js:473 Average time per frame: 1.3767333335029737 ms
jsmpg.js:473 Average time per frame: 1.2611666671242954 ms
jsmpg.js:473 Average time per frame: 1.4227166670025326 ms
jsmpg.js:473 Average time per frame: 1.420450000659912 ms

is it better with large or low values?

Regards

Anders

Licence

Just wondering what licence this code is released under?

Seeking / Scrubbing

I'm attempting to add a seek function to this. I've been attempting to manipulate the buffer index to seek, but haven't had much luck. Not sure I'm fully wrapping my head around the frame decoding aspect of the BitArray. Any chance you'll be adding a seek function? Preferably to a given time in the video?

recording from the websocket stream

Hello. I've got a simple node.js script to connect to the stream server and write the video data to a file. I cannot get ffmpeg to understand the file contents nor get vlc to play the video. Can you suggest how I might archive the stream? The startRecording() method is not an option because this is server-side (aka not a browser).

Thanks

Support MPEG4/h264

Hello,

Thanks for this great lib. Is the MPEG4/h264 support planned ?

Thanks

WebAudio API Sync

I'm trying to create a way of syncing to a separate audio file running in Web Audio API. Thinking I could use the timebase of the audio to manually progress the buffer index. Any tips on pulling this off? Note: I'm using the AJAX / HTTP download version, not streaming.

Add snapshot function to player api

Currently it is possible to start and stop recording a video from the live websocket transmission. How hard would it be to add a snapshot function so one could grab a single image frame instead of a video file ? With some simple guidance I could gladly send a PR.

Question on usage for streaming from a webcam

Can JSMPEG work for streaming my webcam if I spun up a remote VPS server (i.e. Digital Ocean)?

I'm a little confused about it's application and I assume that the example that you give is for doing it on a local machine.

I guess my question is: How would I send the webcam connection /dev/video0 or /dev/video1 to the remote server? Is there anything different I would need to do from your tutorial: http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets ?

About your App Webcam

Hi @phoboslab ,I have two questions about webcam
1 which http server framework do you use in webcam
2 if you write the httpserver module by yourself, could you opensource ?

thanks

Streaming video from IP camera?

Hey I'm new in this and sorry if my question is stupid, but can you tell me can I use jsmpeg to streaming video from my IP cam. Do I need server or what? Can you give me the steps that I need to follow?

Certain mpeg artifacts

The library works great. The lack of latency is insane.

I do get certain artifacts when rendering the mpeg stream, like this:

bars

Any idea on how I can get rid of those?
My avconv/ffmpeg options are very simple:

avconv -f x11grab -r 25 -s 1280x720 -i :0.0 -f mpeg1video -b 3000k pipe:1

And yes, it happens at any resolution, at any bitrate. And also when transcoding other video files, not just x11grab.

Btw: this might just be a proof-of-concept piece of code, but when latency really matters I don't see an alternative.

I've tried setting up MediaSource yesterday, and got it to work eventually. But, as your blog post said, it only starts playing 3-5 seconds after receiving the first frame.

Get current frame canvas

Hey @phoboslab

I was wondering if it's possible to get the current frame from the canvas without doing nextFrame() ?
I want to grab the current frame, but everytime i want to get the current frame i have to do something like this:

video_player.seekToFrame(timeline_frame-1,true);
frame_img = video_player.nextFrame();
frame_img .toDataURL('image/webp', 0.7)

i have to go back 1 frame, do nextframe() and then get the current frame.
is it possible to add getCurrentFrame(); alike function?

thanks!

Cool library, but i like jpg (smaller). Any ideas on what library to use, or can i take some cool parts from this one

I'm working on a project where i store live tv frames and run some vision processing on them and stored as jpg because of space concerns (we are talking millions to billions of daily tv frames).
I saw the demo of this library and it was relatively smooth. I was thinking of adapting this library to play jpg instead of mpeg, which would involve stripping all the mpeg decoding stuff and just leaving the buffer and some gl stuff (i'm assuming). Does anyone have any idea of an existing library that just handles jpg websocket streaming.

Thanks

Support MPEG2?

Could this library decoding MPEG2 streaming video? Because I need better quality maybe like 720p.

Last frame missing

While decoding the raw mpeg1 video on the client side. The last frame is always missing. Actually its buffered by the code at Javascript and displayed when you send a new frame from server. Is this how the algorithm works or is there some kind of issue with the code which can be resolved.

Rendering should be frame rate independent

Hey, wonderful library. I just wanted to raise the fact that the rendering is not frame rate independent if the device is rendering slower than the video frame rate.
EG. on low performing mobile devices (tested on iPhone 4)

What happens: The video renders in slow motion.
What I would expect: Frames that have passed should be dropped, this could be done fairly easily (i think) with delta tracking and adjusting the buffer index.

I will have a bash at getting this working, creating this issue incase anyone has any other ideas.
Thanks

MPEG2 support?

Hello,

Is it possible to implement MPEG2 support? I do not know how different MPEG2 is to MPEG1 however. Might be an entire different project.

Thanks for your great work!

video has wrong scaling on iphone6

I have the jsmpeg-video playing in a canvas which has its css-width/-height set to 100% (of the containing box)
jsmpeg then sets automatically the dimensions of the video (1280/720 in this case) into the
width-/height/attribute of the canvas

BUT :
on iphones the video is somehow scaled up! I only see a detail of the actual video.
a little fix was setting the width-/height/attribute of the canvas again to the values,
BUT
that does not work on iphone6!

Any suggestions are well appreciated!
thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.