Code Monkey home page Code Monkey logo

ustreamer's Introduction

µStreamer

CI Discord

µStreamer is a lightweight and very quick server to stream MJPEG video from any V4L2 device to the net. All new browsers have native support of this video format, as well as most video players such as mplayer, VLC etc. µStreamer is a part of the PiKVM project designed to stream VGA and HDMI screencast hardware data with the highest resolution and FPS possible.

µStreamer is very similar to mjpg-streamer with input_uvc.so and output_http.so plugins, however, there are some major differences. The key ones are:

Feature µStreamer mjpg-streamer
Multithreaded JPEG encoding
Hardware image encoding
on Raspberry Pi
Behavior when the device
is disconnected while streaming
✔ Shows a black screen
with NO SIGNAL on it
until reconnected
✘ Stops the streaming 1
DV-timings support -
the ability to change resolution
on the fly by source signal
☹ Partially yes 1
Option to skip frames when streaming
static images by HTTP to save the traffic
2
Streaming via UNIX domain socket
Systemd socket activation
Debug logs without recompiling,
performance statistics log,
access to HTTP streaming parameters
Option to serve files
with a built-in HTTP server
☹ Regular files only
Signaling about the stream state
on GPIO using libgpiod
Access to webcam controls (focus, servos)
and settings such as brightness via HTTP
Compatibility with mjpg-streamer's API :)

Footnotes:

  • 1 Long before µStreamer, I made a patch to add DV-timings support to mjpg-streamer and to keep it from hanging up on device disconnection. Alas, the patch is far from perfect and I can't guarantee it will work every time - mjpg-streamer's source code is very complicated and its structure is hard to understand. With this in mind, along with needing multithreading and JPEG hardware acceleration in the future, I decided to make my own stream server from scratch instead of supporting legacy code.

  • 2 This feature allows to cut down outgoing traffic several-fold when streaming HDMI, but it increases CPU usage a little bit. The idea is that HDMI is a fully digital interface and each captured frame can be identical to the previous one byte-wise. There's no need to stream the same image over the net several times a second. With the --drop-same-frames=20 option enabled, µStreamer will drop all the matching frames (with a limit of 20 in a row). Each new frame is matched with the previous one first by length, then using memcmp().


TL;DR

If you're going to live-stream from your backyard webcam and need to control it, use mjpg-streamer. If you need a high-quality image with high FPS - µStreamer for the win.


Installation

Building

You need to download the µStreamer onto your system and build it from the sources.

Preconditions

You'll need make, gcc, libevent with pthreads support, libjpeg9/libjpeg-turbo and libbsd (only for Linux).

  • Arch: sudo pacman -S libevent libjpeg-turbo libutil-linux libbsd.
  • Raspberry OS Bullseye: sudo apt install libevent-dev libjpeg62-turbo libbsd-dev. Add libgpiod-dev for WITH_GPIO=1 and libsystemd-dev for WITH_SYSTEMD=1 and libasound2-dev libspeex-dev libspeexdsp-dev libopus-dev for WITH_JANUS=1.
  • Raspberry OS Bookworm: same as previous but replace libjpeg62-turbo to libjpeg62-turbo-dev.
  • Debian/Ubuntu: sudo apt install build-essential libevent-dev libjpeg-dev libbsd-dev.
  • Alpine: sudo apk add libevent-dev libbsd-dev libjpeg-turbo-dev musl-dev. Build with WITH_PTHREAD_NP=0.

To enable GPIO support install libgpiod and pass option WITH_GPIO=1. If the compiler reports about a missing function pthread_get_name_np() (or similar), add option WITH_PTHREAD_NP=0 (it's enabled by default). For the similar error with setproctitle() add option WITH_SETPROCTITLE=0.

Make

The most convenient process is to clone the µStreamer Git repository onto your system. If you don't have Git installed and don't want to install it either, you can download and unzip the sources from GitHub using wget https://github.com/pikvm/ustreamer/archive/refs/heads/master.zip.

$ git clone --depth=1 https://github.com/pikvm/ustreamer
$ cd ustreamer
$ make
$ ./ustreamer --help

Update

Assuming you have a µStreamer clone as discussed above you can update µStreamer as follows.

$ cd ustreamer
$ git pull
$ make clean
$ make

Usage

For M2M hardware encoding on Raspberry Pi, you need at least 5.15.32 kernel. OpenMAX and MMAL support on older kernels is deprecated and removed.

Without arguments, ustreamer will try to open /dev/video0 with 640x480 resolution and start streaming on http://127.0.0.1:8080. You can override this behavior using parameters --device, --host and --port. For example, to stream to the world, run:

# ./ustreamer --device=/dev/video1 --host=0.0.0.0 --port=80

❗ Please note that since µStreamer v2.0 cross-domain requests were disabled by default for security reasons. To enable the old behavior, use the option --allow-origin=\*.

The recommended way of running µStreamer with TC358743-based capture device on Raspberry Pi:

$ ./ustreamer \
    --format=uyvy \ # Device input format
    --encoder=m2m-image \ # Hardware encoding on V4L2 M2M driver
    --workers=3 \ # Workers number
    --persistent \ # Suppress repetitive signal source errors (for example when HDMI cable was disconnected)
    --dv-timings \ # Use DV-timings
    --drop-same-frames=30 # Save the traffic

❗ Please note that to use --drop-same-frames for different browsers you need to use some specific URL /stream parameters (see URL / for details).

You can always view the full list of options with ustreamer --help.


Docker (Raspberry Pi 4 HDMI)

Preparations

Add following lines to /boot/firmware/usercfg.txt:

gpu_mem=128
dtoverlay=tc358743

Check size of CMA:

$ dmesg | grep cma-reserved
[    0.000000] Memory: 7700524K/8244224K available (11772K kernel code, 1278K rwdata, 4320K rodata, 4096K init, 1077K bss, 281556K reserved, 262144K cma-reserved)

If it is smaller than 128M add following to /boot/firmware/cmdline.txt:

cma=128M

Save changes and reboot.

Launch

Start container:

$ docker run --device /dev/video0:/dev/video0 -e EDID=1 -p 8080:8080 pikvm/ustreamer:latest

Then access the web interface at port 8080 (e.g. http://raspberrypi.local:8080).

Custom config

$ docker run --rm pikvm/ustreamer:latest \
    --format=uyvy \
    --workers=3 \
    --persistent \
    --dv-timings \
    --drop-same-frames=30

EDID

Add -e EDID=1 to set HDMI EDID before starting ustreamer. Use together with -e EDID_HEX=xx to specify custom EDID data.


Raspberry Pi Camera Example

Example usage for the Raspberry Pi v3 camera (required libcamerify which is located in libcamera-tools and libcamera-v4l2 (install both) on Raspbian):

$ sudo modprobe bcm2835-v4l2
$ libcamerify ./ustreamer --host :: --encoder=m2m-image

For v2 camera you can use the same trick with libcamerify but enable legacy camera mode in raspi-config.

Example usage for the Raspberry Pi v1 camera:

$ sudo modprobe bcm2835-v4l2
$ ./ustreamer --host :: -m jpeg --device-timeout=5 --buffers=3 -r 2592x1944

❗ Please note that newer camera models have a different maximum resolution. You can see the supported resolutions at the PiCamera documentation.

❗ If you get a poor framerate, it could be that the camera is switched to photo mode, which produces a low framerate (but a higher quality picture). This is because bcm2835-v4l2 switches to photo mode at resolutions higher than 1280x720. To work around this, pass the max_video_width and max_video_height module parameters like so:

$ modprobe bcm2835-v4l2 max_video_width=2592 max_video_height=1944

Integrations

Janus

µStreamer supports bandwidth-efficient streaming using H.264 compression and the Janus WebRTC server. See the Janus integration guide for full details.

Nginx

When uStreamer is behind an Nginx proxy, its buffering behavior introduces latency into the video stream. It's possible to disable Nginx's buffering to eliminate the additional latency:

location /stream {
    postpone_output 0;
    proxy_buffering off;
    proxy_ignore_headers X-Accel-Buffering;
    proxy_pass http://ustreamer;
}

Tips & tricks for v4l2

v4l2 utilities provide the tools to manage USB webcam setting and information. Scripts can be use to make adjustments and run manually or with cron. Running in cron for example to change the exposure settings at certain times of day. The package is available in all Linux distributions and is usually called v4l-utils.

  • List of available video devices: v4l2-ctl --list-devices.
  • List available control settings: v4l2-ctl -d /dev/video0 --list-ctrls.
  • List available video formats: v4l2-ctl -d /dev/video0 --list-formats-ext.
  • Read the current setting: v4l2-ctl -d /dev/video0 --get-ctrl=exposure_auto.
  • Change the setting value: v4l2-ctl -d /dev/video0 --set-ctrl=exposure_auto=1.

Here you can find more examples. Documentation is available in man v4l2-ctl.


See also


License

Copyright (C) 2018-2024 by Maxim Devaev [email protected]

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see https://www.gnu.org/licenses/.

ustreamer's People

Contributors

aggienick02 avatar amiablepointers avatar b-rad15 avatar binilj04 avatar chraac avatar drachenkaetzchen avatar emaste avatar fallingsnow avatar ffontaine avatar fphammerle avatar goblinqueen avatar hugs avatar jotaen4tinypilot avatar jpalus avatar jtrmal avatar kkkon avatar lennie avatar marcelstoer avatar mdevaev avatar mkuf avatar mtlynch avatar netbr avatar pascalhonegger avatar randolf avatar reedy avatar russdill avatar tallman5 avatar theacodes avatar thomergil avatar tomaszduda23 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ustreamer's Issues

NDI Output Support

Would be great to include NDI output support for ultra low latency video output.

Stream stops showing an image / capturing after a while when connected to a video-capture dongle

I'm using one of the video-capture dongles (similar to the one used/shipped by pikvm tinypilot) to capture the HDMI output of device that is running Ubuntu Server but after a while the ustream /stream endpoint shows a blank/black screen and never shows anything anymore. It only starts working again after I re-plug the HDMI input from the video-capture dongle.

NB the linux console turns off the screen after a while, and it also turns it on when I press a keyboard key, but that did not make the video streaming work again, hence, this cry for help issue.

Is the dongle in "sleep" mode? Maybe there is something that needs to be done to "wake" it up again from the V4L2 side?

Are you/@mtlynch seen this with the dongles shipped with pikvm?

How does the --static work?

I have added the --static and don't see reference to it in the logging or use of the directory.

nohup ./ustreamer --process-name-prefix ustreamer --log-level 3 --device /dev/video2 --device-timeout=8 --format jpeg --resolution 880x497 --host=0.0.0.0 --port=3080 --static=/var/www/html/ustreamer/ >video2.log &
-- INFO  [76378.461      main] -- Installing SIGINT handler ...
-- INFO  [76378.461      main] -- Installing SIGTERM handler ...
-- INFO  [76378.461      main] -- Ignoring SIGPIPE ...
-- DEBUG [76378.461      main] -- Increasing picture 0x0x7d1b70 buffer: 0 -> 13845 (+13845)
-- INFO  [76378.461      main] -- Using internal blank placeholder
-- DEBUG [76378.461      main] -- Increasing picture 0x0x7d14a8 buffer: 0 -> 13845 (+13845)
-- DEBUG [76378.461      main] -- Binding HTTP to [0.0.0.0]:3080 ...
-- INFO  [76378.462      main] -- Listening HTTP on [0.0.0.0]:3080
-- INFO  [76378.462    stream] -- Using V4L2 device: /dev/video2
-- INFO  [76378.462    stream] -- Using desired FPS: 0
-- DEBUG [76378.462    stream] -- _stream_init_loop: stream->proc->stop=0
================================================================================
-- INFO  [76378.463      http] -- Starting HTTP eventloop ...
-- DEBUG [76378.485      http] -- Refreshing HTTP exposed (BLANK) ...
-- PERF  [76378.485      http] -- HTTP: Dropped same frame (BLANK) number 0
-- DEBUG [76378.501      http] -- Refreshing HTTP exposed (BLANK) ...
-- PERF  [76378.501      http] -- HTTP: Dropped same frame (BLANK) number 1
-- DEBUG [76378.510      http] -- Refreshing HTTP exposed (BLANK) ...
-- PERF  [76378.511      http] -- HTTP: Dropped same frame (BLANK) number 2

Stream crashing

Hi,

I had the stream working for all day, and tonight it crashed. I tried to stop and start it again and now it still crash (loop) :

================================================================================
-- INFO  [124679.232 tid=1019] -- Device fd=8 opened
-- INFO  [124679.232 tid=1019] -- Using input channel: 0
-- INFO  [124679.232 tid=1019] -- Using TV standard: DEFAULT
-- INFO  [124679.291 tid=1019] -- Using resolution: 1280x720
-- INFO  [124679.291 tid=1019] -- Using pixelformat: JPEG
-- INFO  [124679.348 tid=1019] -- Using HW FPS: 30 -> 60 (coerced)
-- INFO  [124679.358 tid=1019] -- Requested 5 HW buffers, got 5
-- INFO  [124679.395 tid=1019] -- Capturing started
-- INFO  [124679.395 tid=1019] -- Switching to HW encoder because the input format is (M)JPEG
-- ERROR [124679.395 tid=1019] -- Can't query HW encoder params and set quality (unsupported)
-- INFO  [124679.395 tid=1019] -- Using JPEG quality: encoder default
-- INFO  [124679.395 tid=1019] -- Creating pool with 1 workers ...
-- INFO  [124679.396 tid=1019] -- Capturing ...
-- ERROR [124680.397 tid=1019] -- Mainloop select() timeout
-- INFO  [124680.397 tid=1019] -- Destroying workers pool ...
-- INFO  [124680.409 tid=1019] -- Capturing stopped
-- INFO  [124680.415 tid=1019] -- Device fd=8 closed

Also some logs from kern.log :

Sep  6 20:46:52 pi1 kernel: [124048.471629] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:53 pi1 kernel: [124049.472968] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:54 pi1 kernel: [124050.474253] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:55 pi1 kernel: [124051.475526] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:56 pi1 kernel: [124052.476797] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:57 pi1 kernel: [124053.478214] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:58 pi1 kernel: [124054.479344] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:46:59 pi1 kernel: [124055.480746] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:47:00 pi1 kernel: [124056.482118] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).
Sep  6 20:47:01 pi1 kernel: [124057.483476] uvcvideo: Failed to set UVC probe control : -71 (exp. 26).

The command I used :
ustreamer --device=/dev/video0 -r 1280x720 -f 30 -m JPEG --host 0.0.0.0 --port 8888 2>&1 | tee /home/pi/scripts/ustreamer-live.log &

I'm pretty sure it will work again if I reboot the RPi but have you got something to "reset" the camera or something without rebooting ?

Official Docker images

Given the popularity of docker it would be really nice if you're able to simply run ustreamer with a one-line docker command without installing anything. Here are a few points which would have to be done in order to achieve said goal:

  • Create Dockerfile / .dockerignore
  • Ensure Dockerfile supports multiple architectures (amd64, armv7, ...)
  • Automatically build and tag new versions, for example using GitHub Actions and buildx
  • Add tagged build with Raspberry Pi OMX / GPIO support
  • Update documentation

OMX encoder unkillably hangs process after subsequent restarts of ustreamer

I'm experiencing a weird error when I use --encoder=omx. ustream hangs and only produces a 'NO SIGNAL' screen. It doesn't stop on CTRL-C, I can only kill it using SIGKILL. Using CPU encoder works, sort of (see my other ticket soon). The error comes up after killing ustreamer after the first run. Only subsequent starts cause the error and software encoder works meanwhile. Resetting the USB capture device somehow solves the issue and ustreamer can run again. The capture device itself works, verified using the following simple streaming command:

ffmpeg -f v4l2 -r 30 -video_size 1280x720 -i /dev/video0 -c:v mjpeg -qmin 1 -q:v 8 -f nut tcp://10.11.12.101:1234

On the other end, mpv plays the stream fine.

Setup: Pi 4 4 GB, kernel 5.4.51-v7l+ #1327, Raspbian 10.4

Command line:

./ustreamer --device=/dev/video0 --format=YUYV --workers=4 --persistent --drop-same-frames=30 --host=0.0.0.0 --port=8080 --resolution=1280x720 --desired-fps=30 --encoder=omx

Version: 1.20, directly from git

Log of an unsuccessful run: https://pastebin.com/b97G4yTM

New install - no video: ERROR: Mainloop select() timeout

Greetings,

I created and installed a fresh Raspbian image, and followed your build instructions (note -- I also needed to install git).

Build seemed to go well. I built with WITH_OMX=1.

I've used this same hardware successfully with mjpg-streamer as well as fswebcam and motion. It is an ELP-cam. Bus 001 Device 004: ID 05a3:9520 ARC International Camera

I can reach the web page successfully, but there is no image. As shown in the issue title, Mainloop select() timeout is repeated.

I've created a gist with debug info here: https://gist.github.com/JohnOCFII/a2649d01e1e66027367e772361b1a3e5

I imagine I'm missing something simple.

Thanks for your help.

John

Installs an unstripped executable

====> Running Q/A tests (stage-qa)
Warning: 'bin/ustreamer' is not stripped consider trying INSTALL_TARGET=install-strip or using ${STRIP_CMD}

Please strip it!

error: field has incomplete type 'struct in_addr'

In file included from src/http/server.c:35:
/usr/include/netinet/ip.h:71:17: error: field has incomplete type 'struct in_addr'
        struct  in_addr ip_src,ip_dst;  /* source and dest address */
                        ^
/usr/include/netinet/ip.h:71:9: note: forward declaration of 'struct in_addr'
        struct  in_addr ip_src,ip_dst;  /* source and dest address */
                ^
/usr/include/netinet/ip.h:71:24: error: field has incomplete type 'struct in_addr'
        struct  in_addr ip_src,ip_dst;  /* source and dest address */
                               ^

OS: FreeBSD 12.1
clang-9

ustreamer process name in top should be ustreamer instead of main

The ustreamer process appear as main in top:

Tasks: 119 total,   1 running, 118 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.2 us,  0.3 sy,  0.0 ni, 99.5 id,  0.1 wa,  0.0 hi,  0.0 si,  0.0 st
MiB Mem :   3827.9 total,   2928.2 free,    112.4 used,    787.3 buff/cache
MiB Swap:    100.0 total,    100.0 free,      0.0 used.   3575.5 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND                                                                                         
 4064 root      20   0  139032   5840   1060 S  14.6   0.1   0:03.87 main                                                                                            

I think it should appear as ustreamer to better identify it.

Please note that in ps it correclty shows as ustreamer:

ps axw | grep ustreamer
 4064 pts/0    Sl+    0:22 ustreamer: ustreamer --device=/dev/video0 --persistent --dv-timings --format=uyvy --encoder=omx --workers=3 --quality=85 --desired-fps=30 --drop-same-frames=30 --process-name-prefix=ustreamer --slowdown --host=0.0.0.0 --port=80

Streaming a V4L2 virtual device

Is there a way to stream using a V4L2 virtual device? At the moment I can only find 5 supported formats.

Here is what I am actually doing:
ffmpeg -i /dev/video0 -vf "format=yuv420p, crop=657:450:16:13" -f v4l2 /dev/video1
but the format seems to be a problem for uStreamer.

How to video-capture a device that changes resolutions at runtime without blurred images?

I'm using one of the video-capture dongles (similar to the one used/shipped by pikvm tinypilot) and set the capture resolution with ustreamer --resolution, but that resolution is not really the resolution the BIOS/OS ends up using, and as a result, the image I see in the ustream /stream endpoint is blurred.

When I match the ustream resolution with the OS resolution (as returned by fbset or xrandr), it works fine at the OS level, but not when it enters the BIOS, which uses yet another resolution... so I either have a blurred image at the BIOS or at the OS, which is a sub-optimal experience.

Do you known how to fix this? Maybe, somehow query V4L2 (or the video-capture dongle) to known the current resolution of the device and switch to it at runtime? Or force the video-capture dongle to always return a single supported resolution?

Are you/@mtlynch seeing this with the dongles shipped with pikvm tinypilot?

Second capture card causes: "Unable to start capturing: No space left on device"

I have two same capture cards and if I try to stream simultaneously from both cards I get this error:
-- ERROR [247.685 stream] -- Unable to start capturing: No space left on device

My code:
./ustreamer --device=/dev/video1 --host=0.0.0.0 --port=12346 -r 1920x1080 -m JPEG -f 30

And second one:
./ustreamer --device=/dev/video3 --host=0.0.0.0 --port=12345 -r 1920x1080 -m JPEG -f 30

SD card is 256GB so there should be enough space.

Tnx!

Wrong JPEG library version: library is 80, caller expects 62

I just installed ustreamer today on raspberry pi to address issues with mjpg-streamer and multiple cameras. Unfortunately, I get the error of the wrong jpeg library.

/ustreamer --device /dev/video2 --host=0.0.0.0 --port=3080
-- INFO  [7874.091      main] -- Installing SIGINT handler ...
-- INFO  [7874.092      main] -- Installing SIGTERM handler ...
-- INFO  [7874.092      main] -- Ignoring SIGPIPE ...
-- INFO  [7874.093      main] -- Using internal blank placeholder
-- INFO  [7874.094      main] -- Listening HTTP on [0.0.0.0]:3080
-- INFO  [7874.095    stream] -- Using V4L2 device: /dev/video2
-- INFO  [7874.095    stream] -- Using desired FPS: 0
================================================================================
-- INFO  [7874.095      http] -- Starting HTTP eventloop ...
-- INFO  [7874.306    stream] -- Device fd=8 opened
-- INFO  [7874.306    stream] -- Using input channel: 0
-- INFO  [7874.306    stream] -- Using TV standard: DEFAULT
-- INFO  [7874.314    stream] -- Using resolution: 640x480
-- INFO  [7874.314    stream] -- Using pixelformat: YUYV
-- INFO  [7874.321    stream] -- Using HW FPS: 0 -> 30 (coerced)
-- INFO  [7874.324    stream] -- Requested 5 HW buffers, got 5
-- INFO  [7874.330    stream] -- Capturing started
-- INFO  [7874.331    stream] -- Using JPEG quality: 80%
-- INFO  [7874.331    stream] -- Creating pool with 4 workers ...
-- INFO  [7874.331    stream] -- Capturing ...
Wrong JPEG library version: library is 80, caller expects 62

Ambiguous handling of MJPG source format

Hello!
I am one of probably many folks who have found uStreamer via your kind assistance with the mjpg-streamer select() timeout bug: jacksonliam/mjpg-streamer#182

In converting one of my camera servers to uStreamer for testing, I encountered some ambiguities with the way uStreamer handles MJPG sources (-m JPEG). Some may be bugs, and some may just be status output that could be clarified, but either way they cause some confusion for those migrating from mjpg-streamer.

In particular:

  1. uStreamer attempts to utilize the HW encoder regardless of the command line encoder parameters.
    if the "-c CPU" parameter is used, we still see the following output:
    -- INFO [60956.549 tid=3677] -- Switching to HW encoder because the input format is (M)JPEG
    -- ERROR [60956.552 tid=3677] -- Can't query HW encoder params and set quality (unsupported)
    The error indicates HW encoding is unsupported, but the user is unable to avoid this error, even by specifying that CPU encoding should be used, as uStreamer tries to use hardware encoding regardless.

  2. The JPEG quality parameter is either not functioning, or is silently ignored for MJPG sources.
    When using an MJPG source, specifying a JPG compression quality level with the "-q" parameter appears to have no effect. For example, "-q 100" and "-q 10" both produce identical output in the stream. The following output is also displayed at runtime, regardless of the "-q" value:
    -- INFO [60956.553 tid=3677] -- Using JPEG quality: encoder default
    Perhaps this is an indication that MJPG frames from the camera source are simply being passed straight-through without re-encoding? If so, I would suggest that the user is notified of this... and/or perhaps a future feature suggestion is the ability to honor this parameter for re-encoding of MJPG to MJPG, for those of us who could benefit from a quality change for bandwidth reasons.

  3. Slightly unrelated: In the --help output, the --encoder section lists the available options as:
    'Available: CPU, HW; default: CPU.',
    however your sample usage in README.md uses the line:
    '--encoder=omx \ # Hardware encoding with OpenMAX'.
    When specifying that hardware encoding should be used, is the correct option to use "HW" or "OMX"? Are they both accepted, and/or is there any difference between them?

Thanks in advance, I'm looking forward to seeing what uStreamer can do!

Feature Suggestion: Stream Fallback Source

I'll just throw it in here for discussion: Wouldn't it be a good benefit to have a fallback option for the streamer, if the USB device is disconnected or doesn't stream data anymore, to have an option to detect source disconnection/stream-stall and stream either a JPG as fallback "source" or loop an mjpg video instead...

Instant segfault after upgrading to 1.19

I'm seeing a segfault instantly after loading uStreamer after upgrading to ustreamer v1.19 from 1.1.8:

$ sudo ./ustreamer --log-level 3
-- INFO  [145.254      main] -- Installing SIGINT handler ...
-- INFO  [145.254      main] -- Installing SIGTERM handler ...
-- INFO  [145.254      main] -- Ignoring SIGPIPE ...
-- DEBUG [145.254      main] -- Increasing picture 0x0x3efe78 buffer: 0 -> 13845 (+13845)
-- INFO  [145.254      main] -- Using internal blank placeholder
-- DEBUG [145.254      main] -- Increasing picture 0x0x3ef380 buffer: 0 -> 13845 (+13845)
-- DEBUG [145.254      main] -- Binding HTTP to [127.0.0.1]:8080 ...
-- INFO  [145.255      main] -- Listening HTTP on [127.0.0.1]:8080
-- INFO  [145.255    stream] -- Using V4L2 device: /dev/video0
-- INFO  [145.255    stream] -- Using desired FPS: 4
-- DEBUG [145.255    stream] -- _stream_init_loop: stream->proc->stop=0
================================================================================
-- INFO  [145.255      http] -- Starting HTTP eventloop ...
-- DEBUG [145.387      http] -- Refreshing HTTP exposed (BLANK) ...
-- PERF  [145.387      http] -- HTTP: Dropped same frame (BLANK) number 0
-- INFO  [145.492    stream] -- Device fd=8 opened
-- DEBUG [145.492    stream] -- Calling ioctl(VIDIOC_QUERYCAP) ...
-- INFO  [145.492    stream] -- Using input channel: 0
-- INFO  [145.492    stream] -- Using TV standard: DEFAULT
-- DEBUG [145.492    stream] -- Calling ioctl(VIDIOC_S_FMT) ...
-- INFO  [145.493    stream] -- Using resolution: 640x480
-- INFO  [145.493    stream] -- Using pixelformat: YUYV
-- DEBUG [145.493    stream] -- Calling ioctl(VIDIOC_G_PARM) ...
-- INFO  [145.494    stream] -- Using HW FPS: 0 -> 30 (coerced)
-- INFO  [145.494    stream] -- Using IO method: MMAP
-- DEBUG [145.494    stream] -- Calling ioctl(VIDIOC_REQBUFS) for V4L2_MEMORY_MMAP ...
-- INFO  [145.499    stream] -- Requested 5 HW buffers, got 5
-- DEBUG [145.499    stream] -- Allocating HW buffers ...
-- DEBUG [145.499    stream] -- Calling ioctl(VIDIOC_QUERYBUF) for device buffer 0 ...
-- DEBUG [145.499    stream] -- Mapping device buffer 0 ...
-- DEBUG [145.500    stream] -- Calling ioctl(VIDIOC_QUERYBUF) for device buffer 1 ...
-- DEBUG [145.500    stream] -- Mapping device buffer 1 ...
-- DEBUG [145.501    stream] -- Calling ioctl(VIDIOC_QUERYBUF) for device buffer 2 ...
-- DEBUG [145.501    stream] -- Mapping device buffer 2 ...
-- DEBUG [145.501    stream] -- Calling ioctl(VIDIOC_QUERYBUF) for device buffer 3 ...
-- DEBUG [145.501    stream] -- Mapping device buffer 3 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QUERYBUF) for device buffer 4 ...
-- DEBUG [145.502    stream] -- Mapping device buffer 4 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QBUF) for buffer 0 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QBUF) for buffer 1 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QBUF) for buffer 2 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QBUF) for buffer 3 ...
-- DEBUG [145.502    stream] -- Calling ioctl(VIDIOC_QBUF) for buffer 4 ...
-- DEBUG [145.503    stream] -- Allocating picture buffers ...
-- DEBUG [145.503    stream] -- Pre-allocating picture buffer 0 sized 1228800 bytes...
-- DEBUG [145.503    stream] -- Increasing picture 0x0xb5a00768 buffer: 0 -> 1228800 (+1228800)
-- DEBUG [145.503    stream] -- Pre-allocating picture buffer 1 sized 1228800 bytes...
-- DEBUG [145.503    stream] -- Increasing picture 0x0xb5a007a0 buffer: 0 -> 1228800 (+1228800)
-- DEBUG [145.503    stream] -- Pre-allocating picture buffer 2 sized 1228800 bytes...
-- DEBUG [145.503    stream] -- Increasing picture 0x0xb5a007d8 buffer: 0 -> 1228800 (+1228800)
-- DEBUG [145.503    stream] -- Pre-allocating picture buffer 3 sized 1228800 bytes...
-- DEBUG [145.503    stream] -- Increasing picture 0x0xb5a00810 buffer: 0 -> 1228800 (+1228800)
-- DEBUG [145.503    stream] -- Pre-allocating picture buffer 4 sized 1228800 bytes...
-- DEBUG [145.503    stream] -- Increasing picture 0x0xb5a00848 buffer: 0 -> 1228800 (+1228800)
-- DEBUG [145.503    stream] -- Device fd=8 initialized
-- DEBUG [145.503    stream] -- Calling ioctl(VIDIOC_STREAMON) ...
-- INFO  [145.511    stream] -- Capturing started
Segmentation fault

I tried getting a backtrace with gdb but it isn't very useful:

(gdb) backtrace
#0  0x0002261c in encoder_prepare ()
#1  0x00030b94 in stream_loop ()
#2  0x00000000 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Is there a way to compile with debug symbols?

Environment

init.d and systemctl scripts

Has anyone put together scripts to start, restart, and stop the service so ustreamer could start at boot or easily managed?

Can't watch video streamed

Maybe the problem is me, but I can't figure out how to watch the streamed video.

I am streaming /dev/video0 which is an easyCAP dongle to capture analog video, everything seems to be fine in the µStreamer console output, but I can't watch the stream neither on VLC nor on ffplay or in an HTML page with a video tag having for source the address of the server.
I am trying this on my home LAN, so there shouldn't be problems related to the network.
What am I doing wrong?

vcos_semaphore_wait_timeout() and RTC adjusting

I noticed that when I use OMX encoding, uStreamer switches to CPU encoding on any OMX encoding failure:

Nov 12 15:00:35 tinypilot ustreamer[576]: -- INFO  [18.174    stream] -- Creating pool with 3 workers ...
Nov 12 15:00:35 tinypilot ustreamer[576]: -- INFO  [18.174    stream] -- Capturing ...
Nov 12 15:01:07 tinypilot ustreamer[576]: -- ERROR [36.695  worker-2] -- Can't wait VCOS semaphore: EAGAIN (timeout)
Nov 12 15:01:07 tinypilot ustreamer[576]: -- INFO  [36.695  worker-2] -- Error while compressing buffer, falling back to CPU
Nov 12 15:01:07 tinypilot ustreamer[576]: -- INFO  [36.717    stream] -- Destroying workers pool ...

Relevant code is here:

https://github.com/pikvm/ustreamer/blob/d9b91a1d5f9f7cad8aaf031e6ef5cda59c52a431/src/encoder.c#L218L239

The permanent switch to CPU surprised me, since EAGAIN seems like only a transient error. I tried patching the code to make the CPU encoding temporary (just for the current buffer), and it seems to work as expected.

Would you be open to making uStreamer wait longer before deciding to switch permanently to CPU-only encoding?

Proposed solutions

A. Only temporarily fall back to CPU

The simplest solution is to make the fallbacks always temporary. I put together a quick implementation of this solution:

https://github.com/pikvm/ustreamer/compare/master...mtlynch:only-fallback-temporarily?expand=1

B. Fall back permanently after N consecutive failed attempts

A more advanced version of (A) would be to maintain a count of successful buffer encodings and only fail over to CPU if there have been N consecutive failed encodings with OMX.

C. Fall back temporarily after EAGAIN, permanently after EINVAL and other errors

The other way to come at it would be to distinguish between fatal and non-fatal errors from the OMX encoder. We could treat EAGAIN as non-fatal (fall back to CPU just for this buffer) and errors like EINVAL as fatal (permanently switch over to CPU).

Video capture not supported by the device

Mjpg streamer failing with select timeout at random time and its such pain, because camera controlling laser cutter. Tried ustreamer, output as follows:

sudo ./ustreamer --host=0.0.0.0
-- INFO [6670.969 main] -- Installing SIGINT handler ...
-- INFO [6670.969 main] -- Installing SIGTERM handler ...
-- INFO [6670.969 main] -- Ignoring SIGPIPE ...
-- INFO [6670.969 main] -- Using internal blank placeholder
-- INFO [6670.970 main] -- Listening HTTP on [0.0.0.0]:8080
-- INFO [6670.970 stream] -- Using V4L2 device: /dev/video0
-- INFO [6670.970 stream] -- Using desired FPS: 0

-- INFO [6670.970 stream] -- Device fd=8 opened
-- ERROR [6670.970 stream] -- Video capture not supported by the device
-- INFO [6670.970 stream] -- Device fd=8 closed
-- INFO [6670.970 stream] -- Sleeping 1 seconds before new stream init ...
-- INFO [6670.971 http] -- Starting HTTP eventloop ...

Mjpg streamer starts fine with this:

mjpg_streamer -i "input_uvc.so -y -d /dev/video0 -r 640x480" -o "output_http.so -w ./www"
MJPG Streamer Version: git rev: 58e952383cbe973641a3ce6c6a738bafc1605337
i: Using V4L2 device.: /dev/video0
i: Desired Resolution: 640 x 480
i: Frames Per Second.: -1
i: Format............: YUYV
i: JPEG Quality......: 80
i: TV-Norm...........: DEFAULT
UVCIOC_CTRL_ADD - Error at Pan (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Tilt (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Pan Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Pan/tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Focus (absolute): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Tilt (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan/tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Focus (absolute): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at LED1 Mode: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at LED1 Frequency: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Disable video processing: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Raw bits per pixel: Inappropriate ioctl for device (25)
o: www-folder-path......: ./www/
o: HTTP TCP port........: 8080
o: HTTP Listen Address..: (null)
o: username:password....: disabled
o: commands.............: enabled

Nadeyus na pomosh)

Video stuttering problems

When I use ustreamer, the stream on the client is unbearably stuttering. It causes buffering in mpv, and generally the stream playback stops after every 5-6 seconds for a few seconds, then continues. The playback is showing the same problems in mpv and in browsers as well.

Setup: Pi 4 4 GB, kernel 5.4.51-v7l+ #1327, Raspbian 10.4

Command line:

./ustreamer --device=/dev/video0 --format=YUYV --workers=4 --persistent --drop-same-frames=30 --host=0.0.0.0 --port=8080 --resolution=1280x720 --desired-fps=30 --encoder=omx

Version: 1.20, directly from git

I can provide logs if necessary. Using ffmpeg with the same capture device provides perfectly smooth streams, no choppyness or buffering (see ffmpeg command line in #23).

Chrome memory leak when leaving the /stream endpoint open

When running in Chrome Version 84.0.4147.125 (Official Build) (64-bit) (Ubuntu 20.04 Desktop) the memory footprint of the /stream endpoint is always increasing. Here's a screenshot of it using 2.5GB of memory:

image

I'm running ustream as:

./ustreamer \
    --format jpeg \
    --resolution 1280x720 \
    --desired-fps 15 \
    --drop-same-frames 5 \
    --slowdown

It's working fine in Firefox 79. Never uses more than 500KB of memory.

Is there a way to rotate the image?

I've just stumbled upon this project from an issuewithin mjpg-streamer and am really liking the experience so far. I used the available docker image from beholder-rpa and was running in no time.
I was just wondering if there's a way to flop / rotate the image, as my camera is mounted upside down. Manually calling v4l2-ctl --set-ctrl vertical_flip=1 or v4l2-ctl --set-ctrl rotate=180 does fix my issue, but it would be nice if this was available as a handy command line argument. I was previously using the --hf / --vf flag from mjpg-streamer.

Unable to set resolution

This is running on Ubuntu 20.04 with the only issue being I cannot resize the resolution. The capture device I am using supports up to 1080p 30fps output. Do you have any suggestions I could try to have this function? Thank you!

Command line arguments used are as follows.

sudo ./ustreamer \
	--host=0.0.0.0 \
	--port=80 \
	--resolution 1280x720 \
	-f 30 \
	-q 100 \
        --workers=3 \ 
        --persistent \ 
        --drop-same-frames=30 \
	--device-timeout 5

Console log is as follows

-- INFO  [5079.799    stream] -- Device fd=8 opened
-- INFO  [5079.799    stream] -- Using input channel: 0
-- INFO  [5079.799    stream] -- Using TV standard: DEFAULT
-- INFO  [5079.818    stream] -- Using resolution: 1280x720
-- INFO  [5079.818    stream] -- Using pixelformat: YUYV
-- INFO  [5079.836    stream] -- Using HW FPS: 30 -> 10 (coerced)
-- INFO  [5079.836    stream] -- Using IO method: MMAP
-- INFO  [5079.837    stream] -- Requested 3 HW buffers, got 3
-- INFO  [5079.844    stream] -- Capturing started
-- INFO  [5079.844    stream] -- Using JPEG quality: 100%
-- INFO  [5079.844    stream] -- Creating pool with 3 workers ...
-- INFO  [5079.844    stream] -- Capturing ...
-- ERROR [5080.845    stream] -- Mainloop select() timeout
-- INFO  [5080.845    stream] -- Destroying workers pool ...
-- INFO  [5081.023    stream] -- Capturing stopped
-- INFO  [5081.024    stream] -- Device fd=8 closed

Command line arguments that work

sudo ./ustreamer \
	--host=0.0.0.0 \
	--port=80 \
	-f 30 \
	-q 100 \
        --workers=3 \ 
        --persistent \ 
        --drop-same-frames=30 \
	--device-timeout 5 \
	--debug

ustreamer wont start - Unable to set pixelformat=YUYV, resolution=640x480: Input/output error - constant looping.

So if I just start the pi, I can issue "./ustreamer --host=0.0.0.0" and it starts streaming at 640x480 res and every thing is fine.
I dont know if I am stopping it correctly or not, but I hit CTRL-C to stop the stream, and after that, I am unable to restart the stream unless I reboot.

I keep getting this error: Unable to set pixelformat=YUYV, resolution=640x480: Input/output error.

I feel there is something that is not getting stopped or reset or something, but I dont have any idea what it might be.
Any info would be appreciated.

Centos7 make ustreamer error "V4L2_EVENT_SOURCE_CHANGE’ undeclared (first use in this function)"

Linux bogon 3.10.0-957.el7.x86_64 #1 SMP Thu Nov 8 23:39:32 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
[root@bogon ustreamer]# make
-- CC src/device.c
src/device.c: In function ‘device_consume_event’:
src/device.c:366:9: error: ‘V4L2_EVENT_SOURCE_CHANGE’ undeclared (first use in this function)
    case V4L2_EVENT_SOURCE_CHANGE:
         ^
src/device.c:366:9: note: each undeclared identifier is reported only once for each function it appears in
src/device.c: In function ‘_device_open_dv_timings’:
src/device.c:431:14: error: ‘V4L2_EVENT_SOURCE_CHANGE’ undeclared (first use in this function)
   sub.type = V4L2_EVENT_SOURCE_CHANGE;
              ^
make: *** [build/src/device.o] Error 1

Assert with v4l2loopback

Does anyone have any idea how to use uStreamer (patched if needed) for low-latency streaming of desktop (whole or at least a given window) into mjpeg?

Strict Access-Control-Allow-Origin by default

Apologies for anything I misunderstand as I've only recently learned a bit about this header. I noticed it is set to "*" in both _http_callback_snapshot and _http_callback_stream_write, and from what I understand it won't allow any bad cross-site stuff since since PREPROCESS_REQUEST protects both code areas and authorization cookies are not sent cross-domain with CORS.

But just to try to understand what the purpose of it was in server.c, I commented both lines out and rebuilt, and everything seemed to still work just fine.

Was curious if there are cases where the header is needed or helpful?

Usage with Raspberry Pi camera?

Hi,

Any chance this can be made to work with the Raspberry Pi's camera? (I am using the V1.0). I have the bcm2835-v4l2 kernel modules installed (so there is a video0 device). Yet:

# ustreamer --host 0.0.0.0 --port 1234 -b 3 -f 60
-- INFO  [198.908   tid=230] -- Installing SIGINT handler ...
-- INFO  [198.909   tid=230] -- Installing SIGTERM handler ...
-- INFO  [198.909   tid=230] -- Ignoring SIGPIPE ...
-- INFO  [198.909   tid=230] -- Using internal blank placeholder
-- INFO  [198.914   tid=230] -- Listening HTTP on [0.0.0.0]:1234
-- INFO  [198.914   tid=233] -- Starting HTTP eventloop ...
-- INFO  [198.915   tid=232] -- Using V4L2 device: /dev/video0
-- INFO  [198.915   tid=232] -- Using desired FPS: 60
================================================================================
-- INFO  [198.915   tid=232] -- Device fd=8 opened
-- INFO  [198.916   tid=232] -- Using input channel: 0
-- INFO  [198.916   tid=232] -- Using TV standard: DEFAULT
-- INFO  [198.918   tid=232] -- Using resolution: 640x480
-- INFO  [198.918   tid=232] -- Using pixelformat: YUYV
-- INFO  [198.919   tid=232] -- Using HW FPS: 60
-- INFO  [198.919   tid=232] -- Using IO method: MMAP
-- INFO  [198.922   tid=232] -- Requested 3 HW buffers, got 3
-- ERROR [198.922   tid=232] -- Can't VIDIOC_QUERYBUF: Not a tty
-- INFO  [198.924   tid=232] -- Device fd=8 closed
-- INFO  [198.924   tid=232] -- Sleeping 1 seconds before new stream init ...
-- INFO  [199.590   tid=230] -- ===== Stopping by SIGINT =====
-- INFO  [199.591   tid=233] -- HTTP eventloop stopped
-- INFO  [199.925   tid=230] -- Bye-bye

This happens for every combination of frame rates, buffer counts I tried, and also with -I USERPTR. Any ideas on what I am missing, or is this simply not supported?

build fails on FreeBSD; SYS_gettid is Linux-specific

Attempting to build on FreeBSD I get:

-- CC src/device.c
In file included from src/device.c:43:
In file included from src/logging.h:37:
src/threading.h:105:67: error: use of undeclared identifier 'SYS_gettid'
  ...assert(snprintf(name, MAX_THREAD_NAME, "tid=%d", (pid_t)syscall(SYS_gett...
                                                                     ^

Worked around by replacing with thr_self(2):

diff --git a/src/threading.h b/src/threading.h
index 5cc759d..448b751 100644
--- a/src/threading.h
+++ b/src/threading.h
@@ -34,6 +34,7 @@
 #      if defined(__FreeBSD__) || defined(__OpenBSD__) || defined(__DragonFly__)
 #              include <pthread_np.h>
 #              include <sys/param.h>
+#              include <sys/thr.h>
 #      endif
 #endif
 
@@ -102,7 +103,10 @@ INLINE void thread_get_name(char *name) { // Always required for logging
 #      endif
        if (retval < 0) {
 #endif
-               assert(snprintf(name, MAX_THREAD_NAME, "tid=%d", (pid_t)syscall(SYS_gettid)) > 0);
+               long tid;
+
+               thr_self(&tid);
+               assert(snprintf(name, MAX_THREAD_NAME, "tid=%ld", tid) > 0);
 #ifdef WITH_PTHREAD_NP
        }
 #endif

If desired I'm happy to clean this up with more #ifdefery

Feature Request: Alternate options for "NO SIGNAL" condition

Hello!
Would it be possible to add some alternate options for the "blank" JPG configuration?
A few ideas:

  1. Allow the internal blank "NO SIGNAL" image to be generated at a resolution matching the regular stream output... ie, the same as what was specified with the "-r" parameter. Some stream recipients really struggle with resolution changes mid-stream, so having the "NO SIGNAL" page match in resolution would avoid tearing, buffer glitches, and other artifacts. (This might actually be nice as the default option, unless there was a benefit to using 640x480 as the default?)

  2. Add an option to hold (and re-send) the last valid image frame prior to the disconnect, instead of showing the "NO SIGNAL" image. This might alternately have a "NO SIGNAL" text overlay, which may be advantageous to have be in small text.
    The use case for this mode, would be for systems where a downstream consumer is doing motion analysis on the stream. In the case of a system which experiences frequent disconnects, such as the nuisance "select() timeout" condition, each camera disconnect and its full screen "NO SIGNAL" page will generate cascading nuisance motion triggers in the downstream system. Holding and retransmitting the last valid image would allow these intermittent video streams to bridge the camera disconnect gaps without triggering motion actions in the downstream systems. (And likewise, having a "NO SIGNAL" overlay in SMALL text on the frozen image may be sufficient to allow downstream viewers to be aware of this condition, yet not change so much of the video frame that it triggers a motion event.)

Thanks in advance

ov2710 error

Hello, i can't start the ov2710 camera. What am I doing wrong ? Thank you.

lsusb
ID 0c45: 6368 Microdia

./ustreamer --device /dev/video0 -r 1920x1080 -f 30 -m JPEG

- INFO  [45121.867      main] -- Installing SIGINT handler ...
-- INFO  [45121.867      main] -- Installing SIGTERM handler ...
-- INFO  [45121.867      main] -- Ignoring SIGPIPE ...
-- INFO  [45121.867      main] -- Using internal blank placeholder
-- INFO  [45121.868      main] -- Listening HTTP on [127.0.0.1]:8080
-- INFO  [45121.868    stream] -- Using V4L2 device: /dev/video0
-- INFO  [45121.868    stream] -- Using desired FPS: 30
================================================================================
-- INFO  [45121.868      http] -- Starting HTTP eventloop ...
-- INFO  [45121.955    stream] -- Device fd=8 opened
-- INFO  [45121.955    stream] -- Using input channel: 0
-- INFO  [45121.955    stream] -- Using TV standard: DEFAULT
-- INFO  [45122.151    stream] -- Using resolution: 1920x1080
-- INFO  [45122.151    stream] -- Using pixelformat: JPEG
-- INFO  [45122.175    stream] -- Using HW FPS: 30 -> 10000000 (coerced)
-- INFO  [45122.235    stream] -- Requested 3 HW buffers, got 3
-- ERROR [45122.236    stream] -- Can't map device buffer 0: No such device
-- INFO  [45122.381    stream] -- Device fd=8 closed
-- INFO  [45122.381    stream] -- Sleeping 1 seconds before new stream init ...
-- INFO  [45123.206      main] -- ===== Stopping by SIGINT =====
-- INFO  [45123.206      http] -- HTTP eventloop stopped
-- INFO  [45123.382      main] -- Bye-bye

v4l2-ctl -list-formats-ext

ioctl: VIDIOC_ENUM_FMT
        Index: 0
        Type: Video Capture
        Pixel Format: 'MJPG' (compressed)
        Name: MJPEG
                Size: Discrete 1920x1080
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 352x288
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 432x240
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 320x184
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 176x144
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 160x120mm
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 320x240
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 640x360
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 640x480
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 800x600
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 960x720
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 1024x768
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.033s (31.000 fps)
                Size: 1280x960
                        Interval: Discrete 0.03s (30.000 fps)

        Index: 1
        Type: Video Capture
        Pixel Format: 'YUYV'
        Name: YUV 4: 2: 2
                Size: Discrete 1920x1080
                        Interval: Discrete 0.200s (5,000 fps)
                Size: Discrete 352x288
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 432x240
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 320x184
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 176x144
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 160x120mm
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 320x240
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 640x360
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 640x480
                        Interval: Discrete 0.03s (30.000 fps)
                Size: Discrete 800x600
                        Interval: Discrete 0.045s (21.000 fps)
                Size: Discrete 960x720
                        Interval: Discrete 0.093s (11.000 fps)
                Size: Discrete 1024x768
                        Interval: Discrete 0.075s (13.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.113s (9.000 fps)
                Size: 1280x960
                        Interval: Discrete 0.167s (6.000 fps)

Dependencies are unclear

Trying to build this, I spent a lot of time researching errors and trying to figure out what the following line meant:

You'll need make, gcc, libevent with pthreads support, libjpeg8/libjpeg-turbo, libuuid and libbsd (only for Linux).

make and gcc were pretty obvious, but the rest were a significant challenge to untangle - every one of them had a different package name from what was written there, and each had their own naming style to understand (e.g. "uuid-dev" instead of "libuuid-dev", and two different packages for libevent).

I ended up fixing it, and the following line is the result of a good chunk of research time:
sudo apt install libevent-dev libevent-pthreads-2.1-6-dev libjpeg8-dev uuid-dev libbsd-dev

I might end up adding a patch to support GRBG format cameras (an academic exercise I'm undertaking with a 20-year-old webcam and a Pi Zero, that hit a dead-end with a "select() timeout" error with mjpg-streamer), so if I figure that out, I'll try and remember to patch the doc as well in a separate pull. I'm just not terribly well versed in Git, soo... ;)

Building on Ubuntu 20.04

Not sure how best to integrate this into the readme... But on Ubuntu 20.04 (x86_64) there's package differences between Debian itself (as always)

# apt install build-essential libevent-dev libjpeg62-dev uuid-dev libbsd-dev make gcc libjpeg8 libjpeg-turbo8 libuuid1 libbsd0
Reading package lists... Done
Building dependency tree       
Reading state information... Done
libbsd0 is already the newest version (0.10.0-1).
libbsd0 set to manually installed.
libuuid1 is already the newest version (2.34-0.1ubuntu9.1).
libuuid1 set to manually installed.
The following additional packages will be installed:
  binutils binutils-common binutils-x86-64-linux-gnu cpp cpp-9 dpkg-dev fakeroot g++ g++-9 gcc-9 gcc-9-base libalgorithm-diff-perl libalgorithm-diff-xs-perl
  libalgorithm-merge-perl libasan5 libatomic1 libbinutils libc-dev-bin libc6-dev libcc1-0 libcrypt-dev libctf-nobfd0 libctf0 libdpkg-perl libevent-core-2.1-7 libevent-extra-2.1-7
  libevent-openssl-2.1-7 libevent-pthreads-2.1-7 libfakeroot libfile-fcntllock-perl libgcc-9-dev libgomp1 libisl22 libitm1 libjpeg62 liblsan0 libmpc3 libquadmath0 libstdc++-9-dev
  libtsan0 libubsan1 linux-libc-dev manpages-dev
Suggested packages:
  binutils-doc cpp-doc gcc-9-locales debian-keyring g++-multilib g++-9-multilib gcc-9-doc gcc-multilib autoconf automake libtool flex bison gdb gcc-doc gcc-9-multilib glibc-doc
  bzr libstdc++-9-doc make-doc
The following NEW packages will be installed:
  binutils binutils-common binutils-x86-64-linux-gnu build-essential cpp cpp-9 dpkg-dev fakeroot g++ g++-9 gcc gcc-9 gcc-9-base libalgorithm-diff-perl libalgorithm-diff-xs-perl
  libalgorithm-merge-perl libasan5 libatomic1 libbinutils libbsd-dev libc-dev-bin libc6-dev libcc1-0 libcrypt-dev libctf-nobfd0 libctf0 libdpkg-perl libevent-core-2.1-7
  libevent-dev libevent-extra-2.1-7 libevent-openssl-2.1-7 libevent-pthreads-2.1-7 libfakeroot libfile-fcntllock-perl libgcc-9-dev libgomp1 libisl22 libitm1 libjpeg-turbo8
  libjpeg62 libjpeg62-dev libjpeg8 liblsan0 libmpc3 libquadmath0 libstdc++-9-dev libtsan0 libubsan1 linux-libc-dev make manpages-dev uuid-dev
0 upgraded, 52 newly installed, 0 to remove and 3 not upgraded.
Need to get 40.9 MB of archives.
After this operation, 179 MB of additional disk space will be used.
$ make
-- CC src/device.c
-- CC src/encoder.c
-- CC src/encoders/cpu/encoder.c
-- CC src/encoders/hw/encoder.c
-- CC src/http/base64.c
-- CC src/http/blank.c
-- CC src/http/mime.c
-- CC src/http/path.c
-- CC src/http/server.c
-- CC src/http/static.c
-- CC src/http/unix.c
-- CC src/http/uri.c
-- CC src/logging.c
-- CC src/main.c
-- CC src/options.c
-- CC src/picture.c
-- CC src/stream.c
-- LD ustreamer
===== Build complete =====
== CC      = cc
== LIBS    = -lm -ljpeg -pthread -levent -levent_pthreads -luuid -lbsd
== CFLAGS  = -O3 -c -std=c11 -Wall

And then for GPIO support, libgpiod-dev for the package (which would install libgpiod2)

reedy@ubuntu64-build:~/ustreamer$ WITH_GPIO=1 make
-- CC src/gpio/gpio.c
-- LD ustreamer
===== Build complete =====
== CC      = cc
== LIBS    = -lm -ljpeg -pthread -levent -levent_pthreads -luuid -lgpiod -lbsd
== CFLAGS  = -O3 -c -std=c11 -Wall -Wextra -D_GNU_SOURCE -DWITH_GPIO -DWITH_PTHREAD_NP -DWITH_SETPROCTITLE
== LDFLAGS = 

orangepi+USBtoHDMI(ID 534d:2109) ERROR :Can't set input channel

for orangepi zero

v4l2-ctl --all
Driver Info (not using libv4l2):
        Driver name   : uvcvideo
        Card type     : USB Video: USB Video
        Bus info      : usb-1c1b000.usb-1
        Driver version: 4.19.67
        Capabilities  : 0x84A00001
                Video Capture
                Streaming
                Extended Pix Format
                Device Capabilities
        Device Caps   : 0x04200001
                Video Capture
                Streaming
                Extended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
        Width/Height      : 1920/1080
        Pixel Format      : 'MJPG'
        Field             : None
        Bytes per Line    : 0
        Size Image        : 4147200
        Colorspace        : sRGB
        Transfer Function : Default
        YCbCr/HSV Encoding: Default
        Quantization      : Default
        Flags             : 
Crop Capability Video Capture:
        Bounds      : Left 0, Top 0, Width 1920, Height 1080
        Default     : Left 0, Top 0, Width 1920, Height 1080
        Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1920, Height 1080
Selection: crop_bounds, Left 0, Top 0, Width 1920, Height 1080
Streaming Parameters Video Capture:
        Capabilities     : timeperframe
        Frames per second: 30.000 (30/1)
        Read buffers     : 0
                     brightness (int)    : min=-128 max=127 step=1 default=-11 value=-11
                       contrast (int)    : min=0 max=255 step=1 default=148 value=148
                     saturation (int)    : min=0 max=255 step=1 default=180 value=180
                            hue (int)    : min=-128 max=127 step=1 default=0 value=0
-- INFO  [33494.066    stream] -- Device fd=8 opened
-- INFO  [33494.066    stream] -- Using input channel: 0
-- ERROR [33494.066    stream] -- Can't set input channel
-- INFO  [33494.066    stream] -- Device fd=8 closed
-- INFO  [33494.066    stream] -- Sleeping 1 seconds before new stream init ...

for orangepi R1

v4l2-ctl --all -d /dev/video1
Driver Info:
        Driver name      : uvcvideo
        Card type        : USB Video: USB Video
        Bus info         : usb-1c1d000.usb-1.3
        Driver version   : 5.9.11
        Capabilities     : 0x84a00001
                Video Capture
                Metadata Capture
                Streaming
                Extended Pix Format
                Device Capabilities
        Device Caps      : 0x04200001
                Video Capture
                Streaming
                Extended Pix Format
Media Driver Info:
        Driver name      : uvcvideo
        Model            : USB Video: USB Video
        Serial           : 
        Bus info         : usb-1c1d000.usb-1.3
        Media version    : 5.9.11
        Hardware revision: 0x00002100 (8448)
        Driver version   : 5.9.11
Interface Info:
        ID               : 0x03000002
        Type             : V4L Video
Entity Info:
        ID               : 0x00000001 (1)
        Name             : USB Video: USB Video
        Function         : V4L2 I/O
        Flags         : default
        Pad 0x01000007   : 0: Sink
          Link 0x0200000d: from remote pad 0x100000a of entity 'Processing 2': Data, Enabled, Immutable
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
        Width/Height      : 1920/1080
        Pixel Format      : 'YUYV' (YUYV 4:2:2)
        Field             : None
        Bytes per Line    : 3840
        Size Image        : 4147200
        Colorspace        : sRGB
        Transfer Function : Default (maps to sRGB)
        YCbCr/HSV Encoding: Default (maps to ITU-R 601)
        Quantization      : Default (maps to Limited Range)
        Flags             : 
Crop Capability Video Capture:
        Bounds      : Left 0, Top 0, Width 1920, Height 1080
        Default     : Left 0, Top 0, Width 1920, Height 1080
        Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1920, Height 1080, Flags: 
Selection: crop_bounds, Left 0, Top 0, Width 1920, Height 1080, Flags: 
Streaming Parameters Video Capture:
        Capabilities     : timeperframe
        Frames per second: 5.000 (5/1)
        Read buffers     : 0
                     brightness 0x00980900 (int)    : min=-128 max=127 step=1 default=-11 value=-11
                       contrast 0x00980901 (int)    : min=0 max=255 step=1 default=148 value=148
                     saturation 0x00980902 (int)    : min=0 max=255 step=1 default=180 value=180
                            hue 0x00980903 (int)    : min=-128 max=127 step=1 default=0 value=0
-- INFO  [33494.066    stream] -- Device fd=8 opened
-- INFO  [33494.066    stream] -- Using input channel: 0
-- ERROR [33494.066    stream] -- Can't set input channel
-- INFO  [33494.066    stream] -- Device fd=8 closed
-- INFO  [33494.066    stream] -- Sleeping 1 seconds before new stream init ...

help~~~

Feature Suggestion: MJPG-Source Support

I was looking for mjpg-streamer alternatives and found ustreamer. I've played a little with it now and like it so far but without being able to use the MJPG source stream of the cam, my max fps is capped by the cam itself to 5 fps. Only the MJPG stream delivers FHD@30fps. I'd really love to try it again if you get around to teach it to also just forward the MJPG stream coming from the device. Building itself went without issues and it runs smoothly here. Thanks for sharing.

Can't resume stream after unplugging HDMI

Hello,

I am on Rasp4 with the B101 card to stream an HDMI video flux (either a camera or laptops)
It can be plugged-in and out on a regular basis (with a large amount of different devices) but i never want the stream to shutdown which ustreamer should help me with.

Now i have installed the last raspbian version as of this message and compiled ustreamer.

My run command is

./ustreamer --host=192.168.1.166 --port=15666 --format=uyvy --encoder=omx --workers=3

It works fine and i have my video stream, unplugging properly shows NOSIGNAL black frame.
So far so good.

But the problem is until i reboot the rasp entirely i can never have the stream to work again.
I have no idea what to look for or what i may have to do to setup properly the b101, information about it are very confusing.

Another things is using --dv-timing always return the following error

Can't subscribe to V4L2_EVENT_SOURCE_CHANGE: Invalid argument

Thanks!

Log

-- INFO  [1276.896      http] -- Starting HTTP eventloop ...
-- INFO  [1276.896    stream] -- Device fd=8 opened
-- INFO  [1276.896    stream] -- Using input channel: 0
-- INFO  [1276.896    stream] -- Using TV standard: DEFAULT
-- INFO  [1276.897    stream] -- Using resolution: 640x480
-- INFO  [1276.897    stream] -- Using pixelformat: UYVY
-- INFO  [1276.898    stream] -- Using HW FPS: 0 -> 90 (coerced)
-- INFO  [1276.898    stream] -- Using IO method: MMAP
-- INFO  [1276.903    stream] -- Requested 5 HW buffers, got 5
-- INFO  [1279.432    stream] -- Capturing started
-- INFO  [1279.432    stream] -- Initializing BCM ...
-- INFO  [1279.434    stream] -- Initializing OMX ...
-- INFO  [1279.435    stream] -- Initializing OMX encoder ...
-- INFO  [1279.438    stream] -- Initializing OMX encoder ...
-- INFO  [1279.440    stream] -- Initializing OMX encoder ...
-- INFO  [1279.459    stream] -- Using JPEG quality: 80%
-- INFO  [1279.459    stream] -- Creating pool with 3 workers ...
-- INFO  [1279.459    stream] -- Capturing ...
-- INFO  [1287.157      http] -- HTTP: Registered client: [192.168.1.91]:41038, id=d67c24f9-5117-43c6-933c-06f9e5e70658; clients now: 1
-- ERROR [1296.167    stream] -- Mainloop select() timeout
-- INFO  [1296.167    stream] -- Destroying workers pool ...
-- INFO  [1296.177      http] -- HTTP: Changed picture to BLANK
-- INFO  [1299.217    stream] -- Capturing stopped
-- INFO  [1299.219    stream] -- Device fd=8 closed
================================================================================
-- INFO  [1299.219    stream] -- Device fd=8 opened
-- INFO  [1299.219    stream] -- Using input channel: 0
-- INFO  [1299.219    stream] -- Using TV standard: DEFAULT
-- ERROR [1305.297    stream] -- Unable to set pixelformat=UYVY, resolution=640x480: Invalid argument
-- INFO  [1305.297    stream] -- Device fd=8 closed
-- INFO  [1305.297    stream] -- Sleeping 1 seconds before new stream init ...
================================================================================
-- INFO  [1306.297    stream] -- Device fd=8 opened
-- INFO  [1306.298    stream] -- Using input channel: 0
-- INFO  [1306.298    stream] -- Using TV standard: DEFAULT
-- INFO  [1306.298    stream] -- Using resolution: 640x480
-- INFO  [1306.298    stream] -- Using pixelformat: UYVY
-- INFO  [1306.299    stream] -- Using HW FPS: 0 -> 90 (coerced)
-- INFO  [1306.299    stream] -- Using IO method: MMAP
-- INFO  [1306.304    stream] -- Requested 5 HW buffers, got 5
-- ERROR [1312.738    stream] -- Unable to start capturing: Invalid argument
-- INFO  [1312.740    stream] -- Device fd=8 closed
-- INFO  [1312.740    stream] -- Sleeping 1 seconds before new stream init ...
================================================================================
-- INFO  [1313.740    stream] -- Device fd=8 opened
-- INFO  [1316.817    stream] -- Using input channel: 0
-- INFO  [1316.817    stream] -- Using TV standard: DEFAULT
-- ERROR [1319.655    stream] -- Unable to set pixelformat=UYVY, resolution=640x480: Invalid argument
-- INFO  [1319.655    stream] -- Device fd=8 closed
-- INFO  [1319.655    stream] -- Sleeping 1 seconds before new stream init ...

Make license (more) permissive

Hi there, I'm thinking of using this in non-GPL software bundle (containing mostly open source, but also some proprietary stuff - altogether there are hundreds of different licenses) and there'll be some licensing clashes.

Would you consider changing license of uStreamer to a permissive license (MIT, BSD or alike) or at least to LGPL (LGPL is not a win, but has the potential to solve quite many of the problematic cases)?

Build warnings: format specifies type 'unsigned long long' but the argument has type 'uint64_t', etc

clang-8 prints these:

src/device.c:393:31: warning: format specifies type 'unsigned long long' but the argument has type 'uint64_t' (aka 'unsigned long') [-Wformat]
                        dv.bt.width, dv.bt.height, dv.bt.pixelclock);
                                                   ^~~~~~~~~~~~~~~~
src/logging.h:121:48: note: expanded from macro 'LOG_INFO'
                LOG_PRINTF(COLOR_GREEN, "INFO ", "", _msg, ##__VA_ARGS__); \
                                                     ~~~~    ^~~~~~~~~~~
src/logging.h:106:63: note: expanded from macro 'LOG_PRINTF'
                LOG_PRINTF_NOLOCK(_label_color, _label, _msg_color, _msg, ##__VA_ARGS__); \
                                                                    ~~~~    ^~~~~~~~~~~
src/logging.h:95:34: note: expanded from macro 'LOG_PRINTF_NOLOCK'
                                get_now_monotonic(), _buf, ##__VA_ARGS__); \
                                                             ^~~~~~~~~~~
src/device.c:393:31: warning: format specifies type 'unsigned long long' but the argument has type 'uint64_t' (aka 'unsigned long') [-Wformat]
                        dv.bt.width, dv.bt.height, dv.bt.pixelclock);
                                                   ^~~~~~~~~~~~~~~~
src/logging.h:121:48: note: expanded from macro 'LOG_INFO'
                LOG_PRINTF(COLOR_GREEN, "INFO ", "", _msg, ##__VA_ARGS__); \
                                                     ~~~~    ^~~~~~~~~~~
src/logging.h:106:63: note: expanded from macro 'LOG_PRINTF'
                LOG_PRINTF_NOLOCK(_label_color, _label, _msg_color, _msg, ##__VA_ARGS__); \
                                                                    ~~~~    ^~~~~~~~~~~
src/logging.h:98:34: note: expanded from macro 'LOG_PRINTF_NOLOCK'
                                get_now_monotonic(), _buf, ##__VA_ARGS__); \
                                                             ^~~~~~~~~~~
2 warnings and 1 error generated.

http://beefy9.nyi.freebsd.org/data/113amd64-default/522982/logs/ustreamer-1.10.log

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.