Code Monkey home page Code Monkey logo

Comments (21)

raymanfx avatar raymanfx commented on July 23, 2024 2

Some good news - I was able to implement MMAP output streaming today (tested using v4l2loopback). My WIP stuff is on the next branch if you want to check it out.

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024 1

Not sure if I'm doing this correctly, but output it Invalid Argument.

Source:

use std::io::Read;

use v4l::{Timestamp, buffer::{Buffer, Flags, Metadata}};
use v4l::io::stream::Output;
use v4l::output::device::Device as OutputDevice;
use v4l::prelude::{DeviceExt, MmapStream};

// use nix::sys::ioctl;

const DECODE_DEVICE: &str = "/dev/video10";
const VIDEO_PATH: &str = "/home/alarm/short.h264";

fn main() -> std::io::Result<()> {
    let mut dev = OutputDevice::with_path(DECODE_DEVICE)
        .expect(&format!("Unable to open decode device {}", DECODE_DEVICE));

    let capabilities = dev.query_caps().expect("could not get capabilities");

    dbg!(capabilities);

    let formats = dev.enum_formats()?;
    println!("Number of formats: {}", formats.len());
    for fmt in formats {
        print!("{}", fmt);
    }

    let mut stream = MmapStream::with_buffers(&mut dev, 1).expect("Failed to create buffer stream");

    let mut file = std::fs::File::open(VIDEO_PATH)?;
    let mut contents = vec![];
    file.read_to_end(&mut contents)?;

    let metadata = Metadata {
        bytesused: contents.len() as u32,
        flags: Flags::empty(),
        field: 0,
        timestamp: Timestamp::new(0, 0),
        sequence: 0
    };

    let buffer = Buffer {
        planes: vec![contents.as_ref()],
        meta: metadata
    };

    Output::next(&mut stream, buffer)?;

    Ok(())
}

Output:

[alarm@alarmpi ~]$ ./framebuffer_decode/target/debug/framebuffer_decode 
[src/main.rs:19] capabilities = Capabilities {
    driver: "bcm2835-codec",
    card: "bcm2835-codec-decode",
    bus: "platform:bcm2835-codec",
    version: (
        5,
        4,
        75,
    ),
    capabilities: VIDEO_M2M_MPLANE | EXT_PIX_FORMAT | STREAMING,
}
Number of formats: 2
index       : 0
type:       : 10
flags:      : COMPRESSED
description : H.264
fourcc      : H264
index       : 1
type:       : 10
flags:      : COMPRESSED
description : Motion-JPEG
fourcc      : MJPG
thread 'main' panicked at 'Failed to create buffer stream: Os { code: 22, kind: InvalidInput, message: "Invalid argument" }', src/main.rs:27:60
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

I should also point out that this may only be true on Raspberry Pi 4. That is what I'm currently testing on.

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

Hi, I'm very interested in getting multi-planar format support going. Unfortunately I don't have a camera device which supports/needs it. What device are you testing this with? I'm pretty sure standard USB webcams will work using the single-planar (aka packed) v4l2 layer even on the RPi 4, no?

From your first comment I read that you need multi-planar support for the output device, is that correct? Maybe we can tackle it for input and output at the same time even.

Apart from the format change, I can envision some more edits to be required, e.g. a new Buffer struct to represent planar buffers. Perhaps using a data structure like Vec<&[u8]>.

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

I'm using /dev/video10 on a raspberry pi 4 (kernel: 5.4) in an attempt to decode h264. It seems this device only supports multi-planar buffers. With the v4l2_buf_type_V4L2_BUF_TYPE_VIDEO_OUTPUT buffer type you get Invalid Argument errors.

Using v4l2-ctl I still haven't been able to get a h264 stream decoded, keep running into DECODER STOP messages.

However it seems I've taken a few steps back. I tried to install aarch64 arch linux (kernel: 4.8.9) and now I don't get any v4l2 devices at all.

Anyway,

From your first comment I read that you need multi-planar support for the output device, is that correct? Maybe we can tackle it for input and output at the same time even.

Yes. If I can help I'll try.

Apart from the format change, I can envision some more edits to be required, e.g. a new Buffer struct to represent planar buffers. Perhaps using a data structure like Vec<&[u8]>.

I'm very new to this space and I'm not really even sure what multi-planar is. If I understand correctly it's buffers for a device that needs a discontiguous buffer? (going off https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/planar-apis.html#planar-apis)

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

I'm using /dev/video10 on a raspberry pi 4 (kernel: 5.4) in an attempt to decode h264. It seems this device only supports multi-planar buffers. With the v4l2_buf_type_V4L2_BUF_TYPE_VIDEO_OUTPUT buffer type you get Invalid Argument errors.

Oh I see. So you're not working with actual video I/O devices, but trying to use the hardware memory-to-memory codec on the RPi (which happens to be implemented as a v4l2 module).

I'm very new to this space and I'm not really even sure what multi-planar is. If I understand correctly it's buffers for a device that needs a discontiguous buffer? (going off https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/planar-apis.html#planar-apis)

Yes, that's about right. Decoding a video frame will usually give you "raw" frames, probably some YUV format instead of the more familiar RGB formats. YUV data captures luminance and chrominance. While YUV pixels can be packed, video codecs usually prefer them to be laid out in memory planes, e.g. one plane for the luminance (Y) and one for the chrominance components (UV).

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

FYI I'm planning to tackle this tomorrow. Are you available for testing this weekend?

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

Yeah, I should be free sometime this weekend to test. Sorry, meant to reply sooner. Went out to buy a C7000, gonna try setting up virtual desktops, encode their screens, and decode on Pi's.

I got the PI back to having the v4l2 devices online.

Would also note that the PI seems to fail v4l2-compliance tests.

$ sudo v4l2-compliance -d /dev/video10  --stream-from output.h264 --streaming

v4l2-compliance SHA: not available
, 32 bits, 32-bit time_t

Compliance test for bcm2835-codec device /dev/video10:

Driver Info:
	Driver name      : bcm2835-codec
	Card type        : bcm2835-codec-decode
	Bus info         : platform:bcm2835-codec
	Driver version   : 5.4.51
	Capabilities     : 0x84204000
		Video Memory-to-Memory Multiplanar
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps      : 0x04204000
		Video Memory-to-Memory Multiplanar
		Streaming
		Extended Pix Format
Media Driver Info:
	Driver name      : bcm2835-codec
	Model            : bcm2835-codec
	Serial           : 0000
	Bus info         : platform:bcm2835-codec
	Media version    : 5.4.51
	Hardware revision: 0x00000001 (1)
	Driver version   : 5.4.51
Interface Info:
	ID               : 0x0300000c
	Type             : V4L Video
Entity Info:
	ID               : 0x00000001 (1)
	Name             : bcm2835-codec-decode-source
	Function         : V4L2 I/O
	Pad 0x01000002   : 0: Source
	  Link 0x02000008: to remote pad 0x1000004 of entity 'bcm2835-codec-decode-proc': Data, Enabled, Immutable

Required ioctls:
	test MC information (see 'Media Driver Info' above): OK
		fail: v4l2-compliance.cpp(694): found_bit
	test VIDIOC_QUERYCAP: FAIL

Allow for multiple opens:
	test second /dev/video10 open: OK
		fail: v4l2-compliance.cpp(694): found_bit
	test VIDIOC_QUERYCAP: FAIL
	test VIDIOC_G/S_PRIORITY: OK
	test for unlimited opens: OK

	test invalid ioctls: OK
Debug ioctls:
	test VIDIOC_DBG_G/S_REGISTER: OK (Not Supported)
	test VIDIOC_LOG_STATUS: OK (Not Supported)

Input ioctls:
	test VIDIOC_G/S_TUNER/ENUM_FREQ_BANDS: OK (Not Supported)
	test VIDIOC_G/S_FREQUENCY: OK (Not Supported)
	test VIDIOC_S_HW_FREQ_SEEK: OK (Not Supported)
	test VIDIOC_ENUMAUDIO: OK (Not Supported)
	test VIDIOC_G/S/ENUMINPUT: OK (Not Supported)
	test VIDIOC_G/S_AUDIO: OK (Not Supported)
	Inputs: 0 Audio Inputs: 0 Tuners: 0

Output ioctls:
	test VIDIOC_G/S_MODULATOR: OK (Not Supported)
	test VIDIOC_G/S_FREQUENCY: OK (Not Supported)
	test VIDIOC_ENUMAUDOUT: OK (Not Supported)
	test VIDIOC_G/S/ENUMOUTPUT: OK (Not Supported)
	test VIDIOC_G/S_AUDOUT: OK (Not Supported)
	Outputs: 0 Audio Outputs: 0 Modulators: 0

Input/Output configuration ioctls:
	test VIDIOC_ENUM/G/S/QUERY_STD: OK (Not Supported)
	test VIDIOC_ENUM/G/S/QUERY_DV_TIMINGS: OK (Not Supported)
	test VIDIOC_DV_TIMINGS_CAP: OK (Not Supported)
	test VIDIOC_G/S_EDID: OK (Not Supported)

Control ioctls:
	test VIDIOC_QUERY_EXT_CTRL/QUERYMENU: OK
	test VIDIOC_QUERYCTRL: OK
	test VIDIOC_G/S_CTRL: OK
	test VIDIOC_G/S/TRY_EXT_CTRLS: OK
	test VIDIOC_(UN)SUBSCRIBE_EVENT/DQEVENT: OK
	test VIDIOC_G/S_JPEGCOMP: OK (Not Supported)
	Standard Controls: 2 Private Controls: 0

Format ioctls:
	test VIDIOC_ENUM_FMT/FRAMESIZES/FRAMEINTERVALS: OK
	test VIDIOC_G/S_PARM: OK (Not Supported)
	test VIDIOC_G_FBUF: OK (Not Supported)
	test VIDIOC_G_FMT: OK
	test VIDIOC_TRY_FMT: OK
	test VIDIOC_S_FMT: OK
	test VIDIOC_G_SLICED_VBI_CAP: OK (Not Supported)
	test Cropping: OK (Not Supported)
	test Composing: OK
	test Scaling: OK (Not Supported)

Codec ioctls:
	test VIDIOC_(TRY_)ENCODER_CMD: OK (Not Supported)
	test VIDIOC_G_ENC_INDEX: OK (Not Supported)
		fail: v4l2-test-codecs.cpp(123): ret != 0
	test VIDIOC_(TRY_)DECODER_CMD: FAIL

Buffer ioctls:
	test VIDIOC_REQBUFS/CREATE_BUFS/QUERYBUF: OK
	test VIDIOC_EXPBUF: OK
	test Requests: OK (Not Supported)

Test input 0:

Streaming ioctls:
	test read/write: OK (Not Supported)
	test blocking wait: OK
		fail: v4l2-test-buffers.cpp(1353): ret == 0
	test MMAP (select): FAIL
		fail: v4l2-test-buffers.cpp(1353): ret == 0
	test MMAP (epoll): FAIL
	test USERPTR (select): OK (Not Supported)
	test DMABUF: Cannot test, specify --expbuf-device

Total for bcm2835-codec device /dev/video10: 51, Succeeded: 46, Failed: 5, Warnings: 0

Anyway, shoot me any tests you want me to run.

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

It looks like getting your usecase working is going to be a lot more involved than I initially thought. I've started the work in the next branch. Basically, there's two (currently unsupported) features that are required:

  1. Streaming output (MMAP in particular) - right now, only plain write() is supported for output devices.
  2. Multi-planar buffer support

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

Those 2 sound about right. I found some more interesting information at https://www.raspberrypi.org/forums/viewtopic.php?f=68&t=281296&p=1737303&hilit=v4l2+%2Fdev%2Fvideo10#p1737303 that might be useful.

I'm a beginner in rust but if there's anything I can do to help I'd love to.

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

I just pushed some preliminary multi-planar buffer support to the next branch. Could you please change this: https://github.com/raymanfx/libv4l-rs/blob/next/src/output/device.rs#L157 to return VideoOutputMplane and see if that makes it work for you? I'm still thinking about how to best incorporate multi-plane support into the API.

EDIT: the way to do this for your local project is to clone this repo (next branch) and point your project to its directory for dependency resolution in Cargo.toml.

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

Cool. I'll give it a test in the next couple hours.

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

Looks golden.

Source:

use std::io::Read;

use v4l::prelude::DeviceExt;
use v4l::output::device::Device as OutputDevice;

// use nix::sys::ioctl;

const DECODE_DEVICE: &str = "/dev/video10";

fn main() -> std::io::Result<()> {
    let mut file = std::fs::File::open("/home/alarm/short.h264")?;
    let mut contents = vec![];
    file.read_to_end(&mut contents)?;

    let dev = OutputDevice::with_path(DECODE_DEVICE)
        .expect(&format!("Unable to open decode device {}", DECODE_DEVICE));

    let capabilities = dev.query_caps().expect("could not get capabilities");

    dbg!(capabilities);

    let formats = dev.enum_formats()?;
    println!("Number of formats: {}", formats.len());
    for fmt in formats {
        print!("{}", fmt);
    }

    Ok(())
}

Output:

[alarm@alarmpi framebuffer_decode]$ ./target/debug/framebuffer_decode 
[src/main.rs:21] capabilities = Capabilities {
    driver: "bcm2835-codec",
    card: "bcm2835-codec-decode",
    bus: "platform:bcm2835-codec",
    version: (
        5,
        4,
        75,
    ),
    capabilities: VIDEO_M2M_MPLANE | EXT_PIX_FORMAT | STREAMING,
}
Number of formats: 2
index       : 0
type:       : 10
flags:      : COMPRESSED
description : H.264
fourcc      : H264
index       : 1
type:       : 10
flags:      : COMPRESSED
description : Motion-JPEG
fourcc      : MJPG

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

Also, I hit up the raspberry pi forums about the compliance issue: https://www.raspberrypi.org/forums/viewtopic.php?f=67&t=291227&p=1760878#p1760878

Some interesting input on v4l2-ctl/v4l2-compliance in there.

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

Sounds good. Now we need to check whether the actual streaming I/O works with the new code. Did you try writing some H.264 or JPEG frames to that device (by creating a MmapStream and using the io::stream::Output trait)?

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

Okay, so this will require some more debugging. Can you add some logs to this function: https://github.com/raymanfx/libv4l-rs/blob/next/src/io/mmap/arena.rs#L54 and see where exactly it fails?

I just looked at some Raspberry Pi 4 bundles today, but the 4GB models are still rather expensive, I don't think I can justify getting one just for this project. Anything less than 4GB is not worth paying for though.

Apart from that, I'm still wondering about how the H.264 decoder works here - you're basically just passing the entire file buffer in one go. From what I read in the link you provided (https://www.raspberrypi.org/forums/viewtopic.php?f=67&t=291227&p=1760878#p1760878), I think you're supposed to pass framed data - so you'd need to split the H.264 bytestream into frames before passing them to the decoder? Then again, they also they it should work with unframed data as well, disregarding a latency increase.

from libv4l-rs.

FallingSnow avatar FallingSnow commented on July 23, 2024

I'll add some logs to that function and see where it's failing.

Yeah, from what the forum post explains, the decoder should work on unframed data (with performance degradation). But honestly I don't really know if passing in the file buffer is the right way to do it either.

Can I just give you ssh access to my PI? I leave it on 24/7 anyway. If that works just email me at ayrton AT sparling DOT us.

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

Oh .. I see it now. The H.264 decoder device on the Raspberry Pi 4 requires using the VIDEO_M2M_MPLANE API, but this crate right now only implements capture and output (not memory-to-memory codecs). That's going to involve more work. I think I want to restructure some things first before implementing M2M streaming.

EDIT: not entirely true - while your decoder supports M2M, you don't actually want to use it here, since you're feeding data from an external source. I'll continue with the refactor for now to ease implementing future usecases such as M2M.

from libv4l-rs.

patrickelectric avatar patrickelectric commented on July 23, 2024

Hey @raymanfx, what is the status for it ?

from libv4l-rs.

raymanfx avatar raymanfx commented on July 23, 2024

Hi @patrickelectric, I prepared the multi-planar API support a while ago: https://github.com/raymanfx/libv4l-rs/tree/mplane but was not happy with the API at the time. Since it is not required for my projects, I did not finish the work. If you are interested in picking it up or collaborating, let me know.

from libv4l-rs.

wangxiaochuTHU avatar wangxiaochuTHU commented on July 23, 2024

Hi @raymanfx , it looks that the mplane- branch has already been compatible with a VIDEO_CAPTURE_MPLANE device, just by looking into the data structures defined and capture procedures. But you said you did not finish the work. Would you mind pointing out the unfinished part or giving any guidelines on its accomplishment?

Hi @patrickelectric, I prepared the multi-planar API support a while ago: https://github.com/raymanfx/libv4l-rs/tree/mplane but was not happy with the API at the time. Since it is not required for my projects, I did not finish the work. If you are interested in picking it up or collaborating, let me know.

from libv4l-rs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.