Take this ffmpeg example which can encode an ingest (udp) to an mpegts feed with all audio & video variants muxed in:
/root/bin/ffmpeg -loglevel debug -threads 4 -filter_complex_threads 4 -fflags +genpts+nobuffer -i '$channel_udp_stream?fifo_size=9000000' -filter_complex \
"['$video_stream_id']split=3[s0][s1][s2]; \
[s0]yadif[v0]; \
[s1]yadif[v1]; \
[s2]scale=w=1280:h=720:force_original_aspect_ratio=decrease:flags=lanczos,yadif[v2]" \
-bsf:a aac_adtstoasc \
-b:v:0 1500k -c:v h264_nvenc -a53cc 1 -preset slow -rc:v vbr_hq -pix_fmt yuv420p -profile:v main -level 4.0 -gpu 1 \
-b:v:1 1000k -c:v h264_nvenc -a53cc 1 -preset slow -rc:v vbr_hq -pix_fmt yuv420p -profile:v main -level 4.0 -gpu 2 \
-b:v:2 625k -c:v h264_nvenc -a53cc 1 -preset slow -rc:v vbr_hq -pix_fmt yuv420p -profile:v main -level 4.0 -gpu 3 \
-b:a:0 256k -b:a:1 192k -b:a:2 128k -c:a aac -ar 48000 \
-f mpegts \
"udp://$ip-address:$port"
The snippet above takes one ingest, uses a complex filter to create three variants that are encoded by the h264_nvenc
encoder for video and three separate audio streams (already present in the input mpegts feed), and then muxes the result into an mpegts stream over udp.
How can I do something similar with gstreamer, using x264enc and faac elements, using the appropriate muxer?