FFmpeg is used to provide a video pipeline which can feed our virtual camera device (created with v4l2loopback). It is able to directly transmit videos or to gather images into a video and transmit them.
Installed on Ubuntu 14.04 through PPA found in https://ffmpeg.org/:
sudo add-apt-repository ppa:mc3man/trusty-media sudo apt-get update sudo apt-get install ffmpeg
FFmpeg has a wide array of parameters that can be used to create / modify a video stream. The ones we use are listed below. A more comprehensive list can be found here.
Read video at original speed and deliver it to the v4l2 device:
ffmpeg -re -i input.mp4 -r 30 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0
Same as above, video scaled to 640×480:
ffmpeg -re -i input.mp4 -vf scale=640:480 -r 25 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0
Capture screen and direct it to v4l2 device:
ffmpeg -f x11grab -r 15 -s 1280x720 -i :0.0+0,0 -vcodec rawvideo -threads 0 -f v4l2 /dev/video0
Sequencially composes images into a mp4 video - 5fps:
ffmpeg -framerate 5 -start_number 1 -i data%d.png -r 5 -pix_fmt yuv420p out.mp4
Concat the videos listed in “desc” file:
ffmpeg -f concat -i desc -codec copy output.mp4
Content of "desc" file: file 'out1.mp4' file 'out2.mp4'
Grab our images (starting at index 1) and directly transmit them to the v4l2 device/ save them into a.mp4 file:
ffmpeg -re -framerate 2 -start_number 1 -i data%d.png -vcodec rawvideo -r 15 -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0 ffmpeg -re -framerate 2 -start_number 1 -i data%d.png -r 15 -pix_fmt yuv420p out.mp4
Same as above, but loops through images until -t is fulfilled:
ffmpeg -loop 1 -framerate 2 -start_number 1 -i data%d.png -t 50 -r 15 -pix_fmt yuv420p out.mp4
Play vid on udp + grab it and overlay it
ffmpeg -re -i AMGS.mp4 -vcodec copy -vbsf h264_mp4toannexb -an -f rawvideo udp://127.0.0.1:1234
ffmpeg -re -i ../Salvador.mp4 -i udp://127.0.0.1:1234 -filter_complex "overlay=0:0" -strict -2 -threads 0 -f v4l2 /dev/video0
——Overlay stops encoding if udp stream is stopped.
Play frames fed by pipe - Each overlay frame is on by 5s
cat frag*.png | ffmpeg -re -y -i Salvador.mp4 -r 1/5 -f image2pipe -i - -filter_complex "overlay=0:0" -strict -2 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0
Create MJPEG stream to be overlayed on background video while it is running
ffmpeg -re -i Salvador.mp4 -vcodec mjpeg -q:v 1 -strict -2 stream.mjpeg
ffmpeg -re -y -debug 1 -i Salvador.mp4 -r 5 -i stream.mjpeg -filter_complex "overlay=0:0" -strict -2 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0
Stream MJPEG to namedPipe - Main background stream can overlay the MJPEG stream being generated
ffmpeg -re -i Salvador.mp4 -vcodec mjpeg -q:v 1 -strict -2 -f mjpeg pipe:1 > anyPipe
ffmpeg -re -y -debug 1 -i Salvador.mp4 -i anyPipe -filter_complex "overlay=0:0" -strict -2 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0