Ongoing apocalypse-like situation skyrocketed video-on-demand and live streaming services beyond all predictions and expectations. While video-on-demand (e.g. watching Resident Evil movie) eases up our anxiety and claustrophobia seizures caused by #stayathome directives, live streaming helps us not to forget faces of our family members, friends and coworkers.
In this article I will cover one of the ways to capture live video from camera or computer's screen and show it in any modern Internet browser.
Use cases and implementation
There are quite a few live streaming implementations including the most promising free, open source WebRTC but, fortunately for our poor brains, all of them share some basic concepts
- capture live video from some source e.g. PC camera, remote camera or computer's screen (we'll use GStreamer for this)
- stream it to dedicated Wowza-like server for additional post-processing (we'll create a simple media server ourselves using GStreamer and WEB server)
- process and convert video stream in format "compatible" with browsers (again GStreamer)
- make processed stream available for multiple, browser oriented viewers (we'll use simple WEB server)
I have to admit that article's title is a bit misleading because it is not actually possible to stream video to browser using only GStreamer. But do not worry: this is not click bait but just SEO optimized link title :)
Compatibility with browser
Q: Why do we need pay attention on compatibility with browser ?
A: Modern browsers can show video encoded only using limited set of video codecs and wrapped up in limited set of media containers. That is the reason why we need to encode/ pack our video source in format/codec that can be understood and interpreted (unpacked/decoded) by browser.
To achieve browser compatibility and overcome technology constraints, we will post-process live video stream and make it compatible with HLS protocol: live video stream will be continuously segmented into series of MPEG-TS container files with each segment containing only small part of live video.
GStreamer provides CLI (Command line interface) allowing us to build and execute pipelines. Although it is basically the prototyping tool, it will be enough for our purposes.
Capture computer's camera
To capture the most common live video source - PC camera video - and show it in embedded player's window, you can use following command
gst-launch-1.0 v4l2src device="/dev/video0" ! videoconvert ! autovideosink
If you see some person on the screen now, you will confirm that
- you are are not meant to be a face model (that is always my first thought after I see my own pretty face on the screen)
- your camera is operational and that we can continue with our exercise
Now you can extend previous command with clockoverlay element to add current PC time in left upper corner of the video
gst-launch-1.0 v4l2src device="/dev/video0" ! videoconvert ! clockoverlay ! autovideosink
Capture computer's desktop
Following command will do basic job
gst-launch-1.0 ximagesrc ! videoconvert ! clockoverlay ! autovideosink
If this command slows down your computer, you can try this to make it more optimized.
Capture some other video source
v4l2src element is capturing video from computer's camera and
ximagesrc is capturing desktop screen. In order to capture video from some other source, e.g. remote RTSP enabled camera, following example command can be used
gst-launch-1.0 rtspsrc location="rtsp://freja.hiof.no:1935/rtplive/_definst_/hessdalen03.stream" ! \ rtph264depay ! avdec_h264 ! clockoverlay ! autovideosink
This command will capture and show live video from remote camera located in Hessdalen, Norway
Am I the only one who wants to spend the rest of life there?
Before you continue...
- create new directory and go into it
mkdir hlstest cd hlstest
pwdcommand to check your current path. Let's suppose it is
ifconfigand mark local IP address of your PC. Let's suppose it is
Making video compatible with browser
So far we utilized
autovideosink element to show video in embedded GStreamer's window. And now we are ready to prepare our video for the browser. We will replace
autovideosink with elements that will encode and wrap video to be compliant with HLS protocol.
gst-launch-1.0 v4l2src device="/dev/video0" ! videoconvert ! clockoverlay ! \ x264enc tune=zerolatency ! mpegtsmux ! \ hlssink playlist-root=http://192.168.0.11:8080 location=/home/gstreamer/hlstest/segment_%05d.ts target-duration=5 max-files=5
It is good moment to explain pipeline's elements in more details
v4l2srccaptures live video from computer's camera
videoconvertis autoplugging element that chooses appropriate codec to convert video from camera to raw format
clockoverlayadds timestamp at (by default) top right corner of video,
x264encencodes raw video using H264 codec.
mpegtsmuxwraps H264 encoded video into MPEG-TS container
hlssinkis responsible for following
- accepts MPEG-TS stream,
- creates file segments and save them to current directory (each file contains 5 seconds of live video, max 5 files will exist at one time, older files are automatically deleted)
- creates extended M3U descriptor file
Extended M3U descriptor file format
While previous command is running, you can explore files inside
/home/gstreamer/hlstest for details. Observe how video segments are dinamically created and check out format of
#EXTM3U #EXT-X-VERSION:3 #EXT-X-ALLOW-CACHE:NO #EXT-X-MEDIA-SEQUENCE:1007 #EXT-X-TARGETDURATION:5 #EXTINF:5.0238618850708008, http://22.214.171.124:8080/segment_01007.ts #EXTINF:5.0238628387451172, http://126.96.36.199:8080/segment_01008.ts #EXTINF:5.0238499641418457, http://188.8.131.52:8080/segment_01009.ts #EXTINF:5.0238618850708008, http://184.108.40.206:8080/segment_01010.ts #EXTINF:2.9319372177124023, http://220.127.116.11:8080/segment_01011.ts #EXT-X-ENDLIST
M3U file contains information (URLs and time order) about currently available segments and it is continuously updated as new segments are being added and old ones deleted.
Browser HTML page with video
Since we have our pipeline running and creating HLS segments that are representing parts of our live video, we can put a small effort to create HTML page that will show live video.
/home/gstreamer/hlstest directory create
index.html file with following content
<!DOCTYPE html> <html> <head> <meta charset=utf-8 /> <title>My live video</title> <link href="https://unpkg.com/video.js/dist/video-js.css" rel="stylesheet"> </head> <body> <h1>My live video - simple HLS player</h1> <video-js id="video_id" class="vjs-default-skin" controls preload="auto" width="640" height="360"> <source src="http://192.168.0.11:8080/playlist.m3u8" type="application/x-mpegURL"> </video-js> <script src="https://unpkg.com/video.js/dist/video.js"></script> <script src="https://unpkg.com/@videojs/http-streaming/dist/videojs-http-streaming.js"></script> <script> var player = videojs('video_id'); </script> </body> </html>
This very simple piece of code is using VideoJS, Java Script library that supports, among other things, HTTP streaming (HLS and DASH). Few notes:
<video-js>tag is wrapper around standard HTML5
<source>tag takes M3U descriptor file at
http://192.168.0.11:8080/playlist.m3u8to find out information about segments. Java script is
- periodically downloading
playlist.m3u8file and checking available video segments
- then, knowing each segment name and URL, downloading segment by segment and playing it on browser's page
- periodically downloading
Serve files with WEB server
All we have to do now is to serve all files so that browser can access it.
Open another console window, go to
/home/gstreamer/hlstest directory and execute command
for Python 2.X
python -m SimpleHTTPServer 8080
or little better if Python 3.X is available
python3 -m http.server 8080
These commands use existing Python script (part of Python's standard dist package) and create simple WEB server (on port 8080) enabling browser to access all files under current directory.
Python's HTTP server is handy but far from perfect. For more complex scenarios involving multiple clients, use more advanced and more reliable WEB server like Nginx
View live video in browser
On any computer or smart phone inside local network just open browser and type
http://192.168.0.11:8080/index.html. Result should be something like this
Should I mention that your face will appear instead of mine?
If my face appears at your side too, then something went terribly wrong :)
Please note that, for simplicity sake, all above GStreamer pipelines are basic. I would recommend you to use more optimized and configurable pipelines below.
gst-launch-1.0 -v v4l2src device="/dev/video0" ! videoconvert ! clockoverlay ! \ videoscale ! video/x-raw,width=640, height=360 ! x264enc bitrate=256 ! video/x-h264,profile=\"high\" ! \ mpegtsmux ! hlssink playlist-root=http://192.168.0.11:8080 location=segment_%05d.ts target-duration=5 max-files=5
Remote RTSP camera
gst-launch-1.0 -v rtspsrc location="rtsp://freja.hiof.no:1935/rtplive/_definst_/hessdalen03.stream" ! \ rtph264depay ! avdec_h264 ! clockoverlay ! videoconvert ! videoscale ! video/x-raw,width=640, height=360 ! x264enc bitrate=512 ! video/x-h264,profile=\"high\" ! \ mpegtsmux ! hlssink playlist-root=http://192.168.0.11:8080 location=segment.%05d.ts target-duration=5
gst-launch-1.0 ximagesrc use-damage=0 ! videoconvert ! clockoverlay ! \ videoscale method=0 ! video/x-raw,width=1280, height=720 ! x264enc bitrate=2048 ! video/x-h264,profile=\"high\" ! mpegtsmux ! hlssink playlist-root=http://192.168.0.11:8080 location=segment_%05d.ts target-duration=5 max-files=5
Happy streaming :)