Webcams? An elegant device for a more civilized age._

Support the Entertainment Community Fund.
🇺🇦 Resources to help support the people of Ukraine. 🇺🇦
July 12, 2020 @09:20

I recently found myself overcome by a brief spat of boredom. A general haze of nostalgia accompanied the boredom and it got me thinking of earlier days of the Internet. A number of mostly-forgotten fads came to mind, the web counter, the under construction gif, the best viewed with browser logo, and the venerable webcam. The webcam itself was an evolution of other, more pratical network connected cameras and they have continued to progress farther than I would have expected. A quick search of my archives revealed that I still had the very last frame captured from my webcam.

The last frame captured

These days unless you happen to have a round-trip time measured in minutes there is something almost gauche about the idea of a still frame webcam. After all the tiny computer in your pocket can play FullHD video without even waking up all the CPU cores so I decided to set about to create a live stream of the bird feeder on my property thereby bringing back to life my webcam.

Setting up a Camera

The first step in any webcam is choosing an approprate camera. For this particular application I needed an outdoor camera. For ease of installation I very much preferred one that could be powered over Ethernet (passive PoE, 802.3af or 802.3at would work) and prefereably could output a h264 video stream. After a bit of searching I bought a S3VC PoE 1080P IP Camera from Amazon.

The camera met all the specs I wanted and the reviews indicated that the configuration UI was accessible without a plugin (I have had several cheap IP cameras have completely useless configuration UIs that required some ancient Windows-only plugin). The configuration UI turned out to work just fine, however the URLs for the RTSP endpoints provided by the documentation and the reviews are wrong for the firmware version that shipped with the camera. The documented URL rtsp://camera/11 did not work. I found the this page which lists several options for the S3VC cameras and rtsp://camera/cam1/mpeg4 produced the RTSP stream I was looking for.

I customized the video stream output from the camera to minimize bandwidth and verified I could connect to the stream in VLC. With that step done it is now time to get the data onto the Internet.

Camera Video Settings

The State of Live Streaming to the Web

There are two main ways to provide live video to web browsers at the moment. Apple came up with HTTP Live Streaming, also known as HLS, and the MPEG folks spun up several committees and industry forums to produce Dynamic Adaptive Streaming over HTTP or MPEG-DASH which is similar but different for... reasons. I set about testing native support across the browsers that I have access to and was rather surprised by what I found.

Browser/OS Native DASH Native HLS
Safari/macOS No Yes
Safari/iOS No Yes
Firefox/macOS No No
Firefox/Android No Yes

Further poking around caniuse revealed the rather shocking reality of browser support in the wild.

Data on support for the mpeg-dash feature across the major browsers from caniuse.com

Data on support for the http-live-streaming feature across the major browsers from caniuse.com

When putting things on the web I prefer to let the browser do as much of the work as possible so it's clear that there is no real reason to support MPEG-DASH. I'll have to grudgingly lean on some JavaScript to support HLS on the non-Apple desktop browsers.

Creating a HLS encoder/slicer

So now that I know I need to be able to support HLS how do I create it from a RTSP stream? It looks like ffmpeg itself can act as an encoder and a slicer but it seems like it is meant as an offline encoder and not a a way to provide a live stream. Thankfully there is a third party RTMP plugin for Nginx that supports HLS and can execute arbitrary programs at various parts of the stream lifecycle, so it seemed like it would be fairly trivial to create an encoder. The next question was where to put it? Since the camera is at home I wanted to host the encoder there so it won't be traversing the VPN (and therefore my anemic cable Internet upload) to my web server unless someone is actually watching. That also means that I should deploy the encoder stack as a container so I can leverage the same proxy and firewall infrastructure that I use for other applications that I don't trust to be live on the Internet directly (like Grafana).

Building the container

At first I tried to use the official Nginx container, however they don't seem to include the RTMP Module so I would have to fork and locally build their container. Before I did that I decided to look to see if Debian had the plugin available in the stock package repositories and in fact they did so that is where I started. In fact the container started out as a copy of my Nginx proxy image that I use to front end any applications running on my container infrastructure.

I was able to put together a pretty simple container that renders the various configuration fragments from a JSON configuration file. You can find the source in my git repository.

Configuration ffmpeg for nginx-mod-rtmp

Of particular note is the rtmp.conf.tmpl file in the above git repository. The file is fairly well commented and the RTMP module documentation contains all the extra details. The part I wanted to call out is the exec_static section (which in this case contains some Jinja2 template directives) which causes Nginx to execute ffmpeg to pull the RTSP stream from the camera and create the two variants for HLS streaming (one for mobile and one for desktop, though the client will select which version to play back).

exec_static ffmpeg -hide_banner
    -i {{ stream['url'] }}
    -preset veryfast
    -tune zerolatency
    -c:v libx264 -profile:v main -b:v 448k
    -s 640x360 -f flv
    rtmp://localhost/live/{{ stream['name'] }}_360
    -preset veryfast
    -tune zerolatency
    -c:v libx264 -profile:v main -b:v 1024k
    -s 1280x720 -f flv
    rtmp://localhost/live/{{ stream['name'] }}_720;

Both streams use h264 for a video codec and the zero latency preset to provide rapid stream startup. The first stream is 360p and the second is 720p. These corrispond to the hls_variant configurations in the rtmp.conf.tmpl. You can chain even more output sections to provide even more variants if your use case required it as long as you then add more hls_variant sections to the configuration. Nginx will then manage the base playlist for you.

The really nice thing about this is that it is entirely self-contained and the transcoding is managed by Nginx, restarting ffmpeg automatically if it crashes or loses connectivity to the camera. Nginx also handles cleaning up old segment files automatically without the need for an external cron job or helper. The video traffic is all handled within the container so I don't need to expose any ports from the container to the world. This means that multiple containers can be deployed and they should all do the right thing idempotently and independently.

While I was at it I also proxied the still image output from the camera to poster.jpg so I can have the <video> element show an actual still from the camera as it is loading the video stream. You can even set the URL to proxy to in the JSON configuration. On startup nginx.conf.tmpl will render that into a proxy stanza.

location /poster.jpg {
    add_header X-UG-Streamer-Id $hostname always;
    add_header Cache-Control "public";
    expires 1h;
    proxy_pass {{ poster_url }};
}

Birdcam frame grab

It always ends with JavaScript

With the encoder / slicer container running we have ourselves a HLS stream but we need to get a browser to actually play it. Of course as I discovered before just dropping it in a <video> tag is not enough in most non-mobile or non-Apple cases. There are a number of JavaScript player packages out there but most of them aim to be able to do everything and upon investigation many of them inherit their HLS playback functionality from a library called hls.js, so I decided to go directly to the source. I created a small wrapper that checks for native HLS support and if it fails to find that it loads the light version of hls.js and starts the stream. This saves me a half a megabyte of JS in most cases, which is nice. The wrapper tries to detect various errors gracefully and it trues to restart he stream if it is interrupted. Finally it uses the Page Visibility API to stop the stream if the page is hidden and restart it automatically upon being revealed again. The most recent version of the script is visible here, and the version that is current at time of writing is included below.

/* birdcam.js (c) 2020 Matthew J. Ernisse <matt@going-flying.com>
 * All Rights Reserved.
 *
 * Redistribution and use in source and binary forms,
 * with or without modification, are permitted provided
 * that the following conditions are met:
 *
 *    * Redistributions of source code must retain the
 *      above copyright notice, this list of conditions
 *      and the following disclaimer.
 *    * Redistributions in binary form must reproduce
 *      the above copyright notice, this list of conditions
 *      and the following disclaimer in the documentation
 *      and/or other materials provided with the distribution.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
 * 'AS IS' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
 * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
 * FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
 * COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
 * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
 * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
 * OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
 * ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
 * TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
 * USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */
'use strict';

/*global Hls*/

class StreamPlayer {
    constructor() {
        this.dash = false;
        this.hls = false;
        this.hlsjs = false;
        this.hlsjsInstance = undefined;
        this.useragent = navigator.userAgent;
        this.video = document.getElementById('birdcam-video');

        if (! this.video) {
            console.log('Could not locate video, aborting.');
            return;
        }

        /* HTMLMediaElement Events:
         * abort - emitted when we remove the video on visibilityChange
         * error - I have never seen
         * loadeddata - I have never seen
         * loadstart - When data starts loading
         * waiting - See note below
         */
        this.video.addEventListener('error', (evt) => {
            console.log(evt);
            this.displayError();
        });

        this.video.addEventListener('loadstart', (evt) => {
            console.log(evt);
        });

        /* This is the last event we seem to get in the case of
         * an error, it doesn't seem to throw an error event for
         * some reason.  If srcElement.networkState == 3 then
         * we have no valid source.
         */
        this.video.addEventListener('waiting', (evt) => {
            if (evt.srcElement.networkState === 3) {
                this.displayError();
            }
        });

        if (this.video.canPlayType('application/dash+xml')) {
            this.dash = true;
        }

        if (this.video.canPlayType('application/vnd.apple.mpegurl')) {
            this.hls = true;
        }

        const sources = this.video.getElementsByTagName('source');
        this.hlsSrc = sources[0].src;

        if (! this.hls) {
            const script = document.createElement('script');
            script.src = '/js/hls/hls.light.js';
            script.type = 'text/javascript';
            document.head.appendChild(script);
            script.addEventListener('load', () => {
                this.hlsjsLoaded();
            });
        }

        // Pause the video when the page is not visible.
        document.addEventListener('visibilitychange', () => {
            if (document.hidden) {
                this.video.pause();
                console.log('hidden, pause.');
            } else {
                if (! this.hlsjs) {
                    this.video.load();
                } else {
                    this.hlsjsLoaded();
                }
                console.log('visible, playing.');
            }
        }, false);
    }

    displayError() {
        const failvid = '/videos/static.mp4';
        const msg = document.createElement('div');
        const mSpan = document.createElement('div');
        const src = document.createElement('source');

        while (this.video.firstChild) {
            this.video.firstChild.remove();
        }

        // Create error message
        mSpan.appendChild(document.createTextNode('Stream Error'));
        msg.appendChild(mSpan);
        msg.classList.add('birdcam-error');
        this.video.parentNode.appendChild(msg);

        // Swap video for static loop
        src.src = failvid;
        this.video.appendChild(src);
        this.video.setAttribute('loop', '');
        this.video.play();
    }

    hlsjsLoaded() {
        if (! Hls || !Hls.isSupported()) {
            this.displayError();
            return;
        }
        this.hlsjs = true;
        this.hlsjsInstance = new Hls();
        this.hlsjsInstance.loadSource(this.hlsSrc);
        this.hlsjsInstance.attachMedia(this.video);
        this.hlsjsInstance.on(Hls.Events.MAIFEST_PARSED, () => {
            this.video.play();
        });

        this.hlsjsInstance.on(Hls.Events.ERROR, () => {
            this.displayError();
        });
    }
}


if (document.readyState != 'complete') {
    document.addEventListener('readystatechange', event => {
        if (event.target.readyState === 'complete') {
            new StreamPlayer();
        }
    });
} else {
    new StreamPlayer();
}

After all of that I tested this on the various browsers I have access to and it seems to work well enough. If you are interested you can see the results here.

Conclusion

I think the most surprising part of the whole project was discovering that native browser support for streamed video is so poor. It seems like most of the large streaming sites have implemented their own players using JavaScript and the browser vendors have mostly left it that way. All of the back end transport stuff was extremely easy and I had a working HLS stream in a matter of minutes, with the bulk of the effort invested in getting playback to work. I do want to setup some telemetry to see what the general user experience is like but first I need to figure out how to do that ethically.

Anyway, enjoy the birds and the return of the webcam.

Best viewed with Netscape Now! from textfiles.com

Comment via e-mail. Subscribe via RSS.