Matthew Ernisse

Edited: October 18, 2017 @14:03

I have been meaning to play around with containers for a while but for the life of me have not found a real reason to. I feel like without a real use case, any attempt I'd make to learn anything useful would be a huge waste of time. There are a bunch of neat toys out there, from running random ASCII art commands to a crazy script that 'emulates' some of the insane Hollywood style computer screens, as well as base images for all manner of application stacks and frameworks, but all of those are easily installable using your favorite package manager.

None of this really made me care enough to install and learn anything about any of the container ecosystems. I do like the idea of containers as sandboxes but as a macOS user I have that built in for free, so I have no impetus there either.

Still, there is a lot of talk about containers in the development community so I have been keeping an eye out for a use-case where I could justify investing time in them. Lately my primary development work has been creating various bespoke Flask applications. Flask comes with Werkzeug and a simple server built in, so I typically just run the internal server, iterate on the code, and then commit to my git repository. Eventually Puppet comes along and does the heavy lifting to deploy the changes to production. This works really well and I can't really figure out a reason to shoehorn a container into the process..

Docker on Aramaki

Turns out the excuse came from this web site. As I have written about before this entire site is generated from a home brew Python script. It takes all the design from templates and blog articles from markdown files and is triggered from a git post-receive hook on the web server. This lets me make a very fast web site that doesn't rely on any dynamic pages or API calls. The one drawback of this method lies in the differences between viewing pages over HTTP/HTTPS versus off the local filesystem. To test the site locally I was hand-editing some of the output to change some of the URLs from paths that would work on the website to paths to paths that work on the local filesystem. This was getting annoying and frankly is just the thing to replace with a very small shell script.

I initially thought about modifying the build script to use filesystem paths when building locally, but that would just add complexity and potential for breakage. I then thought about fooling around with the web server built into macOS but I am generally loathe to mess around with things in the bowels of the OS lest I do something that Apple breaks in an update. In the end I figured this might finally be a good excuse to pull together a Docker container running Apache, that included the Python bits that the site builder needed and then in true ex-sysadmin fashion wrap it up in a nice shell script.

This resulted in a pretty reasonable work flow.

  1. Update working copy of site.
  2. Run test.sh
    • build Docker image
    • copy working copy into Docker image
    • launch an instance of this image.
    • Open a browser to the URL of the local Docker instance.
  3. Verify things are the way we want.
  4. Fix and GOTO 1 or continue.
  5. git add, commit, push to remote.
    • git hook deploys to production.

Now to be fair there are probably easier ways to do this including using a staging branch that is served on another domain name, directory, or on an internal VM. This would save me from building, launching, and cleaning up images. I could use my normal publishing work flow and scripts to simply do the right thing and then merge back to master when I'm ready to deploy the site to production.

But that doesn't give me an excuse to play with 🐳 Docker. 😁

Details

As of the time of writing these are the main pieces that make this work flow possible.

Dockerfile

FROM debian:latest
LABEL version="0.3.0" \
    vendor="Matthew Ernisse <matt@going-flying.com>" \
    description="Build and serve going-flying.com"

RUN apt-get update \
    && apt-get install -y \
    apache2 \
    python \
    python-pip \
    && rm -rf /var/lib/apt/lists/* \
    && mkdir -p /var/www/going-flying.com \
    && a2dissite 000-default

COPY docker/going-flying.conf /etc/apache2/sites-available
COPY . /var/www/going-flying.com

RUN a2ensite going-flying \
    && pip install \
    --requirement /var/www/going-flying.com/docker/requirements.txt \
    && /var/www/going-flying.com/build.py

EXPOSE 80
CMD ["/usr/sbin/apachectl", "-DFOREGROUND"]

This is pretty straightforward. I take the Debian base Docker image and install the bits I need to build and serve the site. I also have a very basic apache configuration fragment that points the server to the location I will be copying the site files to (the same location as in production so the script doesn't have to care). I then simply copy the working copy of the site into the image and run build.py on it.

test.sh

#!/bin/sh
# test.sh (c) 2017 Matthew J. Ernisse <matt@going-flying.com>
# All Rights Reserved.
#
# Build and run a copy of the website inside a Docker container.

set -e

echo "going-flying.com test builder."

if ! which docker 2>&1 >/dev/null; then
    echo "docker not found."
    exit 1
fi

if [ "$(uname -s)" != "Darwin" ]; then
    echo "Not running on macOS.  Exiting."
    exit 2
fi

cat << EOF

                    ##         .
              ## ## ##        ==
           ## ## ## ## ##    ===
       /"""""""""""""""""\___/ ===
      {                       /  ===-
       \______ O           __/
         \    \         __/
          \____\_______/

EOF

echo "Building image..."
_image=$(docker build --force-rm --squash . -t going-flying:latest | \
    awk '/^Successfully built [0-9a-f]+/ { print $3 }')

docker run --rm -d -p 8080:80 --name going-flying $_image > /dev/null

open "http://localhost:8080"

echo "Container running, Press [RETURN] to end."
read
echo "Stopping..."

docker stop going-flying > /dev/null
echo "OK."

This just does the docker build and docker run dance that causes a container to be running. It can probably be simplified even further but it gets the job done. The biggest thing was to make sure that I wasn't leaving a pile of images and whatnot laying around. And not having to remember the different command line switches needed to make it all Just Work.

build.py

The only other change was a hook in build.py that changes the base URL of the site from the normal https://www.going-flying.com/ to http://localhost:8080/. It does this by simply detecting if it is running in a Docker container and changing an instance variable.

def is_docker():
        ''' Return True if we're running in docker.'''
        if not os.path.exists('/proc/self/cgroup'):
                return None

        with open('/proc/self/cgroup') as fd:
                line = fd.readline()
                while line:
                        if 'docker' in line:
                                return True

                        line = fd.readline()

        return None

[ ... later in main() ... ]

        if is_docker():
                BuildConfig.urlbase = "http://localhost:8080/"
                print ":whale:  container detected."

I was skeptical at first if this was going to be worth it, but after using this for a few site updates, I honestly feel that this was easier than many of the alternatives and in the end let me go back to fixing a bunch of style and template bugs that I had on the TODO list for some time. I'd call that a result that was worth the effort. I look forward to finding more places where a container fits into my work flow. It might even turn into an excuse to run a private registry and start playing with some of the CI tools to run builds.

Errata

It turns out that Safari doesn't like to auto play videos not in view when the page loads. I tried to slam together some JavaScript to 'fix' this, but your milage may vary. If the videos aren't playing you should be able to right click on one of them and say 'show controls' then hit play.

October 13, 2017 @19:10

I monitor the DSM version on my Synology NAS with my icinga2 instance and sometimes alerts pop while I'm not in a position to run the upgrade using the normal GUI process.

This is rare enough that I almost always find myself trying to remember how to do it via ssh(1) and after flailing around aimlessly for a while I ultimately figure it out. This time I figured I'd write it down so I can at least find it in the future.

Basically synoupgrade is what you want.

Synology DS214se

admin@nas01:/$ sudo synoupgrade --check
Password:
UPGRADE_CHECKNEWDSM
Available update: DSM 6.1.3-15152 Update 7, patch type: smallupdate, restart type: none, reboot type: now
admin@nas01:/$ sudo synoupgrade --download
UPGRADE_DOWNLOADDSM
New update has been downloaded
admin@nas01:/$ sudo synoupgrade --start
UPGRADE_STARTUPGRADE
Start DSM update...
Finish DSM update, reboot now!!

Broadcast message from root@nas01
    (unknown) at 19:00 ...

The system is going down for reboot NOW!
October 12, 2017 @22:00

iPad Impressions

I mentioned a few things in my first post that I thought might be better on the iPad than the iPhone.

iPad iOS 11 Screenshot

I like the new task switcher a lot, and I can see potential in the dock if you don't turn off all the iCloud features. I was wrong about the video stuff though, that's still too small and garbage.

iPad iOS 11 Screenshot

I Still Think This Stuff is Ugly

iPhone iOS 11 Screenshot

The more I look at it, the less I like the huge text block at the top of the tab screen. It feels like such a huge waste of screen real estate which feels antithetical to the entire point of designing a mobile UI.

Control Center Is Doing Radios WRONG

I ran into this last week as I was flying to Las Vegas for a work conference. I turn off WiFi and Bluetooth when I'm traveling for a number of reasons, but mostly battery life. It looks like tapping the radio icons in Control Center does not actually turn off the radio but disconnect you from the currently connected items, leaving the radios on, draining your battery, and broadcasting information out into the aether. You have to actually go into Settings to turn off the radios. Thankfully Airplane Mode seems to actually disable the radios so my battery didn't get murdered on the flights but the last thing you want to do is walk around a technical conference in Las Vegas with un-needed radios in your phone looking for something useful.

The long and the short of it is that those buttons should disable the radios not disconnect them.

(β•―Β°β–‘Β°οΌ‰β•―οΈ΅ ┻━┻

October 09, 2017 @15:40

There are a lot of reviews of iOS 11 out there already and as is almost always the case, people are complaining that things changed. This is not that. Part of the reality we live in with our consumer-oriented technology demands is that things change. As a whole iOS 11 seems to be an improvement over previous versions and in general I'm happy with it.

The Good

Security

Apple continues to take security seriously, now requiring passcode authentication after repeated failures as well as after reboot. Also requiring the phone to be unlocked to exfiltrate data is an improvement. This focus is especially important with the continued focus on privacy and security in the face of difficult times all across the world. This kind of behavior should be the default on all mobile devices.

Third Party Location Use Alert

iOS 11 Screenshot

In previous versions it seemed that Apple Maps was the only navigation program that would present you with the blue bar across the top of the screen, letting you know it was actively using your location (and providing a low-friction way to task switch back to navigation). Obviously you shouldn't be using your phone while driving πŸ˜” and the DND While Driving mode is a nice feature (though I doubt anyone is really using it) but since the Podcast app is now broken (see below for more), if you find yourself changing music or some other totally reasonable action while driving it is nice to just be able to tap on the header to go back to your navigation app.

Phone Controls While Locked

This is seriously great. Not having to unlock the phone to toggle mute, or to switch audio outputs, or to end the call is a great thing. Even if you don't fanatically toggle mute while on conference calls, at the very least this has totally gotten rid of the 3 seconds of awkward silence after you say "goodbye" while both parties root around trying to get to the 'end call' button.

The Bad

Stupid Home Control Bug (still)

iOS 11 Screenshot

I think this has been around as long as the Home Control feature has been in iOS. I remember it on my iPad Mini 2. It seems that when you turn this off, it always ends up turning itself back on. Thankfully I don't have any HomeKit devices because frankly this is a security bug. I don't want someone to be able to see or interact with my home automation without being authenticated. That's just... shocking. It's like having a lock on your door that doesn't need a key. What is the point?

Podcast App Now Basically Useless

iOS 11 ScreenCast

I listen to podcasts while driving, mowing the lawn, working around the house, and sometimes while working at work. I am DEEP in the back catalog of most of the podcasts I listen to so I don't want to have to stop what I'm doing and pull the phone out to navigate to and select the next episode. In previous versions I have not had to. For some reason now the built-in Podcast app stops after every episode, even if there are more unplayed episodes already downloaded to the device. This seems strange for Apple as it increases friction using the device.

App Search Completely Useless

iOS 11 ScreenCast

I don't like having a pile of apps on my home screen. This is similar to how I use my MacBook Pro. I toss everything in a folder and search for what I want when I need it. On the Mac clover+space works great. On the previous version of iOS this same workflow worked well. Swipe down, type two or three letters and the app you want is probably listed. Tap and launch.

As full disclosure I have always had Siri and all the related Siri features off since I have had it on all my iOS devices, but in the past this worked great.

Now however you have to type the entire and exact name of the application for it to show up. This makes the workflow much more difficult and cumbersome. More friction for no reason.

iPhone Force Touch App Switching Gone

With all the focus on the iPad, multitasking on the iPhone lost a really handy feature. In iOS 10 you used to be able to force touch on the edge of the screen and get the task switcher. This was less movement and already had your thumb on the screen to switch or close apps versus the double-click home gesture. Again, more friction for no reason.

The Ugly

New Apple Visual Style

iOS 5 Screenshot

I feel like this is the most subjective of the changes. While the UI has changed a LOT over the years and is largely an improvement I can't help but by a little bothered by the waste of space that comes along with the new design language. The fairly ubiquitous search bar just off the top of the screen is nice but the giant title of the app you just clicked on seems... a bit superfluous. Maybe this is less of an issue on the larger screen devices like the iPhone X and the iPads, on a regular iPhone 7 you have a solid 1/4" of the display taken up by quite literally the name of the thing you just tapped on.

Why?

Native Video Controls Now TINY

iOS 11 Screenshot

I don't have a lot to say about this, but the native video player controls are now much smaller and harder to hit. The volume overlay is better but the rest have gotten markedly worse. Most of these aren't a huge issue but trying to hit the full screen or AirPlay buttons has become much more painful. Like I said about the visual style this may be less of an issue on the larger devices, but on the smaller screen standard devices it is pretty irritating.

Overriding Settings To Default On Upgrade

One of the more ugly things that iOS 11 did upon upgrade was to turn on a whole bunch of features that I explicitly turned off in iOS 10. Things like iCloud, iMessage, and Siri turned themselves on without warning. I declined the iCloud Drive feature during setup but a bunch of the other features popped on and iCloud added itself as an 'Account' automatically. I can imagine this was part of some preferences migration in the phone since things clearly have changed but it seems like the case where the user had previously disabled all this had not been tested or possibly (worse) someone had decided to ignore the user's wishes and turn this stuff on again. It seems like I caught it before it started syncing anything back to the Apple mothership, but it feels like a potential privacy leak. At the very least it required an audit of all of the options in Preferences to verify things weren't being uploaded to Apple which was a waste of time and added friction to using the device.

Conclusion

I Miss Steve Jobs

Steve Jobs Headshot

You can say a lot about Steve Jobs. He certainly had a very storied career. For all the drawbacks of his fanatical and perhaps ego-driven attention to detail one thing that was true under his stewardship of Apple was his focus on polish and reducing user friction with technology. This version of iOS, while being largely good is somehow lacking the polish that I would otherwise have come to expect. I don't know if it is the reality that he's gone and his curated plans for the products he shepherded into life have finally ended or if it is the inexorable pressure of the market on Apple to continue to 'innovate' on a very aggressive 12 month cycle. Either way it is a stumble, and while all technology is created by humans who are by their very nature fallible and under enormous pressure to remain the most valuable company in the world, I do hope that these things get polished away instead of becoming a canary in the coal mine for the future of the product.

You Should Still Upgrade

All of the nits I picked above aside you should still upgrade. The security and stability updates alone are important enough to warrant keeping current.

Steve Jobs headshot used under license, see here for details, iOS 5 screenshot from Softpedia

September 30, 2017 @00:02

So I Heard You Like Videos

As a follow up to my Favorite Podcasts post, I figured I would talk a bit about my favorite Youtube channels. A while back I wrote a Flask app to take a bunch of different web services that I didn't feel like having accounts on and turning them into RSS feeds. In the case of Youtube I combine the RSS feeds of each channel into a single RSS feed that I subscribe to. This makes it a lot easier to keep up with the periodic deluge of videos without having to fool around with a bunch of bookmarks or having a Google account.

Top 10 12 uh, bunch?

I currently am following 26 channels, according to my app, and writing about all of them frankly sounds exhausting. I imagine reading all that would be pretty exhausting too, so here are a sampling of them in no particular order.

EEVBlog

The OG Youtube electronics channel. The lord and savior of the tear down. That Aussie Bloke. There isn't a whole lot to say about Dave's channel that has not been said elsewhere, if you have any interest in electronics at all and you have somehow missed him I'd highly suggest you go review his channel and website. From circuit design, PC board layout, gear reviews, and random rants there is a bit of everything. I can't heap enough praise on this guy.

AvE

I hope you have a vice handy... you will need it. AvE is hard to pin down, home of the BoLTR and follower of our Lord and Savior of the tear down Chris is... a rare breed. Uncle Bumblefack seems to spend most of his time in his home shop showing the rest of us how much fun it can be to just chuck up some random scrap in your Boxford lathe and make chips. Not nearly as politically correct as This Old Tony or professional as Abom79 or beautiful as Clickspring his channel is a slice of life with a slightly oiled up bend to the left. Well worth the watch if you are in the market for some tools for your own shop, you will likely stay for the dose of fooling around and laughs.

bigclivedotcom

I love Big Clive. He's kind of like Dave Jones from the EEVBlog if Dave happened to fall in with the carnival instead of designing circuit boards professionally before discovering Youtube. Clive tears things apart with gusto and builds random things out of LEDs and USB leads he got from the 1£ shop. Sometimes he plays with high voltage but not nearly to the level of mikeselectricstuff or tesla500, but lately he's been keeping it safer for those of us who like to follow along at home. Clive is a consummate professional scot, living on the Isle of Man and doing slightly dodgy things for our viewing pleasure. Did I mention that I loved this guy? For extra credit and a bit more digital electronic bend see also Julian Illet.

Scott Manley

Another scot, this one now living in the Bay Area. Probably most known for his Kerbal Space Program videos of which he has HOURS of. Scott takes his formal science background and uses it to do wonderful things in that game. I found him originally looking around at the aforementioned KSP videos and then later ran across him again looking for Elite: Dangerous videos. I was hooked and went through a large portion of his back catalog. If you ever wondered how rockets worked or why people keep talking about delta V when shooting things into space then you'll do worse than dropping by Scott's channel.

Penny Arcade TV

I'll admit, I really only watch for the Acquisitions Incorporated stuff these days since PA: The Series ended (worth a watch if you have not seen it). If you enjoy D&D and somehow have not come across this show guys then go. Go now. Seriously. I'll see you in something like 90 hours. Ok done? Great, go over to WoTC's channel and watch Dice Camera Action.

A Dose of Buckley

The second Canadian on my list, and the only comedy/rant channel. Buckley is sort of my sprit animal. He's mostly known for his 'Worst songs of...' videos but he has several different music and society themed rant series. I am particularly fond of Scumbags of the Internet.

Leo Moracchioli/Frog Leap Studios

Leo makes the most metal covers on YouTube. I can't put what he does in words and truly make you understand how great he is. The production value of both the audio and the video is fantastic. This guy is just absolutely killing it. Do your ears a favor and go spend 10 or 20 hours watching his stuff.

Ok, I'm done. You have your mission should you choose to accept it.

September 06, 2017 @17:20

Introduction

I don't listen to a lot of podcasts these days, in fact most of the time I listen to either Sirius XM or my music collection that I've curated into iTunes over the years. There are some times when I'm in the mood for something different and these are the podcasts that I have been actually listening to this year.

In the order that they show up on my iPhone:

Darkest Night

One of a number of spooky/horror/thriller podcasts that I reach for, this is a bunch of seemingly un-related stories told through the memories of the dead. Really well written and produced with an interesting cast and plenty of strange and spooky tales. The episodes are short and consumable. Easy to jump into and catch up on.

King Falls AM

This was the first of the ``radio drama'' style podcasts that I found. A small town late-night radio show on a run-down AM radio station with a cast of characters, supernatural beings, and implied aliens borders on paranormal and slice of life comedy. This is one of the few that I'll listen to as they come out. πŸ“»

The Christmas episode is hilarious and worth listening to on its own.

Suck Squeeze Bang Blow

A self-described 'Three guys, a garage' show, this is hosted by a bunch of folks that I happen to know. They usually mention cars but beers and general stream-of-consciousness arguments tend to veer all over the place. Not quite news, not quite politics, not quite cars, it is a weekly tirade by a bunch of guys in a garage.

Troy Hunt's Weekly Update Podcast

Troy Hunt is an Aussie bloke and the guy behind Have I Been Pwned. He generally writes about security and privacy in our ever more connected world. This is a podcast outlining the things he's been up to during the week. Sometimes he does this from a jetski. πŸ„

Welcome To Night Vale

Similar to King Falls AM but a bit more fraught with psychological trauma and black bag government conspiracies. I am slowly working my way though the back catalog which is DEEP. If you end up liking King Falls AM, give this one a shot.

Downloadable Content

I like Penny Arcade, and backed the kickstarter for their podcast. Listen in on the two creators of the comic sit in their office and talk about insane things until out the other end comes one of their world-famous .jpegs. Each episode stands on their own so it is really easy to pick up and put down, ideal for driving in the car except for when it sucks you in and you find yourself sitting in the parking lot giggling. πŸ‘Ύ

Lave Radio: an Elite Dangerous podcast

Live from an orange sidewinder somewhere outside of Lave Station this is a podcast about the open-galaxy space sim Elite: Dangerous. I play the game now and then and this podcast fits well in the background. o7 cmdrs. 🍸

Above & Beyond: Group Therapy

I used to listen to A State of Trance by Armin Van Buuren, but the podcast feed is just not a good replacement for the radio show. This is the next best thing. A lovely feed filled with two hour episodes of the music that the boys that are Above & Beyond are in to that week. This is my go to podcast for long drives, long walks, and mowing the lawn. It makes cutting nearly 2 acres really enjoyable. 🎧

I'd love to hear any suggestions of other podcasts to add to the stable, you can drop me a note at matt at going dash flying dot com

🍺

Edited: September 14, 2017 @22:03

It is funny. In this day and age of disposable everything, where people are more than happy to shell out money for things that don't actually exist you might think that we've finally left nostalgia behind. There is no point in wishing for the past if it is all still there on some drive somewhere in the cloud.

I am finding that I have a form of nostalgia for old software. More elegant ways to transmit information, closer to the metal as it were. It's probably the same as a well worn tool that has been replaced by a bulky power tool that does the job way faster but makes a big mess, a lot of noise, needs tremendous care-and-feeding and breaks down periodically in spectacular ways, periodically killing people. (I think I just accidentally took a shot at the web browser there...) πŸ‘Ύ

But I digress. I started writing this because somewhere in my wandering I happened upon some more of archive.org's amazing work and at the risk of falling off into another rambling tangent I have admit that a part of me envies those folks. Working to preserve the whirlwind of ephemera that is the Internet so that hopefully those that come after us will be able to see all the hideous mistakes we made on our free GeoCities pages back in the early 1990s, play all our old text adventure games, and witness the unbridled hubris as we created what we thought would be an anarchic, academic, utopia.

If you are still reading this and want to learn a little bit about some of the events that shaped the form and function of the network we take for granted today, or you just want to marvel at the ability to still use a dead file format brought back to life by an emulator written in a language that deserves to die...

The Hacker Crackdown by Bruce Stirling

☎

Edit

archive.org on an ipad After fighting with the JavaScript emulation on archive.org, I decided to put together a package that essentially mimics what is running in the browser but as a 'native' app. So if you would rather read the stack on your computer or want a starting point for a working Mac SE emulator go grab sterling.tar.gz.

August 25, 2017 @17:30

Why new WiFi?

Back in May I closed on a house, leaving my old apartment of 10 years behind. The house was built in 1856 and as you might expect is built like a tank. This is lovely for many reasons but poses a bit of an impediment for having good WiFi.

As a bit of background, one of the things that I did during the nearly decade that I worked for the local phone company's ISP arm was to help build and deploy various WiFi installations. These ranged from single room, single access point coffee shops to small cities. We evaluated a number of vendors to standardize around when developing these solutions looking at RF performance, number of concurrent clients, authentication and management infrastructure, robustness, and client roaming. Now this was when 802.11g was brand new so things have changed but the lessons were well learned.

The Search

For the last 6 years or so I used a very nice access point from Ruckus Wireless. They have one of the nicest radio and antenna combinations on the market which let me cover my entire 1100 sqft apartment with one access point (and a fair bit of the parking lot... 😊) but they are a bit spendy and I couldn't justify buying 3 or 4 of them.

I also use MikroTik RouterBoard access points and routers for some smaller deployments but honestly I'm not a huge fan of their CAPsMAN WiFi management software and I don't know why but they don't seem to believe that standard PoE (802.3af) is a thing worth supporting.

Also on the list of brands that my supplier ISPSupplies stocks happened to be Ubiquiti. I had initially ruled them out because they also suffered from the lack of 802.3af, but I happened to see that they had just released some new access points so I dug up a data sheet to see if they finally saw the light and ditched passive PoE. Turns out they did, so as well as the new 802.3at PoE+ standard. I was interested.

Features

  • Supports 3x3 MIMO 802.11ac up to 1.7Gbps
  • Dual radio (concurrent 5.8GHz and 2.4GHz operation)
  • Supports standard 802.3af/at PoE
  • Mountable indoor access points
  • Wall & pole mountable outdoor access points
  • Centralized management and monitoring (that can be run entirely on-prem)
  • Mesh connectivity options
  • Reasonably priced

Unifi Marketing Image

Why no how?

This isn't a tutorial on how to implement WiFi. There are many of those available online and Troy Hunt made a rather nice one for Ubiquiti that is pretty close to what I ended up doing. He does a good job of going through the process so feel free to go check that out if you want to know how to deploy this stuff. This is meant to be more of an explanation of my experience with the product. Once I decided to go with the UniFi system I ordered the bits from my friendly supplier

Bits I bought

  • 2x UAP-AC-PRO access points
  • 1x UAP-AC-MESH indoor/outdoor access point

Setup

I have a pretty complex network already so I didn't get the security gateway or any of their switches. The Cisco 3750 PoE switch that I have works just fine, and I very much like my OpenBSD router. I also don't trust the cloud very much so I chose to deploy the Linux version of the UniFi controller software. All in all it took me about 20 minutes to create a puppet manifest and deploy the software on a new VM. Taking ownership of the access points was a breeze and within 30 minutes I had the latest firmware on them and was ready to provision the network.

UniFi device list

Configuration of my SSIDs, VLANs, and RADIUS profiles (I use WPA2-Enterprise for my internal SSID and have a WPA2-PSK guest network on a separate VLAN) was simple and intuitive. I'd say that I had a working WiFi network within an hour and a half, including opening the boxes and putting the access points roughly where I wanted them.

UniFi Map

Results

This was a couple months ago and after living with the system for a while I can honestly say I'm extremely happy πŸ˜„. Installation, configuration, and firmware updates have been easy. All of the clients I have had on the network (Windows 10 laptop, macOS laptops, iPhone 7, Samsung Galaxy S6, BlackBerry Passport, BlackBerry Classic, iPad Mini 2, Kindle Fire, and Kindle PaperWhite) work great and most importantly, roam between access points seamlessly. The previous Ruckus Wireless WiFi network performed really well in the last location so unlike Troy I don't have glowing things to say about the huge performance boost...

Garage access point

but I can successfully cover about 1.75 acres with just 3 access points with no slowdowns or dropouts.

Garage AP statistics

UniFi Mobile App

View from the client location

View from test above

Conclusion

So tl;dr, consumer grade router / access point combos are heaping piles of πŸ’© garbage, don't use them, use something that was designed to be an access point, these Ubiquiti jobbies are pretty good. I'd buy them again.

πŸ‘ πŸ’― 🍺

August 17, 2017 @13:40

There has been a lot of buzz around about how quickly the web is moving towards HTTPS everywhere. For quite a while the EFF has had extensions for the popular browsers to enforce HTTPS Everywhere, and security bloggers like Troy Hunt have written a bunch of things about impending browser changes that are going to make life a lot harder for people with websites that do not support HTTPS.

I've been running HTTPS on ssl.ub3rgeek.net for a while now, since that site serves several applications (OwnCloud, tt-rss and wallabag for example) and I have good reason to want that to be secure, but I figured this was a good time to pull the trigger and put SSL on going-flying.com.

SSL Labs Test Result

The reality is that while I'm unlikely to get the 'insecure' warnings from the browser updates but thankfully SNI is pretty well supported these days so pulling that trigger was pretty damn easy. πŸ‘

In my case I buy DV certificates from my registrar (a rad French company called Gandi). Before people start screaming about LetsEncrypt I may switch to those at some point but frankly I don't really feel like they are "there yet". I use certificates for a lot of things that you don't see, including signing Apple MobileConfig bundles for use in deployment to my iOS devices. These certificates are still not trusted everywhere by default and integrating the LetsEncrypt ecosystem into all those automated backend tools is... well it's work I'm not getting paid for. πŸ˜‚

🍺

April 14, 2017 @16:08

I have been going through my ~/TODO list recently and I have meant to figure out why my Sonos indexing has been failing lately. I sync my iTunes Library from my Time Machine backups into a shared space on my NAS so other things can get to it without having to have my Mac on.

I tried to re-add the UNC path and it would consistently return error 900.

Google wasn't helpful at all on what error 900 actually meant.

So I cranked up debugging on samba and this came across:

No protocol supported !

I had recently disabled SMB1 on my NAS but didn't realize that change coincided with my indexing failures.

So tl;dr, it looks like Sonos uses SMB1 to connect to your NAS, so make sure that you leave it enabled.

Dear Sonos... please use a newer version of SMB... SMB1 is terrible.

🍺 πŸ”‰

April 11, 2017 @20:08

I just wanted to quickly mention a change I ran into today while upgrading my OpenBSD routers to 6.1.

As a quick background I use OpenIKED to terminate VPN connections from OpenBSD routers, iOS devices, mac OS devices and MikroTik RouterOS devices. The OpenBSD and RouterOS systems are site-to-site links with ipip(4) interfaces running on top of the ikev2 tunnels. Routing is handled by the ospfd(8) and ospf6d(8) daemons provided by OpenBSD.

The tunnel to my RouterOS device stopped working today with a rather strange message:

Apr 11 11:49:12 bdr01 iked[60779]: ikev2_ike_auth_recv: unexpected auth method RSA_SIG, was expecting SIG

Searching around in the debug output of iked(8) there was some indication that the daemon could only use RFC 7427 signatures:

Apr 11 10:01:23 bdr01 iked[64964]: set_policy: could not find pubkey for /etc/iked/pubkeys/fqdn/bdr01.work.ub3rgeek.net

I checked RouterOS and it only has a rsa signature option for ikev2 certificate based authentication.

The fix?

Get the public key for the connection and put it where iked(8) expects it.

openssl rsa -in private key -pubout > public key

This allowed the tunnel to come right up without any changes on the MikroTik end.

March 10, 2017 @20:00

Over the years I have had many different BlackBerry phones. I started with a 7100t, one of the first candybar-style BlackBerry devices and just finished up a several-year relationship with a Passport.

I loved every minute of it.

I still think that RIM/BlackBerry had the best device for communication out there, but as they sunset the BlackBerry 10 operating system, there is no longer any reason to continue.

Yes, BlackBerry now makes Android software and TCL makes BlackBerry branded hardware but if you are going to switch away from a platform, you might as well evaluate all the options.

I chose an iPhone.

There are lots of reasons, and none of them are perfect, but at the end of the day it works for me, and that's what is important.

The tl;dr of it all is that I trust Apple more than I trust Google.

They are both huge multi-national corporations who don't really care about anything but driving shareholder value... but Google basically only makes money on selling out its users.

My Collection

  • BlackBerry 7100t
  • BlackBerry 8100 (Pearl)
  • BlackBerry Bold 9000
  • BlackBerry Bold 9700
  • BlackBerry Playbook
  • BlackBerry Bold 9900
  • BlackBerry Bold 9930 (Work)
  • BlackBerry Q10
  • BlackBerry Passport
  • BlackBerry Classic (Work)

I will miss you, you crazy Canadians.

My BlackBerry Collection

September 19, 2016 @16:00

I have actually been building the static content of the site from a python(1) script for a while, though until recently it ran from cron(8) and rebuilt all the pages every hour. This wasn't too bad since there were a few internal pages that also got rebuilt, including my graphing pages that are built from SNMP queries of various network gear.

So a little bit about the page generation. The script uses the Cheetah Template engine to assemble the files for each static page. There is some logic in each template to ensure the proper elements are included based on which page is being created.

ScreenShot of code.html

For example code.html is made up of 4 files.

  1. header.html.tmpl - This is not visible, it is everything up to the closing head tag.
  2. nav.html.tmpl - This is the nav element, including the other page buttons. This is actually even included on the index.html page but it hides itself since it knows it is not needed.
  3. code.html.tmpl - The content of the page.
  4. footer.html.tmpl - the footer element and the closing body and html tags.

This lets me build a wide variety of content out of the same style. There are configuration provisions in build.py that allow me to add additional JavaScript and CSS links in header.html.tmpl if I need to. This is used by the network information page to include additional style and the JavaScript that allows for dynamic hiding of the lists.

        elif page == "network.html.tmpl":
            extras["custom_css"] = [
                '/css/lists-ok.css',
                '/css/network.css'
            ]
            extras["custom_js"] = [
                '/js/jquery.js',
                '/js/network.js'
            ]

The whole build process is fired off by the following post-receive hook in git.

#!/bin/sh
# going-flying.com post-receive hook
# (c) 2016 Matthew J. Ernisse <matt@going-flying.com>
# All Rights Reserved.
#
# Update the on-disk representation of my website when I push a new
# revision up to the git repository.

set -e

BUILD_DIR="/var/www/going-flying.com"
GIT_DIR=$(git rev-parse --git-dir 2>/dev/null)
REV=0

if [ -z "$GIT_DIR" ]; then
    echo >&2 "fatal: post-receive GIT_DIR not set"
    exit 1
fi

echo "updating $BUILD_DIR"
GIT_WORK_TREE=$BUILD_DIR git checkout -f

echo "building html from templates"
$BUILD_DIR/build.py

while read oldrev newrev refname; do
    REV="$newrev"
done

echo "optimizing JPGs."
find "$BUILD_DIR" -name \*.jpg -print0 | xargs -0 jpegoptim -qpst

echo "optimizing PNGs."
find "$BUILD_DIR" -name \*.png -print0 | xargs -0 pngcrush -reduce \
    -rem alla -q -dir "$BUILD_DIR"

echo "setting rev to $REV"
sed -e "s/GIT_REV/${REV}/" "$BUILD_DIR/index.html" > "$BUILD_DIR/index.html.new"
mv $BUILD_DIR/index.html.new $BUILD_DIR/index.html

echo "site deployed."

The result is that a git push looks like this:

Counting objects: 11, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (11/11), done.
Writing objects: 100% (11/11), 195.70 KiB | 0 bytes/s, done.
Total 11 (delta 2), reused 0 (delta 0)
remote: updating /var/www/going-flying.com
remote: building html from templates
remote: optimizing JPGs.
remote: optimizing PNGs.
remote: setting rev to 3ac149f570d379bf71ed78a7734042af2200591a
remote: site deployed.
To git@repo.ub3rgeek.net:going-flying.com.git
   197843c..3ac149f  master -> master

It works pretty well, allows me to serve static files, have a long Expires: header and in the end causes the pages to load reasonably fast.

First test using GTMetrix from San Jose

Result of GTMetrix Page test

Even if I test from Australia using PingDom...

Result of PingDom Page test

Next time we will talk about the gallery generator. In the mean time... 🍺

September 06, 2016 @16:00

I am hoping this will be the first of three or four posts detailing some of the technical bits under the covers of the new website. In this particular post I'll talk mostly about the design decisions that went into the whole infrastructure.

All of this works for me, and is based on my use-case. It is entirely possible that your application may be different and some of the decisions I made won't work for you, but at least you can hopefully understand the reasons behind it all.

So first, the givens

  • I want to host this myself, with as little external dependencies as possible.
  • My site is fairly small
  • I have some images I'd like to have a home for.
  • Sometimes I write things, but not a lot.
  • I have some code splashed around, that is fun to share.

What I chose to do

  • The entire site is made up of static pages served off disk.
  • All the HTML and CSS is hand-written.
  • The gallery generator is also pre-processed and served directly from disk.
  • This entire blog is rendered from the same templates as the base site and markdown fragments to contain the article content.
  • I trigger all of this off a post-receive hook in my git repository.

Why

This allowed me to hand-create a single HTML template that gets applied virtually everywhere (the gallery has bespoke templates for it). I was able to craft a responsive design with almost zero JavaScript (only the mobile interface for the gallery uses JavaScript (jQuery)), which makes me happy. The site looks reasonable across desktops, phones, and tablets. It doesn't expose any data to any third-party sites. It is fast to load and render. It takes almost no server resources to serve.

Most of the pieces (which I will go into detail in the next few posts) have been around for a while but it is how I'm putting them together that makes it so much easier to maintain. I collapsed a lot of the templates down to a single base template class and only customize the bits needed to make each page. I also went from triggering this all out of cron(8) on the hour to building it when a change is pushed to the server. This not only saves server resources rebuilding things when nothing has changed, but also makes it so the errors are immediately noticed (in the git push output instead of in a cron mail that I may ignore).

Hopefully this makes sense. Next time I'll start talking about the oldest part of the site -- the template builder.

Subscribe via RSS. Send me a comment.