I have been running a FlightAware /
FlightRadar24 ADS-B feeder for almost 4
years now. It is an older Raspberry Pi B with a RTL-SDR stick running
dump1090 at its core. These
days it is mounted in my garage with the antenna on the roof. When I
built it I stuffed a
Honeywell HIH6130 temperature
and humidity sensor in the enclosure. At the time it was mounted on a
fence in my back yard so it would be in full sun for much of the day so
I hooked it up to Icinga to alert me if it ever got too hot or too wet
inside.
Lately I've been investigating ways to get more information into a central location for my infrastructure as a whole. I have a lot of one-off, largely custom built systems to collect and aggregate system status data. While this has worked for the last 12 years, it is most certainly starting to show its age. At the moment I'm working with a stack that includes collectd, InfluxDB, and Grafana. The latter two run as Docker containers, while the former is deployed by Puppet to all my physical and virtual hosts.
I wanted to pull together some additional monitoring information from the ADS-B feeder to see just how far I can go with this setup. Luckily the dump1090 web interface works by reading JSON files from the receiver daemon, so all the interesting statistics are available on disk to read.
I was able to pull together a quick python script that loads the JSON and emits the statistics to collectd (which forwards them onto InfluxDB for Grafana to work with). I need to get the script into git somewhere but for now, here is the currently running copy.
#!/usr/bin/env python3
''' collectd-dump1090-fa.py (c) 2018 Matthew Ernisse <matt@going-flying.com>
All Rights Reserved.
Collect statistics from dump1090-fa and send to collectd. Uses the collectd
Exec plugin.
Redistribution and use in source and binary forms,
with or without modification, are permitted provided
that the following conditions are met:
* Redistributions of source code must retain the
above copyright notice, this list of conditions
and the following disclaimer.
* Redistributions in binary form must reproduce
the above copyright notice, this list of conditions
and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
'''
import json
import os
import socket
import time
def print_aircraft(stats):
''' Parse and emit information from the aircraft.json file. '''
aircraft = len(stats.get('aircraft', []))
messages = stats.get('messages')
if not messages:
raise ValueError('JSON stats undefined')
m = "PUTVAL \"{}/dump1090/counter-messages\" interval={} N:{}".format(
hostname,
interval,
messages
)
print(m)
m = "PUTVAL \"{}/dump1090/gauge-aircraft\" interval={} N:{}".format(
hostname,
interval,
aircraft
)
print(m)
def print_stats(stats):
''' Parse and emit information from the stats.json file. '''
counters = [
'samples_processed',
'samples_dropped',
]
gauges = [
'signal',
'noise'
]
values = stats.get('local')
if not values or not type(values) == dict:
raise ValueError('JSON stats undefined')
for k in counters:
value = values.get(k)
if not value:
value = 'U'
m = "PUTVAL \"{}/dump1090/counter-{}\" interval={} N:{}".format(
hostname,
k,
interval,
value
)
print(m)
for k in gauges:
value = values.get(k)
if not value:
value = 'U'
m = "PUTVAL \"{}/dump1090/gauge-{}\" interval={} N:{}".format(
hostname,
k,
interval,
value
)
print(m)
if __name__ == '__main__':
interval = float(os.environ.get('COLLECTD_INTERVAL', 10))
hostname = os.environ.get('COLLECTD_HOSTNAME', socket.getfqdn())
while True:
with open('/var/run/dump1090-fa/stats.json') as fd:
stats = json.load(fd)
stats = stats.get('total')
print_stats(stats)
with open('/var/run/dump1090-fa/aircraft.json') as fd:
stats = json.load(fd)
print_aircraft(stats)
time.sleep(interval)
I also wanted to pull the temperature / humidity sensor readings, that ended up being a similarly easy task since I already had written a script for Icinga to use. A quick modification to the script to emit the values in the way that collectd wants and that was flowing in. I created a user for the i2c group so the script can use the i2c interface on the Raspberry Pi.
The script currently looks like this.
#!/usr/bin/env python3
import os
import socket
import time
import smbus
def read_sensor():
''' Protocol guide:
https://phanderson.com/arduino/I2CCommunications.pdf
'''
bus = smbus.SMBus(0)
devid = 0x27
# writing the device id to the bus triggers a measurement request.
bus.write_quick(devid)
# wait for the measurement, section 3.0 says it is usually
# 36.65mS but roll up to 50 to be sure.
time.sleep(0.05)
# data is 4 bytes
data = bus.read_i2c_block_data(devid, 0)
# bits 8,7 of the first byte received are the status bits.
# 00 - normal
# 01 - stale data
# 10 - device in command mode
# 11 - diagnostic mode [ ignore all data ]
health = (data[0] & 0xC0) >> 6
# section 4.0
humidity = (((data[0] & 0x3F) << 8) + data[1]) * 100.0 / 16383.0
# section 5.0
tempC = ((data[2] << 6) + ((data[3] & 0xFC) >> 2)) * 165.0 / 16383.0 - 40.0
return (tempC, humidity)
if __name__ == '__main__':
interval = float(os.environ.get('COLLECTD_INTERVAL', 10))
hostname = os.environ.get('COLLECTD_HOSTNAME', socket.getfqdn())
while True:
retval = read_sensor()
print("PUTVAL \"{}/hih6130/gauge-temperature\" interval={} N:{:.2f}".format(
hostname,
interval,
retval[0]
))
print("PUTVAL \"{}/hih6130/gauge-humidity\" interval={} N:{:.2f}".format(
hostname,
interval,
retval[1]
))
time.sleep(interval)
The collectd plugin configuration is pretty easy, the dump1090 files are readable by nogroup so you can execute that script as nobody. As I said I made an i2c user that was member of the i2c group so the Python SMBus module can communicate with the sensor.
LoadPlugin exec
<Plugin exec>
Exec "nobody:nogroup" "/usr/bin/collectd-dump1090-fa.py"
Exec "i2c:i2c" "/usr/bin/collectd-hih6130.py"
</Plugin>
Once the statistics were flowing into InfluxDB, it was just a matter of putting together a dashboard in Grafana.
The JSON from Grafana for the dashboard is here, though it may require some tweaking to work for you.
So far I'm pretty happy with the way this all went together. I still have a bunch of network equipment that I'd like to bring over and a stack of ancient MRTG graphs to replace. Hopefully it will be a similarly simple experience.
🍻