JTK – Notes and Ramblings Things we have found …

12/26/2025

Yet another Pi Weather Radio update

Filed under: General — taing @ 6:01 pm

After an update to Raspberry Pi OS Trixie and a great deal of hassle with sox not being willing to handle mp3 format in spite of libsox-fmt-mp3 being installed the Weather Radio scripts have been updated again.

The start.sh script is unaltered from before:

cd weatherRadio
rm -f one two three; mkfifo one two three

./to-same.sh <one &
./to-lame.sh <two &
./to-udp.sh <three &
rtl_fm -f 162550000 -s 22050 -p 14 | tee one two three |multimon-ng -t raw -a EAS /dev/stdin

The tosame.sh script is also unaltered:

multimon-ng -t raw -a EAS /dev/stdin | python3 ~/tg-dsame/dsame.py --mqtt your-mqtt-broker --json output.json --call /home/pi/weatherRadio/playRadio.sh --command boo

The to-lame.sh script was updated to once again use lame instead of sox for the mp3 conversion. While ez-stream can handle other formats mp3 works well from lame.

lame --bitwidth 16 --signed -s 22050 --lowpass 3500 --abr 64 --scale 8 -r -m m - - |ezstream -c ezstream.conf

The to-udp.sh script was greatly simplfied and sox was removed completely from it:

socat -u - udp4-sendto:localhost:5555

The playRadio.sh script was updated to handle playing a raw stream instead of mp3:

#!/bin/bash
nc -lu 5555|play -t raw -r 22050 -b 16 -e signed -c 1 -v 7 -&amp;
sleep 120
killall "play"

For completeness, the ezstream.conf file:

<?xml version="1.0" encoding="UTF-8"?>
<ezstream>

  <servers>
    <server>
      <protocol>http</protocol>
      <hostname>somemachine</hostname>
      <password>hackmeplease</password>
    </server>
  </servers>

  <streams>
    <stream>
      <mountpoint>/weather</mountpoint>
      <format>MP3</format>
      <stream_name>The Weather Radio</stream_name>
      <stream_url>yoururl</stream_url>
      <stream_genre>weather</stream_genre>
      <stream_description>NOAA Radio KHB59 (162.55MHz)</stream_description>
      <stream_quality>2.0</stream_quality>
      <stream_bitrate>32</stream_bitrate>
      <stream_samplerate>48000</stream_samplerate>
      <stream_channels>1</stream_channels>
    </stream>
  </streams>

  <intakes>
    <intake>
      <type>stdin</type>
      <stream_once>yes</stream_once>
    </intake>
  </intakes>
</ezstream>

Weatherflow Tempest MQTT

Filed under: General — taing @ 6:05 am

The Weatherflow Tempest unit has an API based on local broadcast UDP packets. There is also a websocket and a REST API.

This article will focus on using the UDP API with MQTT for use in Openhab. There are a number of resources online. There is a binding that started in 2020 but the discussion stops in 2022. There is a different binding from Bill Welliver that seems more complete.

Additionally there are resources for Home Assistant. https://github.com/briis/hass-weatherflow2mqtt has been depreciated and the git set to read only. It has been replaced with https://github.com/briis/weatherflow_forecast. The newer integration uses a combination the Weaterflow APIs for a more complete setup and also works with Home Assistant discovery. There is also https://github.com/gualandd/WeatherFlow-Tempest-UDP using NodeRed.

All of the options mentioned above seem a bit too heavy for a simple sensor to MQTT path. Here is a bit of python3 code to listen to the UDP port and send to a MQTT broker.

import socket
import paho.mqtt.publish as mqtt
import json

# Define the UDP port to listen on
UDP_IP = "0.0.0.0"  # Listen on all available interfaces
UDP_PORT = 50222
mqtt_host = "your.mqtt.broker"

# Create a UDP socket
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)

# Bind the socket to the specified address and port
sock.bind((UDP_IP, UDP_PORT))

print(f"Listening on {UDP_IP}:{UDP_PORT}...")

# Listen for incoming packets and print them
while True:
    data, addr = sock.recvfrom(1024)  # Buffer size is 1024 bytes
    js = json.loads(data)
#    print(json.dumps(js, indent=4))
    match js['type']:
        case "obs_st":
            mqtt.single('Tempest/obs',json.dumps(js['obs']), hostname=mqtt_host)
            print(f"OBS: {js['obs']}")
        case "rapid_wind":
            mqtt.single('Tempest/wind',json.dumps(js['ob']), hostname=mqtt_host)
            print(f"Wind: {js['ob']}")

This code could easily be expanded to handle more that the obs_st and rapid_wind messages. For now it’s enough. Be sure to update the mqtt_host to point to your broker.

For sending the MQTT messages, the paho-mqtt library is used. The publish.single function can take additional parameters beyond those seen in the script:

  • topic: (Required, string) The topic to publish the message to.
  • payload: (Required, string, bytes, or None) The actual message content.
  • qos: (Optional, int, default 0) The Quality of Service level (0, 1, or 2).
  • retain: (Optional, bool, default False) If set to True, the broker will retain the message as the last known good value for the topic.
  • hostname: (Optional, string, default 'localhost') The IP address or domain name of the MQTT broker.
  • port: (Optional, int, default 1883) The network port of the broker.
  • client_id: (Optional, string) A unique identifier for the client.
  • auth: (Optional, dict) A dictionary containing username and password for authentication (e.g., {'username': "myuser", 'password': "mypass"}).
  • tls: (Optional, dict or None) A dictionary for configuring TLS/SSL secure connections. 

A great intro to using MQTT in python shows installing paho-mqtt using pip3. For a Raspian Buster system apt install python3-paho-mqtt is simpler. Alternatively a virtual environment could be created.

In Openhab, a Generic MQTT thing pointing to your broker is ideal. For most of the sensor data your channels will have the MQTT Topic State set to Tempest/obs. An Incoming Value Transformation using JSONPath will parse the individual readings from the JSON array: JSONPATH:$[0][7], for example, will extract Air Temperature. Refer to the Weatherflow UDP API of the array index of each value. The channels and linked items can be created using the appropriate units for proper conversions.

For Wind Direction a Scale Profile can be created on the Item. This allows for displaying wind direction with compass directions in addition to degrees. For example, E (82 °). The degrees unit is set in the Channel. Create direction.scale in /etc/openhab/transform/ with this content:

[0..11]=N
[12..33]=NNE
[34..56]=NE
[57..79]=ENE
[80..102]=E
[103..125]=ESE
[126..147]=SE
[148..170]=SSE
[171..191]=S
[192..214]=SSW
[215..237]=SW
[238..260]=WSW
[261..283]=W
[284..306]=WNW
[307..329]=NW
[330..352]=NNW
[353..360]=N
NaN=Non-numeric state
format=%label% (%value%)

12/24/2025

TrueNAS and Dropbox, again

Filed under: General — taing @ 1:00 am

After updating TrueNAS let’s try to run Dropbox client in a Docker container rather than installing directly. Hopefully the container will survive upgrades better.

The janeczku/dropbox is one of the most downloaded images for Dropbox client in Docker. Unfortunately it hasn’t seen an update in nine years. It uses the 64-bit version 11.4.21 of dropboxd from 2016.

You must set env variable DBOX_SKIP_UPDATE to prevent loading the newer version of DB. The current version will not install in the container due to GLIB version issues.

There are several things to setup in the TrueNAS app GUI to get things going: As mentioned in the container overview you will want to set the two environment variables for user id and group ip: DBOX_UID and DBOX_GID. You also need to configure storage. The simplest storage is a Host Path mapped to /dbox/Dropbox and /dbox/.dropbox

Once it starts the log (View Logs ) will report the URL to follow to authorize the link. This is typically in the form https://www.dropbox.com/cli_link_nonce?nonce=5b59dd0928bc08b75a736935bd7d37cd. Pasting the link into a browser and confirming the login is required.

There are a few places to get some status info. From the TrueNAS shell, docker commands will help you see a bit. docker ps -a will list the current containers. docker logs -f containername will tail the container logs. docker top containername will list the current processes for the container.

From the container shell , the dropbox command can display additional information. dropbox status will give a brief status. dropbox filestatus will show the current sync status of the files in the current directory. Additional help on commands can be found using dropbox help or online.

Note: The container shell is accessed from the TrueNAS Apps Gui, select the container and look for Workloads and Containers. There should be three icons for each container – Shell , Volume Mounts and View Logs . In TrueNAS 25.04.2.6 there are bugs that prevent the View Logs function from working well in many cases

Even after all of this the janeczku/dropbox container didn’t seem to be actually syncing anything. It is unclear what all of the issues were but the extreme age of the daemon is surely part of the problem.

There are several newer containers available. One is tiagovdaa/dropbox-docker. It is definitely descended from janeczku/dropbox. Future testing will show if it is more suitable. It is built on version 223.4.4909. The current version from dropbox.com as of December 2025 is 238.4.6075. At the very least, 238.4.6075 does auto install correctly inside the tiagovdaa/dropbox-docker container. It should be noted the language for the container is Portuguese. It appears there is only one volume mount point for this container: dbox. This implies that .dropbox and Dropbox are both children of this mount point.

So for now, it’s back to the previous method of installing dropbox headless and using the Gist to run it as a service as discussed in an earlier post.

A few notes from the earlier post: The first command from the earlier post is su dropbox. This is critical. The .dropbox-dist/dropboxd command will create the Dropbox folder in the home directory of the current user. You will also get the prompt to copy/paste the displayed URL to authenticate to Dropbox as discussed above. When running as a service the /etc/db/dropbox-cli commands will fail UNLESS you are running as the user you defined in dropbox-start.target. The python script checks for the dropbox-pid file in ~/.dropbox.

For services, use systemctl status servicename or journalctl -u servicename for more info. Be aware for the dropbox-start service created above, the dropbox-start.service process completes and exits leaving behind a child running dropbox.

Powered by WordPress