ArchiveSW - Display & Data Archive Storage

Version 1.7.15 released.
21 December 2018.

This is a stable release and has been tested and run for over a week with no restarts or errors. Runs under Linux 4.14.79-v7+ #1159 SMP Sun Nov 4 17:50:20 GMT 2018. Use the 2018-11-13-raspbian-stretch.img.

This package is written and tested to run on a Raspberry Pi 3 but may run on older Pi units and the Pi Zero. The application will listen on port 50222 for UDP data from the Smart Weather Station Hub. The data will be stored in a MariaDB (MySQL) database.

Installition

The installation takes less than 30 minutes on new hardware with an empty SD Card. The instructions are located at: https://fsoft.com/archivesw/

Wi-Fi Access Point

You may now turn your RPi into a Wi-Fi access point. This will allow you to connect the Hub to the RPi Wi-Fi. You may also power the Hub from the RPi and power the RPi from a battery backup source. This will keep your Smart Weather Station locally online when your main power fails. It also makes your station 100$ portable. This is great for pilots, sailors, hang gliders and motor sports.

Main Applications

  1. Archive (archive.js) - The main application that captures and stores the data in SQL. It also writes activity to log files and writes events to filenames.
  2. Alert (alert.js) - This will send alerts via Pushover or Prowl.
  3. Server (server.js) - Edit the configuration file from a web page. Restart any of the applications as needed.
  4. Piio (piio.js) - Accept inputs for three closed contact switches and has five LED outputs for instant status.
  5. Panel (panel.js) - Allows you to use any device with a web brower to view your current data. This is initially designed to work with the RPi 7" touchscreen. However, it is html and may be adapted for any size screen.

Alerts

Hub reboot, Device reboot, Device offline / online, Battery voltage low, Battery replacement now.
Sensor failure after x consecutive times.

ATTENTION:

WeatherFlow has many more products than the Smart Weather Station and this application only deals with this one product. Therefore, I am renaming this application to ArchiveSW.

Updates will not affect current installs but future installs will be placed in the folder ā€œarchiveswā€ or the folder of your choice.

Questions and comments are welcome.

If you have any issues you may post here or email at support@fsoft.com.

12 Likes

If you are interested in living off the grid but still having great weather station data . . . or . . . if you are interested in developing your own weather station graphics and custom reporting . . . or even if you just have unreliable Internet service, then WFArchiver is a great tool for you!

I have been using the beta versions and the v1.0.0 version of WFArchiver for some time now and it has been great to have all of the excellent detail weather observation data produced by the WeatherFlow Smart Weather Station stored locally. Having it all in an SQL database is perfect for an old SQL guy like me. And there are so many good tools out there to let you work easily with your data.

To have all of what WFArchiver can do with WeatherFlow data on a Raspberry Pi 3B+ is amazing. The Raspberry Pi 3B+ costs about $35USD and I recommend getting the nice case, heatsink, and power supply with it so your hardware cost is less than $60USD plus the cost of the micro SD memory card of your choice (8GB minimum). I think 16GB will easily hold 5 to 8 years worth of full detail data.

So . . . try out WFArchiver. And I say thank you to WeatherFlow, @GaryFunk and this user community.

6 Likes

I added a new table to record Sensor Status on a daily basis.

c1

This is the HubEvents table that tracks events that happen to the Hub. As you can see the Hub rebooted and the UDP packets before and after the reboot are saved.

c1

2 Likes

@GaryFunk Donā€™t know if youā€™re paying an attention to the UDI forum but Iā€™d like to see if wfarchiver can validate an issue someone is seeing with the node server. Iā€™m tracking the time since a data packet was received from the hub and if itā€™s been 2X longer than the time I expect, I flag that as a possible missing data.

Have you created any queries to check for missing data? And if so, can you share?

1 Like

Bob (@bpaauwe),

Interesting idea. Thatā€™s easy with Hub Status. Iā€™ll see what I can come up with. It should be rather simple.

Maybe a trigger that creates a HubDataGap (a new table) record if the latest HubStatus record datestamp is more than 10 seconds later than the last HubStatus record datestamp for the specific Hub serial_number? Just a thought . . .

1 Like

I was thinking of something that would look at historical data for gaps. The hub status data has a sequence number is there some way to check for gaps in that? For other data, look for large gaps between the timestamp values.

My SQL fu isnā€™t all that good.

1 Like

@bpaauwe, @dan.gealt,

An easy way is a query such as this:

SELECT COUNT(id) FROM AirObservation WHERE hub_sn = 'HB-00005194' AND datestamp BETWEEN '2018-07-23-21:20:00' AND '2018-07-24-18:00:00'

It should 60 for each hourly period queried.

1 Like

I just ran this:
SELECT COUNT(id) FROM AirObservation WHERE hub_sn = ā€˜HB-00003466ā€™ AND datestamp BETWEEN ā€˜2018-05-03-00:00:00ā€™ AND ā€˜2018-05-03-18:00:00ā€™

and got 1078. So I dropped two packets.

That seems pretty easy. Thanks! Change it to go from midnight to midnight and if < 1440 data was lost. And youā€™re only adding the hub_sn condition because you have two right?

1 Like

Yes. with only one Hub it doesnā€™t need to be in there.

I can write a script and add a table for you to test with. Save you some work.

Hi Gary,

I was just testing UDP transmission with no Internet connection and found one instance where a UDP packet appears to have been transmitted twice . . . which may cause a problem for your COUNT (id) query. Here is the AirObservation data:

46%20PM

1 Like

Well, isnā€™t that special!

Iā€™m not sure how to handle that and prevent it from happening. If I create an unique index that will stop any duplicate but it will prevent the bad (ā€˜AR-00000000ā€™) packets from being saved.

I shall have to ponder this issue.

The COUNT (id) query is easy to fix by using DISTINCT.

Not sure if this is a regular occurrence or not. Didnā€™t mean to throw a wrench in the works, but I just stumbled on the duplicate packet while testing something else.

1 Like

Itā€™s something that needs investigating. Did the Hub send it twice? Did the router create the duplicate packet? Did the Pi create the duplicate packet? Was it created by the DB or NodeJS or by my script?

1 Like

Regarding duplicate packetsā€¦ Iā€™m seeing them occasionally too. At least in my OpenHAB UDP parse rules. I had thought it might be a bug in my code or possibly a bug in the way OpenHAB is handling the data. It seemed to fix itself when I restarted OpenHAB, so I had attributed it to OpenHAB. Maybe there actually are duplicate packets being sent or generated somewhere along the line?

By the way, I did just install WFArchiver yesterday. Very cool program! Thanks Gary! I installed it just a few minutes before a firmware update was pushed to my hub so the archiver caught it.

1 Like

Iā€™m happy it worked well for you. I have a few more ideas Iā€™m working on. Itā€™s all about the data

2 Likes

@todd.lorey, @dan.gealt,

Iā€™m going to create an index on AirObservation and SkyObservation to prevent duplicate entries. If a duplicate happens, I will save the packet to a new table called Zdupe. Maybe this will help track the issue and point to where the duplicate packet is originating.

2 Likes

Excellent idea on the index and an even better idea on the Zdupe table! Making the DB nice and neat is good practice, but building diagnostic tools into it is the mark of a ā€œtest & measurementā€ kind of person.

Will you just keep an insert_datestamp, source_table and duplicated_record_id in the Zdupe table so you donā€™t have to account for different tables (down the road) that you might treat the same way?

1 Like

I am experimenting.

I am starting with a fresh database and doing a lot of testing.

2 Likes