Should timestamps be interpreted as the midpoint of the time interval, beginning, or end?
For some background, I have a Tempest mounted ~10 ft above my roof line in a suburban neighborhood and I’m pairing my weather measurements with a few barometric pressure sensors from Gulf Coast Data Concepts to analyze the internal pressure in my house relative to atmospheric in a variety of weather and opening configurations (garage door open vs closed, one window, two windows, etc). I’m logging the internal pressure data at 5 Hz. I’m using IFTTT to log Tempest data in 1-minute intervals, then using a regression model to adjust station pressure to an expected internal pressure to account for altitude differences, minor calibration errors, etc. When I go to calculate a gage pressure, I’m not sure how I should be interepreting the timestamp to compare against the higher resolution measurements. Further, if I were to use WU or Datascope to grab the weather data archive rather than IFTTT, the data interval is even larger and its still not clear whether the timestamp represents the midpoint, first point, or last point.
Happy to have any thoughts or suggestions on this approach. I’ve demoed it for a couple small storms so far and it seems to be producing reasonable data. I realize it’s not perfect, but I’m interested in demoing a framework for using a relatively inexpensive sensor setup to do some worthwhile science.