I would like to raise a topic on which I can not really balance the pros and cons. I obviously want to talk about the use of either local pressures or pressures brought down to sea level.
What I know/understand/intuite is:
Local pressure is what is measured by the probe. So I’m using it in all computation (as the computation must be done 1/ at the altitude where the pressure is measured and 2/ with temperature, humidity, etc. at the same location). I’m using it in data exchange too (always sending local pressures, hoping that the pressures that come from other systems are indeed local pressures too).
Sea level pressure is a computation (it’s rather an approximation, should I say). It’s mainly used in displaying because it’s easier to compare with normal pressure (which is a sea-level pressure, 1013.25 hPa).
But why the hell so many systems have different ways to handle this?
I saw smart aps displaying station pressure, software exporting only sea-level pressure, etc.
Do you know if there’s a “standard” on this? And you, how do you handle this?
Thanks for your future answers, my goal is not to tell whether a particular software is better or worst. I only want to compare your different approaches and the assumptions you have to make.