I think it would be a lot better if the time setting was used for the time after the last detected strike, not the last reported strike. If the time is set to for example 15 minutes, and you are in the middle of a big storm, it is no use to send a notification again after 15 minutes. If you start these 15 minutes after the last detected strike, there will be no notifications unless the new strike is closer than the last notified one.
Also because you now fill in strikes from other sources, I get the strong feeling that distances have a pretty high resolution. Let’s assume it is 1 km. That means that for a single storm you can get up to 40 alarms (one for each kilometer). And with your current definition of frequency, you would possible get that again 30 minutes later. That is a flood of notifications nobody is waiting for.
So yes your first two bullet points are nice, but the third one could be a lot better as it basically the same as it currently is, but with a different timeout.
I propose to define a few distance ranges, like 0-5 km, 5-10, 10-20, 20-40+ km. You send a notification if the new strike is in one of the closer ranges compared to the previously reported strike, or the given time (frequency) has past since the last detected strike (not the last notified strike)
That way you get at most 4 notifications per storm.
Wouldn’t that be a lot better??
Admittedly reducing range (bullet point 2) and frequency limit the amount of notifications somewhat, but only at the expense of loosing otherwise useful input. (if you set frequency to 3 hours and range to 10 km, you still might get 10 notifications, but you won’t get a warning if a new storm is approaching in the next 3 hours, nor will you get long distance warnings… all just because you don’t want to be flooded with notifications (and 10 notifications is still a lot))