(2017-10-10, 17:49)dagnazza Wrote: Since project collected terabytes of data, is it possible to "teach" few algorithms to define stroke properties? Such as negative positive cloud to cloud, any kind, - without getting into physics of the process. I believe patterns are all there. Does anybody know this approach was tested or not? --andrei
The potential is there... some of those parameters are very difficult to compute, and involve math that is very complex. This would all be done on the server. Yes, such is / has been considered... One of the first things that has to happen is network quality of signals... this rests with the operators, and server processing. For example, sending of too many invalid signals can over power a station's 'server expected' patterns, based on system, antenna types, etc. A lot of stations do not have their station page configured with correct antenna types, and the server expects that information... Some server changes have been made, unknown to us operators. Also many of the 'test' algorithms may in fact be utilized on Lightnmaps.org, in the 'experimental' detections.
For example, the third channel on BLUE was added as potential channel for the H components of horizontally polarized sferics... how much actual evaluation is done currently in that mode I do not know, for example a location such as mine is very difficult to install a horizontally polarized loop because of the noise... currently, today, experimenting with an antenna idea for my BLUE system.
The data required is also sent by RED systems, but not Green....
So, you are correct... most stroke data that is needed is, or can be, sent by most RED and BLUE systems.... and/or can be configured through controller firmware and server algorithms, if the stations are set up properly.
The developers like to 'experiment' with such things during the 'northern hemisphere' slow season
Cheers!
Mike