After attending to Infoshare conference this year I had the feeling that all buzzwords, which were presented there I have already heard somewhere. BigData analysis in fact could be done by well known statistics functions, which I learned 15 years ago during my studies. Machine Learning – all algorithms were known then as well, even my master thesis was based on one of the clustering algorithm – the nearest mean, which is widely used during image recognition. I used it as a mean to recognize Polish commands used to steer armored wehicle in virtual battlefield. All the sound was firstly filtered by set of low-pass filter, moments of silence were cut, statements were divided into phonemes, then distinctive set of features was extracted from chosen phonemes and after all proper clasificator using clustering algorithm decided to which class chosen statement belongs.
Here are some Polish works, which describe topic more precisely:
Just now everything is much more simpler using TensorFlow:
The question is : whether it could be possible to use this algorithms as the basis of NOTAM data analysis…
Here are some examples of this approach:
Capstone project for Propulsion with SWISS international airlines. Machine learning classifier for Notices to Airmen (NOTAM).
2. Notam Clasiffier:
3. Using a model-driven, knowledge-based approach to cope with complexity in filtering of notices to airmen
4. Accurate NOTAM parsing with visualization: