Posts Tagged: deep neural networks

Home » deep neural networks

Transforming aerial video data into knowledge

DataFromSky is a unique solution for motion analysis of aerial video data. The ability to detect, track and classify moving objects from aerial videos provided by drones, fixed cameras or other video sources opens new possibilities in the field of traffic monitoring. The technology behind is under intensive development and grows more efficient every day through the embedded capability of learning from its mistakes. We are happy to announce that we have made a significant step towards enhanced accuracy thanks to the complex integration of deep neural networks into DataFromSky framework. Our deep neural networks are being trained on millions of samples 24 hours a day. The quality is quite impressive right now!

Our tool for trajectory interpretation DataFromSky Viewer also comes with a new functionality. The new version extracts parameters which are sometimes difficult to obtain, such as gap-time, time to follow, heatmaps, and dimensions of vehicles. We recognize the potential of the data we are able to obtain, and, therefore, co-work on our tools with traffic researchers and traffic engineers worldwide. The video below demonstrates the new functionality of DataFromSky Viewer. Take a look!

Read more

DataFromSky at Seminar in Liberec – Deep Neural Networks

A Workshop titled Modern methods of image recognition and processing methods was organized recently by Technical University in Liberec. The aim of the workshop was to offer insight into image processing applications from the industry, as opposed to research setting.

Adam Babinec presenting DataFromSky at seminar in Liberec

We presented principles of detection and classification of vehicles from aerial video data using Deep Neural Networks. As you can imagine, Deep Neural Networks are integral part of DataFromSky platform and are used in many subtasks of DataFromSky solution. If you want to know more about the magic behind, you can download the presentation (PDF, 2 MB) and/or contact us.

Read more