What is that I’m seeing?
Elad Denenberg and Dan Feldman - Department of Computer Science
Ilan Shimshoni - Department of Information Systems
Post Doc Grant 2020
In recent years many applications have been suggested for quadcopters. Quadcopters are flying platforms with four rotors that can hover, film, and fly around. Our research aims to develop an autonomous quadcopter that will fly indoors and map a floor or a given area. Such technology can be useful for construction purposes, as well as search-and-rescue applications. For autonomy, our platform would have to film its surroundings and deduce what it is “seeing”: whether it is in front of a wall, window, mirror, or door. Then we would like our quadcopter to reason about its surroundings and navigate. Exit rooms, explore corridors and advance intelligently.
In Prof. Dan Feldman’s Robotics and Big Data (RBD) Laboratory, a model was developed to identify rooms and exist. The quadcopter turns around in a circle, mapping everything in its vicinity. Then the model is used to identify the room structure and the way to the exit. With the Data Science Research Center’s support and aided supervision of Prof. Ilan Shimshoni, Elad Denenberg has developed a newer and simpler model. This model identifies certain characteristics in an image and decides whether it is an exit, corridor or wall. The developed model requires a set of parameters to be tuned and learned to classify scenes successfully. The learning of the parameters was done using state-of-the-art optimization tools named Surrogate-based optimization.
The next phase of the research would be to add another layer of autonomy: enabling the quadcopter to select a path and optimizing the best positions and angles to locate the camera for the most efficient mapping of the desired area.