Capstone Project

Back to listing
Group 2017-8 Status completed
Title Gesture-Based Drone Control
Supervisor Rastko Selmic
Description This project aims to develop a drone (quadrotor) that can be partially controlled with human hand gestures. Present drone control joysticks are not intuitive and control of drones requires training and practice. The impact of any hand-based gesture control is an improved user’s experience through simplified control interface. In this project, students are required to research and develop gesture-based drone control that can include simple gesture-recognition and corresponding control actions followed by more advance learning mechanisms. The project can be staged from simpler to more advanced subtasks as follows: - Develop basic gesture detection for fist and stop sign. Recognized fist should cause drone to hover; recognized stop sign should cause drone to land. - Develop more advanced gesture detection that includes finger point in a certain direction or inviting drone to follow the user. The drone should follow certain direction in such cases. - Develop gesture-based control that would change pitch, roll, and yaw angle during hover phase based on the user’s hand position. - Research and develop learning mechanisms where gesture detection is adjusted based on the feedback from the user.
Requirement Students need to have basic electronics, control systems, and software development background. System engineering will include a mix of different engineering areas.
Tools Specific hardware design tools and software programming language will be determined in consultations with the supervisor.
Number of Students 3
Students Dayelena Pyaneandee,Johana Mallma Kostyuk,Abtin Ghodoussi-Jafari,
Comments: rastko.selmic@concordia.ca
Links: