Capstone Project
Group | 2024-16 | Status | inprogress |
Title | HeAR: Bridging Silence with Vision and Touch | ||
Supervisor | Dr. Le Beux | ||
Description | For individuals with hearing impairments, daily life can present significant challenges, from
communicating effectively with others to recognizing important auditory signals like alarms.
Hearing aids, a common solution, are not suitable for all individuals, particularly those with
severe hearing loss. Additionally, hearing aids come with a range of issues including discomfort,
feedback, susceptibility to moisture, and interference from other devices. They also present a
steep learning curve for those who have recently lost their hearing and require time to adapt. The
high cost of hearing aids and their limitations in certain environments further exacerbate the
problem.
Given the shortcomings of traditional hearing aids, there is a need for a more inclusive, user-friendly solution that enhances communication and awareness without introducing the challenges that hearing aids bring. This project aims to design and prototype an all-in-one set of Augmented Reality (AR) glasses as a comprehensive solution for the deaf and hearing impaired. These glasses will leverage the user visual and tactile senses, offering a virtual screen that overlays text and sound notifications while keeping the real world visible. By providing a visual interface with haptic feedback, they address auditory functions that hearing aids attempt to solve, without the associated limitations. The deliverables include a functional AR glasses prototype with a clear, visible display, overlaying real-time text and sound-related notifications to enhance situational awareness and accessibility. Key Features: • Real-World Integration: Transparent interface that allows users to see both the real world and the AR-generated information simultaneously, ensuring a seamless experience. • Haptic Feedback: Provides silent vibrations or taps to alert the user to important sounds, adding another layer of awareness without relying on audio signals. • Speech-to-Text: Transcribes spoken words in real time and displays them on the AR glass interface. • Sound Awareness Interface: Converts environmental sounds (e.g., fire alarms, phone ringing, door knocks) into visual signals and notifications. • Translation Mode: Automatically detects spoken languages and translates them into the user preferred language or offers visual cues for communication. By merging AR technology with real-time transcription, sound visualization, and haptic feedback, these glasses will provide an intuitive and comprehensive solution for the deaf and hearing-impaired community, offering a more accessible, convenient, and versatile alternative to traditional hearing aids. |
||
Student Requirement | Proficiency in Programming Languages: • Knowledge of Embedded C for microcomputer programming and Python for Artificial Intelligence (AI). Microcontrollers and Embedded Systems: • Familiarity in working with microcomputers and having various devices interface with it. • Working with embedded system assembly. Mobile Application Development: • Application development knowledge which includes UI/UX design, database building and Operating System operations for task handling. Knowledge of Artificial Intelligence and Machine Learning: • Working with AI and machine learning to train the device to detect hand motions, translate and transcribe speech. Circuit Design and Electrical Engineering: • Skills in designing, building, and testing circuits, particularly with PCB (Printed Circuit Board) tools. • Understanding of electrical properties, including power consumption, battery management, and current draw, to ensure efficient hardware integration. | ||
Tools | Hardware: • Microphones for capturing ambient sounds and recognizing spoken language. • Cameras for image processing like detecting people and distance measurements. • Haptic Actuators for delivering tactile feedback (as vibrations) to the user. • 2x Mini LED Display with mirrors and lenses for overlaying text and sound notifications onto the user view through holographic/virtual imaging. • Smartphone Application for managing the device’s settings and modes. • Buttons for interacting with the device interface. • Micro-computer (e.g., CM4 raspberry pi) for processing and memory • 3D printed frame to house the hardware • IMU IC (gyroscope/accelerometer on PCB) to track orientation and to stabilize the UI • GPS IC (also on PCB) for tracking location of the glasses in case of loss • Ion battery rechargeable coin cell to power device • Cooling System for heat dissipation Software: • Software Development Kit (Flutter/React Native/Swift) for application development • AI training models for text to speech as well as image recognition • Python for backend programing • OpenCV for distance measurement using camera integration • C/C++ for direct programming and sensor interaction • Transcription libraries for speech-to-text integration • MongoDB/Firebase databases to store user data Test equipment: • Multimeter to measure the electrical properties in the hardware components. • Blackbox testing to evaluate the overall functionality of the AR glasses • Oscilloscope to ensure proper timing and signal processing across sensors and displays. • Battery and Power Consumption Testers to optimize performance and longevity. • Display Calibration Tools to measure and adjust AR imaging • Test Rig and Motion Tracking Systems to track orientation and movement | ||
Number of Students | 7 | ||
Students | Skevos Galanomatis, Rezeq Khader, Carl Nakad, Matthew Noah Ruffolo, Ethan Santhiyapillai, Joerexpradeepan Thambaiah, Ruaid Usmani | ||
Comments: | |||
Links: |