Introduction: Approximately 7 million people around the U.S. have visual impairments. Cataracts, glaucoma, and macular degeneration are some types of impairments that cause people to lose their vision. Thousands of people with visual impairments rely on adaptive technology to assist them with daily tasks. From smart canes and vests to screen readers, people rely on these devices to help them live with ease and comfort. Moreover, recent technologies, such as AI-assisted devices, have allowed for a broader range of capabilities.
Significance: However, even within the adaptive technology market, there has been little development of navigation devices that aid the visually impaired in navigating difficult environments with obstacles or people. Although devices such as phone apps and wrist cameras are sold to thousands of clients, they do not address the specific needs of the visually impaired. For example, certain devices only detect objects below the waist, are bulky, and are uncomfortable. To address this problem, this project focused on developing technology to meet the needs of the visually impaired and to better equip users with a personalized configuration.
Methods: To understand the needs of the visually impaired, community members of the Carroll Center for the Blind were interviewed. From them, certain preferences such as lightness and comfort, subtleness, and haptic feedback design were requested. These requests were integrated into the first iteration and prototype, which consisted of an Arduino Micro and an ultrasonic sensor. Next, because a two-part device would be more convenient, Bluetooth communication was implemented. However, the first trial did not go well; the HM-10 Bluetooth devices did not connect. Therefore, ultimately, 2 Arduino ESP-32 were used to establish a Bluetooth ESP-NOW connection. These iterations were developed through trial and error, especially with feedback from our clients, the members of the Carroll Center for the Blind. Ideas were developed by the project members, and suggestions were tested through building and testing.
Results: Through weeks of trial and error, we developed a device containing 2 ESP-32s, with one connected to an Arduino Time-of-Flight sensor and the other to a haptic feedback buzzer. The ESP-32 and sensor are attached to a brooch to ensure they are above the waist. The second ESP-32 is connected to a haptic buzzer in a watch-like form, attachable to the user’s wrist. One ESP-32 sends distance data, which is converted into a haptic signal for the user to feel. Additionally, new C++ code was implemented for a logic system that enables faster and more accurate haptic feedback. With these new components, we presented our device to the members of the Carroll Center for the Blind, who tested it and used it.
Conclusion: This project delivered an effective solution for navigation difficulties for the visually impaired. It includes aspects such as a lightweight design and efficient above-the-waist detection, which members of the visually impaired community requested during the design process. Through this device, we aim to address challenges, improve the quality of life for the visually impaired community, and encourage their independence.

Authors List :
Alyssa Yasuhara, Hannah Sheppard, Morgan Olszewski, Karina Florian, Sophia Fu
Presenting Author :
Alyssa Yasuhara
Affiliations :
Newton North High School, LigerBots
Email :
alyasu929@gmail.com
Key Words (5 Words Maximum) :
Cataracts, Glaucoma, Macular Degeneration, ESP-32, HM-10