Bachelor of Science in Computer Science

Students create machine to help visually impaired

CS students hold echolocating machine

SUMMARY: Computer science students and faculty design and create a machine that replicates echolocation to help people with vision disabilities.

By: Caleb Ayers, CISE 

While flying through the night sky, bats use sound waves as a substitute for vision. Bats rapidly emit sound pulses, most of which are outside the range of human hearing. In what is known as echolocation, a bat uses returning echoes to interpret their surroundings, navigate, and detect food. Whales and dolphins use this same tactic underwater. Amazingly, a few blind humans have also learned to echolocate with clicking noises.

During the summer 2017 Research Experience for Undergraduates (REU), Computer Science (CS) professor Nathan Sprague, CS major Diana Godja (’19) and Nhung Hoang (’19) from Swarthmore College began talking about this incredible human adaptation. “We decided it would be a really interesting project to see if we could build a system that could reproduce that ability: to be able to send out auditory signals, listen for the echoes, and then reconstruct a depth map based on that analysis,” said Sprague.

Intrigued by the idea, Godja and Hoang began working on the project over the summer. Under Sprague’s guidance, they successfully designed and created a machine that replicates echolocation, which they envision as a future asset for the visually impaired. They named the echolocating machine Bruce, referring to Batman’s alternate identity, Bruce Wayne.

Bruce performs echolocation using a neural network. This means that the machine continues learning from each input it is given and provides increasingly more accurate outputs based on those inputs.

Bruce’s physical form is inexpensive and simple: a commercial microphone, speakers, and a depth camera. The microphone sends out an audible chirp that bounces off nearby objects and returns to the speakers. The returning sounds are used to plot a depth map of the room, which is tested against the actual results from the depth camera. All the data are then pumped into the neural network.

During the summer, Godja and Hoang spent countless hours designing, coding, and testing Bruce. They met and consulted with bat researchers at Virginia Tech, and, to test and improve Bruce’s accuracy, they walked around ISAT, collecting a dataset of the results from more than one hundred thousand chirps.

When the REU concluded, Godja continued working with Bruce as her independent study for the fall semester tweaking Bruce’s coding and collecting more datasets. She also attended several conferences to present the invention and discuss potential uses. She and Sprague are currently working on a research paper about Bruce.

“The computer science department here is amazing,” Godja said. “There are so many opportunities to do so many things. I feel like I’m being supported by professors and the whole department -- in my future -- in whatever I want to do.”

Aside from the neural network knowledge that she gained, Godja has attained practical skills, such as persevering through research failures, working in teams, and presenting research. Although the REU and her class has concluded Godja will still use her free time to perfect Bruce.

Sprague and Godja believe that Bruce has the potential to help others, but it still needs some improvements. For instance, it only functions in relatively small indoor areas. Bruce also cannot distinguish what the surroundings are; it merely indicates the general location of nearby masses. However, both Sprague and Godja believe that this technology could combine with other types of sensors to co-create a trustworthy representation of reality for the visually impaired.

“If you really wanted to have a system that would be useful for blind people,” Sprague said, “you want it to be as robust as possible, and you want it to have multiple modalities for getting the depth information.”

Published: Thursday, March 8, 2018

Last Updated: Thursday, March 8, 2018

Back to Top

Read More