Date of Award
Bachelor of Science
iOS, mobile application, blind, visually impaired, tensorflow, CoreML, apple, iBeacon, camera, vision, object detection, traffic light
In day-to-day life, visually impaired individuals face the problem of crossing roads by themselves. This project was designed and built to solve this key issue. The system is supposed to give the user a warning before approaching a crosswalk for their safety and also give information about when it is safe to cross the road. An iOS application was developed to address the problem since recent studies have discovered that a vast number of visually impaired individuals are using smartphones (iPhones in particular) due to the ease and convenience it brings to their daily life. The application should be able to: notify the user through audio and haptic vibration, inform the user about crossing with 100% confidence with audio, all the input should be voice-based. In addition, it shouldn’t need any internet, shouldn’t consume much battery power, and run in the background with other applications. The application goal is to be used in Queens, NY in particular. For warning notification, we used iBeacon to know if a user is approaching a crosswalk. Currently, the iBeacon detection has only an error distance of +/-3 feet. In our implementation, every output of the application is audio and haptic vibration, and the inputs are all voice and haptic touch which are standard for the visually impaired. The application informs the user when it is safe to cross by detecting traffic lights 30 frames per second using the phone's built-in camera and an offline image processing model. The model was built in Caffe2 and then converted to CoreML for this project. The model alone has an accuracy of 92.9% and on top of that, we added cross-checking which increases the confidence level according to Bayes Theorem. Also, there is an additional layer of state machine which refines the output using possible timing of the traffic light to give 100% confident signal to cross the road and makes sure we don’t make the user wait more than 10 seconds without any decision. The application is only 30MB in size, doesn’t need any internet connection, and has low to average power consumption. The application can also run in the background allowing low power consumption and the capability to work with other applications. We have done both unit testing and integration testing to confirm every component of the application. The only thing that is left is to test the application in the real world by visually impaired individuals to confirm this initial prototype.
Khan, Ali, "An iOS Application for Visually Impaired Individuals to Assist with Crossing Roads" (2022). Honors Theses. 2642.