Title page for ETD etd-05072012-161514

Type of Document Dissertation
Author D'Angio, Paul Christopher
URN etd-05072012-161514
Title Adaptive and Passive Non-Visual Driver Assistance Technologies for the Blind Driver Challenge®
Degree PhD
Department Mechanical Engineering
Advisory Committee
Advisor Name Title
Hong, Dennis W. Committee Co-Chair
Leonessa, Alexander Committee Co-Chair
Cooper, Robin K. Panneton Committee Member
Southward, Steve C. Committee Member
Sturges, Robert H. Committee Member
  • Real-Time Neural Network Driver Modeling
  • Driver Assistive Technologies
  • Model Predictive Control
  • Quazi-Newton Optimization
  • Non-Visual Human Computer Interfaces
Date of Defense 2012-04-30
Availability unrestricted
This work proposes a series of driver assistance technologies that enable blind persons to safely and independently operate an automobile on standard public roads. Such technology could additionally benefit sighted drivers by augmenting vision with suggestive cues during normal and low-visibility driving conditions. This work presents a non-visual human-computer interface system with passive and adaptive controlling software to realize this type of driver assistance technology. The research and development behind this work was made possible through the Blind Driver Challenge® initiative taken by the National Federation of the Blind.

The instructional technologies proposed in this work enable blind drivers to operate an automobile through the provision of steering wheel angle and speed cues to the driver in a non-visual method. This paradigm imposes four principal functionality requirements: Perception, Motion Planning, Reference Transformations, and Communication. The Reference Transformation and Communication requirements are the focus of this work and convert motion planning trajectories into a series of non-visual stimuli that can be communicated to the human driver.

This work proposes two separate algorithms to perform the necessary reference transformations described above. The first algorithm, called the Passive Non-Visual Interface Driver, converts the planned trajectory data into a form that can be understood and reliably interacted with by the blind driver. This passive algorithm performs the transformations through a method that is independent of the driver. The second algorithm, called the Adaptive Non-Visual Interface Driver, performs similar trajectory data conversions through methods that adapt to each particular driver. This algorithm uses Model Predictive Control supplemented with Artificial Neural Network driver models to generate non-visual stimuli that are predicted to induce optimal performance from the driver. The driver models are trained online and in real-time with a rapid training approach to continually adapt to changes in the driver’s dynamics over time.

The communication of calculated non-visual stimuli is subsequently performed through a Non-Visual Interface System proposed by this work. This system is comprised of two non-visual human computer interfaces that communicate driving information through haptic stimuli. The DriveGrip interface is pair of vibro-tactile gloves that communicate steering information through the driver’s hands and fingers. The SpeedStrip interface is a vibro-tactile cushion fitted on the driver’s seat that communicates speed information through the driver’s legs and back. The two interfaces work simultaneously to provide a continuous stream of directions to the driver as he or she navigates the vehicle.

  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  DAngio_PC_D_2012_1.pdf 9.75 Mb 00:45:09 00:23:13 00:20:19 00:10:09 00:00:52
  DAngio_PC_D_2012_Copyright.pdf 734.70 Kb 00:03:24 00:01:44 00:01:31 00:00:45 00:00:03

Browse All Available ETDs by ( Author | Department )

dla home
etds imagebase journals news ereserve special collections
virgnia tech home contact dla university libraries

If you have questions or technical problems, please Contact DLA.