Categories
Research

Design and Analysis of an Artificial Intelligence Based System for Real-Time Detection of Texting and Driving

Using a cell phone to talk or text while a vehicle is in motion impairs driving performance and increases the likelihood of serious crashes. This solution uses a machine learning-enabled video camera to automatically recognize the usage of mobile devices in the driver’s seat through object detection.

Problem: Texting and Driving

Cell phone use has increased exponentially over the years. The rise in cell phone use and distracted driving and its consequences have risen in tandem. According to the National Highway Traffic Safety Administration, in 2015, around 542,000 drivers were using their cell phones while on the road at any given point during daylight hours (Driver Electronic Device Use in 2015, NHTSA), creating great potential for crashes. The National Safety Council estimates one in four motor vehicle crashes involve cell phone usage.

The Centers for Disease Control and Prevention determined that one-third of drivers between the ages of 18 and 64 read or send text or email messages while in traffic. This issue is becoming more and more prevalent, and laws for driver cell phone use are not solving this problem. The statistics are sobering. The NHTSA estimated that in 2015 30,000 people were injured in crashes involving cell phone use or other cell-phone-related activities.

Driver cell phone use, including texting and browsing the web, while the vehicle is in motion is extremely dangerous and has safety implications beyond that of the individual driver’s wellbeing. The misconceptions that cell phone usage while the driver is behind the wheel is not unsafe and that drivers can multitask cause many individuals to risk their own lives and others at the expense of using mobile devices.

In fact, driver reaction times are significantly slower when a cell phone is involved (Distractive effects of cellphone use). A study by the University of Utah determined that the reaction times for distracted drivers from cell phone usage were comparable to those of intoxicated drivers. A study conducted by ​Car and Driver Magazine has shown that texting and driving may even be more dangerous than drunk driving.

A challenge presented by current solutions that attempt to prevent texting and driving is distinguishing the difference between a phone that is in the hands of a driver and a phone that is in the passenger’s hands.

With this in mind, we sought to leverage new technologies to address this issue, using the AWS DeepLens to detect phone use in the driver’s seat.

Solution

To address these problems, our proposed solution uses the machine learning-enabled video camera AWS DeepLens to automatically recognize the usage of mobile devices in the driver’s seat through object detection. Positioned overlooking the driver’s shoulder, the smart camera has a full view of the driver’s seat and the steering wheel, allowing it to monitor the driver’s actions at all times.

Our goal was to train the smart camera to be able to detect driver cell phone usage and respond by issuing an audio warning to the driver in real-time. In order to achieve this, we applied neural networks for cell phone object detection through a transfer learning process.

First, we manually located and labeled four hundred images of a phone in the driver’s hand. We used 320 figures to train the network and the other 80 figures for network validation.

Results & Discussion

The network achieved a mean average precision of 95.07%, and we found the overall detection and alert delay to be 0.31s. From the high precision rate and detection speed, we concluded that the transfer trained network SSD + Inception V2 deployed to DeepLens can be used for real-time detection of driver cell phone use.

In the future, this approach can be extended to address other dangerous driving behaviors, such as driving under the influence or drowsy driving.

Awards

  • 2019 Regeneron Science Talent Search (STS) Scholar
  • JSHS Competitive Poster Session B Physical Science First Place Winner