top of page
Image by Marc-Olivier Jodoin

A mobile app that notifies the user of objects or hazards in their path when they are engaged in their smartphones.

Introduction

IMG_1753 2.JPG

Most people use smartphones for various purposes while walking on the road or on footpaths.

But unknowingly, we trip or fall without being unaware of the surroundings.

Fall caused about 8 million hospital emergency room visits, representing the leading cause of visits.

Survey

We conducted the first round of interviews with 18 participants. Here are the results:

61%

Male

50%

Sustain minor injuries from falls.

81%

Age 18-25 years

79%

People are using their smartphones while strolling in an urban area.

66%

Proficient in using smartphones

61%

Positively interested in a safety app

Methodology

Measurement criteria
  • Object detection accuracy (in percentage)

  • Screen Resolution of a real-time image

  • Category of the detection (footpath, flower, door)

  • Processing time for real-time object detection

  • Number of threads required to classify objects in live videos

  • Number of steps before path deviation

  • The Reaction Time of the Participants

Technology used
  • XCode

  • Python, Image classification framework

  • Tensor flow, TFlite Model using EfficientNet

  • GoogleColab

  • Google Drive

  • Datasets: Roboflow, Kaggle

Testing Participants
  • 5 participants

  • 4 Males and 1 Female

  • Mean age is 25 years

  • Graduate Students of ASU

  • All participants had knowledge of using Smartphones

  • All participants have the habit of using smartphones while walking

Testing Scenarios
  • 3 locations of Real-world Environment

  • Within living communities for safety purposes 

  • Participants walked from 60ft towards an object that can make them trip

  • Participants were completely engaged in the object detection app while walking toward the object

  • Once the app detects an object and the percentage is > 80%, the participant is expected to change his path

ProjectFinal_Group 5.png

Testing

  • Test Location 1

  • Object Detection Percentage = 87%

  • Deviation in the path before 7 steps

  • The Reaction Time of the participant is 1 seconds

Participant 1
Participant 3
  • Test Location 2

  • Object Detection Percentage = 93%

  • Deviation in the path before 3 steps

  • The Reaction Time of the participant is 1 seconds

Participant 5
  • Test Location 3

  • Object Detection Percentage = 83%

  • Deviation of the path before 1 step

  • The Reaction Time of the participant is 2 seconds

Results

Using Pearson Correlation Coefficients, we got to know that

  • Accuracy Percentage vs. Reaction Time: r = -0.4001

  • Accuracy Percentage vs. Number of Steps from Obstacle: r = -0.2103

  • Accuracy Percentage vs. Processing Time for Real-time Object Detection: r = 0.23

  • Reaction Time vs. Number of Steps from Obstacle: r = -0.4001

  • No strong correlations are found between the variables.

  • Further analysis and replication with larger sample sizes may be necessary to draw more generalizable conclusions.

Pearson Corelation coefficient.png

Pearson Correlation Coefficients: Accuracy Percentage vs. Reaction Time

ProjectFinal_Group 5 (1).png

Seconds

Accuracy

P1

P3

P5

83%

87%

93%

Graph Showing Time vs Accuracy Percentage for Participants

System Usability Results

  • Participant 1 = 67.5

  • Participant 2 = 80

  • Participant 3 = 90

  • Participant 4 = 55

  • Participant 5 = 92.5

Average SUS Score = 77

image 14.png
image 15.png
image 16.png

Limitations

  • The limited sample size for testing the prototype and understanding user behavior

  • The calculation of participant reaction time was manually done and may have human error

  • Notification not added as a response to detecting objects

  • Framework change from Image classification to Object detection CNN method for better accuracy 

  • Shadow on an object affected the accuracy of the object detection

User End Prototype

  • When the app detects an object's percentage greater than 97% (yet to be achieved), it sends a modal notification to the user

  • This serves as a warning to deviate from his path in order to avoid injuries

  • This feature is envisioned as a built-in feature that works in the background.

iPhone 12 Pro (Wooden Hands).png

Future Scope

Prototype Improvements:

  • Increase Database for better detection

  • Increase Accuracy Percentage

  • Increase object classification

Product Integration:

  • Google View with object detection

  • Built-in feature in Android and iOS engines to support background detection of an object

  • Lane detection in the car software for better intelligence of suspension usage

References

  • A Review on Real-Time Pothole Detection System

  • The impact of mobile phone use on where we look and how we walk when negotiating floor-based obstacles

  • Obstacle Watch.- Acoustic-based obstacle collision

  • Motion mode recognition and step detection algorithm for mobile phone

bottom of page