Pedestrian Location and Assistance

From Mindworks
Jump to navigation Jump to search
Freescale Sensor Fusion Software
Sponser Dr. Jim Frenzel
Team Name The Neighborhood Watch
Duration Spring 2015 -Fall 2015
Faculty Advisor Dr. Feng Li
Faculty Mentor Dr. Jim Frenzel
Team Members
  • Alec Briggs
  • Tom Haney


Problem Definition[edit | edit source]

The Pedestrian Location and Guidance project aims to monitor an intersection and locate pedestrians while in the crosswalk. The system is geared towards assisting visually impaired pedestrians who may have trouble safely crossing the street. If the pedestrian strays or remains in the crosswalk the system should be able to recognize this and either give feedback or lengthen the time to cross.

Background[edit | edit source]

The University of Idaho has a past in assisting impaired pedestrians at intersections. A previous senior design project at the University used Ethernet over Powerline to design a crosswalk button that can communicate and exchange information with the traffic controller. The Pedestrian Location and Guidance project will build off of this project to provide further protection to impaired pedestrians.

Deliverable[edit | edit source]

  • Methods and design setup. Code used for system. Experimental procedures and anything else needed to replicate project for future research and application.
  • A report on feasibility and usability of system design for assessment on application of technology in real world everyday use.

    Specifications[edit | edit source]

    Specification Importance
    5 - high, 1 - low
    Provide feedback to the pedestrian

    3

    Be able to locate someone in the crosswalk

    5

    Be able to work in all weather conditions

    4

    Provide connectivity to the traffic controller

    3

    Track multiple users at a time

    5

    Be customizable to different intersections

    3

    Be able to test on an actual phone

    1

    Increase accuracy using a secondary device (DecaWave)

    2

    Project Learning[edit | edit source]

    We ended up using two different methods, both relying on sensor fusion, to try to solve our problem. The first method uses the Freescale Sensor Fusion test kit. This kit is a two part system of an embedded micro controller board and a system of sensors including: accelerometer, gyroscope, magnetometer, pedometer and various others. This complement of sensors is available in most modern cell phones and allows us to take advantage of Freescale's filtering and data combination algorithms to use the sensor data in more useful ways than we could individually. These libraries also allow for the frame of reference to be set to Earth, rather than the chip orientation.

    The second method consisted of the Bosch BNO055 sensor fusion board, an Arduino Pro mini 5V 16MHz, and a microSD card. This board does the sensor fusion itself, and then we were using the Arduino to access data and save to the microSD. The sensor fusion didn't come with the libraries like the Freescale board, so it was a little more limited. The main limitation was the frame of reference is set to the orientation of the chip.

    Instead of using GPS, we are researching how accurately we can track someones location using these other senors as they cross the street. This is because GPS has a range of 10 feet which can be the difference between the person being in the crosswalk or not.

    2015 PLA SystemDiagram.png

    Testing[edit | edit source]

    The idea was to read the acceleration data, do sensor fusion, the use simple kinematic equations to get velocity, and then displacement. From here we would check the displacement against the given boundaries. If the pedestrian was in the crosswalk, we would start the process over. If they had veered out of the crosswalk we would give some sort of feedback, and start over. If they had successfully crossed the crosswalk, we would be done. As for the actual testing, we would mark out a rectangular area that would act as a crosswalk. Then we would walk in different patterns and see how well the device was able to recognize where we were in the crosswalk. As for feedback, we were using the LED on the device to glow different colors depending on what condition was hit during boundary checking.

    Results[edit | edit source]

    After running several tests we found that the sensors on both devices were not very accurate. This was something that was also said by Freescale after an inquiry to why we were getting strange values. The most troubling problem was that we were seeing a non zero value for acceleration even when the device wasn’t moving. And because we are integrating twice, the corresponding displacement errors become large after a short period of time. One thought as to how to fix this was a find the bias for the motionless device and then add/subtract from the acceleration readings to fix this. The problem we found here was that the same bias didn’t seem to apply for the moving device. So, fixing the problem for stationary readings would just add the error to the moving readings. For many applications the stationary readings are not relevant, but for our application they are very important. If a pedestrian stopped in the intersection for a few seconds the device would still record movement. After a few seconds the device would think the pedestrian had made it across, when in reality they were still standing in the middle. Different results are shown in the document archive.

    Recommendations[edit | edit source]

    We had a few thoughts to improve accuracy. One was to use multiple, different accelerometers. Compare the readings from each, and if they seemed to match fairly well take an average of the readings. If they didn’t seem to correlate, one could assume that we weren’t moving and we were just seeing the bias from each device. Another thought was to try to use the Decawave, GPS, or some other type of location tracking to somehow improve the accuracy. There were also several applications where accelerometers were used to recognize a stepping motion. The change from positive to negative z acceleration could be recognized as a step. They were basically making a pedometer, them making an assumption on the length of one step to calculate how far one must have walked. This is an interesting idea, but will ultimately not be that accurate because everyone has different steps.

    Team Information[edit | edit source]

    2015 PLA BidPhoto sized.jpg
    Alec Briggs Interests: Sports (especially soccer), hiking, fishing, video games
    Electrical Engineering
    Hometown: Boise, Idaho
    Email: brig9575@vandals.uidaho.edu
    2015 PLA personalpic.png
    Tom Haney Interests: ---.
    Computer Engineering
    Hometown: Idaho
    Email: thaney@vandals.uidaho.edu

    Document Archive[edit | edit source]

    File:Agenda Minutes.pdf
    File:Project Timeline.pdf
    File:Cart slide.pdf - Slide of frictionless track. Acceleration should read 9.8 m/s^2
    File:Freescale no movement.pdf - Data seen when Freescale device was not moving