P5 – Team BlueCane (Group 17)

Group 17 – BlueCane

Team Members: Evan Strasnick, Joseph Bolling, Xin Yang Yak, Jacob Simon

Project Summary: We have created an add-on device for the cane of a blind user which integrates GPS functionality via bluetooth and gives cardinal and/or route-guided directions via haptic feedback.

Tasks Supported in this Prototype: Our first, easiest task arises whenever the user is in an unfamiliar space, such as a shopping mall or store.  As they mentally map their surroundings, it’s imperative that the user maintain a sense of direction and orientation. Our cane will allow users to find and maintain an accurate sense of north when disoriented, increasing the reliability of their mental maps. Our second and third tasks both confront the problems that arise when a user must rely on maps constructed by someone else in order to navigate an unfamiliar space, as is the case with navigation software and GPS walking guidance. In the second task, (medium difficulty) our cane would assist users on their afternoon walk by providing haptic and tactile GPS directions, allowing users to explore new areas and discover new places. In our third and most difficult task, our cane alleviates the stress of navigation under difficult circumstances, such as frequently occur when running errands in an urban environment. In noisy, unfamiliar territory, the BlueCane would allow users to travel unimpaired by environmental noise or hand baggage, which can make it very difficult to use traditional GPS systems.

How Our Tasks Have Changed Since P4: As our tests in P4 were conducted on seeing users who are not familiar with cane travel, we hesitate to generalize our findings to our target user group. Now that we’ve managed to find blind people in the community to volunteer to test our next prototype, we can be more confident that our findings from P6 can be better generalized. The aim of our testing procedure remains largely the same – we still want our users to be able to navigate with one hand free while being able to pay attention to other auditory cues. Since our previous round of tests did not give us much useful insight, we decided to keep most of the tasks the same. For example, seeing users found the task of walking-in-a-cardinal-direction-given-North challenging, but we expect blind users to perform better at this task, since they already have to orient themselves relative to a known direction without visual cues. Thus, the feedback that blind users give while performing this task would still be useful, and we are not changing this task. Also, blindfolded seeing users walked slowly while being heavily reliant on tactile feedback for guidance as they performed the task of follow-the-direction-of-the-tactile-feedback, which is unrealistic. We expect blind people to walk much faster than blindfolded seeing people, and this would lead to a different set of challenges for our system. As such, we are not changing this task either. However, we also recognize that cane users also make use of a great deal of tactile feedback in normally getting around obstacles. Thus, for the task where the user is given auditory distractions, we are modifying the task by adding obstacles along the users’ path in order to simulate a more challenging use case and to check if the cane vibration would be too distracting.

Revised Interface Design:  Obviously, because we were only able to locate seeing participants for our first round of user testing, we were hesitant to drastically change aspects of our design in ways that may not actually be relevant to blind users. Most notably, our prototype now currently takes the form not of a cane itself but as a simple add-on which can be placed on a cane. This design was chosen because we wanted to test the usability of our system without the confound of the additional learning a blind user would have to do simply to get used to a new cane. With our prototype, the user can test with their own familiar cane, adding only the slight weight and form factor of the device. As noted in the discussion of our P4 blog post (http://blogs.princeton.edu/humancomputerinterface/2013/04/09/p4-bluecane/), we wanted to make it very clear to the user which of the navigation “modes” the cane was currently in. Thus, we added a simple switch which alternates between the two modes and has braille markings on either side to make the distinction quite clear.

Updated Storyboards:

Task 1

Task 2

Task 3

Elements Still to Be Added:

The Eventual Smartphone App

Overview and Discussion of the New Prototype:

i. For our first working prototype, we used our magnetometer, bluetooth Arduino shield, and vibration motor to implement the important features of our final design. Rather than constructing or modifying an entire cane, though, we decided to make the prototype as lean as possible by attaching it to the user’s existing cane. The prototype is capable of telling users when they are pointing in a particular cardinal direction (i.e. magnetic north) using haptic feedback. It is also capable of sending and receiving receiving data wirelessly over bluetooth, which can be used for providing turn-by-turn navigation in conjunction with a GPS-equipped device.
ii. There are some notable limitations to our prototype that we hope to address in future refinements. We hope to develop a more sophisticated mapping from the magnetometer that will allow us to send, receive, and store specific directional bearings. We may use some degree of machine learning to calculate the desired range of magnetometer values. We would also like to refine the bluetooth interface by developing a simple Android app that can communicate with the cane. Our emphasis for this prototype was to build a reasonable proof-of-concept, though, so we have left these advanced functions on the back burner until we get more feedback. Finally, we are still discussing having our final product take the form of an actual cane.
iii. We wizard-of-oz’ed some of the cane’s navigation features. For example, to give the user directions and turns, we wrote a Processing program that uses keyboard input to send commands to the cane in real-time. This is a functional substitute for a phone and GPS application would do in the real world. Simulating these features without the complication of third-party hardware/software allows us to test features quickly, debug connection problems, and maintain control over the testing procedure.
iv. The code for the bluetooth functionality was written with guidance and examples from the manufacturer’s documentation and Arduino tutorials. We also utilized some of the example code that came from the SparkFun page for our magnetometer.
Videos and Images:
The Cane Add-On
The Processor and Other Hardware
Attached to a “Cane”
The Prototype in Action