Group 17 – P6

Group Number: 17

Names: Evan, Jacob, Joseph, Xin Yang

Project Summary: We are testing an add-on device for the cane of a blind user which inte­grates GPS func­tion­al­ity via blue­tooth and gives car­di­nal and route-guided direc­tions via hap­tic feedback.

Introduction: Over the course of our project, we have prototyped an attachment for the long white cane used by the blind. Intended to work as a bluetooth extension to a GPS device, the BlueCane provides haptic and touch-based navigation guidance and features a passive mode in which it haptically gives an intuitive compass orientation to the user. We are now testing this prototype with blind users. Our purpose is to determine the usability of our current prototype, in terms of how much of an improvement (if any) it would provide over current systems, and to determine which features promote the usability and which should be improved or removed. We hope to understand the usefulness of haptic and touch-based guidance in a navigation interface for the blind.


Link to our P5 prototype:

Since our P5 submission, we have made the following changes:

  • Cut down on the size of the PVC apparatus to facilitate easier attachment to individual canes

  • Consolidated and organized wiring to prevent shorts, breaks, and entanglements

  • Added an accelerometer and experimented with gravitational tilt compensation for the compass unit


Participants: All three participants were blind or visually impaired individuals, with varying levels of mobility and experience with cane travel, living in the Mercer county area. They were recruited via a notice sent over the listserv for the New Jersey Foundation for the Blind which advertised an opportunity to help test a prototype for new technologies in the area of navigational tools for the blind. Participant #1 was a completely blind, retired female, who, despite having above-average mobility and confidence, was primarily a seeing-eye dog user and thus had limited experience with cane travel. She used a GPS device regularly. Participant #2 was a blind woman who held a full-time job, but primarily used transit services to get around. Though she had far more experience with cane travel, she had limited experience with GPS technology. Participant #3 was a legally blind, working male who had moderate experience with cane travel. He worked in the field of technology and had experience with GPS. No participant had any physical issues with mobility, and all seemed to understand well the nature of the task and were excited about the advancements that we were proposing.

Apparatus: Our prototype was essentially the same as demoed in our P5 video, with the small modification that we carved out a portion of the cane handle to give the attachment a better form factor. An accelerometer had been added to allow for more accurate directional calculations in the final version, it had not yet been implemented at the time of testing. In addition, we also utilized a small blue briefcase for use in the 3rd task, as well as audio tracks of city background noise. In order to minimize the demand on our participants, who had difficulty traveling, all testing was performed at the house of the individual, typically outdoors in their yard or neighborhood because of space requirements. As a result, the participants all had a fair degree of familiarity with their environment, which, while perhaps allowing them to rely less on purely external directional instructions, lessened the already considerable stress associated with their participation.

Tasks: Our first, eas­i­est task arises when­ever the user is in an unfa­mil­iar space, such as a shop­ping mall or store, but does not have a well defined destination, as is often the case when browsing or shopping.  As they men­tally map their sur­round­ings, it’s imper­a­tive that the user main­tain a sense of direc­tion and ori­en­ta­tion. Failure to do so can reduce the user’s ability to do the things they want, and can even be a safety concern if the user becomes severely lost. Our cane will allow users to find and main­tain an accu­rate sense of north when dis­ori­ented by providing intuitive haptic cues, increas­ing the reli­a­bil­ity of their men­tal maps.

Our sec­ond and third tasks both con­front the prob­lems that arise when a user must rely on maps con­structed by some­one else in order to nav­i­gate an unfa­mil­iar space with the intent of reaching a specific destination, as is the case with nav­i­ga­tion soft­ware and GPS walk­ing guid­ance. In the sec­ond task, (medium dif­fi­culty) our cane would assist users on their after­noon walk by pro­vid­ing hap­tic and tac­tile GPS direc­tions, allow­ing users to explore new areas and dis­cover new places, in much the way that a sighted person might want to visit a new street, building, or park on a leisurely stroll.

In our third and most dif­fi­cult task, our cane alle­vi­ates the stress of nav­i­ga­tion under dif­fi­cult cir­cum­stances, such as fre­quently occur when run­ning errands in an urban envi­ron­ment. In noisy, unfa­mil­iar ter­ri­tory, the Blue­Cane would allow users to travel unim­paired by envi­ron­men­tal noise or hand bag­gage, which can make it very dif­fi­cult to use tra­di­tional GPS systems.

Procedure: Upon arriving at the participants’ houses, we began by explaining to them who we were and what our system hoped to accomplish. After obtaining their informed consent, we introduced them to the prototype and its features, and allowed them to get familiar with it before explaining each task. We performed the tasks sequentially, gathering data during the trial itself (success on responding to cues and appropriately reaching destinations), and then obtained intermediate feedback after each task. After all tasks were completed, we reminded each participant to be as honest as possible and read out the survey questions, allowing the them to qualify their answers as freely as they wished (after stating a value on the likert-style questions). Finally, when all of our predetermined questions had been answered, we opened the conversation to a full discussion of any questions or other feedback they had.

Test Measures:

Task 1:

  • Whether user was able to turn himself/herself towards a given cardinal direction given tactile feedback from the cane when it points.
  • If unsuccessful, approximate angle at which user deviated from the correct direction.
  • Qualitative Feedback

Task 2:

  • Without any additional cues, we gave the user a few turns to follow using the raised ridges in our navigational hardware. Out of these, we counted how many they were able to follow.
  • Qualitative Feedback

Task 3:

  • Same as task 2.


Participants who succeeded in task 1: 3/3

Overall fraction of turns followed in task 2: 1/3, 0/5, 2/5

Overall fraction of turns followed in task 3: 3/3, 3/5, 2/2

Likert ratings: (1 for “Strongly Disagree”, 5 for “Strongly Agree”)

I found the vibration in the direction of North useful in maintaining a sense of my orientation.  4 + 5 + 4 (avg: 4.33)

I found the vibration in the direction of North intuitive and easy to use. 5 + 5 + 4 (avg: 4.67)

I found the turn-by-turn commands useful in navigating to a destination. 4 + 4 + 5 (avg: 4.33)

I found the turn-by-turn commands intuitive and easy to use. 3 + 4 + 5 (avg: 4)

I would prefer to have directions read to me aloud instead of or in addition to haptically (as in the current system). 3 + 2 + 4 (avg: 3)

I would prefer to use (a refined version of) this system over a standard cane. 5 + 5 + 4 (4.67)

I feel that having such a system available to me would increase my confidence or feeling of autonomy. 2 + 5 + 4 (avg: 3.33)

I feel that (a refined version of) such a system would help me navigate indoor spaces. 5 + 4 + 4 (avg: 4.33)

I feel that (a refined version of) such a system would help me navigate outdoors (with or without GPS navigation). 5 + 4 + 4 (avg: 4.33)


Given that the profiles of our 3 users varied considerably, it is likely that there are other profiles of blind users we have not considered/encountered. This causes us to be hesitant with any assumptions about our external validity.

Variations between blind users:

– Amount of experience in cane travel

– Cane travel technique (how they like to hold the cane)

– How good their sense of direction is (none of them normally think in terms of cardinal directions, but they generally know how much a 90 degree turn is)

– Experience with assistive technologies

– Sense of autonomy.

Our system generally received positive reception (as indicated by Likert feedback). All users were enthusiastic about our developments and asked to be informed of future possibilities to test the system.

Salient points from post-task discussions:

– Our method for indicating the turn-by-turn instructions needs to be more ergonomic – the current placement makes it difficult to detect both left and right signals with a single finger.

– Because of the many variations in hand placement, users are not always aware of when turn signals are passed.

– When user missed a turn, it was hard to recover using current system.

– We will need a way to adapt the layout of the ridges to many different hand placements and holding styles.

Discussion: For this round of testing, we were fortunate enough to work with visually impaired individuals and receive their feedback. We found that demonstrating, testing, and discussing our prototype with them was highly informative—affirming some features of our prototype and challenging others. The three individuals we visited had varying amount of experience with cane travel, degree of autonomy, navigational technique, experience with technology, and sense of direction. Each participant acknowledged his or her degree of autonomy or “mobility” as well as how age has affected their ability to navigate independently. Furthermore, they lived in different environments and performed different tasks on a day-to-day basis. Together they provided a variety responses to our questions and offered alternatives to some of our presumptions in the design process.

Naturally, it was more difficult to control the testing process, and our results were almost certainly influenced by testing location, individual preferences, and level of visual impairment. Whereas previously we performed the tasks in the confines of the electrical engineering lab with blindfolded students, this round of testing required traveling to participants’ neighborhoods. Even so, this revealed a range of use cases for our device and was ultimately helpful.

Participants’ performance on the three tasks helped reveal differences between individuals, owing in part to their particular impairment. All three fared well on the first cardinal direction task. They understood the task and were able to identify the direction of north using haptic feedback from the cane; they also identified other cardinal directions using north as a point of reference within an acceptable degree of error. When asked if this feature was useful and intuitive, all three participants (as well as one participant’s husband) responded either “Agree” or “Strongly Agree”. One participant expressed an interest in being able to set the direction indicated by the cane, which affirmed our original intention. Interestingly, few if any of the participants said that they navigate with respect to cardinal directions currently and prefer to think of their environment as a series of relative turns and paths. This challenged one of our presumptions about users’ perception of their environment. We originally suspected that blind people discarded relative direction in favor of absolute direction, but this turned out to be incorrect. Nevertheless, all participants indicated that they were open to the idea of using the device to learn cardinal directions, and they acknowledged that the feature would be helpful in unfamiliar environments.

The turn-by-turn navigation task was more challenging and ultimately more informative. The task relied on the user’s ability to perceive and respond to instructions sent from our laptop. Variation in grip technique and hand size led to some difficulty performing the task or accomplishing a turn in an adequate time frame. We found that users were better at the task when they were walking on well-defined paths (i.e. a sidewalk) where the location of the turn is already demarcated along the path itself. Navigation in the user’s backyard was more difficult because it lacked these cues, and so the user had to infer the timing and magnitude of turns.

The first two users gripped the cane the way that we had anticipated in design, but the third user preferred the less-frequent “pencil grip,” perhaps owing to height or cane length. As a result, we learned that the design of the cane handle should be more ergonomic—not only more comfortable but flexible to different preferences, or at least designed to suggest the intended grip more clearly. We were also told that the distance between the turn indicators was too long and made it difficult to receive instructions exogenously (i.e. without attending to the device directly). Perhaps for this reason, most users agreed that they would prefer to use the cane in conjunction with an optional auditory GPS program. Despite these difficulties and qualifications, users still reported the turn-by-turn navigation feature as intuitive and easy to use in our survey questions. Two of the participants were especially optimistic about the potential for the device in indoor environments, and the third said that he would prefer to use a normal cane indoors.

In the third task—as in our previous round of testing—users were not hindered by the addition of background noise and even demonstrated a notable improvement over the second task. We were also informed by one user about the concept of “parallel traffic” noise, which is used for inferring traffic patterns and deciding when to cross roads. With this in mind, the ability to navigate without aural distractions seems more important than ever.

We also asked about the desired form factor for the final product, and participants gave varying responses. Some preferred the idea of a built-in, integrated navigational cane, but others decided that a device that attaches to their existing cane would be preferable (in case the cane breaks, for example). Most of the users expressed a desire simply to see more affordable technology, since existing screen-readers and navigational devices cost thousands of dollars and aren’t covered by health insurance. Overall, the participants were gracious with their feedback and asked to stay informed about the future of the project.


Document 1: Demo script

Document 2: Consent form

Document 3: Post-task questionnaire


Figure 1: Participants were introduced to the system and shown its relevant features.

Figure 2: Participants were tested on their ability to use the cardinal features of the BlueCane in task #1.

Figure 3: Participants followed directional cues in task #2.


Figure 4: Participants completed the same navigational task, but with the added distraction of background noise and luggage to carry.

Figure 5: A video of a Participant undergoing testing is hosted at the link above.

P5 – Team BlueCane (Group 17)

Group 17 – BlueCane

Team Members: Evan Strasnick, Joseph Bolling, Xin Yang Yak, Jacob Simon

Project Summary: We have created an add-on device for the cane of a blind user which integrates GPS functionality via bluetooth and gives cardinal and/or route-guided directions via haptic feedback.

Tasks Supported in this Prototype: Our first, easiest task arises whenever the user is in an unfamiliar space, such as a shopping mall or store.  As they mentally map their surroundings, it’s imperative that the user maintain a sense of direction and orientation. Our cane will allow users to find and maintain an accurate sense of north when disoriented, increasing the reliability of their mental maps. Our second and third tasks both confront the problems that arise when a user must rely on maps constructed by someone else in order to navigate an unfamiliar space, as is the case with navigation software and GPS walking guidance. In the second task, (medium difficulty) our cane would assist users on their afternoon walk by providing haptic and tactile GPS directions, allowing users to explore new areas and discover new places. In our third and most difficult task, our cane alleviates the stress of navigation under difficult circumstances, such as frequently occur when running errands in an urban environment. In noisy, unfamiliar territory, the BlueCane would allow users to travel unimpaired by environmental noise or hand baggage, which can make it very difficult to use traditional GPS systems.

How Our Tasks Have Changed Since P4: As our tests in P4 were conducted on seeing users who are not familiar with cane travel, we hesitate to generalize our findings to our target user group. Now that we’ve managed to find blind people in the community to volunteer to test our next prototype, we can be more confident that our findings from P6 can be better generalized. The aim of our testing procedure remains largely the same – we still want our users to be able to navigate with one hand free while being able to pay attention to other auditory cues. Since our previous round of tests did not give us much useful insight, we decided to keep most of the tasks the same. For example, seeing users found the task of walking-in-a-cardinal-direction-given-North challenging, but we expect blind users to perform better at this task, since they already have to orient themselves relative to a known direction without visual cues. Thus, the feedback that blind users give while performing this task would still be useful, and we are not changing this task. Also, blindfolded seeing users walked slowly while being heavily reliant on tactile feedback for guidance as they performed the task of follow-the-direction-of-the-tactile-feedback, which is unrealistic. We expect blind people to walk much faster than blindfolded seeing people, and this would lead to a different set of challenges for our system. As such, we are not changing this task either. However, we also recognize that cane users also make use of a great deal of tactile feedback in normally getting around obstacles. Thus, for the task where the user is given auditory distractions, we are modifying the task by adding obstacles along the users’ path in order to simulate a more challenging use case and to check if the cane vibration would be too distracting.

Revised Interface Design:  Obviously, because we were only able to locate seeing participants for our first round of user testing, we were hesitant to drastically change aspects of our design in ways that may not actually be relevant to blind users. Most notably, our prototype now currently takes the form not of a cane itself but as a simple add-on which can be placed on a cane. This design was chosen because we wanted to test the usability of our system without the confound of the additional learning a blind user would have to do simply to get used to a new cane. With our prototype, the user can test with their own familiar cane, adding only the slight weight and form factor of the device. As noted in the discussion of our P4 blog post (, we wanted to make it very clear to the user which of the navigation “modes” the cane was currently in. Thus, we added a simple switch which alternates between the two modes and has braille markings on either side to make the distinction quite clear.

Updated Storyboards:

Task 1

Task 2

Task 3

Elements Still to Be Added:

The Eventual Smartphone App

Overview and Discussion of the New Prototype:

i. For our first working prototype, we used our magnetometer, bluetooth Arduino shield, and vibration motor to implement the important features of our final design. Rather than constructing or modifying an entire cane, though, we decided to make the prototype as lean as possible by attaching it to the user’s existing cane. The prototype is capable of telling users when they are pointing in a particular cardinal direction (i.e. magnetic north) using haptic feedback. It is also capable of sending and receiving receiving data wirelessly over bluetooth, which can be used for providing turn-by-turn navigation in conjunction with a GPS-equipped device.
ii. There are some notable limitations to our prototype that we hope to address in future refinements. We hope to develop a more sophisticated mapping from the magnetometer that will allow us to send, receive, and store specific directional bearings. We may use some degree of machine learning to calculate the desired range of magnetometer values. We would also like to refine the bluetooth interface by developing a simple Android app that can communicate with the cane. Our emphasis for this prototype was to build a reasonable proof-of-concept, though, so we have left these advanced functions on the back burner until we get more feedback. Finally, we are still discussing having our final product take the form of an actual cane.
iii. We wizard-of-oz’ed some of the cane’s navigation features. For example, to give the user directions and turns, we wrote a Processing program that uses keyboard input to send commands to the cane in real-time. This is a functional substitute for a phone and GPS application would do in the real world. Simulating these features without the complication of third-party hardware/software allows us to test features quickly, debug connection problems, and maintain control over the testing procedure.
iv. The code for the bluetooth functionality was written with guidance and examples from the manufacturer’s documentation and Arduino tutorials. We also utilized some of the example code that came from the SparkFun page for our magnetometer.
Videos and Images:
The Cane Add-On
The Processor and Other Hardware
Attached to a “Cane”
The Prototype in Action

L2 – Team EyeWrist

Team Members: Evan Strasnick, Joseph Bolling, Xin Yang Yak, Jacob Simon

Group Number: 17

What We Built: We built a system that overlays tunes on top of a bass beat, using an accelerometer to select notes from within a given chord so that the result is always in tune for a specific key. A rotary pot allows the user to control the tempo, a push button on the larger breadboard varies the chord being used – either to toggle to the next set of notes or to modulate the sound through a continuous hold, the accelerometer picks out a note from within the currently selected chord, and the push button on the hand-held component plays the tune. To play our musical instrument, the user simply holds the hand-held controller (the smaller breadboard) in one hand and presses the button to create sound while controlling pitch via rotation. The user can also vary the tempo with the Tempo Knob and change the chord by pressing the Chord Progression button. For our demonstration, we chose a CM, FM, GM chord progression, but the instrument could theoretically support any arrangement or number of chords the user desires. This instrument was not only a success in terms of offering a number of ways in which the user could create an original beat, but it was also surprisingly  addicting to play. We would have liked to use more speaker elements to offer different instrumental options as well.


1) Our first prototype was a simple mapping of the accelerometer tilt to allow the user to play melodies over a certain range of musical pitches. While this offered the most control over pitch in theory, we wanted the user to not have to worry so much about tilting correctly for the pitches that they wanted, allowing them to focus more on the beat that they were dropping. This philosophy inspired our next design…

2) Our second prototype instead preserved the fun of using the accelerometer to play, but aided the user by mapping its output to the notes within specific chords. After adding in the push button users were able to program in their own chord progressions for a given song and then simply iterate through them.

3) Our third prototype added in a constant bassline, which was always tuned to match the accelerometer’s output, keeping things easy for the user. Further, we added a potentiometer which allowed the user to control tempo.


Final Performance: Entitled “Lab 2 in A Minor,” this piece is a satire on satire, and a celebration of celebratory pieces (and poor percussionists). Enjoy:


Parts Used:

Wires + Crocodile clips
Piezo element
2 push buttons
1 rotary pot
1 * 220 ohm resistor
1 * 330 ohm resistor

Instructions to Recreate:

To build our instrument, mount the accelerometer and a push button on a small breadboard or other device that can be held and manipulated with one hand.  Connect the accelerometer Vin, 3Vo, Gnd, and Yout to the A0, A1, A2, and A4 pins on the Arduino, respectively.  Ground one terminal on the pushbutton, and connect the other to the Arduino’s digital pin 2.  Be sure to keep your wiring tight on this section-You may want to bundle your wires out of the way and secure the connections with electrical tape or solder, as this part of the device will be moving a lot as you play.

Connect the buzzer in series with a 220Ω resistor between ground and digital pin 3.  Use alligator clips to connect the piezo element between digital pin 3 and ground.  These two components will produce your sound, so feel free to experiment with different physical setups to see what kind of tonality you can get- we found that covering the sound hole on our buzzer actually made it considerably louder.

Mount the second pushbutton and the potentiometer on a separate breadboard.  Ground one leg of the pushbutton, and connect the other to digital pin 4 on the Arduino. Connect the center leg of the potentiometer to pin A5, and connect the outer two legs to the Arduino’s +5V and Gnd pins.  When wiring this section, try to keep your wiring tight and out of the way- you’ll want to have uninhibited access to the pushbutton and potentiometer as you play. Then just upload our code to your Arduino and have fun!


Source Code:


// Accelerometer's power & ground pins
const int groundpin = A2;
const int powerpin = A0;

// Accelerometer's Y axis
const int ypin = A4;
const int tempopin = A5;

// Speaker pins ===
const int piezopin = 3;
const int buzzerpin = 5;

// Button pins ===
const int buttonpin = 2;
const int chordpin = 4;

int** progression;
int chordindex;
int basscounter;

boolean basson;
boolean buttonpressed;


int scaleN = 8;
int chordN = 4;

// C Major Scale: c, d, e, f, g, a, b, c
int CMajorScale[] = {3830, 3400, 3038, 2864, 2550, 2272, 2028, 1915};
char CMajorScaleNotes[] = "cdefgabc";

// C Major Chord: c, e, g, c
int CMajor[] = {3830, 3038, 2550, 1915};
char CMajorNotes[] = "cegc";

// G Major Chord: g, b, d, f
int GMajor[] = {5100, 4056, 3400, 2864};
char GMajorNotes[] = "gbdf";

// F Major Chord: f, a, c, f
int FMajor[] = {2864, 2272, 1915, 1432};
char FMajorNotes[] = "facd";

// A Minor Chord: a, c, e, g
int AMinor[] = {4544, 3830, 3038, 2550};
char AMinorNotes[] = "aceg";

int* CFCAC[] = {CMajor, FMajor, CMajor, AMinor, CMajor};
int* CFCGC[] = {CMajor, FMajor, CMajor, GMajor, CMajor};


void setup() {

// initialize the serial communications:

pinMode(groundpin, OUTPUT);
pinMode(powerpin, OUTPUT);
digitalWrite(groundpin, LOW);
digitalWrite(powerpin, HIGH);

// Input pins
pinMode(ypin, INPUT);

pinMode(buttonpin, INPUT);
digitalWrite(buttonpin, HIGH);
pinMode(chordpin, INPUT);
digitalWrite(chordpin, HIGH);

// Output pins
pinMode(buzzerpin, OUTPUT);
pinMode(piezopin, OUTPUT);

progression = CFCGC;
chordindex = 0;
basson = true;
basscounter = 0;



// ===
#define THRESHOLD 270

// Returns a value between lo and hi
// by taking in and subtracting the threshold
int mapToRange(int in, int lo, int hi) {

// Subtract threshold value
int result = in - THRESHOLD;

if (result > hi) return hi;
else if (result < lo) return lo;
else return result;


int convertToTone(float value, float range, int* notes, int n) {

// do this until we find something better

if (value >= range)
return notes[n-1];

int i = (value / range) * n;
return notes[i];


// Plays a note with period through speaker on pin
void playTone(int period, int duration, int pin) {

// Time between tone high and low
int timeDelay = period / 2;

for (long i = 0; i < duration * 1000L; i += period * 2) {
digitalWrite(pin, HIGH);
digitalWrite(pin, LOW);

void playTwoTones(int period, int duration, int pin, int bassperiod, int basspin) {

// Time between tone high and low
int timeDelay = period / 2;

for (long i = 0; i < duration * 1000L; i += period * 2) {
if (buttonpressed) {
digitalWrite(pin, HIGH);
if (basson) {
digitalWrite(basspin, HIGH);
if (basson) {
digitalWrite(basspin, LOW);
if (buttonpressed) {
digitalWrite(pin, LOW);


// Not using this right now but keeping for reference
void playNote(char note, int duration) {

char names[] = { 'c', 'd', 'e', 'f', 'g', 'a', 'b', 'C' };
int tones[] = { 1915, 1700, 1519, 1432, 1275, 1136, 1014, 956 };

// play the tone corresponding to the note name
for (int i = 0; i < 8; i++) {
if (names[i] == note) {
playTone(tones[i], duration, buzzerpin);


void loop() {

if (basscounter == 4)
basscounter = 0;
if (basscounter == 0)
basson = true;
else basson = false;

if (digitalRead(chordpin) == LOW) {
if (chordindex == 4) chordindex = 0;

// Play sounds if the button is pressed
buttonpressed = (digitalRead(buttonpin) == LOW);

Serial.println("Button pressed!");

// Analog Read
int analog = analogRead(ypin);
int tempo = analogRead(tempopin) + 10;

// Debug: Print the analogRead value
Serial.print("AnalogRead: ");

// Map the Y-axis value to the range [0, 250]
float yValue = mapToRange(analog, 0, 135);

// Calculate the tone (period value)
int tone = convertToTone(yValue, 135, progression[chordindex], chordN);
int basstone = progression[chordindex][0];

// Debug: Print out the yValue and tone (period)
Serial.print("yValue: ");
Serial.print("tone: ");

// Actually produce some sound!
playTwoTones(tone, 100, piezopin, basstone, buzzerpin);

// Delay before next reading:



Names: Joseph Bolling, Evan Strasnick, Jacob Simon, Xin Yang Yak

Group Number: 17

What we built: We created an alarm clock, aimed at heavy sleepers and chronic snoozers, that shuts off its buzz only when the user actually gets out of bed. It accomplishes this using a simply force-sensing resistor located under the bed that responds to the change in force of a human weight being lifted. Overall, we were delighted with how well the system functions, and how simple it was to implement such a practical feature that solves a real-world problem. We were dissatisfied, however, that the coding of the actual alarm had to be done by setting a variable in the source code of the arduino (and that only one alarm could be set at a time). An actual implementation of the device would communicate with a real alarm clock (or alarm clock interface) to allow the user to more easily set and change his alarms.



Our first design was of an insole that could be inserted into any shoe to add pedometer functionality. A force sensing resistor measured the number of steps, which could be uploaded via USB at the end of the day.

Our next design was of a device which used a thermistor to alert the user if (because of a malfunction or power outage) their refrigerator or freezer was at a temperature at which their food might spoil. In the event of a power outage, the user would exactly when their food was no longer good.

Our final design, which we decided to implement, was the alarm clock which shuts off when the user actually gets out of bed.


Final System:

Here’s a video demonstration of our prototype. Since, to our surprise, we were unable to find a bed in the lab, our user demonstrated the system by falling asleep in a chair instead.

List of Parts:

  • 1 x Arduino UNO connected to power supply
  • 1 x Force sensing resistor
  • Approx. 5 feet of wire
  • 1 x 10 kOhm resistor
  • 1 x 330 ohm resistor
  • 1 x Buzzer

Instructions to recreate:

To construct our alarm clock, simply connect the parts as indicated in the diagram above.  The force-sensing resistor should be connected to the Arduino using two wires long enough to reach from the location of your alarm clock to the point where you’d like the force-sensing resistor to be located-] (we recommend near the center of your mattress or underneath one of the legs of your bed).  One of these wires should travel from the +5V pin on your Arduino, and the other should connect with a 10kΩ resistor.  The other end of the 10kΩ should connect to ground.  Connect analog pin 0 on your Arduino to the junction between the force-sensing resistor and the 10kΩ resistor. Then, connect the buzzer in series with a 330Ω resistor between pin 3 and ground, set how long you’d like to sleep in our code (the “secondsUntilWake” parameter), and upload to your Arduino!

Source Code:

/*COS 436 Lab 1
*Joseph Bolling, Evan Strasnick, Jacob Simon, Xin Yang Yak
*Force Sensing Alarm Clock

int fsrAnalogPin = 0; // FSR is connected to analog 0
int buzzerAnalogPin = 3; // Buzzer is connected to analog 1

int fsrReading; // the analog reading from the FSR resistor divider
int fsrInitialRead; // the value of fsrReading when the alarm begins

int secondsLeft;
int releaseThreshold = -50;
int buzzerTone = 50; // default buzzer pitch

// -------------------------------------------------------------------
int secondsUntilWake = 5;
boolean alarmOn;

void setup(void) {
secondsLeft = secondsUntilWake;
alarmOn = false;
pinMode(buzzerAnalogPin, OUTPUT);

void loop(void) {
if (!alarmOn) {
else {
while (pressureOn()) {



if (secondsLeft == 0) {
alarmOn = true;


void markPressure(void) {
fsrInitialRead = analogRead(fsrAnalogPin);

boolean pressureOn(void) {
if ((analogRead(fsrAnalogPin) - fsrInitialRead) <= releaseThreshold) {
return false;
else return true;

void buzzOn(void) {
analogWrite(buzzerAnalogPin, buzzerTone);

void buzzOff(void) {
analogWrite(buzzerAnalogPin, 0);


P1: EyeWrist

Evan Strasnick, Xin Yang Yak, Jacob Simon, Joseph Bolling


  1. Dielectric stimulation – Choreography is hard to communicate and record, teach it to people by stimulating their bodies
  2. For anyone attempting to hobbies, sports, or arts requiring tacit skills (e.g. skiing), a helmet could record the motions and perspectives of the master, and then the apprentice could have this perspective played back for them as they attempt the motions.
  3. Dynamically adapting screen that uses eye tracking to take advantage of foveal resolution / peripheral resolution and thereby optimize resources and space.
  4. Glasses or some form of headpiece that tracks direction of gaze to turn on lights only where a person is currently looking, to conserve energy (or just a cool social experiment)
  5. Bad habits tracker: a basic app combined with select hardware pieces could allow users to select their personal bad habits and collect statistics on how often they occur, as well as give advice (teeth-grinding, snoring, stuttering, etc.).
  6. A patch containing a variety of flex sensors could be applied to any of a number of target areas on patients with bad posture, helping to remind them with simple vibration when they sink into unhealthy positions.20B6BBFD-114B-4E8F-894F-756FBAA22E7A
  7. A flex-sensor based device to be placed on the neck, that stimulates (vibration, dielectric, etc.) students with a tendency to fall asleep during class.
  8. Speech recognition software that reads the users lips (and applies predictive software) instead of listening to an audio stream would allow users in public locations to have text-to-speech input without having to speak to their computer, or could serve as an enhancement to existing audio-based software, or could serve as input for disabled/blind patients.
  9. Door that is gesture-triggered, not just motion-triggered – automatic doors often open even when passersby don’t intend to pass through the door; this could be fixed with a more nuanced interface
  10. An automatic toilet that will never flush while you’re still using it. (heat or pressure sensors)
  11. Integrating an accelerometer component into a watch or wedding ring, allowing users to respond to common prompts instantly and effortlessly (answering/rejecting calls, turning off reminders, etc.)
  12. An alarm that wakes you up using gradually increasing light rather than sound. This not only allows users to wake up more naturally and peacefully, but does not disturb neighbors/roommates.
  13. Track REM periods through eyelids to monitor quality and duration of sleep states.
  14. For anyone who wishes to check time discreetly during class, work, a meeting, etc: a ring, bracelet or other touch-controlled item that can signal with vibrations
  15. Sensors (pressure sensors, proximity sensors) that can be placed around the house to automate the tasks a user must perform in various locations (e.g. turn off lights when pressure in bed registered after a certain time, open blinds by stepping in front of them, etc.)
  16. A device which monitors basic physiological readings (blood pressure, heart rate), and activates severity-determined anxiety prevention steps (from helpful text messages to emergency personnel alerts) to help patients with phobias
  17. 3D TV must be viewed from a specific angle to reach the full effect. We propose glasses that use accelerometers or eye tracking to constantly adjust the TV to the correct viewing angle in real time.
  18. Playing cards visible only to players – each player wears glasses which overlay images on specialized cards currently being held in front of them; other players are not shown these images.
  19. Splitscreen multiplayer that is player-specific, hides other player’s screen
  20. A TV which allows family members to view different channels or programs at the same time, using polarized light and viewing glasses for example.
  21. For improved hygiene in public restrooms, an intelligent toilet that automatically raises or lowers its seat depending on user intention (penis identifier).
  22. Making it easier to observe shabbat by automating common tasks using passive proximity technology.
  23. New types of trackpads which can distinguish and recognize individual fingers for advanced gestures – Modern trackpads are very accurate, but cannot identify which finger is being used; doing so would allow for much more nuanced interactions, shortcuts, etc. with a computer.
  24. Car safety (warning if you don’t check your mirrors and blind spots) – eye tracking software could be used to train new drivers to check their blindspots, one of the harder things to learn when driving.
  25. Phone mouse using visual and accelerometer input – The camera on a modern smartphone could be used with accelerometer data to let users accurately pan onscreen by moving their phone.
  26. Actively heated clothing – Clothing that is actively heated by warmed fluids or another mechanism would maintain body temperature in extremities during winter without adding too much bulk, and could be made smart through temperature sensors and shiver-detection.
  27. Study space noise tracker – It’s difficult to know which public spaces on campus are being used as social areas and which are available for quiet study at any given moment; this could be fixed with noise sensors and a web application.
  28. Dance-based music selection – We could solve the problem of poor party DJ-ing with a system that recognized the dances different dancers are performing and selects songs accordingly.
  29. Dance-controlled music synthesizer – The improvised dance performance is vastly improved by live musicians.  This device would produce a similar conversation between dancer and musician without the inconvenience of finding a live musician.
  30. Anti static phone charger – Harvesting the static electricity that builds up in clothing during winter could help charge small devices and solve a major annoyance.
  31. People who like to do gardening recreationally may not know what kinds of plants their garden soil/climate is suitable for. A device that monitors the soil and climate condition can help the planter figure out what types of plants are likely to do well in the garden.
  32. Dental sensors in dentures/retainers – Basic dental diagnostics such as bacterial populations, enamel health, and tooth position could be assessed using devices implanted in dentures and retainers
  33. Text input for severely disabled patients by touching their tongue to their teeth using a specialized retainer.2013-02-20 21.35.21
  34. A toothbrush that maps out where users reach (and don’t reach) while brushing, allowing them to improve their brushing habits and dental hygiene.
  35. Audiovisual recording buffer device – Allows you to retroactively record moments in your life or lecture by keeping a running buffer of the last few minutes. Allows users to never miss out on a moment of accidental hilarity, without the storage restraints of constantly recording the entire day
  36. Improve bad handwriting with a pen that actively corrects its user with counterbalanced weights.
  37. In some sports, it’s easier to spot bad form if you can see yourself in third-person view. Goggles that allow the athlete to see him/herself in third person while performing the action during athletic training might help to improve form.
  38. Monitoring the food that is being baked in the oven currently requires one to open the oven, but this reduces the oven temperature and requires that the chef always be nearby and alert. An oven camera which streams to a smartphone would allow constant monitoring without interfering with the baking.
  39. Smart microwave oven that both scans barcode of food and figures out how long to cook it, and also adjusts the microwaving plate using pressure sensors to always center food to cook evenly.0ED4EB11-E517-4D0F-9E33-611282A0DA5E
  40. It’s a hassle for hikers/athletes/patients who need to stick to a hydration plan to keep track of their fluid intake. A water bottle that tells you how much to drink would help.2013-02-20 21.40.17
  41. CPR instructor necklace/tattoo – A device worn by populations at risk for cardiac arrest could indicate to untrained rescuers how and where to apply chest compressions for CPR.
  42. Phone EKG patch/heart monitor – People could keep continuous track of their cardiac health by wearing small electrode devices that would communicate warnings to a cellphone or other handheld interface.
  43. Weight-lifting sensor gloves – Gloves with built in pressure sensors and accelerometers could track and deliver statistics on how much work is being done with each arm, to allow weightlifters to more accurately exercise their muscles.6E40A95A-BB52-4DBD-8F29-E6AA5952D046
  44. Customer heatmap for stores/attractions – A manager could track which areas or displays experience the most traffic by using pressure sensors or infrared cameras, and adjust product locations and inventory appropriately.
  45. Diagnostic toilet with fluid and fiber recommendations (tracks volume, frequency, and possibly chemical content)
  46. Eyeglasses that not only darken in bright light, but can accurately adjust to any desired level of light (using a liquid crystal layer, for example).
  47. Ctrl + F for physical books: so that a user could instantly find parts of a book, a piece of hardware with an equipped camera would be adjusted to rapidly flip through the pages of a book and photograph each one, using text-recognition software to compile a digital version within minutes.9B215C8B-1BFA-44EB-A3B3-D0C5A83D1E93
  48. Kinect phantom limb therapy – Kinect could be used to display missing body parts in the mirror image of a user, helping ease phantom limb pain.
  49. “Intelligent mirror” – Kinect system + screen would allow users to virtually try on all of the clothes in a store’s inventory  2591AF79-2DC2-45FD-B285-083FF1AECE3D
  50. Kinect-based application that allows the user to turn their entire body into a musical instrument – e.g. control volume by opening mouth while playing out pitches with arm position – but user could select which parts of their body they wanted to control which elements of the music.
  51. Sonar glove (or infrared glove) – a glove which emits and records a high-frequency sound (or infrared beam), indicating the distance from an obstacle via vibrations and allowing blind users to detect obstacles in front of them.



After much discussion, we finally decided upon idea #51, the “sonar-glove” (which may or may not actually employ sonar!) First and foremost, we believed that it would be a product that could completely change the way that the user base (the visually handicapped) faces its daily problems. By being something as inconspicuous as a glove, it would serve the same basic goal as a cane, but would draw much less attention and be far more convenient. By offering an analog scale of vibration frequency, users could know exactly how far away they were from an object, and if they wanted, they could even use more than one in order to scan even greater regions of space around them. Another advantage to this system is its design flexibility. While the idea was originally proposed as a glove, the same system could take the form of a more conventional cane (allowing the propioceptive benefits of this more standard approach), or even be embedded in the users shoes. Similarly, we began by thinking in terms of sonar, but have already begun discussing the advantages and disadvantages of other approaches (i.e. infrared beams). We believe that this idea will allow us to make the most of the experience of iterative design, while realistically culminating in the creation of a prototype that could be nothing short of miraculous for its users.

Target User Group: Our main target user group is the blind and visually handicapped. Of course, there are many other possible applications for this technology (working in dark spaces, assisting patients recovering from eye surgery, etc.), but the blind face daily challenges that make even the most basic of tasks frustrating, not the least of which is simply navigating their environment. As evidenced by the fact that multiple coping strategies already exist (canes, seeing-eye dogs, etc.), safely traversing the world is a demanding task for the visually handicapped – one we believe would be much alleviated by constantly being able to sense the distance of obstacles. Of course, while this is a user group which is not terribly common on Princeton’s campus, we believe it would be perfectly feasible to get user feedback by reaching out to hospitals, assisted-living homes, or other members in the nearby community. We believe it is important for users who actually understand the struggles of a visual handicap to test the system, so we will not simply allow testing to consist of “closing one’s eyes.”

Problem Description: The visually handicapped often use canes or other aids in order to walk about without fear of tripping or walking into an unseen obstacle.  While effective, the cane provides a limited amount of information-the device does not alert the user to an obstruction until they are within a few feet of it, and the information refresh rate of the cane is limited by the speed with which the user can physically sweep it.  What’s more, the cane is noisy and physically intrusive, and can become entangled in objects or other pedestrians.  Our glove would serve instead of or in addition to a cane, and would provide more information while being less

Technical platform: Our technological platform of choice is the Arduino. We chose this because our device needs to be portable and needs to be able to make use of input from the proximity sensors and to control the vibration motors on the glove. It also does not require much computational power. The Arduino would also make it easier for us to quickly iterate on technical parameters of the device such as the sensitivity of the proximity detectors or the intensity of the vibration motors to provide the optimal user experience.