The Elite Four – Final Project Documentation

The Elite Four (#19)
Jae (jyltwo)
Clay (cwhetung)
Jeff (jasnyder)
Michael (menewman)

Project Summary

We have developed the Beepachu, a minimally intrusive system to ensure that users remember to bring important items with them when they leave their residences; the system also helps users locate lost tagged items, either in their room or in the world at large

Our Journey

P1: http://blogs.princeton.edu/humancomputerinterface/2013/02/22/elite-four-brainstorming/
P2: http://blogs.princeton.edu/humancomputerinterface/2013/03/11/p2-elite-four/
P3: http://blogs.princeton.edu/humancomputerinterface/2013/03/29/the-elite-four-19-p3/
P4: http://blogs.princeton.edu/humancomputerinterface/2013/04/08/the-elite-four-19-p4/
P5: http://blogs.princeton.edu/humancomputerinterface/2013/04/22/the-elite-four-19-p5/
P6: http://blogs.princeton.edu/humancomputerinterface/2013/05/06/6008/

Videos

Photos

Changes Since P6

  • We added sound to the prototype’s first function: alerting the user if important items are left behind when s/he tries to leave the room. The system now plays a happy noise when the user opens the door with tagged items nearby; it plays a warning noise when the user opens the door without tagged items in proximity.

  • We added sound to the prototype’s item-finding feature. At first, we had used the blinking rate of the two LEDs to indicate how close the user was to the lost item. We improved that during P6 by lighting up the red LED when the item was completely out of range, and using the green LED’s blinking rate to indicate proximity. We now have sound to accompany this. The speaker only starts beeping when the lost item is in range, and the beeping rate increases as the user gets closer to the lost item.

By adding sound to the prototype, we made our system better able to get the user’s attention–after all, it is easy to overlook a small, flashing LED, but it is more difficult to both overlook the LED and ignore beeping. For the second function, the system allows the user to operate the prototype without taking their eyes off of their surroundings, thus improving their ability to visually look for missing items.

Goals and Design

Our primary goal has not changed since the beginning of the semester. Our prototype’s main function, as before, is to save users from their own faulty memories by reminding them not to forget important items when they leave their residences. However, our prototype has evolved to include a related secondary goal/task, which is finding items once they are already lost/forgotten. Because our hardware already incorporated item-detection via RFID, this was a natural extension not only of our stated goals, but also of our hardware’s capabilities.

Critical Evaluation

From our work on this project, it appears that a system such as ours could become a viable real-world system. The feedback we received from users about the concept shows that this has the potential to become a valuable product. We were able to address most of our testers’ interface concerns with little difficulty, but we still suffered from hardware-based issues. With improved technology and further iterations we could create a valuable real-world system.

The primary issue with the feasibility of developing this project into a real world system comes from hardware and cost constraints, rather than user interest. The current state of RFID presents two significant issues for our system. The first is that, in order for our system to function optimally, it needs a range achievable only with active RFID or extremely sophisticated passive RFID. That level of passive RFID would be unreasonable in our system due to its astronomical cost (many thousands of dollars). Active RFID, which our prototype used, is also quite expensive but feasible. Its primary issue is that the sizable form factor of most high quality transmitters does not allow for easy attachment to keys, phones, wallets, etc. Therefore, ideally, our system would have a high-powered and affordable passive RFID, but currently that technology appears to be unavailable. EAS, the anti-theft system commonly used in stores, is another feasible alternative, but its high cost is also prohibitive.

Moving Forward

As stated in the previous section, the best option for moving forward would be to improve the hardware, specifically by switching to more expensive high-powered passive RFID. Other avenues of exploration include refinement of range-detection for our first task, which would become increasingly important with increasingly powerful RFID detection systems, and implementation of tag syncing. Range limiting is important because if our system can detect tags from a significant distance, it is important not to give a “false positive” if a user has left items somewhere else in their relatively small room but does not have them at the door. Syncing of tagged items would become important for a system with multiple tags; it would allow users to intentionally leave behind certain items, or for multiple residents of a single room/residence to have different “tag profiles.” Syncing could also permit item-detection of particular items, which would allow greater specificity for our second and third tasks. Finally, for most of our testing we used laptops as a power source for the Arduino. This was fine for the prototype, but in a real version of this product, ideally the system would have its own power source (e.g., a rechargeable battery).

Code

https://drive.google.com/folderview?id=0B5LpfiwuMzttMHhna0tQZ1hXM0E&usp=sharing

Third Party Code

Demo Materials

https://drive.google.com/folderview?id=0B5LpfiwuMzttX2ZPRHBVcmlhMUk&usp=sharing

P6 – The Elite Four

The Elite Four (#19)
Jae (jyltwo)
Clay (cwhetung)
Jeff (jasnyder)
Michael (menewman)

Project Summary

We have developed a minimally intrusive system to ensure that users remember to bring important items with them when they leave their residences; the system also helps users locate lost tagged items, either in their room or in the world at large.

Introduction

The system we will be testing is a prototype with the basic functionality described in the project summary above. The system is able to detect when a user leaves the room, alert him/her if tagged items are missing (task 1), and help him/her find the items either inside (task 2) or outside (task 3) of his/her room. For our tests, we will have our users perform each of the three primary tasks. The goal of these tests is to ensure that our prototype is an intuitive and effective system for the average user.

Implementation and Improvements

P5 Blog Post: http://blogs.princeton.edu/humancomputerinterface/2013/04/22/the-elite-four-19-p5/

We have not made any changes to our working prototype since submitting P5. However, we certainly plan on improving and adding features before the final submission, focusing on the feedback we receive from our participants.

Method

i. Participants:

Participants were randomly selected from Terrace Club. At the time of testing (during the afternoon) students who were working in the dining room were approached and asked for their participation. Amongst the users was a senior MOL major, a senior COS major and a senior HIS major. Each of the participants were members of our target user group, as they all lived in on-campus dormitories with automatically locking doors. More specific demographic information can be found in the questionnaire data within the Appendix.

ii. Apparatus:

We conducted the tests using our prototype. The current prototype uses an Arduino Uno to control LEDs as well as our RFID receiver and transmitter. These components are connected with a breadboard and jumper wires, along with miscellaneous items like electrical tape. We conducted our tests at Terrace Club, which was not a dorm room per se but sufficed because it has doors.

iii. Tasks:

Users should be able to perform three tasks with our prototype. The first task (easy) is to identify when a door has been opened and alert the user if s/he tries to leave without tagged item(s). The second task (medium) is for the system to help the user locate lost tagged items in his/her own room. Our final task (hard) is to help the user locate lost tagged items outside of his/her room. This task is very similar to the second from the system’s point of view, but for the user it is far more difficult, since the potential location of the lost item(s) is much greater.

iv. Procedure:

We conducted the study by setting up in a semi-public area and asking random students if they would like to participate. At that point, we showed them the consent form, asked them to perform our prototype’s three tasks, and had them fill out the demographic and post-task questionnaires.

Test Measures

For task 1, we had each participant leave the room 10 times with the tagged item and 10 times without. The resulting raw data can be found in the Appendix. Our system consistently identified the item 70% of the time when the user had the tagged item. It’s important to note that the RFID transmitter was generally put in the user’s sweatshirt or pants pocket, which appeared to interfere with the signal strength. We also had User 2 experiment with the left vs. right pocket of his sweatshirt since the sensor was on his right side in our setup. In his right pocket, our system was 5 for 5, but in his left pocket, it was only 2 for 5. We also tested our system without the tagged item to test that there were no false positives. For Users 1 and 3, we varied where the tag actually was while we were testing this. Since a college dorm room isn’t very big, there were realistic scenarios where the transmitter was in a pants pocket on the floor just a few feet away from the door, and these were the situations where false positives were detected. This will be a difficult problem to solve.

For task 2 and 3, we measured the time required to find an object in each of 3 preset hidden locations. We made sure this test was double-blind by having one group member hide the transmitter and a different group member following our participant as he or she tried to locate it. Throughout all 3 of the tasks, we made sure to take note of any qualitative comments or suggestions they made.

In addition we had each participant complete a more quantitative response form. In this form they were asked to rate the intuitiveness and effectiveness of our system during each task.

Results and Discussion

Users were not satisfied with the current alert system, which just uses red and green LEDs to inform the user of tag presence. This was shown to be inadequate for all of our users, who were somewhat unaware of the LEDs’ presence for all three tasks. For the first task, users would often be past the door before they could see the LED light up, especially if they simulated the situation as if they were in a rush. For the second and third tasks, users would have to spend a lot of time looking down at the LEDs to see how quickly it was blinking, and since they were looking around at where they were going, they would sometimes miss important changes in blinking speed. As a solution to this we will be adapting our prototype to add audio notifications as well. We hope that this change will help make our system’s alerts more intuitive to the user.

There also some rather severe usability issues with the item finding feature. This feature had issues with responding quickly and accurately to distance changes. This is a result of our RFID transmitter having a weak signal and only transmitting every ~2.4 seconds. This meant that proximity updates to the user were too slow and caused confusion. In order to remedy this issue, we will be adding an antenna that will increase signal strength and reduce transmission time. This means faster updates to our user and a much more usable system.

Users were also confused by the use of LEDs while using the finding feature. In the current prototype, both red and green LEDs flash at varying rates depending on tag proximity. However, users complained that they were unsure whether the tag was just far away or fully out of range. To fix this we are now using the green LED to display proximity and the red LED to indicate that the tag is out of range. One user even suggested including more LEDs to indicate proximity. For example, it would flash red if the tag is out of range, orange if the tag is at the edge of its range, yellow if it is near the middle of its range and green if it is very close. After implementing and testing the method of using the green LED to display proximity and the red LED to indicate that the tag is out of range, we have decided that users are satisfied with this and that we do not need to add more LEDs.

Another issue users observed was that the tag was not always detected by our system when opening the door. This occurred when the tag was covered (i.e. in pants or a backpack), and is is a severe issue for us. It is caused by the weak signal strength on the transmitter, and because of this the new antenna that we will add should solve this problem as well. We will also need to decrease the threshold of signal strength where the tag is considered found, as the stronger signal from the antenna will mean the tag could be detected from further away which could create more false positives.

Appendices

Consent Form:

https://docs.google.com/document/d/16-kftTGzhnIGb6mRQMm_JerSnB0mmMQOlzKc81en9zk/edit?usp=sharing

Testing Script:

https://docs.google.com/document/d/13EEWWEgj6BAFCx76ZjiYTZKztoZ2mT1o5rwjDMgUgMo/edit?usp=sharing

Demographic Questionnaire:

https://docs.google.com/a/princeton.edu/forms/d/1MKLK28t85aLUPZSDv9xda7W7-b3OBaLaoA1kpgEPJ70/viewform

Demographic Results:

https://docs.google.com/spreadsheet/ccc?key=0AoQrPYxYPLESdEg1ZFFCWTBlam5fcFJWZ3AyZ20tZEE&usp=sharing

Post-Task Questionnaire:

https://docs.google.com/forms/d/1MK0oBRswYdxR7aZUiQvuD4YhM6Okxxce1kqyCGqxBg0/viewform

Post-Task Results:

https://docs.google.com/spreadsheet/ccc?key=0ApLpfiwuMzttdHVIV3VPRUkxS01vUFpJUW93RXZUMHc&usp=sharing

Task 1 Data:

With tagged item (10 times)

Without tagged item (10 times)

User 1

7 right, 3 wrong

8 right, 2 wrong

User 2

7 right, 3 wrong

10 right, 0 wrong

User 3

7 right, 3 wrong

8 right, 2 wrong

Task 2 Data:

Location 1

Location 2

Location 3

User 1

Time limit reached

2:26

Time limit reached

User 2

1:18

2:18

1:59

User 3

1:23

2:04

0:40

 

Assignment 2 – Clay Whetung

Oberservations

I preformed two rounds of observation, one was performed in the ten minutes before a precept for URB201 and the other was performed in the ten minutes before URB201 lecture. I decided that it would be beneficial to observe the two primary class “types” at Princeton and also consider any observations that appeared in both. URB201 lecture occurs at 1:30pm on Tuesdays and URB201 precept occurs at 1:15pm on Thursdays. During lecture I observed my professor (the lecturer) and two students, one who had a laptop and another who had not brought theirs to class. I observed the following:

URB201 Lecture:

Professor:

  • Spent a brief amount of time (~ 2 minutes) preparing the PowerPoint presentation for lecture
  • After the projector was prepared, the professor spent about ~5 minutes conversing casually with students seated in the front rows
    • This was particular interesting as students who were not in the first few rows did not have contact with the professor
    • The remaining time was spent talking with the present preceptor
    • After the professor had set up her lecture slide, the first slide was present on the screen, allowing students to view the start of the lecture
    • Before the start of class the professor closed the main entrance to the lecture room

Laptop Student:

  • Student spent the entirety of pre-lecture time on their laptop
  • They performed simple procrastination tasks, (I,e, Facebook, ESPN etc.)
  • They were seated far from the front row (the very last row)
  • No contact was made between the professor (or the preceptor) and the student

Non-Laptop Student:

  • This student appeared to be doing readings for a class  before lecture (unsure of whether it was for URB201 or not
  • He also spent a brief time conversing quietly with a another student who sat next to him
  • Student checked their phone periodically

In the URB201 precept observations, I took noted on the actions of the preceptor and one other student.

URB201 Precept

Preceptor:

  • Preceptor spent ~2 minutes organizing papers, silently
  • She then began to engage some of the students in some brief discussions about the class (i.e. is this blog post due time working)
  • The remaining time before precept was spent engaging students in casual conversation

Student:

  • The student was on his laptop throughout the beginning of precept (and during)
  • Was procrastinating on the internet (Facebook, Reddit, etc.)
  • Did not converse with the preceptor

Overall Observations

  • Students seemed less inclinesd to engage others when their laptops were available
  • Professor and Preceptors seemed very willing to engage students when it was possible
  • Physical distant made communication between people much less likely to occur
  • The rooms were mostly quiet and speech tended to be hushed
  • Many students appeared to have smart phones available

Brainstorm

 

  1. A live forum, similar to Piazza, is projected in front of the class. Students can log in and ask questions that will be answered in real time by the professor.
  2. Chat room for the students in the class to procrastinate together. Large chat room to post pictures of cats, or discuss the class possibly.
  3. A web space where students can log on and make plans for their next meal (since students often meal exchange and classes usually occur before lunch or dinner). It will able to track email exchanges so students remember to take them
  4. An application that will allow students to mark e-mails that they didn’t have time to respond to, and reminds then to answer them when they have free time before class
  5. 1 vs 100 style game that students can log into before class starts. The game is played with trivia questions from the previous lecture. (basically a trivia game where those who answer the question incorrectly are eliminated). If the 1 player ( a random student) wins then they may be giving some reward.
  6. A twitter style feed where students can log in and post notifications to other students in the class. This can be used to post questions about the class, or to find study groups (i.e. “Hey, anyone want to work on this problem set Wednesday night? cwhetung@”)
  7.  An arcade system that students (and Professors) can log into from their laptops with their netIDs. There will a selection of simple game (light bikes, Tetris, Pong. Etc.) that students can play against each other.  Will have chat to help facilitate communication between players and help members of the class get to know each other
  8. A quick polling system that the professor can use to poll students 10 minutes before class. The poll will be between different short ~6 minute, non-class material lectures, the slides for the winner will be automatically shown and the professor will deliver the short lecture
  9. A Pokémon style game that can only be played against other students in the same class, each student is given a random starter Pokémon t the start of the semester. Since play is limited to the time before class, students are encouraged to arrive early to level their Pokémon
  10. System that students log into as they arrive to precept, a student who is at precept is randomly selected to give a brief analysis of their thoughts on the week’s readings before class begins.
  11. An application that selects a random passage from the week’s reading and projects it before the start of precept. Then students are encouraged to discuss that small slice of the week’s information
  12. A webpage that students can log into . It is a stream of webpage that other students in the class have found to be interesting, it operates as a passive Reddit, where the user doesn’t need to take action as the information is feed to them automatically
  13. An app that can calculate the time to walk between locations on campus quickly. Can be sued to ensure that you don’t arrive early to class and have to wait around!
  14. Before class begins, have last class’s lecture slides repeat on screen as a quick reminder to the students
  15. Digital doodle board that allows students to draw together
  16. Students can sign up online for an off-topic presentation before class (I.e. sing a short song, do a jig)
  17. Have an online poll that would allow students to say what they did/did not like about previous lectures for on the fly improvements

Chosen Ideas

Online Arcade: Gives students a way to have fun, relax and interact with each other easily before class.

Short Lecture Poll: A great way to learn more about a professor’s studies outside of the topic of the class.

Prototyping

Online Arcade:

This slideshow requires JavaScript.

Short Lecture Poll:

This slideshow requires JavaScript.

 

Feedback

Feedback was gathered from three testers, Paulius Paulaskas ’13 (ORFE), Mengou Zho ’13 (WWS) and Eric Penalver ’13 (CBE). I presented each of them with the welcome splash and informed that that it was an activity to be done in the ten minutes before class began. Below are some of them using the prototype:

This slideshow requires JavaScript.

  • It was unclear what the “Play Again” Button was for
  • It was unclear if it was a touch interface or a desktop interface
  • There were no instructions for the games
  • Users couldn’t log out, except for after a game
  • It would be useful to see my record other times as well
  • Users would like to be able to choose who to play in class
  • The ability to see the records of other classmates was highly requested
  • It was unclear why a class had to be selected
  • Users were unsure if they could leave a game in the middle
  • Users would like the ability to chat
  • Would enjoy a friendly form of procrastination before class

Insights:

  • It is extremely important to make it clear to your testers what they are experiencing
  • Students would like a chance to relax before class, rather than work
  • Users enjoyed the social  aspect, but would like it be more pronounced
  • There were aspects that users expected, such as a log out, or leave game button, that weren’t present
  • It would be beneficial to provide users with some analogue that represents the input tools they have available. Such as giving them a keyboard that isn’t attached top anything.
  • Users liked the competition aspect, but it should be made clearer to them who they are competing with, why they are competing with and where the completion stands

 

Strength Testing Game

Group Members

Jeff Snyder (jasnyder@)
Clay Whetung (cwhetung@)
Michael Newman (menewman@)
Neil Chatterjee (neilc@)

Description

We built a “strongman game” that uses a flex sensor to measure a player’s “digital” strength. As the player flexes the sensor, the amount of flex is categorized into one of five strength levels, and we use three LEDs and a speaker to demonstrate which level has been reached. Within each level, the brightness of the brawniest LED reached is varied with pulse-width modulation to match the player’s exertions. At the lowest level, none of the LEDs are lit and the speaker produces no noise. At the next-lowest level, the green LED lights up. At the middle level, the green and yellow LEDs light up. At the second-highest strength level, the green, yellow, and red LEDs light up. At max strength, all three LEDs light up and celebratory music plays from the speaker. We chose to build this because we wanted a way to demonstrate our finger-flexing prowess, and the game’s cheerful blinking lights and victory song brightened our day while affirming our digital swole. In general, the project was a success, and we were especially pleased with the jingle. For future improvement we might want to change the kind of sensor being used (for example, a pressure sensor that measures the force from the strike of a comically large hammer) and/or the number of lights and tunes available (and consequently, the number of strength categories available).

Video

Arduino Strength Tester

Each Strength Level

First Strength Level

Strength Level One

Second Strength Level

Strength Level Two

Third Strength Level

Strength Level Three

 

Design Sketches and Final Build

Night Light

First Design – Night Light

Clapping Binary Counter

Second Design – Clapping Binary Counter

Strength Tester

Third Design – Strength Tester

Strength Tester

Final Strength Tester Design

 

Parts Used

– 3 LEDS (green, yellow, red)
– 1 speaker
– 1 flex resistor
– 1 Arduino
– jumper wires
– 2 breadboards
– 3 330-ohm resistors
– 1 100-ohm resistor
– 1 10k-ohm resistor
– 1 laptop with usb cable
– 1 wax paper diffuser

Instructions

1. Set up the LEDs in a line on a breadboard next to the flex sensor. Place the speaker across the center divider of the other breadboard. You may find it helpful to connect ground and +5V to the power rails of the breadboard for the following steps.

2. Make the following connections:
– Connect the anode of the red LED to pin 6 of the Arduino, the anode of the yellow LED to pin 11, and the anode of the green LED to pin 11.
– Connect the cathode of each LED to ground via a 330 ohm resistor.
– Connect one pin of the speaker to pin 8 and the other to ground via a 100 ohm resistor.
– Connect one side of the flex sensor to a +5V pin.
– Connect the other side both to A0 and to ground via a 10 kilo-ohm resistor.

3. Check the values output by the flex sensor using the serial monitor and the Serial.println() function. In the code, change the FLEX_MAX and FLEX_MIN values as appropriate.

4. Mount the wax paper diffuser in front of the three LEDs.

5. Test your brawn!

Source Code

/* 
  Authors: jasnyder, cwhetung, menewman, neilc
  Date: 2/11/2013
  COS 436 Lab L0: The Strength Test

  Test the user's strength with a flex sensor. Display the
  results to LEDs and play them a tune if they are truly
  powerful.
*/
#include "pitches.h"

const int FALSE = 0;
const int TRUE = 1;

//  Input / Output Constants
const int FLEX_MIN = 150;  //  set me as appropriate!
const int FLEX_MAX = 348;  //  set me as appropriate!
const int NUM_STAGES = 6;
const int STAGE_SIZE = ((FLEX_MAX - FLEX_MIN) / NUM_STAGES);
const int ANALOG_MAX = 255;

//  Possible States
const int SONG = 0;
const int ALL_ON = 1;
const int TWO_ON = 2;
const int ONE_ON = 3;
const int ALL_OFF = 4;

//  Musical Constants
const int NOTE_DELAY = 300; // (ms)
const int NOTE_DUR = 250;

//  Pin Connection Constants
const int green = 11;
const int red = 6;
const int yellow = 10;
const int flex = A0;
const int speaker = 8;

//  Variables
int flexvalue = 0;  //  value returned by flex sensor
int stage = 0;  //  current state of LEDs / sensor
int play = 0;  //  have we already played the tune?
int pwmout = 0;  //  dimming value

//  Set internal pull-ups for output on LED pins
void setup()
{
  pinMode(green, OUTPUT);
  pinMode(red, OUTPUT);
  pinMode(yellow, OUTPUT);
}

void loop() {

  //  Grab the bend value and map it to one of the stages
  flexvalue = analogRead(flex);
  stage = map(flexvalue, FLEX_MIN, FLEX_MAX, 0, NUM_STAGES - 1);

  //  Within each stage, dim the "last" LED to the approximate
  //  progression through the stage 
  pwmout = (flexvalue - FLEX_MIN) % STAGE_SIZE;
  pwmout = map(pwmout, 0, STAGE_SIZE, 0, ANALOG_MAX);

  //  Turn all LEDS on and play the song once
  if (stage == SONG) {
    digitalWrite(green, HIGH);
    digitalWrite(yellow, HIGH);
    digitalWrite(red, HIGH);

    //  If we have already played the song, do nothing
    if (play == FALSE)
    {
      playACongratulatoryTune();
      play = TRUE;
    }
  }

  //  All on, red variable
  if (stage == ALL_ON) {
    play = FALSE;  //  reset the song
    digitalWrite(green, HIGH);
    digitalWrite(yellow, HIGH);
    analogWrite(red, pwmout);
  }

  //  Green and yellow on, yellow variable
  if (stage == TWO_ON) {
    play = FALSE;
    digitalWrite(green, HIGH);
    analogWrite(yellow, pwmout);
    digitalWrite(red, LOW);
  }

  //  Green on and variable
  if (stage == ONE_ON) {
    play = FALSE;
    analogWrite(green, pwmout);
    digitalWrite(yellow, LOW);
    digitalWrite(red, LOW); 
  }

  //  All leds off
  if (stage == ALL_OFF) {
    play = FALSE;
    digitalWrite(green, LOW);
    digitalWrite(yellow, LOW);
    digitalWrite(red, LOW); 
  }
}

//  Play a classic little ditty!
void playACongratulatoryTune() {
  tone(speaker, NOTE_G4, NOTE_DUR);
  delay(NOTE_DELAY);
  tone(speaker, NOTE_C5, NOTE_DUR);
  delay(NOTE_DELAY);
  tone(speaker, NOTE_E5, NOTE_DUR);
  delay(NOTE_DELAY);
  tone(speaker, NOTE_G5, NOTE_DUR);
  delay(NOTE_DELAY*2);
  tone(speaker, NOTE_E5, NOTE_DUR);
  delay(NOTE_DELAY);
  tone(speaker, NOTE_G5, NOTE_DUR*4);
  delay(NOTE_DELAY*5); 
}