P5- Group 4

Basic Information (a,b,c)

Group Number: 4
Group Name: TFCS
Group Members: Farhan, Raymond, Dale, Collin
Project Description: We are making a hardware platform which receives and tracks data from sensors that users attach to objects around them, and sends them notifications e.g. to build and reinforce habits.

Tasks (d,e)

For our easy task, we plan to track trips to the gym using the accelerometer built into our sensor tags. This task should be relatively easy, as the tracker only needs to detect accelerometer motion to trigger an event. For our medium task, we will track a user’s medication usage by tagging a pill box. This task is of medium difficulty because only certain changes in measured data actually correspond to task-related events; we have to filter our data to find out when events have really occurred. Finally, for our hard task, we hope to track the user’s reading habits by tagging textbooks. This will be the hardest task for us to implement because of the complexity of pairing a magnet and magnetometer on an arbitrary book in a way that is intuitive to the user. We also have to filter our data to trigger events only in the appropriate cases, just as in our medium task.

Our team has left these tags fundamentally unchanged since P3. We found in P4 that user complaints were almost exclusively focused on interface design and aesthetics; the overall use case has received promising feedback from users. Ultimately, since our initial idea of tracking user habits remained the same, we kept the well-specified user interactions we have been using before.

Revised Interface (f)

In our P4 user test, we found that users had difficulty grasping the complexity of our notifications system. We proposed that a simpler model would be a countdown timer, which is reset by the user interacting with an object. For P5, we prototyped such a system, because we anticipated that it would be interesting to test with users in P6, and also because it would get us to a state where we could start tracking tasks while being simpler technically. As a result, we removed most adjustable controls, and made it so a user only chooses a task name and task interval that they want each sensor to monitor. This has also allowed us to dedicate more screens towards walking the user through configuring the system.

Continue reading

P3

Team TFCS: Dale Markowitz, Collin Stedman, Raymond Zhong, Farhan Abrol

Mission Statement

In the last few years, microcontrollers finally became small, cheap, and power-efficient enough to show up everywhere in our daily lives — but while many special-purpose devices use microcontrollers, there are few general-purpose applications. Having general-purpose microcontrollers in things around us would be a big step towards making ubiquity of computing and would vastly improve our ability to monitor, track, and respond to changes in our environments. To make this happen, we are creating a way for anyone to attach Bluetooth-enabled sensors to arbitrary objects around them, which track when and for how long objects are used. Sensors will connect to a phone, where logged data will be used to provide analytics and reminders for users. This will help individuals maintain habits and schedules, and allow objects to provide immediate or delayed feedback when they are used or left alone.

Because our sensors will be simple, a significant part of the project will be creating an intuitive interface for users to manage the behavior of objects, e.g. how often to remind the user when they have been left unused. To do this, Dale and Raymond designed the user interface of the application, including the interaction flow and screens, and described the actual interactions in the writeup. Collin and Farhan designed, built, and documented a set of prototype sensor integrations and use cases, based on the parts that we ordered.

Document Prototype

We made a relatively detailed paper prototype of our iOS app in order to hash out what components need to go in the user interface (and not necessarily how they will be sized, or arranged, which will change) as well as what specific interactions could be used in the UI. We envision that many iOS apps could use this sensor platform provided that it was opened up; this one will be called Taskly.

Taskly Interface Walkthrough

Taskly Reminder App

Below, we have a created a flowchart of how our app is meant to be used. (Right-click and open it in a new tab to zoom.)

Here we have documented the use of each screen:

IMG_0630

When a user completes a task, it is automatically detected by our sensor tags and pushes the user an iPhone notification–task completed!

 

IMG_0617

User gets a reminder–time to do reading!

IMG_0618

More information about the scheduled task–user can snooze task, skip task, or stop tracking.

IMG_0619

Taskly start screen–user can see today’s tasks, all tracked tasks, or add a new task

IMG_0620

When user clicks on “MyTasks”, this screen appears, showing weekly progress, next scheduled task, and frequency of task.

IMG_0621

When user clicks on the stats icon from the My Tasks screen, they see this screen, which displays progress on all tasks. It also shows percent of assigned tasks completed.

IMG_0622

User can also see information about individual scheduled tasks, like previously assigned tasks (and if they were completed), a bar chart of progress, percent success at completing tasks, reminder/alert schedules, etc. User can also edit task.

IMG_0623

When user clicks, “Track a New Action”, they are brought to this screen, offering preset tasks (track practicing an instrument, track reading a book, track going to the gym, etc), as well as “Add a custom action”

IMG_0627

User has selected “Track reading a book”. Sensor installation information is displayed.

 

 

IMG_0629

IMG_0625

User can name a task here, upload a task icon, set reminders, change sensor notification options (i.e. log when book is opened) etc.

IMG_0624

Here, user changes to log task when book is closed rather than opened.

IMG_0628

When a user decides to create a custom task, they are brought to the “Track a Sensor” screen, which gives simple options like “track light or dark”, “track by location”, “track by motion”, etc.

IMG_0626

Bluetooth sensor setup information

Document Tasks

Easy: Our easy task was tracking how often users go to the gym. Users put a sensor tag in their gym bags, and then our app logs whenever the gym bag moves, causing the sensor tag’s accelerometer to note a period of nonmovement followed by movement. We simulated this with our fake tags made out of LED timer displays (about the same size, shape of our real sensors). We attached the tags to the inside of a bag.

Our app will communicate with the tag via Bluetooth and log whenever the tag’s accelerometer experiences a period of nonmovement followed by movement (we’ve picked up the bag!), nommovement (put the bag down at the gym), movement (leaving the gym), and nonmovement (bag is back at home). It will use predefined thresholds (a gym visit is not likely to exceed two hours, etc.) to determine when the user is actually visiting the gym, with the visit starting when the bag remains in motion for awhile. To provide reminders, the user will configure our app with the number of days in a week they would like to complete this task, and our app will send them reminders via push notification if they are not on schedule, e.g. if they miss a day, at a time of day that they specify.

Accelerometer Sensor for Gym Bags

Screen shot 2013-03-29 at 10.38.35 PM

Sensor is placed in a secure location in a gym bag, Its accelerometer detects when the bag is moved.

Medium: Our medium difficulty task was to log when users take pills. We assume that the user’s pillbox is typically shaped, i.e. a box with a flip-out lid and different compartments for pills (often labeled M, T, W, etc.). This was exactly the same shape as our Sparkfun lab kit, so we used it and had integrated circuits represent the pills. We attached one of our fake tags (LED timer display) to the inside of the box lid.

Our app connects to the tag via bluetooth and detects every time the lid is opened, corresponding to a distinct change of about 2 g’s in the accelerometer data from our tags. To provide reminders, the user sets a schedule of times in the week when they should be using medication. If they are late by a set amount of time, or if they open the pillbox at a different time, we will send them a push or email notification.

Magnetometer Sensor for Pill Containers

Screen shot 2013-03-29 at 10.40.00 PM

This “pillbox” is structurally very similar to the pillbox we imagine users using our product with (we even have IC pills!). A sensor is placed on the inside cover, and its accelerometer detects when the lid has been lifted.

Hard: Our hard task was to track how frequently, and for how long, users read tagged books. Users will put a sensor on the spine of the book they wish to track. They will then put a thin piece of metal on the inside of the back cover of the book. Using a magnetometer, the sensor will track the orientation of the back cover in reference to the book’s spine. In other words, it will detect when the book is opened. Our iPhone app will connect to the sensor via bluetooth and record which books are read and for how long. It is important to note that this system is most viable for textbooks or other large books because of the size of the sensor which must attach to the book’s spine. Smaller books can also be tracked if the sensor is attached to the front cover, but our group decided that such sensor placement would be too distracting and obtrusive to be desirable.

This is the most difficult hardware integration, since sensors and magnets must fit neatly in the book. (It might be possible for our group to add a flex sensor to the microcontroller which underlies the sensors we purchased, thus removing the issue of clunky hardware integration in the case of small books. In that case, neatly attaching new sensors to the preexisting circuit would likely be one of the hardest technical challenges of this project.)

To track how often books are read, the user will set a threshold of time for how long the book can go unused. When that time is exceeded, our app will send them reminders by push notification or email. The interface to create this schedule must exist in parallel to interfaces for times-per-week or window-of-action schedules mentioned above.

Magnetometer Sensor for Books

Screen shot 2013-03-29 at 10.37.26 PM

User attaches sensor to spine of a book. The magnetometer of the sensor detects when the magnet, on the cover of the book, is brought near it.

Screen shot 2013-03-29 at 10.37.42 PM

Sensor on spine of book.

Our Prototypes

How did you make it?:

For our iPhone app, we made an extensive paper/cardboard prototype with 12 different screens and ‘interactive’ buttons. We drew all of the screens by hand, and occassionally had folding paper flaps that represented selecting different options. We cut out a paper iphone to represent the phone itself.

For our sensors, we used an LED seven-segment display, as this component was approximately the correct size/shape of the actual sensor tags we’ll be using. To represent our pillbox, we used a sparkfun box that had approximately the same shape as the actual pillboxes we envision using our tags with.

Did you come up with new prototyping techniques?:

Since our app will depend upon sensors which users embed in the world around them, we decided that it was important to have prototype sensors which were more substantial than pieces of paper. We took a seven-segment display from our lab kit and used that as our model sensor because of its small box shape. Paper sensors would give an incorrect sense of the weight and dimensions of our real sensors; it is important for users to get a sense for how obtrusive or unobtrusive the sensors really are.

What was difficult?

Designing our iPhone app GUI was more difficult than we had imagined. To “add a new task,” users have to choose a sensor and ‘program’ it to log their tasks. It was difficult for us to figure out how we could make this as simple as possible for users. We ultimately decided on creating preset tasks to track and what we consider to be an easy-to-use sensor setup workflow with lots of pictures of how the sensors worked. We also simplified the ways our sensors could work. For example, we made sensor data discrete. Instead of our accelerometers to track acceleration, we allow users to track movement or no movement.

What worked well?

Paper prototyping our iPhone app worked really well because it allowed us, the developers, to really think through what screens users need to see to most easily interact with our app. It forced us to figure out how to simplify what could have been a complicated app user interface. Simplicity is particularly important in our case, as the screen of an iPhone is too small to handle unnecessarily feature-heavy GUIs.

Using a large electronic component to represent our sensors also worked well because it gave us a good sense of the kinds of concerns users would have when embedding sensors in the objects and devices around them. We started to think about ways in which to handle the relatively large size and weight of our sensors.

Assignment 3

Names:
Dale Markowitz
Raymond Zhong
Amy Zhou
Joshua Prager
Daniel Chyan

i.
What were the most severe problems with the app/site?
How do these problems fit into Nielsen’s heuristic categories? What are some suggestions to fix the UI, and how do your proposed changes relate to the heuristic categories?

One of the most egregious problems we found with Blackboard was its lack of naming conventions in the folders containing documents.  For example, while Blackboard has an e-reserves tab, a course materials tab, and a syllabus tab, we have known professors to interpret these tabs completely inconsistently (syllabus in course materials folder, etc). This makes our ability to find relevant course material extremely difficult. We believe this violates H4, consistency and standards. We also found that H1. Visibility of System Status was violated. Particularly, students cannot tell when important course material is updated, and we found an update about Blackboard going down 7 days ago still listed on the site.  Finally, we found H8. Noise to Signal ratio to be violated, as the home page was covered in somewhat irrelevant information and tools. For example, a good section of the home page is taken up by a side panel telling users how to use Blackboard, which really should be encapsulated in the “Help” tab since it is rarely used.

ii.

H1 (Visibility of system status) gave us a better way to think about the top-down interface of Blackboard, starting from the front page. It made it obvious that there is a usability problem when students do not know the latest news, grades, or assignments in their courses.

H6 (Recognition rather than recall) exposed usability problems that would have been easy to gloss over after using Blackboard for multiple years. We noticed that recognition is difficult while using the Tools page (icons are nondescriptive, page is overfull, etc.) to access grades or send email. The heuristic also made it easier to observe when information was not exposed in previews, forcing students to recall the contents of a document with a given title, rather than recognizing it visually or through an excerpt.

iii.
One usability issue encountered was the presence of red herrings that linked to unhelpful pages. For example, clicking on the “Courses” tab leads to a page that contains all courses ever taken rather than a page of current courses. In addition, the prominent top Princeton logo links to the contextually unhelpful Princeton website rather than the landing page for Blackboard.

iv.
How does a physical book violate Nielsen’s guidelines for usability. Use a hardbound copy of Harry Potter as an example.

Joshua Prager’s Evaluation

Dale Markowitz’s Evaluation

Daniel Chyan’s Evaluation

Amy Zhou’s Evaluation (typed from a phone, so please excuse copious typos)

Raymond Zhong’s Evaluation

 

A2- Dale

Observations

1. Got to Phi203 Lecture about 5 minutes early. It’s a ~100 person class. The girl next to me is checking her email on her laptop, someone in front of me is checking on his iPhone. At least 40% of people here with computers are checking Facebook. Someone to the left of me is setting up the headings for the notes he’s going to take in this class. Lots of people are bringing up this week’s readings. Calendar-checking and updating is also popular. Students who walk into the classroom now are searching for their friends. Some of them are awkwardly squeezing past rows of people to get to middle-row seats.

2. My roommate, at 8:00 AM, getting ready to go to her first class. Me in bed, wishing she weren’t so loud because I don’t have class until 10. She is getting all of her books from the bookshelf, unzipping her backpack (which is really loud), stuffing in books and papers, trying to figure out if she forgot anything. She checks the weather and sees it’s raining, grabs her umbrella. Grabs her helmet, key to bike lock, and runs out the door.

3. Math 217, I just got to class. Professor is writing the first ten minutes of class on the board. He writes an outline of what we are going to learn, and also puts up a short description of what we covered last week in class. Math is a special lecture because professors need chalkboards or whiteboards (a projector isn’t good enough because they need to write math problems and solutions which is easier to do with a pen/chalk). Is there a way we could save him the time of writing all this preliminary info? He writes things on the chalkboard from his written notes, so there is no digital copy, and no way for students who missed lecture to get notes online.

Brainstormed Ideas

  • GeoTask – an app that alerts users to complete certain tasks based not on time or date but on location. Passing Frist? It will remind you to pick up that package you received.
  • InClass – never text your friends in class again. This app augments your phone’s contact list with “In Class” or “Not in Class” next to each contact’s name, and also offers a drop-down class schedule option.
  • 5 minute language-learning app that will present users a 5-minute lesson on a single word in a foreign language. Presents user with spelling/pronunciation challenges plus in-context use of word.
  • ClassFM – lectures are sent live over campus radio, so that when students are running late to class, they can tune in to the first few minutes on any radio receiver.
  • ClassCall – small seminars start the first 10 minutes of their class as conference calls, so that if you’re going to be ten minutes late, you can phone-in and be involved in the discussion
  • NoteInit – a note-initializing app that saves you the trouble of making a new word document each time you want to take notes in class. Auto-creates a file with proper heading, date, class, lecture topic, etc.
  • Food-ordering app. Order food from Frist from your iPhone, pay from your iPhone, and specify when you’ll pick it up. Then when you pass Frist, all you have to do is grab your food and go.
  • PathUnPack – GPS-enabled app that tells you the best path to take from class A to B by considering how crowded they are. Especially useful for bikers who don’t want to be stuck behind walls of walkers.
  • BikePath – bike-optimized GPS app, that tells you the best path to take from A to B without encountering steps.
  • Laptop-battery vending machine. Check out laptop batteries from vending machines with PUID for a small fee, return them within 24-hours. Useful for when you forgot to charge your laptop before lectures.
  • App that maximizes the amount of energy you can expend while getting from class A to B, i.e. suggests the path with hills, steps, etc. Suggests easy, 5-minute workout routines users can do along the way.
  • Interval Alarm – alarm that not only alerts users not only at the time scheduled but also gives users distinct 15 minute, 10 minute, and 5 minute warnings distinguished by ringtone (for use, say, when you wake up and are getting ready for class and want to know when you have to speed up)
  • TigerMunch – Check by dining hall where your friends are eating. Displays friends PUID swipe-in times for different dining halls.

2 Favorite Ideas

I like TigerMunch because sometimes I don’t necessarily want to text/call my friends to coordinate lunch in between classes, but I definitely might choose to go to one dining hall over another if I know lots of people there.

I like InClass because sometimes my friends will call me 2+ times while I’m in class to, say, ask me to have lunch, thinking that I’m simply ignoring them when really I just can’t answer because I’m in a small precept. If I could have my friends’ class schedules on-demand, it would be much easier to figure out when I should meet up with people/contact them/make lunch plans with them.

Prototypes

Photo Mar 01, 11 05 37 PM

Tiger Munch launch screen

Photo Mar 01, 11 05 59 PM

Scroll down to see who’s eating in Mathey

Photo Mar 01, 11 06 07 PM

Friends in Mathey, with sign-in times

Photo Mar 01, 11 06 38 PM

Who’s eating in Forbes today?

Photo Mar 01, 11 06 57 PM

Nobody. That is surprising.

Photo Mar 01, 11 07 25 PM

InClass launch screen in contacts list

Photo Mar 01, 11 07 32 PM

Scroll down to see more info.

Photo Mar 01, 11 07 49 PM

Kate’s class schedule, for the next few hours

User Testing

None of my users had a particular problem with figuring out the user interface or how they were supposed to use the app, but they did have really awesome ideas for how to extend TigerMunch. Originally I thought that users would “check in” when they get to a dining hall, but my friend David suggested it would be cooler if you could automatically be checked in when you swipe your PUID at a dining hall. This made me wonder if auto-check-in with GPS would be useful. Another user suggested I connect TigerMunch with Princeton’s TigerApp that pulls dining hall menus, so users could choose dining halls based on both friends and food. Photo Mar 01, 11 09 23 PMUser seems to get the interface…

[kaltura-widget uiconfid=”1727958″ entryid=”0_7f8wgh2h” width=”400″ height=”360″ addpermission=”” editpermission=”” /]

L1

Names:
Farhan Abrol
Dale Markowitz
Collin Stedman
Raymond Zhong

Group Number: 4

Description

The interface consists of a force sensitive resistor, which the user either taps or holds to send dots or dashes, respectively. A piezo sensor beeps to provide feedback; a short pulse tells the user they have hit the FSR with enough force to trigger it, and a long pulse tells them they have held it down long enough to enter a dash.

Morse code is tricky to interpret, because not all characters are the same length (i.e. one letter might be a single dot, while another is several dots). In order to get around this, we set up a time interval for which dots or dashes a user enters are considered a single character. This is denoted by four LEDs that “count down” to indicate how long the user has to enter the next dot or dash within the same character.

Assessment

When deciding on what to build for this assignment, our team thought of many options involving interacting with sound through touch. We wanted to build something that was both simple and fun to play with. We went through many ideas involving sound synthesis (especially a drum simulator). Ultimately, we decided on a Morse code interpreter, because we felt it fit well with the supplies we had at hand.

All in all, we were very happy with the way our morse code converter turned out. Although the description sounds complicated, the device is intuitive once you sit down in front of it — and we had fun typing letters on a screen. Perhaps the thing that we are unhappiest with about our device is that it is not super relevant today (what a shame! it’s so fun to play with). It’s a device that might only be appreciated by specialists or true nerds. Whatever those are. 🙂

Storyboard

Photo Feb 27, 12 50 29 PM

Sketches

Photo Feb 27, 12 49 07 PM Photo Feb 27, 12 49 12 PM  Photo Feb 27, 12 50 52 PM Photo Feb 27, 12 50 59 PM Photo Feb 27, 12 51 04 PM

Parts List

Breadboard, wire, wire cutter/stripper
Arduino Uno
Force Sensitive Resistor
3 Red LEDS, 1 Yellow LED, and appropriate resistors for 5V source (varies by LED)
Piezo speaker
Computer and source code

Make it yourself!

Only requires everyday prototyping parts!

1. Obtain a force-sensing resistor, an Arduino microcontroller, an electronic breadboard, as well as several LEDs and appropriate resistors for each component.
2. Attach the force-sensing resistor to the center of the breadboard, in such a way that it can be taped onto the edge of the top of the breadboard.
3. Connect the FSR to the analog sensing port of the Arduino, using a pull-down resistor connected to ground.

circuit1

3. On the opposite side of the breadboard, attach a row of LEDs, connecting them to digital output pins on the Arduino through 330 ohm resistors.
4. Tape the FSR onto the breadboard so that it can be easily tapped or held with a finger.
5. Set the FSR pin, the first LED pin, and the number of LEDs used in the Arduino program.
6. Download the Arduino program, and run the keyboard filter on your computer.

Congrats, you can now type in Morse code!

Arduino Code

/* FSR simple testing sketch. 

Connect one end of FSR to power, the other end to Analog 0.
Then connect one end of a 10K resistor from Analog 0 to ground 
*/

int fsrPin = 0;     // the FSR and 10K pulldown are connected to a0
int fsrThreshold = 20;
int fsrReading;     // the analog reading from the FSR resistor divider
int delayTime = 10;
int buzzerPin = 6;

// number of delay loops for each symbol
int dotLength = 1;
int dashLength = 20; // 200ms
int letterSepLength = 100; // 1000ms
int pauseLength;
int tapLength;

// length of tones emitted for each button press
int dotToneLength = 30;
int dashToneLength = 100;

// LED status display
int firstLEDpin = 8;
int numLEDpins = 5;

void setup(void) {
  Serial.begin(9600);   
}

void sendLetterEnd() {
  Serial.println("2");
}

void sendDot() {
  Serial.println("0");
  tone(buzzerPin, 440);
  delay(dotToneLength);
  noTone(buzzerPin);   
}

void sendDash() {
  Serial.println("1");
  tone(buzzerPin, 1000);
  delay(dashToneLength);
  noTone(buzzerPin);
}

void lightLEDs() {
  for (int i = 0; i < numLEDpins; i++) {
    digitalWrite(firstLEDpin + i, 1);
  }
}

void dimLEDs() {
  for (int i = 0; i < numLEDpins; i++) {     if (pauseLength > i*letterSepLength/numLEDpins) {
      digitalWrite(firstLEDpin + i, 0);
    }
  }
}

void beep() {
}

void loop(void) {
  fsrReading = analogRead(fsrPin);  

  if (fsrReading < fsrThreshold) {     if (dashLength > tapLength && tapLength > dotLength) {
      sendDot();
      lightLEDs();
    }
    if (pauseLength == letterSepLength) {
      sendLetterEnd();
    }
    dimLEDs();
    tapLength = 0;
    pauseLength++;    
  } else {
    if (tapLength == dashLength) {
      sendDash();
      lightLEDs();
    }
    pauseLength = 0;
    tapLength++;
  }
  //Serial.print("Analog reading = ");
  //Serial.println(fsrReading);     // the raw analog reading
    delay(delayTime);
}

Processing Code:

github.com/dalequark/morsecode

[kaltura-widget uiconfid=”1727958″ entryid=”0_0e1153da” width=”400″ height=”360″ addpermission=”” editpermission=”” /]

P1

P 1

TFCS

Collin Stedman, Raymond Zhong, Farhan Abrol, Dale Markowitz

Ideas

  • NFC device that allows user to “transport” documents and websites from one computer to another.
  • Habit-acquisition system.  Tag physical objects like dumbells, medicine bottles, etc with RF tags or microcontrollers, and use them to detect and log users’ interactions with these objects.

  • Forget-me-not backpack. If it is zipped without an NFC-tagged laptop, it beeps to alert its user he/she has left something behind.

backpack

  • Portable 3D pointer/mouse which consist of a string whose tension is measured to determine its position in 3D space.

  • Interface for navigation through multiple degrees of graphs (such as WordNet) with gestures using Kinect/Leap Motion.
  • Real-time tracking of facial microexpressions using a wearable camera, to provide instant feedback on social interactions and crowds.
  • Real-time tracking of facial microexpressions using a stationary webcam, to track energy and dynamics in a room during meetings, lectures, etc.
  • Intelligent conference tables that transfers powerpoints/handouts to and from users who place their devices on it.
  • Augmented-reality phone apps that overlay location data from Facebook, Linkedin, etc on the real world, like Yelp Monocle but with open data integrations.
  • Panoramic photography application for iPhone, which allows tagging things around you and immersively recreating the place where you took the photo.
  • Pen peripheral device for iPhone, which has a pencil tip and can write on paper, but also saves digital copies of what users have written.
  • Add electrical/magnetic field sensing to an iPhone or Pebble, and use it to provide an additional sense through vibration, tension, etc.
  • iPhone peripheral multimeter attachment

  • Geo-aware reminder app: App that reminds you to do tasks when you are in the appropriate location, sensed using GPS.
  • Public computer setup that senses’ users presence using face recognition and/or NFC, and automatically loads their personal desktop from the cloud.
  • An audio looper that uses Leap/Kinect to control dynamics, pitch, etc.
  • Virtual plants and creatures that simulate the health of real things, like work, schedules, or relationships (as measured by online communication). Best if a clear interface for tracking many different things.
  • Interactive canvas that can be touched or walked on to draw stories or animated art/videos/slideshows.
  • SMS enabled door lock – users text their friends keys which enable the friends to open the lock.
  • A tangible interface for tangibly representing and manipulating object-oriented programming constructs.
  • Input/output integrations between two computers that allow them to work together seamlessly.
  • Keyboard gloves. Design a different keyboard that allows typing from memory.
  • Digital business cards that are exchanged by phone via NFC.
  • A game played on a digital surface where the physical location and orientation of game pieces upon the board causes different game behaviors, events, and outcomes.
  • Photo frames whose photos are static until users stop to look at them, at which time the photos become videos and possibly react to the observer(s).
  • 3D scanner which reads and interprets braille.

  • A petlike robot which is meant to be a dynamic presence in the homes of people living alone.
  • Wifi-detector necklace/keychain, which glows when open wi-fi is available, or for important calls.

  • Kinect program which allows users to “walk through” panoramic google maps.
  • NFC-key bracelet, which stores NFC keys like a virtual keyring would.
  • Multiuser NFC laptop lock screen which tracks recent users of the computer, retaining traces of their presence and usage of the computer. (Useful for making computer applications more discoverable, if people can see what software is used at a certain workstation.)
  • Flash drives that plug into a computer and download content from Internet livestreams, and replay them until they are exhausted and must be recharged.
  • Lights that automatically turn off when user gets into bed (detects this with pressure sensor)
  • Chandelier that displays when user receives an email, Tweet, etc with different color LEDs.

  • Persistence of Vision machine that sits on users desks, displays tweets, email titles, etc.
  • Lecture attendance taker using NFC login.
  • Headband that measures user’s concentration and vibrates to alert them when they lose concentration (for use in lecture, studying, etc).
  • Posture monitor using a stretch sensor, which vibrates to alert user when he/she is slouching.
  • Musical device that consists of streams of water (water harp, for example).  Perhaps for hearing-impaired users to visualize and create music.
  • Controllers for complex theater lighting configurations using gestures/tactile interfaces.
  • Redesigned controls for mixers and synthesizers, perhaps gesture-based using Kinect.
  • Interactive table that can be drawn on and that makes noise when users touch it, also responding to pressure of touch.
  • A set of triggers for safely receiving a delivery carried by a small quadrotor/helicopter.
  • A gestural user interface used to provide navigation instructions for a robotic reconnaissance drone before it is deployed.
  • Intelligent teapot that brews tea when it reaches the right temperature.
  • Canvases that interactively display text/poems/etc through gestural input. Think interactive museum exhibit but with text.
  • Journal that automatically logs daily temperature, pulse, location based on data taken by phone, fitbit, etc.
  • Directional sensing of wifi or cell signals – overlay where wifi or cell signals are coming from on a webcam view or Google Glass.
  • Virtual spatial interface across multiple computer screens: ability to throw virtual objects from one screen or computer to another in physical space using gestures.
  • Object oriented programming language which is embodied by physical lego-style blocks. Each block corresponds to an object with a series of functions. Plugging blocks together in different ways creates complete programs.
  • Shoes that sense and warn athletes of poor posture in real-time and historically.
  • Using a sheet as a screen. Video camera detects indents or ripples in sheet, interprets command and projects new image on sheet.
  • 3D scanner, and interface for virtually rearranging scanned objects in e.g. a room.

Our Choice

Habit Tracking for Object Interactions

We will create a variety of sensors to be attached to everyday objects like medicine boxes and exercise equipment, which detect when they are disturbed and update a habit-tracking application on your phone or computer.

We thought of many physical computing ideas, and chose the one that enabled the most interesting interaction cycle between the user and computer. Dale and Collin were interested in using NFC tags to remind themselves when they forgot to bring things with them. Raymond noticed he left things untouched that he wanted to use, like vitamins, whey, and dumbbells. We realized that it’s hard to profile what objects should be on someone at all times, but it is much easier to couple everyday interactions with objects like medicine bottles and dumbbells to a computer. What if those objects could remind us when they were left unused? This is useful; it would solve problems for anyone on a medical regimes or trying to build a new habit. It also enables some of the most interesting human-computer interactions among all the ideas we considered, since it it imbues common objects with intelligence which can go far beyond sensing disturbances, enabling a whole class of interactions between people and things around them.

 

Target Users and Context

Habit tracking serves strong needs for a few user groups. The elderly and those suffering from Alzheimer’s would benefit massively from reminders for their daily habits. For both the elderly and the infirm, daily medication schedules are vital; both overdosage and underdosage are unacceptable. Elderly individuals frequently use pill boxes labeled by day in order to remember which medicines to take on a particular day of the week. While this humble solution is simple and reliable, it does not provide active reminders and is only effective when used regularly. Even more importantly, the elderly population is provided with very few tools for remembering non-medical habits or behaviors, such as calling a family member, attending a scheduled appointment, feeding a pet, or checking the mail. This demographic is likely to have few but more significant uses for mobile or desktop computers, like calling and messaging relatives or reading news.

Another group of people who would benefit from habit tracking is athletes. In order to prepare for a marathon, be in peak shape for a championship, or simply maintain physique, athletes must be diligent in their workout routines. They need to remember to workout with a certain frequency, or want to make sure they they work out at a certain time of the day and for a certain amount of time. Most importantly of all, they need to make sure that they manage to maintain whatever changes they make to their established routines. The category of athletes extends to larger groups like students or the general population, as we are all too familiar with making resolutions to go to the gym or floss every day. Many people are interested in either acquiring new habits or reinforcing existing ones.

Problem Description

Our product solves two general problems. The first is struggling to acquire habits, which requires a solution that is reliable and easy to use. A reminder system must not fail upon losing Internet connectivity, although it could expect a phone to stay powered on, and it should not disrupt existing non-technological solutions like pill boxes. The habit tracker will be used mostly at home, or perhaps at work, by people with limited experience with technology. The system should require little maintenance, although users are not likely to be busy.

Another user segment is those who would like to acquire new habits, like flossing, taking vitamins, or exercising. These users currently use Seinfeld-style calendars where they cross out days when they have accomplished their goals, or habit tracking apps that are general-purpose (like Lift or Beeminder) or specific (like RunKeeper). Calendars are limited in how many habits they can track and do not provide reminders. Habit tracking apps require a superfluous interaction beyond the actual activity of the habit. Since these users are likely to have limited motivation during most times (but high motivation occasionally), a lower-friction interaction is beneficial for them.

Choice of Technology

Our system will use a combination of platforms to sense and track object interactions. The frontend of the app will be a smartphone or web app which lets you identify and track the habits and objects that are being tracked. It generates reminders and alerts for objects that haven’t been disturbed by the user in a set time period, and provide reminders to enforce important actions/habits the user is trying to follow.

The sensors must detect usage of the object; since the interaction varies between object, we would use a number of sensor integrations.

  • Weight-sensing pads for: pill boxes, dumbbells, laundry bags, sports equipment
  • Tilt/shake sensors for: dental Floss, hygiene items, tilt-sensors for pill boxes
  • Light sensors for: unused rooms
  • Electrical sensors for: windows, doors, lights

More than just force-sensors, these should sense time, so we can also track how long the object in question was used for. These sensors could be connected to microcontrollers which broadcast a signal. For a more polished system, we could use RF (radio frequency) tags and transmitters to reduce cost and size. The information could then be relayed to a hub or an Internet connection.

NFC-based sensors and NFC-reader phones are one other feasible implementation for the system. NFC tags are cheap and easy to tag an object with, and the data error rates are close to zero. They are also passive, requiring no power supply, which is ideal for sensors that you would attach to objects of daily use. The caveat of using an NFC based system is that the reader (phone) needs to be physically close (3-4 cm) to the tag in order to detect it and get data. This limits the kinds of information that we can get about patterns of usage of objects to only situations where you would physically touch the object with your phone to register a reading. Another implementation choice following in the same vein is using RFID-tags, active or passive as the sensors, and an RFID reader for receiving data.