Miles Yucht – Assignment 2

Observations:
The bulk of my observations were made in various classrooms before and after class, although I did devote some time to observing students in transit between classes. For the most part, the activities conducted by students were indicative of idleness, such as socializing and resting: which activity in particular depended primarily on the size of the group the student was in. In larger groups (3 or more students), students most often were chatting with friends. At one point or another, nearly every student that I observed took out materials to take notes for class beforehand; likewise, after class, people packed up their things. The activities of students in small groups (alone or with one other person) were much more varied, including texting, eating, playing games on their computer, and checking emails/social media. After class, students were observed standing up and stretching out, and infrequently students were asleep at the end of a lecture.

In between classes, the set of activities was even smaller, limited to things that could be accomplished on a mobile device or in person. Most often, students were simply walking with their bags, sometimes with a phone in hand. When asked, these people were most often checking Facebook or texting other friends. In groups, chatting (varying from somewhat quiet to quite raucous) was often conducted, typically with at most two members of the group using their mobile phones at one time. Additionally the occasional running student sought to make it to class before the bell rang on time.

In reflection, I decided to focus my brainstorming on ways to keep students more active during this intermediary time period. I postulated that being more active before/in between lectures would help students pay more attention during lecture and perhaps make them less likely to fall asleep during lecture.

Brainstorming:

The following are my one-line ideas for the brainstorming component of the project:

1. Remind yourself of assignments/projects/readings/etc. for classes
2. In real time analyze what people are talking about right after class
3. Check out what’s for lunch today at your respective dining hall/eating club
4. Calculate the fastest path between two classes for high efficiency walking
5. Sleepy tracker – monitor wakefulness during the day as a function of sleep/naps
6. Princeton trivia game – cool facts you never knew about Princeton
7. Fun music player – plays bassline/guitar/drums, and you can play along with it
8. Save the day’s lecture slides to your computer
9. Jeopardy-style game about lecture, featuring material covered in class
10. Reminder to make sure you don’t leave any of your belongings behind
11. The 5-minute trainer makes a short workout before siting down for an hour+
12. Add student events in the next three days to your calendar
13. Social game where you score points by interacting with classmates
14. Determine how many students are present so the teacher can begin lecture
15. Scavenger-hunt style game where you get points by going to places around campus

Ideas to prototype:
The ideas I’m picking to design prototypes for are ideas 11 and 4. The idea of
a small personal trainer is interesting because by nature of lecture, we often
spend a long time sitting down, and many people find they can stay more focused
for longer after exercising a bit. A mapping app would potentially help a student like those I observed running between classes get to class sooner and even allow them to see exactly how fast they need to go to make it to class on time.

Prototypes:

IMAG0033

For the workout prototype, I decided to use the smartphone form factor because it needs to be on a device that ideally is widely available and also extremely portable, and the smartphone fits both of these needs. When running the workout app prototype, the user is presented with a starting screen, from which they can start a random workout, check out their list of favorited workouts, see their friends’ usage of the app, and adjust their own personal settings. When pressing the random workout button, the user is immediately brought to a workout confirmation page notifying them about the duration and intensity of the workout, from which they can continue on to the workout. The workout duration is automatically calculated based on the starting time for the class and the current time. The workout screen presents one exercise at a time, showing the time remaining on the exercise and the workout and the number of exercises remaining in the workout. Upon finishing or cancelling the workout, the user is brought to the workout completion page, where the workout is logged and the user is given the option to favorite the workout.

IMAG0037 IMAG0036 IMAG0034IMAG0038

Additionally, the user can view their favorite workouts and the number of times they’ve completed their favorite workouts on the Favorite Workouts page. The user can start any of their favorite workouts, or he/she can design a new workout. When making a new workout, the user can rename the current workout, type in an exercise and a duration for the exercise, and add/remove exercises. The total time for the workout is tallied at the bottom of the screen.

IMAG0035

When clicking on the friends link on the home page, one is brought to a list of friends. Clicking on one of those friends brings up their profile, where one can view that friend’s favorite workouts, how many workouts they have completed, and the time of their last workout. Clicking on one of these workouts brings you back to the workout confirmation screen, enabling you to try one of your friends workouts.

IMAG0032 IMAG0031

Clicking on the settings button enables one to change their personal settings. These include: the default difficulty of the workout, the time to end the workout before lecture, the username, sound level, and whether to use vibrate.IMAG0030IMAG0040 IMAG0039 IMAG0029

————————————————————————————–

IMAG0022IMAG0021

For the mapping app, I decided to go with a much simpler layout, simply because the functionality of this program is quite a bit more limited. I figured that it ought to help a user accomplish the singular task of getting from point A to point B as quickly as possible. As such, picking points A and B should be a very easy task. On the home page, one can choose the starting point and the destination point by pressing on the corresponding buttons and immediately ask for directions or change settings for the app. The only settings that can be modified are the route type, which can take a value of “Fastest,” “Shortest distance,” or “Late Meal,” which directs the student towards Frist en route to the destination.

IMAG0024IMAG0025IMAG0026IMAG0027

Once requesting a path, a map screen is loaded, displaying the starting and ending locations, the path to follow, the current location of the user via GPS, the user’s current speed, remaining distance, and time to arrive at the destination.

IMAG0023

Evaluation:

I completed testing with three real users. For the mapping app, I introduced the app to the user as they were about to leave one hall en route to another; in one instance, I also introduced the app to a user to simply play around with the app and describe the experience of using the app rather than try to extract any useful information from it. Before each test, I mentioned to the user that every area of the screen with a black box around it was an interactive component, encouraging them to touch those points and see what happens. Nobody that I had asked to demo my prototype had ever used a paper prototype in the past, so they ended up needing an acclimatory period in which they became accustomed to the use of a paper prototype. After that short period, most people were able to navigate the interface with relative ease. However, some users felt like they had exhausted the possibilities of the app rather quickly and became pretty bored with it after a short period of time. One user suggested the possibility of viewing others on the map who were also using the application. Still, there was overwhelming appreciation for the “Late Meal” setting, which I meant to be more humorous than functional.

Insight:

I found that during the actual evaluation of prototypes, it was far more useful to give the user a task rather than simply letting the user play with the application, especially since both of these applications are designed to accomplishing a very specific task, as I found when I tried to give the one user the app without actually using it to find the shortest path between two places. Without a task, this user felt very undirected and said that he could see how the application would be useful for him but didn’t enjoy the experience of using it.

Additionally, most people left me with the impression that they walked away unsatisfied with what the app could have provided them. In the next redesign of the app, I would change the design in order to emphasize the final result of the calculated route. Perhaps because this app targets a very particular user space, the set of people who are interested in getting places efficiently, I might have been more likely to have picked people not in this group, so the reviews were more negative than I would have hoped. However, this does indicate to me that I’m going to have to make the app more enjoyable or useful for people beyond this group if I want to garner more interest in it.

Assignment A2: Osman Khwaja

Observations:

On Thursday, I stood inside and around the Friend Center for two different class change periods to observe how different people use the 10 minutes. Of the various people I saw, three particular people interested me and I took their actions and generalized them to groups of people for whom I could design a product.

Candidate 1: Hurried bicyclist

This speed demon is trying to bike as fast as he can through a bunch of people who crowd the walkways during class change. He gets stuck behind a crowd of people and is forced to slow down significantly which probably bothers him. His motivation is unclear (could be late to a class, forgot something somewhere, or just has the need for speed), but his frustration with slow pedestrians isn’t. The one I saw struggled to navigate the walkway between Fisher and Friend, got stuck behind a group of walkers, and nearly hit someone trying to weave through. Maybe I could design something that allows him to navigate crowds better.

Candidate 2: The Early Bird

This individual is the one you see trying to kill time outside the classroom. The one that I observed came out of the Friend Library, went downstairs to the tables, and pulled out his phone. Eventually two students walked into one of the classrooms, and then our subject followed them a minute or two later. My guess is that he didn’t want to be the first into the classroom (it might have been a little awkward to be alone with the professor). Maybe we could design something for this candidate that lets him kill time or even let’s him know if people are in the classroom.

Candidate 3: The Kobayashi (google it if you don’t know!)

This student is the one who unfortunately scheduled class such that she can’t enjoy a proper lunch break on certain days. Walking and eating quickly proves challenging as his person struggles to juggle the bunch of things in her hands. The one I saw walking into the Friend Center was trying to eat from a takeout tray, hold a water bottle against her side with her arm, and open the door. Needless to say, she had to wait until someone came by to get into the building. Maybe there’s some tool that will better enable her to enjoy her quick lunch or receive her lunch more quickly or better interact with her surroundings hands-free.

Brainstorm:

1. Real-time pedestrian traffic monitor and route suggester
2. Bike horn that sounds when it senses proximity to pedestrian
3. Pedestrian avoidance system with sensor and intelligent controller
4. Optimal path navigator based on location and end destination
5. A handle bar shrinking system to enable better weaving
6. Fellow class student locator to see if there’s an empty classroom
7. Refresher material application based on classroom proximity
8. A betting application based on which students arrive to class earliest
9. A scenic route suggesting app to kill time walking to class
10. An estimator app that predicts time to eat given meal
11. A app to order lunch for an eating to be picked up at a given time
12. An app that can suggest how to optimize how to hold your objects
13. An automated backpack zipper opener for handsfree opening/storing
14. A help signaler device that notifies people to help open doors
15. A food carrying tray that gently heats food as you walk

Favorite Ideas:

– I chose the pedestrian avoidance system (#3) because it has the most upside (help bikers everywhere, avoid accidents, etc.) and doable given current technology (see Google cars).

– I also chose the student locator (#6) because I thought it’s pretty neat and potentially doable given the prevalence of smart phones and OIT’s registered data base of devices

Quick Prototypes:

Pedestrian Avoidance

Description: The above picture shows the screen of the device that you’d attach to the front of your bike. The horizontal dashes with arrows shows the detected obstacles and their trajectories. You are represented by the arrow with your direction shown as the arrow. Using an intelligent system that takes in the velocities of the sensed obstacles, the device displays a suggested route through the crowd, signified by the dotted line.

DSC00451

Description: The above picture shows the app interface that you’d open on your phone. People, including yourself, are represented as dots against the map layout of the building or area that you are in. By looking at the map, you can see if anybody is in the classroom or on their way to the room. In this picture, two people are already in the classroom and 4 people are on their way to the building.

Testing and Feedback:

I chose to test the pedestrian avoidance system because it’s my personal favorite and I was really interested to see what people would think of it. I managed to catch up with three people who were extremely kind and gave me 5 minutes of their time.

– Person 1: Jason – I managed to meet up with Jason in the Prospect House garden. I put the device on his bike, as shown in Picture 1, and asked him what he thought he should do. He was a little confused at first, but after I told him to imagine the horizontal lines as people, he quickly figured out that he was the arrow and he go on the projected path. Clearly, the graphic wasn’t intuitive enough for him to pick up without a simple nudging. He also said, “It looks ugly. I would throw it away”.

– Person 2: Stephen – I managed to meet up with Stephen outside Brown Hall. I put the device on his bike and asked him how he would use it. Unlike Jason, Stephen immediately knew what to do and commented that he was familiar with this type of interface from GPS devices. Unfortunately, Stephen didn’t see the need for the device, saying something to the effect of  “why would I let the device guide me when I can do it better with my own eyes?” He also commented that it wasn’t pretty looking.

– Person 3: Roy – When I showed Roy the device and asked him to use it without telling how, he was initially confused. But soon figured out that he was the arrow, but couldn’t figure out what the horizontal lines were. Once I finally told him, he thought it was a cool idea and started telling me about how he could use it. He also asked some pretty insightful questions about safe this would be if multiple people were using it or whether or not this device promotes safety if it encourages weaving through traffic.

Pictures of Testing:

I took some pictures (some of them staged) of the user testing process to show how the testing was conducting and how the prototype was used.

Picture 2

This picture gets at the essence of the problem. Hurried bicyclists often struggle to navigate through pedestrians on walkways, especially when they’re crowded during class change. I designed a tool that I hoped would make that experience less frustrating.

Picture 1

This picture shows how I mounted the prototype to the bicycle for user testing. Typically, I had the tester sit on the bike and I held it there with my hand and asked them to interact with the device. They started to think through it, ask some questions, and eventually figured out how it worked. Then, I got their feedback.

Picture 3

Here’s the ideal usage of the prototype in action. Given a set of obstacles, the prototype maps out an optimal course through them in real-time, and the bicyclists follows the path until he clears the obstacles. When I had my users try out the prototype, I made sure we waited until the walkway was crowded and then asked them to use the bike with the prototype.

Insights:

– Using simple symbols to represent complex objects adds a layer of abstraction that can take away from the intuitiveness of your design. In my example, using horizontal lines to represent people or objects caused two of my users to initially struggle to figure out how to use the device. In my next iteration, I could create a representative symbol, like a stick figure, to show an incoming person, and something else to show an inanimate object. Horizontal lines, while easy to draw, received pretty negative feedback.

– Aesthetics are extremely important. As two of my users noted, my device was definitely not the prettiest interface they’ve ever seen. In my next iteration, I would look to create something much more enjoyable. Color-coded objects and a 3D looking arrow are just some of the things I could use to improve how my device looks.

– Provide something unique. As Stephen pointed out, my device simply does something that a conditioned human con do pretty easily. While there is some value in that, it isn’t as likely  to be as successful as a product that can produce something that humans can’t easily do. Maybe adding some small features to the device, like a flashlight, a speedometer, or a video camera to take some cool footage, could help push my device over the top. While that may stray away from the original purpose of the device, these changes could make it a great product.

Assignment A2: Alan Thorne

1.) Initial Observations
I conducted observations before, after, and between almost all of my classes for a few days. Since much of my class time is spent in the CS department, I mostly saw people using their computers or talking to friends.

2.) Ideas
Here’s a list of ideas I came up with:
– Mini-game server for Princeton campus
– Interactive Princeton trivia screens
– Quick view of important daily information
– Coordinate local upcoming events with friend’s plans
– Quick facts/tips site about various things related to classes
– A minimal e-mail client: for speed over functionality
– Restroom/ snack/ coffee location database w/ map
– Route planner: route efficiency, avoid road blocks and congestion
– List of upcoming deadlines: What’s due when?
– School resource usage info (location based. for laundry, printers, funding even??)
– Quick view of menus / where friends are eating
– Random entertainment generator (like stumble upon but optimized for quick, transient browsing)

3.) Prototypes
I chose to flesh out both my 2nd and 3rd ideas. The 3rd I’ve affectionately called “Nutshell”
Princeton Trivia Network:
– Place large screens all over campus which display various photos and trivia about Princeton
– Play Jeopardy-like quiz games with a few contestants, possibly all over campus
– Waiting time is spent connecting with the school and cheering on friends

I chose this idea to prototype because it seemed like the kind of fun thing that would make people feel more connected to the University.

Nutshell:
– Bring important information into one mobile app:
– Schedule (list view)
– To-do list
– News headlines (with links to relevant stories on mobile site)
– Weather (linked to weather.com or similar service)
– Facebook and twitter feeds (linked to respective apps)
– Upcoming Important dates (From University calendar)

I chose to prototype this idea because I know it’s something I would use, and it also seems to be a recurring theme in software development (iGoogle anyone?) so it felt like a natural starting point.

4.) Prototype pictures
Nutshell:
Home_Screen

List view of calendar

List view of calendar

View of an event when selected

View of an event when selected

To do list

To do list

News Headlines

News Headlines

Headlines link to relevant mobile websites

Headlines link to relevant mobile websites

Weather from an online source

Weather from an online source

Clicking on the weather will redirect to the web

Clicking on the weather will redirect to the web

Recent activity from Facebook and Twitter

Recent activity from Facebook and Twitter

Selecting a story goes to that story's app

Selecting a story goes to that story’s app

Lists important dates

Lists important dates

Dates link to princeton.edu

Dates link to princeton.edu

My roommate pressing stuff

My roommate pressing stuff

Princeton Trivia Network:

Shows random Princeton trivia. Changes every minute or so.

Shows random Princeton trivia. Changes every minute or so.

Asks people the answer to trivia questions on a timer.

Asks people the answer to trivia questions on a timer.

Tells them if they got it right or wrong relative to others.

Tells them if they got it right or wrong relative to others.

5.) Usage Observations
Nutshell: User testing went very smoothly. There was almost no confusion about anything, and on the whole, it seemed like a pleasant experience for everyone involved. There were two unexpected events:
– I envisioned using a swiping gesture between categories, but no one picked up on that.
– One tester swiped the home screen with his whole hand. It was weird.

Princeton Trivia Network: Again, everything was pretty straightforward for the users. There are only 6 buttons in the whole interface so it was easy to figure out. I got a few “Really? THIS is what you came up with?” looks. In retrospect i have to agree with them.

6.) Insights
The quotations are from user testing
General:
– Buttons are quite intuitive. Swiping is less so.

Nutshell:
– Weather by the hour would have been nice
– There needs to be a back button. Always
– Adding a reminder/alarm feature to the calendar would be helpful
– People value consolidation
– “Where are the games?” People want to be entertained while they wait?
– People want to easily “flip” from one information source to the next

Princeton Trivia Network:
– Not so great…
– “Cute”
– Maybe the answers could “lead to another game”
– “You should tell people how many people got it wrong.”
– “You should have different levels, like easy, medium, and hard.”

Individual Design Exercise

Observations

My COS583 professor, before class, arrives to the classroom ten minutes early. She prepares for the class by writing everything she wants to be on the board to use during class discussion. Looking at her notes, she sees all the diagrams and illustrations she wants to copy on the board.

To prepare for class, my ENG321 preceptor has several pages of printed notes of where to lead the precept in conversation. Before class, she reviews her notes and greets everyone into the class. She almost always seems very prepared and only gives her notes a very cursory glance.

Before lecture in COS 436, very few people are reviewing notes. The majority of people, including one senior I observed, simply try to kill time by talking to the people sitting next to them or browsing the internet. This senior in particular was reading TechCrunch, Facebook and Reddit. A few people pull up assignments from other classes or prepare their word processor for the lecture.

That’s a big change from a precept style class. In COS583, students are asked to read a seminal paper of computer science and discuss it in class. Before class, half the participants (including another senior I observed) will flip through the assigned paper and review the highlighting/notes they scribbled on the side of the paper. Some of these people are reviewing it to be studious and others simply haven’t read the paper and are trying to catch up. The other half of the class will kill time by talking to others or browsing the internet.

Talking to students who review their notes before class reveals that they often prepared for class a long time ago. Some of them read the paper a couple days back and need a refresher on the material.

Another interesting observation: students who come late have a harder time than you would expect finding a seat with certain classroom layouts. If either the layout was improved or if students were encouraged to sit more efficiently, this problem would be avoided.

Brainstormed ideas

1. Attach sensors and lights to seats to make sure students choose the most efficient seating pattern in lecture.
2. A platform for prepared students to sell summaries of readings to unprepared students.
3. Placards that allow students to find seats more easily and that allow the teacher to more adequately control discussion.
4. Integrate word processors inside desks to make sure no students are using their computers for the wrong purposes.
5. Allow teachers to upload a layout to a board with the click of a button
6. Allow student computers to take screenshots of what will be on the board/slides before class
7. A door to the lecture hall that locks from the outside immediately when class is supposed to start.
8. A web application that gives students a one-minute refresher of the last lecture to prepare them.
9. A mobile application that knows when you should leave for your next appointment based on the distance you must cover
10. A class instant message board, that allows students to ask questions to others or the lecturer before/during class.
11. A desktop application that will prepare your other desktop applications for class (open word processor, browser to certain tabs, etc.).
12. A mobile application that will tally up the total minutes of class missed by being late and donate money to charity proportional to that amount
13. An e-book platform that crowdsources highlighting and sidenotes.
14. A quick survey for students to show which topic they want the lecture to cover
15. A big screen in lecture that will show the faces of late students
16. Make the projector in lecture play the same games of an NBA big screen before class (find the people, kiss cam, etc).

Ideas Chosen

In this project, I will be exploring ideas 9 and 13. Idea 9 would be very useful to me personally, because I have a very bad sense of how long it takes me to get to places and truly smart alarm clocks should take distance to destination into consideration. Idea 13 would be a more natural way of having discussion in a book – as you read the content, you see what other people thought was important and what insights they may have.

SmartGo

For idea 9, I envision an app named SmartGo. SmartGo starts off very simple and allows users to import calendars from different sources (Google, Apple, etc). This events are then visible in a simple list “home screen” that is the default view for the app.

Each event can be opened by touching its row in the list. Once in the event view, users can edit the information of the event and when they should be reminded to leave. Users can drag/tap the map to change location or click the magnifying glass to search for a completely different map segment.

Adding an event is very simple – only three fields are needed in the beginning: time, description and location. Once a location is entered, a map is generated to allow the user to specify the exact destination of the event.

The app routinely checks the background to see the user’s location and once it realizes that the user needs to leave to arrive in a destination at the right time, it sends a push notification. Simple map APIs can allow the app to accomplish this.

CrowdNotes

CrowdNotes is a simple browser based eBook platform that allows users to share eBooks with groups and allow others to comment/highlight them. The home screen is very simple and consists of only an upload button. Ideally, the uploader would accept various types of eBook file extensions and PDFs.

Next, users can share the file with the friends by clicking the share button in the top bar. Ideally, users can share with email addresses or social media and they can even share by giving the URL away. Visitors can see the notes of previous readers and the most highlighted phrases within the book.

When users highlight a portion of the text with their cursor, a tooltip appears with two options: sidenote and highlight. Sidenotes appear in the margin of the screen once inputted and highlights are simply highlights.

User testing

For the most parts each of the three testers enjoyed the apps they saw. The majority of their confusion resulted from my poor handwriting, but the “flow” of both GUIs seemed to work fine. Many of them requested new features to make the apps more useful to them.

Our first tester enjoyed the SmartGo app, but believed it would be arduous to update the app regularly. While she appreciated that the application would have live updates from other calendar applications, she did not like that she needed to choose parameters like “mode of transportation” every time. She requested that the app’s settings page allow her to choose defaults like “remind me so I will get to my events five minutes before starting time” and “I have a car that I can use to get there”.

This tester also liked CrowdNotes, saying “it could be useful for reading heavy classes”. However, she had some concerns about the usability of the app once it had a large number of people sharing one document. She wants to see what person made what comment to be able to determine its relevancy and the reliability of the commenter. Highlighting with the cursor was also not explicitly obvious, so maybe a first-run tutorial would be useful for users.

Our second tester is a designer herself. She appreciated the purpose of SmartGo, but commented that she “never really uses calendars anyway”. The biggest insight I got from this test was that the dynamically updating add event view was confusing to some because the submit button only appeared when the user filled out the other fields. Users want to immediately understand how something will work and having the submit button there, even if it is not a valid submission yet, reassures the user.

This user also had scaling concerns with CrowdNotes, asking me how text will appear if multiple people highlight the same text. There are only so many colors to assign, so perhaps there needs to be a better way to see the most highlighted text. Her suggestion was to highlight only the most overall highlighted text and when a user clicks it, they can see each user that used it.

Our last tester is an avid calendar user and appreciates the purpose of SmartGo. The import functionality was complimented because this tester uses Google Calendars religiously. They asked if they can add the location of an event via Google Calendars, so they could avoid using the app as much as possible – I didn’t know the answer to that question. Alternatively, this tester suggested that SmartGo could sync with Google Calendar so the updating could go in both directions. This tester had difficulty choosing when the application should remind them about the event and how they could edit those options.

This tester had the same problems with highlighting the text in CrowdNotes. It was not implicit that highlighting with the cursor would lead to comments and that would be made more easily understandable.

Insights

For SmartGo, I had the following insights:

  • Default settings are important. The app should take into consideration whether a user has a car, whether they like taking public transportation and how early they like to go to events.
  • It is important for a user to clearly understand the flow of submissions – hiding the submission button until the submission is valid is an easy way to confuse users.
  • The app shouldn’t just import to other calendar applications – it should sync to make it more useful.
  • “Tap to edit” should be written somewhere if it’s true.
  • My handwriting is terrible.

For CrowdNotes, I had the following insights:

  • An actual user authentication system would be useful to more accurately share eBooks with others. Sharing just a URL may not be private enough.
  • It is difficult to show how many people highlighted an area given how few colors can be used for it. A number near the highlighting may be used to do that, or even better, the highlighting can be brighter for the larger the ratio of people who saw it and highlighted it.
  • Some people highlight the dumbest things. Eventually, with enough users, anything could be highlighted. It would be good for the application to discern what was and what wasn’t highlighted.
  • Making the platform browser based would get the most users, but platforms with touch capabilities – tablets, phones, etc. – would provide a better user experience.
  • The margins will get cluttered from number of sidenotes. A reputation system or Facebook integration to heavily weigh the submissions of friends could increase relevancy.

I was pleasantly surprised at the quality of feedback received. The majority of collected responses resulted in actionable information to make the products more intuitive and easily understood.

L1 – Name Redacted

Group Members:
Brian Matejek
Joshua Prager
Matt Dolan
Ed Kelley

Group Number: 20

Photo Sketches:

photo1

This is a picture of a memory game with different sensors. The Arduino would tell the user what order to interact with the sensors by turning on and off LED lights associated with each sensor.

photo2

This is a flex sensor connected to an Arduino that allows users to measure finger strength. Alternatively, one could create a game by trying to match the sensor output with a randomly generated number.

photo3

Our third idea is create a game of pong with different sensors. Competitors will use the sensors to move the paddles up and down on the screen.

photo4

Our last idea is to create some mechanism for students to provide real time feedback for teachers.

What We Built and Why:
We built an education utility that allows the students of a class to provide feedback, through potentiometers mounted in their desks, to a teacher or professor. This feedback is then amalgamated into a single graphic that allows the professor to get a reading of the class at a glance. Our project was definitely successful. We were able to produce a reasonably good graphical display from three simulated “students.” We also added lifetime min and max lines that help the teacher gauge the current confusion level of the class in comparison to the extremes. Going forward, we think there is huge potential for the idea. Possible extensions would be to allow students to provide different types of feedback, such as engagement and confusion. The real extension would be in how we could improve the display of the information to the professor. We are currently using a moving graph over time but added extensions like time decay and even adaptive learning algorithms might help the professor get a better snapshot of the class at a glance.

Storyboard:
photo5

photo6

photo7

photo8

In use:
photo9

Feedback nobs that students control

photo9a

Overall setup

photo9b

Display shown to teacher. Each color corresponds to the sensor readings from each individual student. The two red lines are the maximum and minimum levels of confusing in the frame.

Video

List of All Parts Used:
1 Arduino
Assorted Wires
1 Breadboard
3 Potentiometers

Directions:
The setup for our device is fairly simple. Take a potentiometer and connect the left pin to 5V, the middle pin to analog input A0, and the right pin to ground. Repeat this with two more potentiometers with their middle pins connected to analog inputs A1 and A2 respectively. The rest of the setup is done in software.

Code:

// Graphing sketch
//Adapted from http://arduino.cc/en/Tutorial/Graph 
 
 import processing.serial.*;
 
 Serial myPort;        // The serial port
 int xPos = 1;         // horizontal position of the graph

 int zero_pos;
 int one_pos;
 float max_pos;
 float min_pos;


 
 void setup () {
	 // set the window size:
	 size(1080, 720);
	 max_pos = height; 
	 min_pos = 0;       

	 // List all the available serial ports
	 // println(Serial.list());
	 // I know that the first port in the serial list on my mac
	 // is always my  Arduino, so I open Serial.list()[0].
	 // Open whatever port is the one you're using.
	 myPort = new Serial(this, Serial.list()[0], 9600);
	 // don't generate a serialEvent() unless you get a newline character:
	 myPort.bufferUntil('\n');
	 // set inital background:
	 background(224,228,204); 
 }
 void draw () {
 	// everything happens in the serialEvent()
 }
 
 void serialEvent (Serial myPort) {
	 // get the ASCII string:
	 String inString = myPort.readStringUntil('\n');

	 if (inString != null) {
		 // trim off any whitespace:
		 inString = trim(inString);

		if (inString.length() != 0) {
		 String[] sensor_strings = inString.split(" ");

		 if (sensor_strings.length < 3) {
		 	println("RETURN");
		 	return;
		 }

		float inByte = float(sensor_strings[0]); 
		inByte = map(inByte, 0, 1023, 0, height/3);

		float yPos = height;
		// draw the line:
		stroke(105,210,231);
		line(xPos, yPos, xPos, yPos - inByte);
		yPos -= (inByte + 1);

		inByte = float(sensor_strings[1]); 
		inByte = map(inByte, 0, 1023, 0, height/3);

		stroke(167,219,216);
		line(xPos, yPos, xPos, yPos - inByte);
		yPos -= (inByte + 1);


		inByte = float(sensor_strings[2]); 
		inByte = map(inByte, 0, 1023, 0, height/3);

		stroke(250, 105, 0);
		line(xPos, yPos, xPos, yPos - inByte);


		if ((yPos-inByte)  min_pos) {
			min_pos = yPos-inByte;
		}
		drawMax(max_pos);
		drawMax(min_pos);


		 // at the edge of the screen, go back to the beginning:
		 if (xPos >= width) {
			 xPos = 0;
			 max_pos = yPos-inByte;
			 min_pos = yPos-inByte;
			 background(224,228,204); 
		 } 
		 else {
		 // increment the horizontal position:
		 xPos++;
		 }
		}
	 }
 }

 void drawMax(float max_pos) {
 	stroke(255, 0, 0);
 	ellipse(xPos, max_pos, 2, 2);
 }

Future Development:
In the future we would like to have the student sensors interact with the teacher’s computer without connecting by wires. Each student would also have multiple nobs so that the students can express different emotions.

Team Colonial – L1

Team Colonial

Group Members: David Lackey, John O’Neill, Horia Radoi

Group Number: 7

Short Description

We built a physical interface (with potentiometers) to change the RGB values of a window filled with a single solid color.  RGB values can be tricky to understand.  Providing physical knobs to adjust them with immediate visual feedback allows the user to better grasp the concept of RGB values.  We feel that our project was successful and we really liked how Processing allowed us to give such quick feedback.  One of the main drawbacks of our project is that a user can only accurately control 2 potentiometers at a time, making it really difficult to affect all three RGB values simultaneously.

Sketches

RGB Interface

photo 1

This sketch involves our physical interface for adjusting the RGB values of a solid color, displayed via laptop.

Cup Temperature Indicator

photo 2

This sketch involves an LED setup mounted on the side of a cup.  A thermistor hangs just barely into the cup through a small hole in the lid.  The information from the temperature sensor is passed to the Arduino, which then powers certain LEDs based on the temperature of the inside of the cup.

Rooster

photo 3

This sketch involves a Photo Sensor attached to a window to get information about the lighting outside of the user’s room.  If the light outside reaches a certain threshold (daylight), a buzzer connected to the Arduino will go off, waking the user.

Storyboard

photo

System in Action

THE RGB PANEL (video)

Parts Used

  • Arduino, wires, USB cord, breadboard
  • 3 Rotary Potentiometers

Creation Instructions

  1. Add three rotary potentiometers to a breadboard
  2. Add power to the rotary potentiometers, connect them to ground, and connect each one to a separate analog input
  3. Read potentiometer values and convert them to RGB values
  4. Use Processing to display a window filled with a solid color

Source Code

Arduino

/* RGBKnobs */

// pins for knobs
int rPin = 0;     
int gPin = 1;
int bPin = 2;

// the analog reading from the knobs
int rReading; 
int gReading;
int bReading;

double knobMax = 1023.0;

void setup(void) {

  // We'll send debugging information via the Serial monitor
  Serial.begin(9600);   

}

void loop(void) {

  rReading = 255 - (255 * (analogRead(rPin) / knobMax));  
  gReading = 255 - (255 * (analogRead(gPin) / knobMax));
  bReading = 255 - (255 * (analogRead(bPin) / knobMax));

  Serial.print(rReading);
  Serial.print(","); 
  Serial.print(gReading);
  Serial.print(",");
  Serial.print(bReading);
  Serial.println();
}

Processing

/* RGB Knobs, created by Lab Group 7 (jconeill, dlackey, hradoi) */

 import processing.serial.*;
 Serial port;
 // rgb values
 int[] vals = {0,0,0};

 void setup() {
   size(555, 555);

   println("Available serial ports:");
   println(Serial.list());

   port = new Serial(this, Serial.list()[4], 9600);  
   port.bufferUntil('\n');
 }

 void draw() {
   background(vals[0], vals[1], vals[2]);
 }

 void serialEvent(Serial port) {
   String in = port.readStringUntil('\n');
   in = trim(in);

   if (in != null)
     vals = int(split(in, ","));
 }

Lab 1 – Intuitive Computer Navigation

Group Members: Abbi Ward, Dillon Reisman, Prerna Ramachandra, Sam Payne

Group Number: 9

Ideas and Sketches:

1. Etch-A-Sketch – a version of the famous Etch-A-Sketch game using 2 rotational potentiometers

EtchASketch

2. Intruder Alert – An alert if someone opens the door to your room while you’re asleep using photocells and a potentiometer

CreeperInTheRoom

3. Intuitive Control for the Computer – Use of finger flicking movement to quickly minimize windows on the computer.

FlexFinger

We decided to select Idea number 3, since it seemed the most intuitive and feasibly implementable for the lab.

Project Description:

For L1, we built a glove that allows the user to navigate the Internet more naturally. For this implementation we sought to enable a user to do a simple “flick” gesture to minimize the browser quickly for privacy. Currently, the process of minimizing all windows at best requires the user to enter a key macro on their keyboard. The problem with this is that when surfing the Internet the user is not always engaged with the keyboard. This glove uses a flex sensor on the user’s non-dominant index finger so the user does not have to inefficiently engage the keyboard in the event that they have to close their browser quickly. The difficulty with this project was that the Arduino Uno does not have a native API to override the keyboard. To accomplish the task of associating a flex sensor with the keyboard macro to minimize all windows, we used Processing and had the signal indicating a flick written to a file that would be simultaneously read from by a Java module. Java’s “Robot” class allowed us to trigger the keyboard macro we needed to minimize all windows. We are very pleased with the final result- our Java module could be adapted to implement even more functionality by simply triggering different keyboard macros given different signals from Processing. In the future, however, we would like to be able to find a more efficient system to get the Arduino to interact with the computer’s hardware than this write/read process that is happening between Processing/Java.

Project Storyboard

Frame 1

storyboardf1

Frame 2

storyboardf2

Frame 3

photo

Frame 4

photo (1)

Project Schematic

Photo Feb 23, 2 48 58 PM

Demo Video

List of Parts Used

  • Software
    • Arduino
    • Processing + our keyeventsprogram program
    • Java + our KeyboardInteractor.java
  • Hardware
    • flex sensor
    • Arduino
    • 10 kOhm pull up resistor
  • Other
    • glove (to hold the sensor)

Instructions to Recreate Design

  1. Follow the given schematic.
  2. On the Arduino, upload FirmataStandard (in the Example Software, under Firmata) to enable the Arduino to work with Processing.
  3. Go to https://github.com/pardo-bsso/processing-arduino to get an arduino-processing interface library with fixed bugs.
  4. Open our Processing code(keyeventsprogram), Java code (KeyboardInteractor.java) and a command line. You will need to adjust the pathnames stored in the files for your own computer.
  5. With the circuit all hooked up, start the Processing program. A small gray box should pop up.
  6. At the command line, type “tail -f desktop.txt | java KeyboardInteractor”. This takes the output of the Processing program (that is stored in desktop.txt) and pipes it to our Java program. Now, when you interact with the sensor (i.e. by flicking your finger), the shortcut should happen (i.e. windows+d takes you to your desktop)

Source Code

Code for Arduino

/**
* Names: Sam Payne, Abbi Ward, Dillon Reisman, Prerna Ramachandra
* Dates: 2/22/13
* COS 436, L1
*
* NOTE: since Robot objects are not allowed due to graphical
* interference, this program interfaces with an external application
* to trigger desktop results
*/
import processing.serial.*;
import cc.arduino.*;
Arduino arduino;
int desktopPin = 0; // pin reads from flex sensor
int desktopReading; // reading from flex sensor pins
PrintWriter output; // write to file when triggered
//Paramters for sensor reading
int FLICKTOP = 280; //threshold to trigger flick motion
int desktopCounter = 0; //number of times which flick has triggered
//setup
void setup()
{ 
 arduino = new Arduino(this, Arduino.list()[0], 57600);
 arduino.pinMode(ledPin, Arduino.OUTPUT);
 //create pipe file for Robot helper to read
 output = createWriter("C:/Users/Abbi/Programming/hcilab/src/desktop.txt"); 
}
void draw()
{
 // for this prototype only perform the shortcut once
 if (desktopCounter == 0) {
 desktopReading = arduino.analogRead(desktopPin);
 //output.println(desktopReading);
 }
// if above the threshold, then go do desktop
 if (desktopReading > FLICKTOP) {
 desktopCounter++;
 desktopReading = 200;
 //Send command to buffer file for robot object to interpret
 output.println("D"); 
 output.flush();
 }

 //exit the program after 5 triggers, this is a prototype function
 if (desktopCounter > 5) {
 output.flush();
 output.close();
 exit(); 
 }
 //arduino.digitalWrite(ledPin, Arduino.HIGH);

 delay(20); // delay 20 ms
}

Java Code with Robot class as workaround for using a different Arduino

/**
* Names: Sam Payne, Abbi Ward, Dillon Reisman, Prerna Ramachandra
* Dates: 2/22/13
* COS 436, L1
* Keyboard interacting program to interface with Arduino
* since robot class is not allowed due to graphics interference
*/
import java.lang.Object;
import java.awt.Robot;
import java.awt.AWTException;
public class KeyboardInteractor {
 public static void main(String[] args) {
 Robot rob; 
 //In in = new In("C:/Users/Abbi/Programming/hcilab/src/desktop.txt");
 try {
 rob = new Robot();

 //character mappings in java
 int VK_D = 68;
 int VK_WINDOWS = 524;
 int VK_BACK_SPACE = 8;

 char c;
 String line;
 while(true) {
 //c = StdIn.readChar();
 line = StdIn.readLine();
 if (line != null) StdOut.println(line);
 if((line != null) && (line.equals("D"))) {
 //StdOut.println("Desktop Command Detected");
 //trigger keypresses
 rob.keyPress(VK_WINDOWS);
 rob.keyPress(VK_D);
 rob.keyRelease(VK_D);
 rob.keyRelease(VK_WINDOWS);
 }

 //example for additional functionality
 //with additional sensors and triggers
 if((line != null) && (line.equals("B"))) {
 rob.keyPress(VK_BACK_SPACE);
 rob.keyRelease(VK_BACK_SPACE);
 }

 }
 } catch (AWTException e) {
 e.printStackTrace();
 }

 }

}

 

Team Colonial – P1

Team Colonial:

Dave Lackey (dlackey@)

John O’Neill (jconeill@)

Horia Radoi (hradoi@)

Brainstorming ideas:

  1. (Concentration – sketched) Device that looks at what you’re browsing and shocks you / punches you when you procrastinate. Sketch.
  2. (Education) Interactive game that allows you to control things virtually that would otherwise be too dangerous (nuclear waste or dangerous chemicals).
  3. (Exercise) A game involving lasers that force you to exercise by performing quick foot movements in order to win.
  4. (Health) Laptop stand that raises up after a certain amount of time so that you have to stand (relieves back pain). Sketch.
  5. (Leisure) Beverage launcher controlled by voice.
  6. (Music) Virtual DJ board controlled by hand gestures.
  7. (Navigation – sketched) A device that taps your shoulder when you should turn right or left (vehicle or on-foot navigation).  Could possibly integrate with voice navigation. Sketch.
  8. (Other) Coding through gestures and voice (in addition to keyboard).
  9. (Other) Device that helps you wake up by giving you a challenging set of physical and virtual tasks to turn off its alarm mechanism.
  10. (Other) Pads under your sheets that slowly vibrate to wake you up. Sketch.
  11. (Robotics – sketched) A robot who crawls across a chalkboard to clean it when activated. Sketch.
  12. (Accessibility) Heads Up Display – Attached to your glasses and connected to your phone through bluetooth, displays information about callers, number of unread texts and/or emails. Sketch.
  13. (Health) Barcode reader for fridge – lets you know when your food is going bad.
  14. (Accessibility) Round touchscreen phone, that can be worn on your forearm.
  15. (Entertainment) A walking FURBY® toy, that can follow you around when it is hungry.
  16. (Accessibility) Local GPS/Wifi signal generators that can be attached to important devices that are easily misplaced(Phone, Keys, Prox, Backpack) – in order to find them easier.
  17. (Accessibility) Turning any display into a touchscreen by using a glove with an emitter attached to the fingers, which transmits a signal whenever it sense pressure (touching a surface), and 4 receivers placed on the corners of the display, which pinpoint the positon of the fingers.
  18. (Accessibility) Number 16 for generating 3D points by pressing a button and drawing in mid air (would require computer to see generated image/series of points)
  19. (Lifestyle) Travel mug with a heater incorporated, that heats your beverage whenever it gets too cold. Sketch.
  20. (Lifestyle) Backpack with a solar panel incorporated, which can recharge you phone or ipod.
  21. (Autonomous Vehicle) Cars that park themselves in a designated parking lot.
  22. (Autonomous Vehicle) On-campus golf cart taxi service for injured athletes – fleet of self driven golf carts that pick up and drop off injured students in an efficient way.
  23. (Lifestyle) Breathalyzer car start – you can’t start your car unless your BAC is in normal limits.
  24. (Lifestyle) Roomba/Helicopter that brings you a glass of water when you are in bed (autonomous or pre-programmed).
  25. (Lifestyle) Polarized TV programmes – two different programs running on the same device, but on different polarizing streams – you need a pair of glasses to watch your own show.
  26. (Lifestyle) Equipment that prepares a set breakfast when you wake up (start when you hit the alarm button on the alarm clock)
  27. (Lifestyle) Shampoo estimator – dispense enough shampoo based on the length of your hair
  28. (Lifestyle) Automated TShirt folder – give it crumpled Tshirts and it outputs them neatly folded.
  29. (Education) Piano assistant – load a music file and it teaches you how to play it on the piano, by lighting up the key you need to press.
  30. (Education) Number 29 for guitar – because guitars are cool.
  31. (Lifestyle) Device that changes the song on your ipod based on the intensity of your workout (mostly for running) or based on blood pressure. Sketch.
  32. (Lifestyle) Device that detects a song you dislike on the radio and replaces it with a song from your ipod (also serves as ipod charger)
  33. (Lifestyle) Glasses that light up if alcohol is detected in a drink
  34. (Entertainment) Racing game controllable with your steering wheel and car levers, for when you have to wait in the car for long times.
  35. (Lifestyle) Radar that detects how far you are from the car in front of you/ Alerts you when it has moved far enough – for stop-and-go traffic
  36. (Lifestyle) Camera incorporated in your eyeglasses, that can take an exact picture of what you can see.
  37. (Accessibility) Automatic page turner when eyes reach last element on page
  38. (Heath) Sensors improper posture, delivers some feedback – vibrations where incorrect.
  39. (Health) Use kinect to train user into proper lifting / workout form.
  40. (Health) Use kinect to detect if people are stalking you / behind you.
  41. (Health) Toothbrush that tells you if you’ve missed any spots
  42. (Lifestyle) Helping blind people shop for clothes, or just choose clothes in the morning
  43. (Lifestyle) Glasses with facial recognition so you never forget someone’s name, or perhaps those who have trouble recognizing faces, such as those with helps those with Asperger’s
  44. (Entertainment) Glasses that read the depth of objects in front of you, and recreates relative depth on a surface that can be felt (like a pinpression) — this way, a blind person could “feel” the objects and space in front of them (e.g. the way performers move left/right and forward/back on a stage)
  45. (Music) Using nerf guns as musical inputs (place in space alters note)
  46. (Lifestyle) Using myoelectric sensors to help amputees control machines (e.g. something that controls a paintbrush or prepares dinner)
  47. (Music) Rockband using flex sensors – someone plays air guitar or air drums, and the sensors on their limbs translate motion into a sound
  48. (Lifestyle) Travel mug with an LED that displays how hot a beverage is.
  49. (Lifestyle) Stereo cameras for glasses that can approximate distances.
  50. (Kinect) TV remote control using gestures.
  51. (Kinect) Martial arts/Krav Maga form/posture corrector in real time
  52. (Kinect) Light art – Use a Kinect controller to detect the position of a digital spray can, and based on its position, back project a grafitti drawing generated by the user. Sketch.

Idea Choice Explanation

For our project we are going with sensors that detect improper posture.  People who monitor their health and have seating posture issues could benefit from this.  A possible interface could be a simple set mechanism to set a desired posture, and then an alert mechanism to notify the user when they stray from the posture.  To reach this idea, the group members each picked his favorite idea and polled third-parties (students outside of the class) about their favorites.s

Check out our sketch here.

Project Description

Our target group of users are people who sit in chairs, are interested in monitoring their health through the use of technology and who have issues with correct posture and/or back pain.

It’s not too difficult to find out what good seating posture is.  Most people are probably already aware of it.  However, people almost inevitably end up twisting their body frames into unhealthy positions.  If people are aware that they’re straying from good posture, we believe that it will help them to stop straying.  For example, if they are slumping in their chair the product may alert the user about their mistake or penalize them in a virtual way.  This is a product that could be positioned during long periods of sitting.  Another important thing to realize is that they’re often sitting in a chair and in front of a screen, which makes the screen a perfect medium for relaying information about their posture.

One benefit of using flex sensors is that they can easily bend to any bodily contours. A preliminary idea involved creating a chair with properly-placed force sensors, but the arrangement of sensors would have to be reconfigured for each individual using it, unlike the flex sensors, which adapt to the shape of the user.  Additionally, flex sensors can be portable, allowing us to make the product accessible during other areas of life, such as with good weight-lifting posture.