Prakhar Agarwal (pagarwal), Colleen Carroll (cecarrol), Gabriel Chen (gcthree)
We are developing a glove that uses a variety of sensors to detect off-screen hand movement and link these to a variety of tasks on one’s cell phone.
When obtain informed consent, our first priority was to make sure that users had the time and were willing to participate in our prototype testing. Additionally, we made sure that the users were okay with being in a video or picture to be published on the blog. We also gave the user a consent form (http://goo.gl/oYzug) to look over, and overall it was a smooth process. There wasn’t much else to warn users about for our testing, so we feel that a verbal and visual description of it was sufficient. We paraphrased from the following script: http://goo.gl/rjlmu
Our participants were selected by surveying a public area and looking for people who seemed to have free time to participate in the study. We happened to choose one person in the class, but also managed to find two strangers. All participants fell into our target group of people who use or have used their phones while walking outside.
Testing was conducted at a table in Frist. We used the same low fidelity prototype we had built in the previous assignment, and had users try it on in order to conduct our tests. We used one of our phones to mount the paper prototype of our UI for ease of interaction. The phone was also used to simulate one of the tasks. This way, we were able to achieve a realistic feel of interacting with a phone while using our prototype.
For the testing procedure, Prakhar was the wizard of Oz, and fulfilled all the actions on the phone that users prompted using our prototype. Both Gabe and Prakhar paraphrased the scripts and informed users about the tasks they would be doing. Colleen observed and was primarily in charge of taking notes on the interactions between the user and the system. She also called the phone during the task where the user had to answer a phone call. Gabe recorded a few videos and took pictures throughout testing.
After demoing key features of the system, we presented tasks to our users. We chose to give our users the tasks in order of increasing difficulty, so they could grow accustomed to the system. The first task was just to answer a phone call and then hang up using built in gestures. The second task was interacting with the music player using built in gestures. The third task was by far the most difficult, and involved setting a string of gestures as a password using the user interface. See the scripts for details: http://goo.gl/F6OPY
A User on the Setup Screen
User Testing the Music Player
Summary of Results and Most Catastrophic Incidents
The most glaring critical incidents from our testing, with all three users, occurred in the setup screen. First of all, users did not understand the context in which the gloves would be useful from the information in our prototype alone. Instead, we had to interrupt our testing each time to explain it because they were too confused to move forward otherwise. Secondly, all of the users misunderstood how to use the password setting screen. They were not sure which buttons to press, in which order, and what they were going to achieve by the end of the task.
Other than the setup screen, all of the users had issues with the gesture necessary for picking up a phone call. Two of the users found it awkward to use their nondominant hand to do the setup while their glove was on the dominant hand. All three held both the glove and their hand in shape of a phone up at the same time, which was not necessary. One user actually tried to speak into the glove instead of their phone, though the gesture is intended only as a replacement to pressing the answer key, then the user should still talk into their actual phone. With the music player, two of the users tried to use the gestures for forward and back in the opposite direction as was intended. Finally, one of our users could hardly refrain from repeating how stupid the gestures were during testing.
Discussion of Results
Judging by the catastrophic failure of our setup screen we need to thoroughly rethink our design for introducing a user to the system. It is clearly a very new idea that does not even make sense without the “cold weather” context, and this needs to be conveyed more clearly, perhaps through a demo video showing the system in use. The process for setting up a password also needs to be redone with more explanation and/or a more intuitive UI. Our original design seemed to be too cluttered and users were not able to discern the step-by-step process to setting up a new password.
It seems that our initial choice of gestures will require more user testing to get right. Firstly, many of the users laughed at or felt embarrassed by the gestures when the first tried them. Referring to our “rock out” gesture for playing music, one user actually asked “What if I want to play a mellow song?” Also it was brought to our attention that pausing the music player might be mistaken for a very dorky high-five. The gestures overall need to be more discreet. In addition, we will have to be careful not to choose gestures that may confuse the user into using it differently intended by changing the conventions of the gesture, which may, for example, cause a user to lift their hand in the shape of a phone and trying to talk into the glove.
Plan for Subsequent Testing
As discussed above, while we validated the general usefulness of our system to certain users, we also realized a number of issues with our system through the testing procedure. One recurrent problem was a result of the fact that the prototype had users use the gesture glove with their dominant hand, leading to confusion when using the phone in their other hand. We recognized two solutions to this problem. One, we recognized that we should have users wear the smart glove on their non-dominant hand, and two, we decided that it would be useful to have users watch an introduction video before they initially set up their glove. It would definitely be fruitful to conduct lo-fi testing once again so that we could gauge if implementing these changes will make use of the system more intuitive.
It would also be useful to once again conduct lo-fi testing for the application to set one’s password. The way we had buttons set (i.e. having both a “Set” and “Edit” button visible at all time) made the interface quite confusing. Based on user feedback, we have discussed some simple ways to make the interface easier to use, but the fact that having a series of hand gestures act as an unlock password is an entirely new concept makes this quite difficult to represent in paper prototyping. It may actually be useful to quickly code up a dummy application that implements just the user interface and have users test the application with this as a simple prototype.