Assignment 3

Group Members: Matt, Alice, Ed, Brian

What were the most severe problems with the app/site? How do these problems fit
into Nielsen’s heuristic categories? What are some suggestions to fix the UI, and
how do your proposed changes relate to the heuristic categories?

Logging in does not always work:

  • Logging only works about half of the time.
  • This fits within H9.  Doesn’t tell you why there is an error and there is no graceful recovery.
  • Which problems, if any, were made easier to find (or potentially easier to correct)

Help menus don’t actually exist:

  • Help menus are in Window’s ’98 format
  • Most of the links crash
  • The falls under H10

https://dl.dropbox.com/u/49220792/score_menu.png

There are no icons anywhere

  • For example, finding quintile rank.  You just have to know where it is or have someone tell you.
  • This falls broadly under H6

http://dl.dropbox.com/u/25731678/Screen%20Shot%202013-03-12%20at%204.00.10%20PM.png

No consistency on the site (Violates H4)

  • Mixture of links, tabs, and dropdowns
  • H2: Their conceptions of tabs is weird.  Clicking on a tab can bring you to a new page.

Which problems, if any, were made easier to find (or potentially easier to correct)
by the list of Nielsen’s heuristics? (That is, if we had asked you to perform a
usability evaluation of the app/site without first discussing heuristics, what might
you have overlooked?)

Help and Documentation

  • I would never have otherwise looked at the documentations

Lack of Icons

  • It’s not something you actually think about.  How much you use icons to navigate through a web page.

Did anyone encounter usability problems that seemed to not be included under
any of Nielsen’s heuristics? If so, what additional heuristics might you propose
for this type of application or site?

  • When the help just crashes or the login just doesn’t work.  That is just a bug.  Maybe adding a heuristic for “just not working.”
  • Unnecessary Functionality (It’s not quite minimalist design its a little different):  For example you can email someone from inside SCORE and no one every does that.
  • Usability across different devices:  SCORE doesn’t work at all on mobile or tablet devices.

What might be some useful class discussion questions—or final exam questions—
related to heuristic evaluation?

  • Why is recognition vs. recall important?
  • What are some examples of “matching between the system and the real world”?
  • What are some methods of preventing users from making errors?
  • Which of these heuristics might provide the biggest impact on the users?  Which is most likely to lead to high severity problems?

PDF’s of Students Notes:

A3: Craigslist Heuristic Evaluation

Group Members:

Xin Yang Yak (xyak@)
Gabriel Chen (gcthree@)
Peter Yu (keunwooy@)

i. Most severe problems

The most severe problems we found with the site are listed as follows, with their corresponding violations:

H2: Unrealistic names and uncommon abbreviations for discussion subforum links.
H6: Homepage is recall dependent, lacking icons for recognition.
H7: Lack of login page impedes efficient use.
H8: Discussion subforum interface is cluttered and not minimalistic.
H10: Help function uses google site search and documentation is not thorough.

ii. Problems exposed by Nielsen’s heuristics

We wouldn’t have thought to look at the help functionality of Craigslist if it weren’t included in the list of heuristics. The problems with its functionality were exposed by the list.

iii. Usability problems beyond Nielsen’s heuristics

The site was aesthetically bad and discourages users from using it. The site looks unreliable, which is not a good quality for an ecommerce site.

Possible additional heuristics should thus incorporate the site’s look and feel, and how welcoming it is toward a user.

iv. Class discussion questions

1. What are the limitations of Nielsen’s heuristic evaluation?
2. What are the shortcomings of Nielsen’s heuristic evaluation?
3. Apply heuristic evaluation to the HCI course website.
4. What is the timeframe of Nielsen’s heuristic evaluations? Will the evaluation criteria persist as technology evolves?

Links

Xin Yang: https://docs.google.com/document/d/1LwwD1IGgCtKNfSXfeZU5TNabKaD32M2Srjl3vhDrHp0/edit?usp=sharing

Gabriel: https://www.dropbox.com/s/xs801hxemaiwa5b/A3.pdf

Peter: https://www.dropbox.com/s/zpwhr9btafk3f1b/inclass.pdf

Assignment 3

Names:
Dale Markowitz
Raymond Zhong
Amy Zhou
Joshua Prager
Daniel Chyan

i.
What were the most severe problems with the app/site?
How do these problems fit into Nielsen’s heuristic categories? What are some suggestions to fix the UI, and how do your proposed changes relate to the heuristic categories?

One of the most egregious problems we found with Blackboard was its lack of naming conventions in the folders containing documents.  For example, while Blackboard has an e-reserves tab, a course materials tab, and a syllabus tab, we have known professors to interpret these tabs completely inconsistently (syllabus in course materials folder, etc). This makes our ability to find relevant course material extremely difficult. We believe this violates H4, consistency and standards. We also found that H1. Visibility of System Status was violated. Particularly, students cannot tell when important course material is updated, and we found an update about Blackboard going down 7 days ago still listed on the site.  Finally, we found H8. Noise to Signal ratio to be violated, as the home page was covered in somewhat irrelevant information and tools. For example, a good section of the home page is taken up by a side panel telling users how to use Blackboard, which really should be encapsulated in the “Help” tab since it is rarely used.

ii.

H1 (Visibility of system status) gave us a better way to think about the top-down interface of Blackboard, starting from the front page. It made it obvious that there is a usability problem when students do not know the latest news, grades, or assignments in their courses.

H6 (Recognition rather than recall) exposed usability problems that would have been easy to gloss over after using Blackboard for multiple years. We noticed that recognition is difficult while using the Tools page (icons are nondescriptive, page is overfull, etc.) to access grades or send email. The heuristic also made it easier to observe when information was not exposed in previews, forcing students to recall the contents of a document with a given title, rather than recognizing it visually or through an excerpt.

iii.
One usability issue encountered was the presence of red herrings that linked to unhelpful pages. For example, clicking on the “Courses” tab leads to a page that contains all courses ever taken rather than a page of current courses. In addition, the prominent top Princeton logo links to the contextually unhelpful Princeton website rather than the landing page for Blackboard.

iv.
How does a physical book violate Nielsen’s guidelines for usability. Use a hardbound copy of Harry Potter as an example.

Joshua Prager’s Evaluation

Dale Markowitz’s Evaluation

Daniel Chyan’s Evaluation

Amy Zhou’s Evaluation (typed from a phone, so please excuse copious typos)

Raymond Zhong’s Evaluation

 

A3 Dohan Yucht Cheong Saha

Assignment 3: Heuristic Evaluation

Group Members

 

  1. Andrew Cheong

  2. Shubhro Saha

  3. Miles Yucht

  4. David Dohan

 

Some of the most severe problems in SCORE include lack of address authentication and inability to login to SCORE for a certain amount of time and having terse and unhelpful error messages which did not facilitate recovery whatsoever. While SCORE does ask for the user to double check if his/her address is correct, SCORE itself will not check if the address is authentic. This error would fall under H5 (error prevention). SCORE should be capable of detecting obvious bad input such as “hodge podge” with simple (or complex) regular expressions. Also the inability to login would fall under H9 (help users recover from errors). This was further compounded by lack of helpful messages. One proposed solution was to provide more helpful error messages that would allow the user to have an approach to recover from this message. Either by providing a list of steps to check the user’s situation.

 

I think that one good final exam question might be: develop a set of heuristic that you might use to evaluate an interface. The Nielsen heuristics are absolutely helpful in parsing the components of an interface and checking that it works in a way that is helpful and useful, but if you don’t think in that way or perhaps if you think that some of the heuristics are necessary or should be evaluated in a different way, you might want to use a different set of standards to judge an interface by.

 

One other suitable question that could be posed on an exam might be something like the following: some components of the heuristic evaluation are arguably more important than others in terms of the overall experience (the inability to undo in a text-editing program might be a much more egregious flaw than a bogged down interface). Which heuristics would you consider to be the most significant or important when evaluating a couple sample interfaces, such as:

 

  • Email client,

  • Text editing software,

  • Image design software, or

  • File explorer?

 

 

While there are many elements of the list of heuristics that would surface while exploring an interface without an explicit list, Nielsen’s list of heuristics makes it more likely that you will explore parts of the program that one might not have otherwise looked at.  H9-help users recognize, diagnose, and recover from errors-, for example, makes it far more likely that we will try to deliberately break the software.  This goes beyond finding situations that confuse us to actively trying to confuse the software.

 

H10, help and documentation, is also something that evaluators don’t usually think about when assessing the usability of the program. We tend to assume that help documentation is a section to be avoided because users go there only if they’re confused. Nonetheless, many applications get to the point of complexity where not everything can be understood at first glance. Help documentation is key in these cases.

 

This could just be a derivative of H8 (Aesthetic and minimalist design), but we’d like to draw attention to fact that color scheme and element decoration in an application’s design is super-important. No doubt, SCORE is an eyesore. Students want to log out as soon as possible so spare their inner design conscience. The choice of various shades of blue is sickening. The occasional burst of sea green in the buttons does not help much.

 

Links to Original Observations:

 

Assignment 3: Bonnie, Harvest, Valya, Erica

Names


Bonnie, Valya, Harvest, Erica

Short summary of our discussion of each of the four discussion questions

1. Most severe problems with the app/site?

iTunes has a lot of problems with hidden options, features, and menus. Most icons don’t have any text associated with them, even on rollover, which meant that we had no idea what many of them did. To fix the UI, we recommend including more text along with the icons, as well as displaying important features like the sidebar by default. While iTunes 11 is trying to look minimalist by removing the sidebar by default, there are a lot of functions that aren’t implemented anywhere else. So, either the sidebar should be viewable by default, or there should be some sort of indicator showing that the sidebar can be expanded.

2. Which problems were made easier to find by the list of Nielsen’s heuristics?

Nielsen’s heuristics helped us realize how terrible hidden features are, and find some features that we may not even have known existed. Many of us discovered new features, including things like “play next” and “show sidebar.” We also thought that they helped us articulate why some UI issues were problems.

3. Did anyone encounter usability problems that seemed to not be included under any of Nielsen’s heuristics? Additional heuristics?

Nielsen’s heuristics are broad enough that generally everything can be crammed into one of the categories. However, Nielsen doesn’t have a clearly defined category that emphasizes user workflows; iTunes has several workflows that are too long, branch too much, or are difficult to find the correct path through; these kinds of usability issues plague many interfaces, but they don’t show up under a clear category in Nielsen’s heuristics.

4. Useful class discussion questions or final exam questions

  • How does your consideration of the heuristics change if thinking from the perspective of a disabled user?
  • What insights do Nielsen’s heuristics give you that general usage of the interface wouldn’t?
  • What happens when you can’t reasonably satisfy multiple Nielsen heuristics in a given design? Does that mean the design is fundamentally incorrect, or are there reasonable cases where a designer must choose which heuristic takes priority?

Links to individual heuristic evaluations


Erica: https://docs.google.com/document/d/1bO7fgCGKGg9M-pNOGSogmzvatyzKAcvbeYNMFblXo5E/edit?usp=sharing
Valya:
https://www.dropbox.com/s/cdkw98990o79jdz/Heuristics.pdf
Harvest:
https://docs.google.com/file/d/0B9ZsvvU1nAexM2xMenZCSTVnVWc/edit?usp=sharing
Bonnie:
https://docs.google.com/file/d/0BzHbdcSoIIg5MUVvckxmQmV2MEU/edit?usp=sharing

Assignment 3 – Heuristic Evaluation of SCORE

Junjun, Saswathi, Angela, Edward

1. Most Severe Problems

Login errors

When attempting to log in, users quite frequently get redirected to an error page saying that they are “not authorized to log into PeopleSoft”. This may also happen even when the username and password are correct, and before 2am. This is a very annoying and severe error (we rated it a 3).
This violates the following heuristics:

  • Heuristic 9: help users recognize, diagnose, and recover from errors: “Not authorized” is both unhelpful and seemingly incorrect (students should be authorized to log in), and does not propose and form of solution. The error message should at least be more informative (was it a bad username/password combination? Is the time invalid for the particular user?); it should also not occur when the username/password are correct.
  • Heuristic 7: flexibilty and efficiency of use: The user must then go back to the original login page after receiving this error; it would be much more efficient to display error messages on the original login page, so that the user can easily try logging in again.

Class Registration

Users must search for courses to enroll in by course number to find a specific course. This information is not available easily through SCORE
This violates the following heuristic:

  • Heuristic 6: recognition rather than recall: the course numbers must be looked up, as no one remembers them. Users should be able to search courses by keywords and words in the course titles, or have SCORE integrated with ICE or even the Registrar’s page.

2. Problems Recognized Because of Nielsen’s Heuristics

We noticed a problem with consistency in the navigation interfaces – tabs, drop-down menus, and radio buttons are all used at different times for navigation. We only noticed this as a distinct problem by learning about Nielsen’s heuristics. Thus, H4 was a useful heuristic.

Also, we did not directly notice the lack of help and documentation as a problem. Since we all have used SCORE a lot, we already know what to click and what all the cryptic things mean. However, we realized (and remembered) how little sense it made for a first-time user after reading over the list of heuristics. Thus, H10 was a useful heuristic.

3. Non-Nielsen Usability Problems

Since the heuristics seem very broad, most of our problems fit into them (and several problems seem to fit more than one heuristic).

One problem that didn’t quite fit was the login error (“You are not authorized by PeopleSoft”) described above. While the error message isn’t very useful, the main problem seems to be an error of functionality. The user should be able to log in, but they cannot.

We might add a heuristic called “consistent functionality”: The system should behave predictably and consistently.

4. Useful Class Discussion Questions

  1. Are any of the heuristics inherently more severe than the others? Explain why.
  2. Is there a line between usability flaws and functional flaws? How terrible can a UI be before the application is no longer functional?
  3. Here are a list of several usability problems. What heuristic violation would you categorize them as, and why?
    • automatic logout without warning
    • no file previews available
    • faulty spell check
    • etc.
  4. Give examples of severity level 4 violations for each heuristic.

5. Solo Heuristic Evaluation Links

Junjun: https://dl.dropbox.com/u/49280262/JunjunChenA2.pdf
Saswathi: https://docs.google.com/a/princeton.edu/file/d/0By483E15Y63_cGpDQXVFanAzUE0/edit?usp=sharing
Angela: https://docs.google.com/file/d/0B0fj2iAnOQwcOGZGSnk4dFJJUlU/edit?usp=sharing
Edward: https://docs.google.com/document/d/11uVTSsP-xRlUDFN6l3am83bPBsQVlEhQ3oJNN0eUoP8/edit?usp=sharing