Escolar Documentos
Profissional Documentos
Cultura Documentos
U S A BIL I TY TE S TI NG
R E PO RT
INDIANAPOLIS PUBLIC SCHOOLS
WWW.IPS.K12.IN.US
INSPECTED BY TEAM 2:
STEVEN ENTEZARI
HAI DAN HUANG
JAY WHEELER
Table of Contents
I. Executive Summary ................................................................................................. 4
II. Instruments and Methods ...................................................................................... 7
Participants ....................................................................................................................................................7
Data Collection .............................................................................................................................................7
Environment & Timeframe.......................................................................................................................8
Tools................................................................................................................................................................9
Tasks................................................................................................................................................................9
III. Results ................................................................................................................ 11
Task 1 – School Board Meetings .......................................................................................................... 15
Task 2 – Finding the Technology Plan ................................................................................................ 20
Task 3 – College Resources .................................................................................................................... 24
Task 4 – Find the Closest School ......................................................................................................... 27
Task 5 – Magnet School Application ................................................................................................... 31
Task 6 – Payroll Contact Information (Teachers Only).................................................................. 34
Task 7 – Teacher Amendment (Teachers Only) ............................................................................... 34
IV. Synthesis of Results from Inspection ................................................................... 36
Content ........................................................................................................................................................ 36
Information Architecture ........................................................................................................................ 37
Navigation................................................................................................................................................... 40
Presentation ................................................................................................................................................ 41
V. Overall Recommendations for Improvement ........................................................ 43
Content ........................................................................................................................................................ 43
Navigation................................................................................................................................................... 44
Information Architecture ........................................................................................................................ 45
Presentation ................................................................................................................................................ 46
VI. Appendixes ......................................................................................................... 48
Appendix A: Pre-test Questionnaire .................................................................................................... 48
Appendix B: Usability Test Script......................................................................................................... 50
Appendix C: Post-Task Questionnaire ................................................................................................ 54
Appendix D: Post-Test Questionnaire ................................................................................................ 55
Appendix E: Usability Testing Videos................................................................................................. 56
2
Table of Figures
Figure 1 - Average times it took each participant to finish each task. ..........................................11
Figure 2 - Full-Success/Partial-Success/Give-Up rates for each participant per task ..............12
Figure 3 - Summary of quantitative data showing average time on task, lostness, success rates
and ease of use for each task.........................................................................................................13
Figure 4- Correlation of measured items .............................................................................................14
Figure 5 - IPS Homepage .....................................................................................................................16
Figure 6 - Main Navigation for IPS Site..........................................................................................17
Figure 7 - Task 2 Quantitative Summary ......................................................................................20
Figure 8 - Divisions/Departments page with no Technology section ...............................21
Figure 9 - Task 3 Quantitative Summary ......................................................................................24
Figure 10 - Long list of state colleges and universities ...........................................................25
Figure 11 - Task 4 Quantitative Summary ...................................................................................27
Figure 12 - Task 5 Quantitative Summary ...................................................................................31
Figure 13 - Task 5 Quantitative Summary ...................................................................................34
3
I. Executive Summary
The Indianapolis Public School System (IPS) Corporation website is an informational
website about the Indianapolis Public School Corporation. The purpose of the site is to
communicate news, events and educational related information to primary groups of users
This report identifies key findings from the usability study conducted by our team on
the IPS Corporation website. The usability study included 11 participants comprising 5
community members, 4 parents and 2 teachers. All participants performed 5 tasks except for
teachers who performed 7. Each task was recorded to captured data, which was later
analyzed and included in this reported. Tasks were selected for their saliency in providing
information to perspective users based on activities that were performed both routinely such
as finding school board meetings to those that were performed less frequently such as finding
cost associated to a technology plan budget. Teachers performed two additional task which
were designed to help the team identify how recognizable the layout and content was
between the general users understanding of the site and teachers understanding of the site. In
general, the tasks selected had to include at least 3 steps to be considered for the study. In
addition, an earlier usability inspection was performed which helped establish the direction
The following are the key set of findings captured from both of the usability
inspection and usability study, which have been identified as having the most impact to the
information.
4
2. Navigation: Many problems were due to inconsistency in the location and style of
links. Additionally, the page banner located on every page due to its size and
distracting messages causes all the users unnecessary scanning of pages to locate the
3. Information Architecture: The website needs to align to how the target user‟s
topics within a subject area. Topics that belonged to different subjects were included
After careful review of both the usability inspection and evaluation study given the top
four impacted areas mentioned above our team recommends the following top seven critical
Key Recommendations:
5
6. Design should be include visual cues in the navigation that highlights their current
location. Ensure each page has a title matching the pages content.
7. Make the navigation bar around the banner area more salient. Avoiding using
colors similar to the banner for the background of navigation bar. Make the
appearance of the navigation bar more touchable like a button.
6
II. Instruments and Methods
This study employed several tools and techniques for collecting information
methods and instruments are explained. A complete list of artifacts including the Pre & Post-
Test Questionnaire, Post-Task Questionnaire, Usability Study Script among others used in
Participants
The IPS district website caters to three primary groups of users: Teachers, Parents
and the Community. For our study we focused on these key user group which we solicited
have at least some computer experience with 7 self-identifying as very experienced. All 11
participants had over 20 hours of Internet usage per week. Additionally, 9 the users had
never visited the IPS website, however, 4 parents and both teachers visited their own school
corporations website.
Data Collection
Due to the complex nature of collecting both quantitative and qualitative data our
team developed and employed a well-crafted process and artifacts, which were used in the
study (see Appendix B). The evaluator first started by traveling to the participant‟s location
with a laptop, which was loaded with recording and capturing tools (see tools below). Once
the evaluator was setup they began by giving the participant a pre-test questionnaire, which
solicited demographic information along with computer and Internet usage information.
Once the participant completed the pre-test questionnaire they were then read a script that
provided an over view of the study which included information regarding the purpose of the
7
study along with the intention of the information collected, privacy & confidentiality and
their rights as a participant in the study. After the participant agreed to the terms the
evaluator then gave the participant an introduction to the computer system and to the
website allowing them to complete a nominal task before the official test was initiated.
Once the participant was familiar with the setup of the laptop, browser and site the
evaluator began the test. The evaluator began recording the session along with reading each
task out load. The evaluator recorded both the start and end times along with the success,
partial success or failure for each of the task. In addition the participants we encouraged to
think out load as they worked through each task. This helped provided additional insight and
rich qualitative data, which was noted and later analyzed by the evaluator and team. Also
after each task the participant was given a post-task closed question questionnaire, which
solicited their feedback on how satisfied with the ease of completing the task they were. The
After all the tasks were completed the evaluator then moved to the last section of the
script, which included the post-test questionnaire and wrap-up. The post-test evaluation
included several open-ended questions to give the participant a chance to give their overall
impressions of the website and to reflect on the site as a whole. Once completed with the
post-test questionnaire the participants were again thanked for their participant and given
contact information of the evaluator in case additional thoughts or concerns arose about the
Due to the short timeframe for this study and the need to emulate an environment
that best represented our key user groups most of the testing was conducted in either the
8
participant‟s home or place of work. The testing was conducted between November 1 st and
November 8th, 2010 at times that were most convenient for the participants.
Tools
A variety of tools were used to help record and capture data from the participants.
These tools included Clearleft‟s Silverback 2.0, Camstudio and Synium‟s Screenium for local
participants and Cisco‟s Webex for remote participants. These tools allowed our team to
record the participant‟s actions on the screen, the participant‟s verbal comments along with
the participant‟s face, which allowed us to capture both verbal and nonverbal cues.
Tasks
Below are the following tasks used in the study for participants to complete. Tasks 6 and
7 were included in the test if the participant was a teacher. Tasks were selected for their
saliency in providing information to perspective users based on activities that were performed
both routinely such as finding school board meetings and to those that were performed less
frequently such as finding cost associated to a technology plan budget. The usability
inspection was also used to identify steps used within the tasks that would expose
problematic areas found by our experts. The additional tasks for teachers were designed to
help the team identify how recognizable the layout and content was between the general users
understanding of the site and teachers understanding of the site. In general, the candidate
tasks had to include at least 3 steps (or navigation points) to be considered for the study.
1. Find the next school board briefing meeting so that you can attend.
As a parent you are growing concerned about the direction the IPS School board is
moving towards concerning year round school. Another parent mentioned that there
are school board meetings where you can go and voice your concerns. Assume today
is December 1st and you want to go to the next briefing session. Please identify the
date for this meeting.
9
2. Find the dollar amount IPS has budgeted towards technology in its
Technology Plan.
As a parent you feel that your child‟s ability to learn and use technology is very
important. In the midst of growing concerns about budget cuts you are interested in
knowing how much money IPS has allocated to implementing new technology. Find
the budgeted amount IPS has allocated in its Technology Plan.
As a parent of a high school junior, another parent told you that IPS has resources for
you and your child to start exploring colleges. Find contact information about Sawyer
College in Merrillville.
As a parent you want to identify the schools that are nearest to your home where your
child would attend middle school. Your house is located at 816 N. Audubon Road,
Indianapolis, IN 46219. Please name the closest high school to your house and find
the enrollment information for that school.
5. Find the 2011-2012 English Student Application for the Magnet Schools.
You noticed that a school located near your house was a magnet school. Please
locate the 2011-2012 English student application for this magnet school.
As a new teacher you had an issue with you pay check this week. You contacted the
front office at your school, however, because they do not directly deal with payroll
they directed you to the corporation website to find the appropriate contact
information and the location were you needed to go to sort everything out. Identify
the telephone number and location of the payroll office
As a teacher you wanted to review updates to the 2010 teacher salary information
specifically Amendment 5233. Locate information about Amendment 5233.
10
III. Results
After careful analysis during both live and recorded user-testing review, we identified
trends common to many of our participants. In our analysis, we looked at quantitative data;
questionnaire answers, and a calculation on the „lostness‟ of the participant. Qualitative data
that we focused on include our team observations, questionnaires, demographic data, and
other such items that gave a sense of the user‟s state of mind, the sites pleasurability, and the
overall acceptance of the site by the user. These results were analyzed first individually, then
in comparison with each team-member for inter-rater reliability. They were then cross-
referenced to the heuristic and scenario-based evaluations conducted previously for this site.
Task 1 Find the next school board briefing meeting so that you can attend.
Task 2 Find the dollar amount IPS has budgeted towards technology in its Technology Plan.
Task 3 Find contact information for Sawyer College in Merryvile.
Task 4 Find the middle school closest to your house.
Task 5 Find the 2011-2012 English Student Application for the Magnet Schools
Task 6 Find the telephone number and location of IPS payroll
Task 7 Find amendment 5233 of Teacher Contract
350
300
250
200 Time-On-Task
150 Perceived Ease
100
50
0
1 2 3 4 5 6 7
11
We measured the time it took each participant to accomplish the tasks. Above you
see the average time on task for the participants per task. Notice the times for tasks two and
four are highest. These two tasks were proven to be the most compelling for our participants;
as you will see in later parts of this paper. This graph also compares the time on task with the
perceived ease for the participant. Notice, on task 2, that the perceived ease is significantly
lower than the other 6 tasks. This indicates that the user found it most difficult and spent the
most time attempting to complete it. One thing that should be mentioned with regards to the
time on task calculations is that the time on task takes into account the attempts by
participants that were unsuccessful and incorrect. Next you will see success rates which will
For every participant and for every task the participant started, we kept a tally on the
completion occurs when the user completes the task with no input (or input that would not
alter the intended course of the participant) from the proctor. Partial success occurs when the
participant has successfully completed the task, but was, at some point during the scenario,
12
assisted by the proctor. Give up or withdraw simply means that the participant never
completed the task; even after given help. While task 2 has the lowest success rate of all the
tasks, notice, from before, that it still maintained the highest time on task of all tasks. This is
indicative of the user attempting, by-all-means, to complete the task yet failing to do so.
Quantitative Summary
350.00 3.50
300.00 3.00
250.00 2.50
200.00 2.00 Time on Task
150.00 1.50 Lostness
0.00 0.00
Task Task Task Task Task Task Task
1 2 3 4 5 6 7
Figure 3 - Summary of quantitative data showing average time on task, lostness, success rates
and ease of use for each task.
After combining the two items listed before, time on task and success rate, with a
measure of lostness and ease of use, we can see that the most problematic tasks were Task 2
and Task 4. Both of these tasks displayed the highest time-on-task and lostness levels. They
also displayed the lowest success rate and ease-of-use averages. This not only indicates that
these two tasks were quantifiably more difficult than the rest, but it also indicated that the
user could feel this while completing the task; based on the post-task questionnaire regarding
Another thing to note is that tasks 6 and 7 were performed only when the participant
was a teacher. There were a total of two teachers, who both were parents. All 11 participants
completed tasks 1 through 5. The data shows that these two tasks (task 6 and task 7) were
both completed with complete success, relatively no lostness, and a comparatively low time
13
on task average. This indicates that the teacher-participants may have been experts with the
system as they need to use it for daily-work. While this may imply that increased exposure to
the site generates higher satisfaction with the usability, it is important to note that the user
may become habituated to a poor usability experience and learn the „wrong ways‟ efficiently.
task 3. Out of the 5 main tasks, task 3 had the lowest time on task of all of them. The ease of
use was relatively high in comparison to the other tasks. Success rate tended to be higher than
average and the lostness of the task completion seemed to be extremely low. These facts may
imply a conflict between our usability evaluation based on heuristics and scenarios compared
to the usability testing done by our participants. This disconnect will be explored later when
Time on Success
Lostness Task Rate Ease
Lostness 0.92 -0.93 -0.39
Time on
Task -0.95 -0.68
Success Rate 0.66
Ease
Figure 4- Correlation of measured items
After running a correlation on the measured items, we found three interesting, and
very strong, correlations. First, for the average participant there was a +0.92 correlation for
time on task and lostness. This indicates that as the participant would spend more time on
the page, they would get more and more lost. This also implies that time was not spent solely
on a few pages for purposes of our time-on-task. Lostness factors in page counts, therefore,
the user spent a great deal of time searching many pages to complete the task. We also found
a -0.93 correlation regarding success rates and lostness. This is very interesting as it indicates
that as the user got more and more lost in the system, they would have a much less chance of
14
succeeding. It also implies that the more the users would succeed, the less lost they would be
in the system; meaning they needed less clicks to get to their final destination. The final and
strongest interesting correlation deals with the participant‟s time on task and their success rate.
Interestingly, as the success rate of an individual would go down, their success rate would go
up. This implies that when the participant did succeed, they did so quickly and the items that
• Task description
“As a parent you are growing concerned about the direction the IPS School board
is moving towards concerning year round school. Another parent mentioned that
there are school board meetings where you can go and voice your concerns. Please
identify the date for the next briefing session.”
Participants completed task 1 after completing a “warm-up” task to help avoid the
learning curve of the site. While this does hellp with avoiding some skewed statistics, it does
not take away from the fact that users of any site will be more proficient with general
navigation, logical constructs, and overal usability of any site; good or bad design.
15
Given that, however, the time on task for task 1 was right on average for the first five
tasks (the tasks intended for parents). Seven of the participants completed this task without
any help from the proctors while 2 needed assistance to finish successfully and 2 failed or
gave up. The users had a lostness of 0.53. This number does indicate a relatively high lostness
* Problem Description:
the dashboard-like style. We observed that users tended to spend a large amount of time
displayed the problems with a homepage containing only links. We saw that this dashboard-
like style lead the participants down a path that reinforced them to think the homepages main
content had allt he answers for their task. This information architecture set up is one of the
reaons contributing towards a high lostness level. It was assumed that if it wasn‟t in the
16
“The access to the school board
meeting should also be in the Parents
* Design defect | Navigation part”
passing the link, on average, 4 times before. This accounts for the high lostness displayed by a
* Severity | 2 – Medium
Participants were fixated on the fact that when looking for school board meetings,
from the
Figure 6 - Main Navigation for IPS Site
17
perspective of a parent. While logically, school board meetings may seem to belong under the
category of „School Board‟, our participants still felt that it was an activity that „Parents‟ attend.
Given this, for the participants, it seemed only logical to navigate to the main parent link and
search the information in that subdirectory. It was only after much confusion in the parent
directory, that many of ths users finally decided to reevaluate their approach and searh other
main links or simply give up at this point. This emphasizes the idea that the participants had
enormous expectations regarding the location of school board meetings. The majority of
participants expected to find school board meetings within the parents link as opposed to the
school boardlink.
* Severity | 2 – Medium
* Problem Description
One item that was not prevelent in our usability evaluation before, but became very
evident during our user testing, was the problems with the banner. For a few participants,
when the rotating banner rotated to a particular ad dealing with school borads (although it
was an ad that was not relevent to specific task), the users felt the need to wait for the ad to
come full circle to proceed. This rotation of the banner caused a higher time on task because
* Severity | 2 – Medium
* Problem Description:
18
Once the users navigated to the correct page, there were some characteristics of the
interaction that were quite interesting. Users noted that when navigating through the list of
school board meetings, it was hard to identify the next meeting because of the order of the
list. They expected the next meeting to be displayed first and didn‟t want to hunt and search
down the page for the every day to commpare it to today. They needed some kind of
distributed cognition to assist with their calculations of today, the displayed date for the
* Severity | 2 – Medium
19
Task 2 – Finding the Technology Plan
• Task Description
“As a parent you feel that your child’s ability to learn and use technology is very important. In
the midst of growing concerns about budget cuts you are interested in knowing how much
money IPS has allocated to implementing new technology. Find the budgeted amount IPS has
allocated in its Technology Plan.”
As evident from the graph above, the time on task for task 2 was higher than any of
the other tasks in this report. At 328 seconds, participants seemed to take as more time to
complete this than a majority of the other tasks, even though the ideal path for a majority of
these tasks consisted of only 4 pages. Also, there were fewer successes on this task than for
any of the other main tasks (tasks 1 through 5). Both the time on task and success rates
mentioned before alludes to a high lostness level. As shown in the graph above, the lostness
level for this task was higher than any other task in this report.
20
Participants consistently skipped the navigation of the page and skimmed through
the pages content in hopes of finding the answer. This contributes a great deal to the average
time on task for this task. We believe the participants felt the need to skim every page for the
answer due to not only the banner blindness explained earlier, but also the mislabeling of
pages. Page titles within the About IPS section are all the same. The participant had trouble
differentiating between which “About IPS” section they were in. Also, related to this, many
participants were unsure of their current location on the page due to page title and navigation
not retaining their visited or current pages. This issue was addressed before in our evaluation
*Problem Description:
21
actuality, the correct page was “Information Technology”. Four users simply left the page,
assuming they were in the wrong area. A few of them were redirected back to the page and
given the hint of thinking of synonyms of “Technology” while others found it, simply by
chance. This is a prime example of ensuring that the language written matches the language
understood by users. In this case, the users are parents who may or may be familiar with
technology, but usually not. Ensuring both methods of language are covered will allow users
to find what they are looking for, rather than simply leaving the page as our participants so
eagerly did.
* Severity | 2 – Medium
*Problem Description
A couple of our participants navigated to the
superintendent‟s blog or some area of news such as recent “I don’t even know where
I am”
performance metrics or school district goals. Some even expected
“If I’m looking for
it to be a form of general information which, they assumed, something like ‘report’ or
‘annual’ then it’s not
would be located in an “About Us” type of page. Regardless of going to be connected
with a department; it’s
where the users thought the budget, or technology plan, would be,
for the whole
organization!”
they didn‟t expect it to be where it is initially. They all got to the
intended page quickly, but only after some hesitation at the “The technology budget
plan should be under
beginning of the encounter. About IPS”
“Honestly, I would
probably just end up
calling them at this
point.”
22
#Problem 4: Navigation of PDF
*Problem Description:
When the users found the technology plan, time on task was taken. The goal here was
to see how long it would take our users to loacte and access the technology plan. After the
task was deemed copmlete, we allowed the users to continue without telling them we weren‟t
timing them any more. This proved to be rather revealing about another issue we didn‟t even
think about. Once the user has accessed the technology plan, finding the actual information
proved to be rather difficult. The navigation and information architecture of the pdf
document was not easily navigable and elicited some frustration in our users. The users, even
when in the document, didn‟t know where they were or how to get to where they wanted to
go. We recommend some form of hypertextual linkage on either the pdf document or an
html page with the same information as the pdf document. As of now, the table of content
listed an approximate page that the user had to scroll to; if they used the table of contents at
all.
23
Task 3 – College Resources
• Task Description
“As a parent of a high school junior, another parent told you that IPS has resources for you and
your child to start exploring colleges. Find contact information about Sawyer College in
Merrillville. “
Time on task for task 3 was lower than any of the other 5 main tasks. 143 seconds
surpassed all of our expectations and gave some insight on what we thought was a clear
usability problem. Task 3 had more Full and Partial successes than any other task. Lostness,
while still high, was the lowest of all five tasks as well. The one participant that failed to
complete this task or gave up on it does not account for a major shift in the data due to the
fact that the average for the other 10 successful participants average approximately the same
time on task.
24
The pages on this site
Figure 10 - Long list of state colleges and the appropriate place for state universities.
universities
Others, similar to the problems outlined in
task 1, believed that state university searches were activities the parents would complete and,
any of the other tasks in the 5 main tasks for parents. Some other “(While browsing
within the
important discoveries are made regarding the quantitative data as well. schools) How can I
get back to that
The success rate, on average, was ultimately higher than any other 5 other page?”
that the resources section would be a very usability-error-prone section of the website due to
the information architecture and navigation presented. Our assumptions were proven
false with regards to quantitative metrics. The participants were able to get to where they
needed to go, relatively efficiently. However, getting there is only half of the battle with
regards to usability. The pleasure attained by the user in accomplishing the task is as equally
* Problem Description:
Participants expressed distaste with their experience of the site based upon psycho-
pleasures related to page semiotics. Some mislabeling of pages and multiple links meaning
the same thing caused grief with the participants. The information architecture that the
users experienced to develop their mental map of the site was truly tested at this point. The
users have developed a sense of how things are set up with the first two tasks and now can
use that construct developed to accomplish this task. However, given the two-task prep with
this task, some users still felt like they could not navigate the site appropriately based upon
* Severity | 2 – Medium
26
Task 4 – Find the Closest School
• Task Description:
“As a parent you want to identify the schools that are nearest to your home where your child
would attend middle school. Your house is located at 816 N. Audubon Road, Indianapolis, IN
46219. Please name the closest high school to your house and find the enrollment information
for that school. “
The time on task for finding the school closest to your home was the second highest
time of all tasks. There were a total of 5 complete successes, 1 partial success (which involved
redirection by the proctor), and 5 complete fails or give ups. The level of lostness is almost
27
gave up too quickly, these numbers could have factored into the time on task we saw by
dragging it lower than the average would actually have been. By having 5 participants fail or
give up, and a lostness score of 0.59, it is evident that the participants, in general, were not
pleased with what it took to accomplish the task. On top of this, the average ease-of-use
* Problem Description:
Participants noticed that the banner was static for some of the pages
and dynamic for others. This caused much confusion when users
“The Boundary Map
were attempting to figure out how they ended up in certain points of doesn’t help. Where
can I give my
their navigation. Upon navigating over a banner, the user was unable address?”
to determine whether it was a link or not. This enhances the previous “The Map is too
small to see the
concerns regarding banner blindness; if the banner act like text, the
information”
user will be more likely to avoid text and consider it as part of the
“I’m not familiar
banner than ever before. with the school
boundary map”
* Severity | 2 – Medium
“I think I left the
* Design Defect | Navigation website”
28
be together for most of our participants. When accessing the enrollment page and not finding
a school-locator, the users either remained confused and randomly browsed, or simply gave
up. Some users even felt the need to search the schools-specific sites to see if there was
something there that could assist them. This lead to some information architecture issues
we explained earlier.
* Severity | 2 – Medium
*Problem Description:
Another logical area the users expected to find the schools location was in the schools section
of the page. Some participants went immediately here and assumed that it was a „trick‟
question because they couldn‟t believe the developers would not implement something like
this here. The area they were looking for was the boundary map, which was not located in the
school sub-pages. The semiotics used for the „boundary map‟ and expectation of it being in
„enrollment‟ or „schools‟ are understandable expectations and links to a boundary map should
We found that when the user types in their address to locate the nearest school, they need to
match the syntax rules of the system, displayed under the map and input field. When they
29
enter an address that doesn‟t match the syntax rules, they are presented with an alternate
address as a “Did you mean…” type of suggestion. However, upon clicking this, the
subsequent page is another page notifying us that our input is incorrect. The suggestion itself
is as incorrect as the users input. Another higher level issue deals with the users
understanding of the semiotics of the page. Many of our participants did not like the idea of
“Boundary Map” as the area they navigate to in attempts to locate a school close to them.
30
Task 5 – Magnet School Application
• Task Description
“You noticed that a school located near your house was a magnet school. Please locate the
2011-2012 English student application for this magnet school.”
Task 5 could be denoted as the “Average” task for our participants. The time on task
was 212 seconds with seven of our participants completely finishing, 2 finishing with
assistance, and 2 failing or giving up. The lostness scale is right at 0.50 which is about average
* Problem Description:
This task summarized many of the points previously made in other tasks. One point referred
to before for other sections, and now this section, is the labeling of pages and how those
31
semiotics match the users expectations. Users expected the magnet school applications to be
Some participants felt the need to go to the schools list and find a magnet school to
hone in on while searching for the application for that particular school. At this point in the
scenario, some participants felt confused as to some of the semiotics used to describe the
schools. Magnet schools were both located under high school and other schools. This
confused the user into thinking magnet schools had to be one or the other. However, even
when the user found a magnet school, it was to no avail as the individual school page did not
have what the user expected to be there; an application to that school. This poor set up in
information architecture could cause the participant to feel like there is no place to go from
here.
Users tended to want to read all of the material before clicking on the magnet
link. This has to do with an area we discussed in our evaluation testing focusing
on the side navigation being ignored by the user. Most times, especially for this task (which is
odd because it was the last task for most people) the side navigation was not referenced as
often as we expected. While time on task was not horrible, it was high enough to elicit
concerns that items like this navigation instance could cause some major pleasure losses for
the site.
32
* Severity | 2 – Medium
33
Task 6 – Payroll Contact Information (Teachers Only)
• Task Description:
“As a new teacher you had an issue with you pay check this week. You contacted the front
office at your school, however, because they do not directly deal with payroll they directed you
to the corporation website to find the appropriate contact information and the location were
you needed to go to sort everything out. Identify the telephone number and location of the
payroll office”
“As a teacher you wanted to review updates to the 2010 teacher salary information specifically
Amendment 5233. Locate information about Amendment 5233.”
In both task 6 and task 7, there were only two participants due to the nature of the
task. The tasks involved items only teachers would need to do, so they were conducted only
by our two teacher participants. Time on task for both of them was 60 and 120, respectively
for tasks 6 and 7. They both succeeded fully within the ideal number of clicks.
Users both found what they needed to efficiently. This may not be due to great user
experience designing, yet, adaptation by the participants to this poor user design. Since the
users were both teachers within the IPS school district, they were our expert users and thus,
34
have had much experience with this design interface. Being accustom to a poor user design
experience does not mean a poor user design experience doesn‟t exist.
35
IV. Synthesis of Results from Inspection
The team analyzed the qualitative data discovered during the usability testing and
Navigation, Presentation, and Semiotics. The team then compared the testing results with
the problems discovered during the inspection to reaffirm the initial findings and to uncover
Content
New Problems
Within the Content section, participants encounter three new problems during the
usability testing which was not discovered in the inspection (ID: 5, 6, 7). They included
search problems caused by “Unclear input requirement”, “Unclear boundary map”, and the
Inspection Problems
Of the four problems the team found in the inspection, the participants encountered
none of them during the usability testing. The problem of “Missing and inconsistent
information” (ID: 1) was already updated on the website by the time our participants began
testing. The other three (ID: 2,3,4) were not confirmed as they were included in the tasks,
however, participants who did visit those pages during the test did not uncover any problems.
The table below shows the comparison of problem findings in the Inspection and Usability
testing.
36
ID Inspection Results Usability Results Conclusion
1 Missing and inconsistent This problem is repaired by
information the website. Not a problem
2 Target User Misidentified Participants didn‟t encounter
this problem. Severity
Decrease.
3 Consistency of information Participants didn‟t encounter
and representation this problem. Severity
Decrease.
4 When was the news new? Participants didn‟t encounter
this problem. Severity
Decrease.
5 Unclear input Major new problem
requirement
6 Unclear boundary map Major new problem
7 Vague enrollment Minor new problem
information
Information Architecture
New Problems:
a new problem which was found in usability testing and has a very high severity. About 8
participants encountered this problem and it is a main reason causing the highest time-on-
task and lostness, and lowest success rate of Task 2. Users categorize information according
to the semiotic closeness between target information and category, while the website is
37
grouped according to location and structure of the physical IPS departments. The users were
not familiar with the organizational layout of the IPS departments. “Problem finding items
in a long list” (ID: 10) mainly occurred when users were looking for schools from a page
that contained lists. When the users found finally found the school they were looking for the
expected it to be according to the school name sort alphabetically, however the schools were
listed in the order of IPS school number which did not provide much help.
Overlapping Problems:
The Overlapping Problems (ID: 2,4,5,8) were predicted in the inspection and proved
by the usability testing. Users encountered “Problem of find school according to distance”
(ID: 2). Their first reaction so solve the problem was by using the boundary map, but the
static and separate boundary map of each school provide little help for measuring distance
from home to school or location comparison. Three of them used Google Map to finish the
task. Regarding to the problem of “No way go back” (ID: 4), users were confused on how
to go back to the previous page, but they overcame it by clicking the back button multiple
times. The problem of “Where am I?” (ID: 5) was caused by both mislabeled pages and a
lack of cues for current location. The weak awareness of the current location makes it
difficult for the user to find the page again as well as understand the website structure.
Participants encountered the problem of “Homepage without main content” (ID: 8).
Most of them wandered on the homepage for a long time, especially in Task 2, seeking the
entrance to the information they were looking for. They found little help with the
Inspection Problems:
during the usability testing ( “Too much information accommodating too many people”
38
(ID: 3) and “Archiving” (ID9)). Due to the lack of evidence in testing the severity for these
Navigation”. When users were finding information within the page, they naturally ignore
the unrelated navigation. The severity for this problem was also decreased.
39
Navigation
Overlapping Problems:
“Undistinguishable Links” (ID: 1) was a reoccurring problem where users were confused
by the static text, images (including banners) and links. Users also encountered the problem
of “Too much navigation” (ID: 2) as predicted by the inspection. The users reaction to this
problem was by ignoring the links (even the useful ones). An example would be instead of
clicking the in-page quick link at the top of the page they instead scrolled down to find the
section they were looking for. “No indication when leaving the website” (#ID: 3), was yet
another problem that confused the users. Many expected to find the information within the
IPS website, however, were surprised when they had to navigate or find the window back to
it. Again, as proved by the inspection, “Provide choices without decision information”
New Problems:
The Usability Testing also uncovered a new problem “Navigation bar blocks
navigating within the topic” (ID: 5). The navigation bar does not consistently including all
40
ID Inspection Results Usability Results Comments
1 Undistinguishable links Undistinguishable links Increase Severity
(text, & banner)
2 Too much navigation User missed navigation Severity Remain
links
3 No indication when leaving the Lack external website Severity Remain
website alerts lead confusion
4 Provide choices without decision School link with vague Severity Increase
information description
Presentation
Overlapping Problems:
Users encountered the problem of “Error Recovery” (ID: 7). In the search feature,
they have difficulty in recovering from the error when the information they input cannot
meet the required format because there is no instruction telling them how to do. Regarding to
the problem of “Mislabeled Pages” (ID:8), users are confused by the phenomena that
New Problems:
The Problem of “Banner Blindness” (ID: 1) was not predicted in the inspection,
however, due to issues uncovered in the usability testing the team classified this as a high
severity. Several users overlooked the navigation bar on the top, especially on the homepage,
which makes it difficult to gain access to the topics within the website. In contrast, the users
can notice the so call “deemphasized” menus on the right. The other problem “The Search
Bar” (ID: 10) is too small to be noticed. The third problem, “Un-salient link” (ID: 11), does
not show links in a consistent format (i.e. no visual cue the text is a link).
Inspection Problems:
41
The problem “Users don’t know where they are” (ID: 2) and “Mixed
representation of information and linked files” (ID: 3) are another aspect of the
above problems should remain. Other problems, ID: 3, 5, 6, 9 were not discovered by users
42
V. Overall Recommendations for Improvement
The teams recommendations span over four design dimensions; Content, Navigation,
based upon our inspection and user feedback. Items highlighted in yellow were identified in
Content
Usability Testing. Regarding to the severity of the problem, we recommend the website
should supply a section proving the school‟s enrollment information (ID: 6) as users expected.
More over, they should also improve the Searching feature (ID: 4) by providing clear input
format requirement and error recovery help. Also, the Boundary map (ID: 5) should also be
The problems with lower severity can also be improved if develop team has enough
time. The relevant recommendation and their reference please refer to table below (ID: 1, 2,
3).
43
doesn't meet the user's
need. The boundary map
only provides a static map
of area around the school
without any explanations,
while the users want to
know the distance between
school and their home.
6 Enrollment Information: Clarify enrollment process 3 Testing: Vague information:
by including the link in various subject areas to The users expect to know
provide enrollment information, including how to the information about how
enroll, current enrollment status, resource of forms to enroll and the situation
and etc. of current enrollment, while
the current enrollment
information is a number of
how many students enroll
in this school
Navigation
severity in inspection and caused great difficulty in users finishing their tasks in the test.
Regarding navigation links, make sure the overall navigation include links to each pages
within the topic. The website should also keep the links consistent and salient in appearance
and layout to make them easy to be notice and distinguishable from the text or static pictures.
Moreover, strengthen user‟s awareness of boundary of the website to improve the confidence
It is also necessary to strengthen the users‟ ability to control the navigation, by adding
3 Keep the most necessary navigation and get rid of 3 Inspection, Testing: many
44
the ones irrelevant to the page. Make sure the navigations that users
appearance and layout of the navigation are salient ignore one or two
and consistent for easy noticed. Distinguish the
navigation of different level by different
presentation.
4 Provide clear awareness of the boundary of the 2 Inspection, Testing: No
website, by alerts when the user is navigated to the indication of leaving the
external website while keeping the graphical and website leads to the user's
navigation consistency within the main website. confusion as to where IPS
ended and new site started.
5 In the School List, provide more strategies for 2 Inspection, Testing: Provide
school searching, such as filtering or arranging the choices without decision
schools according to specific school attribute. information, and users have
Provide more information or attributes to the problem with the school
schools (i.e. address, email fax, etc.) A possible way link with vague description
to keep the list clear with rich information is to
redesign the school list as a folding list like Google
Reader. Users can tab the school name to unfold
and show more information about it. A “show all”
button is needed to easily unfold all the items.
6 Provide controller for navigating through the 2 Testing: The uncontrollable
flashed banner: There should be some kind of way rotation of the flashed
for the user to go back to a previously flashed banner is difficult for users
banner quickly. Links to each banner should be to browse and make use of.
displayed below the banner, allowing the user to
navigate these asynchronously.
Information Architecture
Regarding that this is a deep and wide website with the severe problem of mismatch
information categorized between users and the website (ID: 1), however, it takes a great
effort to re-architecture the website, we recommend proving a sitemap as solution. The link
to sitemap should be in a salient position for easy access at any time, which will show the
overall website structure to the user to help them understand the conceptual model of the
website and easy locate the information they are seeking. The website should also provide
visual cues indicating user‟s current location (ID: 2), breadcrumbs and highlighted labels are
possible solutions. The Homepage should also be improved to provide clear and overall
introduction to the website and helpful navigation assistance to the inner of the website (ID:
3).
45
ID Recommendation Severity Reference
1 Provide a sitemap of the website to aid the user gain 3 Testing: Mismatch
an overview of the logical architecture of the website, information categorization
so that they can find the information they need. between users and the
website. (Usability Testing)
Users categorize
information according to
the closeness of their name,
but the website group
according the department
while the users are not
familiar with the
organization of
departments.
Presentation
One of the most critical problems the team feels is important to fix is the problem of
Banner Blindness, especially on the Homepage (ID: 1). Possible solution is making the
navigation bar more salient and keeps it a distance away from the banner. Also, need to keep
the useful widget, including links and search button more salient in visual and layout aspect.
(ID: 2, 4). Clear error recovery instruction should also be provided (ID: 3).
46
color similar to the banner for the background of
navigation bar. Make the appearance of the
navigation bar more touchable like a button.
2 Make the links more visually salient. On the same 2 Inspection: Mixed
time, put them in a more outstanding position, such representation of
as on the top of the page and avoid mixing within information and linked
the body of the content. files, Un-salient links
3 Provide clear recovery instruction when the error 2 Inspection, Testing: Error
appears. Especially to the search feature, provide recovery & Users don‟t
more options for user to choose to let them know how to recover from
recognize rather than recall. Provide format the search error
requirement and example near the input area.
4 Make the search button in the banner area more 3 Testing: The search bar is
salient in order to be noticed. Provide an input area too small to be noticed.
to make it more affordable.
47
VI. Appendixes
Appendix A: Pre-test Questionnaire
Date:
Age:
Gender:
Type of Computer:
Browser Used:
Location:
Race:
Very Somewhat Not At All
Very Somewhat Not At All
Email Social Work School Browsing Other
Networks
School and Computers
Yes No
48
If yes, was it IPS’s?
Yes No N/A
1 – 3 Times 4–9 10-20 20-30 More than 30
Times Times Times
Would you ever access the IPS site if you were/are not a parent, student, or staff member?
Yes No
49
Appendix B: Usability Test Script
Introduction
Hi, I‟m [Name]. Welcome and thank you for coming. How are you? Today we will be conducting an
evaluation on the Indianapolis Public School (IPS) Corporation website.
I‟m helping IPS understand how well their website works for the people who use the website. I will
be observing what you are doing today. In the evaluation you will have a chance to use the website to
see what you think of their product: what seems to work for you, what doesn‟t, and so on.
We‟re going to be videotaping the screen as a record what happens here today. This video is for our
analysis only. Our conversation will be recorded as soundtrack to capture full information for
analysis. It‟s primarily so I don‟t have to scribble notes and can concentrate on talking to you. The
members of our evaluation team will also review the video and audio. The information obtained
today is strictly for the purposes of this study and your information will be kept confidential.
Like I said, we‟d like you to help us with a product we‟re evaluating. It‟s designed for people like you,
so we‟d really like to know what do you think about it and what works and doesn‟t work for you.
You may run into features or functions that will not work right. Please feel free to tell us about your
experiences as you go through the evaluation.
Procedure:
The procedure we‟re going to do today goes like this: we‟re going to start out and talk for a few
minutes about how you use the web, what you like, what kinds of problems you run into, that sort of
thing.
Then I‟m going to show you a website of Indianapolis Public Schools which we are evaluating and
have you try out a couple of things with it. Then we‟ll wrap up, you will talk about your experience of
using it and I‟ll ask you a few more questions about it, and we‟re done.
Any questions?
Now I'd like to read you what's called a statement of informed consent. It's a standard thing I read to
everyone I interview. It sets out your rights as a person who is participating in this kind of research.
Let's start!
50
Preliminary Interview
-Offline Habits-
1. (For parents) What kind of information do you want to know about your child‟s education?
From where do you usually get this information?
2. What kind of information do you want to know about the local schools or educational
system? How do you usually get it?
Evaluation Instructions
In a few minutes, I will ask you to begin your evaluation of the IPS website. However, before we get
started, let's talk a little about what will be happening.
Please try to feel as comfortable as possible while using the interface. We are studying how the
website elicits your actions. There is nothing that you can do wrong as a user of this website. We
want to observe your thoughts and reactions on the site's interface. With that, it is extremely helpful
to us if you could narrate your experience while using the site. We would love for you to have a nice
stream of consciousness while interacting with the site. If you don't like something, let us know. If
you really, really like something, let us know. If anything stands out, doesn't stand out enough, or if
you have an idea as to something that should be that isn't, your suggestions will help future users'
experiences.
So, here is just a quick summary of what we discussed. We are testing the site; not you. Make sure to
let us know what you are thinking as you are thinking it. Finally, and most of all, be comfortable. If
there is anything we can do throughout the process to make your experience as natural as it would be
at home, let us know.
Do you have any questions on the process? Does it all make good sense?
Great! Let's go to our Internet browser and go to www.ips.k12.in.us. As the site loads, please feel free
to move the chair, mouse, monitor, and keyboard to a comfortable position.
Tasks
8. As a parent you are growing concerned about the direction the IPS School board is moving
towards concerning year round school. Another parent mentioned that there are school
board meetings where you can go and voice your concerns. Assume today is December 1st
and you want to go to the next briefing session. Please identify the date for this meeting.
51
9. As a parent you feel that your child‟s ability to learn and use technology is very important. In
the midst of growing concerns about budget cuts you are interested in knowing how much
money IPS has allocated to implementing new technology. Find the budgeted amount IPS
has allocated in its Technology Plan.
10. As a parent of a high school junior, another parent told you that IPS has resources for you
and your child to start exploring colleges. Find contact information about Sawyer College in
Merrillville.
11. As a parent you want to identify the schools that are nearest to your home where your child
would attend middle school. Your house is located at 816 N. Audubon Road, Indianapolis,
IN 46219. Please name the closest high school to your house and find the enrollment
information for that school.
12. You noticed that a school located near your house was a magnet school. Please locate the
2011-2012 English student application for this magnet school.
13. As a new teacher you had an issue with you pay check this week. You contacted the front
office at your school, however, because they do not directly deal with payroll they directed
you to the corporation website to find the appropriate contact information and the location
were you needed to go to sort everything out. Identify the telephone number and location
of the payroll office
14. As a teacher you wanted to review updates to the 2010 teacher salary information specifically
Amendment 5233. Locate information about Amendment 5233.
Evaluation Wrap-up
Wonderful. Now, if you would, please, simply exit your Internet browser and we'll come together for
a quick wrap-up session.
If you could describe this site in one or two sentences to a colleague of friend of yours, how would
you do so? What adjectives would you use?
Would you say, by the end of the process, that you were satisfied by the service provided by IPS?
When you first started the process, did you expect the site to contain what it did? More than it did?
Based on your experience with the website, can you give three general pros and three general cons
about the site?
Is there anything that you wish this site would provide to people in your same shoes with regards to
stake in the system (IPS)?
Thank you so much for your time and participation. Many times, after completing evaluations,
people tend to have amazing ideas when they are at home, browsing the Internet and utilizing
services like these in their home environment. If you have anything you could add to this evaluation
today, tomorrow, or even as far out as next week, please don't hesitate to contact us at
52
[email_address].
53
Appendix C: Post-Task Questionnaire
Strongly Strongly Agree
Disagree
54
Appendix D: Post-Test Questionnaire
2. How would you explain how you are feeling after completing the tasks?
3. Do you feel like you were “In Control” of navigating around the site?
5. Could you understand where you were in relation to the sites structure?
55
Appendix E: Usability Testing Videos
Please reference the attached CD File submitted with this project for a complete list of videos.
56