Você está na página 1de 56

I N F O – I 5 4 3 U S A B I L I T Y A N D E VA L UAT I V E M E T H O D S

INDIANA UNIVERSITY SCHOOL OF INFORMATICS


FALL 2010
D R . DAV I D E B O L C H I N I

U S A BIL I TY TE S TI NG
R E PO RT
INDIANAPOLIS PUBLIC SCHOOLS

WWW.IPS.K12.IN.US

INSPECTED BY TEAM 2:
STEVEN ENTEZARI
HAI DAN HUANG
JAY WHEELER
Table of Contents
I. Executive Summary ................................................................................................. 4
II. Instruments and Methods ...................................................................................... 7
Participants ....................................................................................................................................................7
Data Collection .............................................................................................................................................7
Environment & Timeframe.......................................................................................................................8
Tools................................................................................................................................................................9
Tasks................................................................................................................................................................9
III. Results ................................................................................................................ 11
Task 1 – School Board Meetings .......................................................................................................... 15
Task 2 – Finding the Technology Plan ................................................................................................ 20
Task 3 – College Resources .................................................................................................................... 24
Task 4 – Find the Closest School ......................................................................................................... 27
Task 5 – Magnet School Application ................................................................................................... 31
Task 6 – Payroll Contact Information (Teachers Only).................................................................. 34
Task 7 – Teacher Amendment (Teachers Only) ............................................................................... 34
IV. Synthesis of Results from Inspection ................................................................... 36
Content ........................................................................................................................................................ 36
Information Architecture ........................................................................................................................ 37
Navigation................................................................................................................................................... 40
Presentation ................................................................................................................................................ 41
V. Overall Recommendations for Improvement ........................................................ 43
Content ........................................................................................................................................................ 43
Navigation................................................................................................................................................... 44
Information Architecture ........................................................................................................................ 45
Presentation ................................................................................................................................................ 46
VI. Appendixes ......................................................................................................... 48
Appendix A: Pre-test Questionnaire .................................................................................................... 48
Appendix B: Usability Test Script......................................................................................................... 50
Appendix C: Post-Task Questionnaire ................................................................................................ 54
Appendix D: Post-Test Questionnaire ................................................................................................ 55
Appendix E: Usability Testing Videos................................................................................................. 56

2
Table of Figures

Figure 1 - Average times it took each participant to finish each task. ..........................................11
Figure 2 - Full-Success/Partial-Success/Give-Up rates for each participant per task ..............12
Figure 3 - Summary of quantitative data showing average time on task, lostness, success rates
and ease of use for each task.........................................................................................................13
Figure 4- Correlation of measured items .............................................................................................14
Figure 5 - IPS Homepage .....................................................................................................................16
Figure 6 - Main Navigation for IPS Site..........................................................................................17
Figure 7 - Task 2 Quantitative Summary ......................................................................................20
Figure 8 - Divisions/Departments page with no Technology section ...............................21
Figure 9 - Task 3 Quantitative Summary ......................................................................................24
Figure 10 - Long list of state colleges and universities ...........................................................25
Figure 11 - Task 4 Quantitative Summary ...................................................................................27
Figure 12 - Task 5 Quantitative Summary ...................................................................................31
Figure 13 - Task 5 Quantitative Summary ...................................................................................34

3
I. Executive Summary
The Indianapolis Public School System (IPS) Corporation website is an informational

website about the Indianapolis Public School Corporation. The purpose of the site is to

communicate news, events and educational related information to primary groups of users

who include faculty, parents and the community.

This report identifies key findings from the usability study conducted by our team on

the IPS Corporation website. The usability study included 11 participants comprising 5

community members, 4 parents and 2 teachers. All participants performed 5 tasks except for

teachers who performed 7. Each task was recorded to captured data, which was later

analyzed and included in this reported. Tasks were selected for their saliency in providing

information to perspective users based on activities that were performed both routinely such

as finding school board meetings to those that were performed less frequently such as finding

cost associated to a technology plan budget. Teachers performed two additional task which

were designed to help the team identify how recognizable the layout and content was

between the general users understanding of the site and teachers understanding of the site. In

general, the tasks selected had to include at least 3 steps to be considered for the study. In

addition, an earlier usability inspection was performed which helped establish the direction

and objectives for the final usability study.

The following are the key set of findings captured from both of the usability

inspection and usability study, which have been identified as having the most impact to the

usability of the website:

1. Content: Many participants became confused due to vague, non-dated or missing

information.

4
2. Navigation: Many problems were due to inconsistency in the location and style of

links. Additionally, the page banner located on every page due to its size and

distracting messages causes all the users unnecessary scanning of pages to locate the

appropriate navigation and/or content identified in the set of tasks.

3. Information Architecture: The website needs to align to how the target user‟s

understanding a traditional website. Many problems surrounded the organization of

topics within a subject area. Topics that belonged to different subjects were included

into other unrelated subjects, which caused vague or inconsistent relationships

between information that confused participants.

4. Presentation: A lack of consistency in style and organization of page elements (i.e.

links, headers, banners) caused additional overhead in the participants ability to

navigate and use the website effectively.

After careful review of both the usability inspection and evaluation study given the top

four impacted areas mentioned above our team recommends the following top seven critical

areas for improvement. Additional recommendations can viewed in the “Overall

Recommendations For Improvement” section later in this report.

Key Recommendations:

1. Establish a checklist and a content review board to ensure information is up-to-


date and correct
2. Ensure dates are captured and identified on items that have relevance around a
date or time
3. Use proper formatting ensuring consistency between text and links, as well as
banner picture and links.
4. Keep the most necessary navigation and get rid of the ones irrelevant to the page.
Make sure the appearance and layout of the navigation are salient and consistent.
5. Provide a sitemap of the website to aid the user gain an overview of the logical
architecture of the website, so that they can find the information they need.

5
6. Design should be include visual cues in the navigation that highlights their current
location. Ensure each page has a title matching the pages content.
7. Make the navigation bar around the banner area more salient. Avoiding using
colors similar to the banner for the background of navigation bar. Make the
appearance of the navigation bar more touchable like a button.

6
II. Instruments and Methods
This study employed several tools and techniques for collecting information

pertaining to the usability study. In the following sections an account of participants,

methods and instruments are explained. A complete list of artifacts including the Pre & Post-

Test Questionnaire, Post-Task Questionnaire, Usability Study Script among others used in

the study can be found in Appendix.

Participants

The IPS district website caters to three primary groups of users: Teachers, Parents

and the Community. For our study we focused on these key user group which we solicited

11 participants (5 community members, 4 parents and 2 teachers). All 11 participants did

have at least some computer experience with 7 self-identifying as very experienced. All 11

participants had over 20 hours of Internet usage per week. Additionally, 9 the users had

never visited the IPS website, however, 4 parents and both teachers visited their own school

corporations website.

Data Collection

Due to the complex nature of collecting both quantitative and qualitative data our

team developed and employed a well-crafted process and artifacts, which were used in the

study (see Appendix B). The evaluator first started by traveling to the participant‟s location

with a laptop, which was loaded with recording and capturing tools (see tools below). Once

the evaluator was setup they began by giving the participant a pre-test questionnaire, which

solicited demographic information along with computer and Internet usage information.

Once the participant completed the pre-test questionnaire they were then read a script that

provided an over view of the study which included information regarding the purpose of the

7
study along with the intention of the information collected, privacy & confidentiality and

their rights as a participant in the study. After the participant agreed to the terms the

evaluator then gave the participant an introduction to the computer system and to the

website allowing them to complete a nominal task before the official test was initiated.

Once the participant was familiar with the setup of the laptop, browser and site the

evaluator began the test. The evaluator began recording the session along with reading each

task out load. The evaluator recorded both the start and end times along with the success,

partial success or failure for each of the task. In addition the participants we encouraged to

think out load as they worked through each task. This helped provided additional insight and

rich qualitative data, which was noted and later analyzed by the evaluator and team. Also

after each task the participant was given a post-task closed question questionnaire, which

solicited their feedback on how satisfied with the ease of completing the task they were. The

questionnaire ranged from 0 (strongly disagree) to 5 (strong agree).

After all the tasks were completed the evaluator then moved to the last section of the

script, which included the post-test questionnaire and wrap-up. The post-test evaluation

included several open-ended questions to give the participant a chance to give their overall

impressions of the website and to reflect on the site as a whole. Once completed with the

post-test questionnaire the participants were again thanked for their participant and given

contact information of the evaluator in case additional thoughts or concerns arose about the

website or the study in general.

Environment & Timeframe

Due to the short timeframe for this study and the need to emulate an environment

that best represented our key user groups most of the testing was conducted in either the

8
participant‟s home or place of work. The testing was conducted between November 1 st and

November 8th, 2010 at times that were most convenient for the participants.

Tools

A variety of tools were used to help record and capture data from the participants.

These tools included Clearleft‟s Silverback 2.0, Camstudio and Synium‟s Screenium for local

participants and Cisco‟s Webex for remote participants. These tools allowed our team to

record the participant‟s actions on the screen, the participant‟s verbal comments along with

the participant‟s face, which allowed us to capture both verbal and nonverbal cues.

Tasks

Below are the following tasks used in the study for participants to complete. Tasks 6 and

7 were included in the test if the participant was a teacher. Tasks were selected for their

saliency in providing information to perspective users based on activities that were performed

both routinely such as finding school board meetings and to those that were performed less

frequently such as finding cost associated to a technology plan budget. The usability

inspection was also used to identify steps used within the tasks that would expose

problematic areas found by our experts. The additional tasks for teachers were designed to

help the team identify how recognizable the layout and content was between the general users

understanding of the site and teachers understanding of the site. In general, the candidate

tasks had to include at least 3 steps (or navigation points) to be considered for the study.

1. Find the next school board briefing meeting so that you can attend.

As a parent you are growing concerned about the direction the IPS School board is
moving towards concerning year round school. Another parent mentioned that there
are school board meetings where you can go and voice your concerns. Assume today
is December 1st and you want to go to the next briefing session. Please identify the
date for this meeting.

9
2. Find the dollar amount IPS has budgeted towards technology in its
Technology Plan.

As a parent you feel that your child‟s ability to learn and use technology is very
important. In the midst of growing concerns about budget cuts you are interested in
knowing how much money IPS has allocated to implementing new technology. Find
the budgeted amount IPS has allocated in its Technology Plan.

3. Find contact information for Sawyer College in Merryville.

As a parent of a high school junior, another parent told you that IPS has resources for
you and your child to start exploring colleges. Find contact information about Sawyer
College in Merrillville.

4. Find the middle school closest to your house.

As a parent you want to identify the schools that are nearest to your home where your
child would attend middle school. Your house is located at 816 N. Audubon Road,
Indianapolis, IN 46219. Please name the closest high school to your house and find
the enrollment information for that school.

5. Find the 2011-2012 English Student Application for the Magnet Schools.

You noticed that a school located near your house was a magnet school. Please
locate the 2011-2012 English student application for this magnet school.

If you are a teacher, please complete the additional two tasks:

6. Find the telephone number and location of IPS payroll.

As a new teacher you had an issue with you pay check this week. You contacted the
front office at your school, however, because they do not directly deal with payroll
they directed you to the corporation website to find the appropriate contact
information and the location were you needed to go to sort everything out. Identify
the telephone number and location of the payroll office

7. Find amendment 5233 of Teacher Contract.

As a teacher you wanted to review updates to the 2010 teacher salary information
specifically Amendment 5233. Locate information about Amendment 5233.

10
III. Results
After careful analysis during both live and recorded user-testing review, we identified

trends common to many of our participants. In our analysis, we looked at quantitative data;

specifically with regard to time-on-task, success/partial-success/fail rates per task, post-task

questionnaire answers, and a calculation on the „lostness‟ of the participant. Qualitative data

that we focused on include our team observations, questionnaires, demographic data, and

other such items that gave a sense of the user‟s state of mind, the sites pleasurability, and the

overall acceptance of the site by the user. These results were analyzed first individually, then

in comparison with each team-member for inter-rater reliability. They were then cross-

referenced to the heuristic and scenario-based evaluations conducted previously for this site.

Here is a summary of the tasks, mentioned before:

Task 1 Find the next school board briefing meeting so that you can attend.
Task 2 Find the dollar amount IPS has budgeted towards technology in its Technology Plan.
Task 3 Find contact information for Sawyer College in Merryvile.
Task 4 Find the middle school closest to your house.
Task 5 Find the 2011-2012 English Student Application for the Magnet Schools
Task 6 Find the telephone number and location of IPS payroll
Task 7 Find amendment 5233 of Teacher Contract

Time on Task Analysis


400
Time(sec) to complete task

350
300
250
200 Time-On-Task
150 Perceived Ease
100
50
0
1 2 3 4 5 6 7

Figure 1 - Average times it took each participant to finish each task.

11
We measured the time it took each participant to accomplish the tasks. Above you

see the average time on task for the participants per task. Notice the times for tasks two and

four are highest. These two tasks were proven to be the most compelling for our participants;

as you will see in later parts of this paper. This graph also compares the time on task with the

perceived ease for the participant. Notice, on task 2, that the perceived ease is significantly

lower than the other 6 tasks. This indicates that the user found it most difficult and spent the

most time attempting to complete it. One thing that should be mentioned with regards to the

time on task calculations is that the time on task takes into account the attempts by

participants that were unsuccessful and incorrect. Next you will see success rates which will

correlate to these findings.

Analyzing Levels of Success


100%
90%
80%
70%
"Give up/ wrong/ use
60% outer tool"
50% Partial success
40%
30% Complete success
20%
10%
0%
1 2 3 4 5 6 7

Figure 2 - Full-Success/Partial-Success/Give-Up rates for each participant per task

For every participant and for every task the participant started, we kept a tally on the

number of successful completions, partial successes, and complete withdraws. A successful

completion occurs when the user completes the task with no input (or input that would not

alter the intended course of the participant) from the proctor. Partial success occurs when the

participant has successfully completed the task, but was, at some point during the scenario,

12
assisted by the proctor. Give up or withdraw simply means that the participant never

completed the task; even after given help. While task 2 has the lowest success rate of all the

tasks, notice, from before, that it still maintained the highest time on task of all tasks. This is

indicative of the user attempting, by-all-means, to complete the task yet failing to do so.

Quantitative Summary
350.00 3.50
300.00 3.00
250.00 2.50
200.00 2.00 Time on Task
150.00 1.50 Lostness

100.00 1.00 Success Rate

50.00 0.50 Ease of Use

0.00 0.00
Task Task Task Task Task Task Task
1 2 3 4 5 6 7

Figure 3 - Summary of quantitative data showing average time on task, lostness, success rates
and ease of use for each task.

After combining the two items listed before, time on task and success rate, with a

measure of lostness and ease of use, we can see that the most problematic tasks were Task 2

and Task 4. Both of these tasks displayed the highest time-on-task and lostness levels. They

also displayed the lowest success rate and ease-of-use averages. This not only indicates that

these two tasks were quantifiably more difficult than the rest, but it also indicated that the

user could feel this while completing the task; based on the post-task questionnaire regarding

the users ease-of-use.

Another thing to note is that tasks 6 and 7 were performed only when the participant

was a teacher. There were a total of two teachers, who both were parents. All 11 participants

completed tasks 1 through 5. The data shows that these two tasks (task 6 and task 7) were

both completed with complete success, relatively no lostness, and a comparatively low time

13
on task average. This indicates that the teacher-participants may have been experts with the

system as they need to use it for daily-work. While this may imply that increased exposure to

the site generates higher satisfaction with the usability, it is important to note that the user

may become habituated to a poor usability experience and learn the „wrong ways‟ efficiently.

After examining tasks 1 through 5, we noticed an interesting phenomenon regarding

task 3. Out of the 5 main tasks, task 3 had the lowest time on task of all of them. The ease of

use was relatively high in comparison to the other tasks. Success rate tended to be higher than

average and the lostness of the task completion seemed to be extremely low. These facts may

imply a conflict between our usability evaluation based on heuristics and scenarios compared

to the usability testing done by our participants. This disconnect will be explored later when

looking at each task individually.

Time on Success
Lostness Task Rate Ease
Lostness 0.92 -0.93 -0.39
Time on
Task -0.95 -0.68
Success Rate 0.66
Ease
Figure 4- Correlation of measured items

After running a correlation on the measured items, we found three interesting, and

very strong, correlations. First, for the average participant there was a +0.92 correlation for

time on task and lostness. This indicates that as the participant would spend more time on

the page, they would get more and more lost. This also implies that time was not spent solely

on a few pages for purposes of our time-on-task. Lostness factors in page counts, therefore,

the user spent a great deal of time searching many pages to complete the task. We also found

a -0.93 correlation regarding success rates and lostness. This is very interesting as it indicates

that as the user got more and more lost in the system, they would have a much less chance of

14
succeeding. It also implies that the more the users would succeed, the less lost they would be

in the system; meaning they needed less clicks to get to their final destination. The final and

strongest interesting correlation deals with the participant‟s time on task and their success rate.

Interestingly, as the success rate of an individual would go down, their success rate would go

up. This implies that when the participant did succeed, they did so quickly and the items that

the participant was unsuccessful on took more time.

Task 1 – School Board Meetings

• Task description
“As a parent you are growing concerned about the direction the IPS School board
is moving towards concerning year round school. Another parent mentioned that
there are school board meetings where you can go and voice your concerns. Please
identify the date for the next briefing session.”

Quantitative Summary Average Time on Task/


Average of All
350.00 3.50 3:49 / 3:54
300.00 3.00
Success Rates
250.00 2.50
Full-Success – 7
200.00 2.00 Time on Task Partial-Success – 2
150.00 1.50 Lostness Fail – 2
100.00 1.00 Success Rate
Lostness
50.00 0.50 Ease of Use 0.53
0.00 0.00

Participants completed task 1 after completing a “warm-up” task to help avoid the

learning curve of the site. While this does hellp with avoiding some skewed statistics, it does

not take away from the fact that users of any site will be more proficient with general

navigation, logical constructs, and overal usability of any site; good or bad design.

15
Given that, however, the time on task for task 1 was right on average for the first five

tasks (the tasks intended for parents). Seven of the participants completed this task without

any help from the proctors while 2 needed assistance to finish successfully and 2 failed or

gave up. The users had a lostness of 0.53. This number does indicate a relatively high lostness

for this task.

• Usability Issues encountered by participants:

# Problem 1: Missed Navigation

* Problem Description:

We noticed that participants failed to

recognize the main navigation area as

clickable links; a phenomena Don Norman

calls „Banner Blindness‟. This is especially

harmful on the homepage of IPS‟ site due to

the dashboard-like style. We observed that users tended to spend a large amount of time

specifically looking for their course-of-action


Figure 5 - IPS Homepage in the homepage content structure. This was

emphasized in our usability testing in a section denoted “Homeless Homepage” where we

displayed the problems with a homepage containing only links. We saw that this dashboard-

like style lead the participants down a path that reinforced them to think the homepages main

content had allt he answers for their task. This information architecture set up is one of the

reaons contributing towards a high lostness level. It was assumed that if it wasn‟t in the

homepage area, it was a task that could not be completed.

* Severity | 1 (Show Stopper)

16
“The access to the school board
meeting should also be in the Parents
* Design defect | Navigation part”

“The task is about parent’s school


board meetings, so I think I can find
the meeting time in parents”
#Problem 2: Inconsistent Labeling
“I don’t think it would be in general
*Problem Description: information because the title seemed
The users continuously wanted to find to be about static information, but the
meeting times seem more dynamic.
school board calendars via the calendar links on Balanced Calendars seems more
relevant.”
both the homepage and the individual school sites.
“I would prefer it showed general
They were surprised and a little frustrated when information once I am on the school
board page.”
browsing the page that appeared as a result of
“The members list in school boards is
clicking the calendar link to find multiple different
distracting. It makes me think the
types of calendar, but none pertaining to the School Board [link] is about personal
introductions with no information
school board meetings. The content simply didn‟t about events”

match the participants expectations. It was “What Is DOE?”

interesting to see that when users ended up on the

IPS‟ school board calendar page, they did so after

passing the link, on average, 4 times before. This accounts for the high lostness displayed by a

majority of the participants.

* Severity | 2 – Medium

* Design Defect | Navigation

#Problem 3: Logical Constructs Not Adhered To


* Problem Description:

Participants were fixated on the fact that when looking for school board meetings,

you are looking

from the
Figure 6 - Main Navigation for IPS Site

17
perspective of a parent. While logically, school board meetings may seem to belong under the

category of „School Board‟, our participants still felt that it was an activity that „Parents‟ attend.

Given this, for the participants, it seemed only logical to navigate to the main parent link and

search the information in that subdirectory. It was only after much confusion in the parent

directory, that many of ths users finally decided to reevaluate their approach and searh other

main links or simply give up at this point. This emphasizes the idea that the participants had

enormous expectations regarding the location of school board meetings. The majority of

participants expected to find school board meetings within the parents link as opposed to the

school boardlink.

* Severity | 2 – Medium

* Design Defect | Information Architecture

#Problem 4: Banner Data Inaccessible

* Problem Description
One item that was not prevelent in our usability evaluation before, but became very

evident during our user testing, was the problems with the banner. For a few participants,

when the rotating banner rotated to a particular ad dealing with school borads (although it

was an ad that was not relevent to specific task), the users felt the need to wait for the ad to

come full circle to proceed. This rotation of the banner caused a higher time on task because

of the wait time for it to come back around.

* Severity | 2 – Medium

* Design Defect | Navigation

#Problem 5: User Calculation

* Problem Description:

18
Once the users navigated to the correct page, there were some characteristics of the

interaction that were quite interesting. Users noted that when navigating through the list of

school board meetings, it was hard to identify the next meeting because of the order of the

list. They expected the next meeting to be displayed first and didn‟t want to hunt and search

down the page for the every day to commpare it to today. They needed some kind of

distributed cognition to assist with their calculations of today, the displayed date for the

meeting, and the difference between those two dates.

* Severity | 2 – Medium

* Design Defect | Navigation

19
Task 2 – Finding the Technology Plan

• Task Description

“As a parent you feel that your child’s ability to learn and use technology is very important. In
the midst of growing concerns about budget cuts you are interested in knowing how much
money IPS has allocated to implementing new technology. Find the budgeted amount IPS has
allocated in its Technology Plan.”

Average Time on Task/


Quantitative Summary Average of All
5:28 / 3:54
350.00 3.50
300.00 3.00
250.00 2.50
Success Rates
200.00 2.00 Time on Task Full-Success – 3
150.00 1.50 Lostness
Partial-Success – 4
100.00 1.00 Fail – 4
50.00 0.50 Success Rate
0.00 0.00
Ease of Use Lostness
0.60

Figure 7 - Task 2 Quantitative Summary

As evident from the graph above, the time on task for task 2 was higher than any of

the other tasks in this report. At 328 seconds, participants seemed to take as more time to

complete this than a majority of the other tasks, even though the ideal path for a majority of

these tasks consisted of only 4 pages. Also, there were fewer successes on this task than for

any of the other main tasks (tasks 1 through 5). Both the time on task and success rates

mentioned before alludes to a high lostness level. As shown in the graph above, the lostness

level for this task was higher than any other task in this report.

• Usability Issues encountered by participants

#Problem 1: Skip Navigation


* Problem Description

20
Participants consistently skipped the navigation of the page and skimmed through

the pages content in hopes of finding the answer. This contributes a great deal to the average

time on task for this task. We believe the participants felt the need to skim every page for the

answer due to not only the banner blindness explained earlier, but also the mislabeling of

pages. Page titles within the About IPS section are all the same. The participant had trouble

differentiating between which “About IPS” section they were in. Also, related to this, many

participants were unsure of their current location on the page due to page title and navigation

not retaining their visited or current pages. This issue was addressed before in our evaluation

in a section under the Information Architecture section labeled “Where am I”.

* Severity | 1 – Show Strpper

* Design Defect | Information Architecutre

#Problem 2: Language Disparities

*Problem Description:

There were some semiotic related

problems regarding the language used on

some of the pages. For an example of this,

consider this tasks language of “Technology

Department”. When attempting to find the

technology department in the

Divisions/Departments page (which was

navigated to quite efficiently by most

participants), they all immediately scrolled


Figure 8 - Divisions/Departments page with no
Technology section
to the T‟s to locate “Technology”, when in

21
actuality, the correct page was “Information Technology”. Four users simply left the page,

assuming they were in the wrong area. A few of them were redirected back to the page and

given the hint of thinking of synonyms of “Technology” while others found it, simply by

chance. This is a prime example of ensuring that the language written matches the language

understood by users. In this case, the users are parents who may or may be familiar with

technology, but usually not. Ensuring both methods of language are covered will allow users

to find what they are looking for, rather than simply leaving the page as our participants so

eagerly did.

* Severity | 2 – Medium

* Design Defect | Semiotics

#Problem 3: Where to Begin

*Problem Description
A couple of our participants navigated to the

superintendent‟s blog or some area of news such as recent “I don’t even know where
I am”
performance metrics or school district goals. Some even expected
“If I’m looking for
it to be a form of general information which, they assumed, something like ‘report’ or
‘annual’ then it’s not
would be located in an “About Us” type of page. Regardless of going to be connected
with a department; it’s
where the users thought the budget, or technology plan, would be,
for the whole
organization!”
they didn‟t expect it to be where it is initially. They all got to the

intended page quickly, but only after some hesitation at the “The technology budget
plan should be under
beginning of the encounter. About IPS”

* Severity | 2 – Medium “I think the tech-budget


belongs to the IPS budget
* Design Defect | Information Architecture in general.”

“Honestly, I would
probably just end up
calling them at this
point.”
22
#Problem 4: Navigation of PDF

*Problem Description:
When the users found the technology plan, time on task was taken. The goal here was

to see how long it would take our users to loacte and access the technology plan. After the

task was deemed copmlete, we allowed the users to continue without telling them we weren‟t

timing them any more. This proved to be rather revealing about another issue we didn‟t even

think about. Once the user has accessed the technology plan, finding the actual information

proved to be rather difficult. The navigation and information architecture of the pdf

document was not easily navigable and elicited some frustration in our users. The users, even

when in the document, didn‟t know where they were or how to get to where they wanted to

go. We recommend some form of hypertextual linkage on either the pdf document or an

html page with the same information as the pdf document. As of now, the table of content

listed an approximate page that the user had to scroll to; if they used the table of contents at

all.

* Severity | 1 – Show Stop

* Design Defect | Navigation & Information Architecture

23
Task 3 – College Resources

• Task Description
“As a parent of a high school junior, another parent told you that IPS has resources for you and
your child to start exploring colleges. Find contact information about Sawyer College in
Merrillville. “

Quantitative Summary Average Time on Task/


350.00 3.50 Average of All
300.00 3.00
2:23 / 3:43
250.00 2.50 Success Rates
200.00 2.00 Time on Full-Success – 7
Task
150.00 1.50 Lostness Partial-Success – 3
100.00 1.00 Fail – 1
Success Rate
50.00 0.50
Lostness
0.00 0.00
0.31

Figure 9 - Task 3 Quantitative Summary

Time on task for task 3 was lower than any of the other 5 main tasks. 143 seconds

surpassed all of our expectations and gave some insight on what we thought was a clear

usability problem. Task 3 had more Full and Partial successes than any other task. Lostness,

while still high, was the lowest of all five tasks as well. The one participant that failed to

complete this task or gave up on it does not account for a major shift in the data due to the

fact that the average for the other 10 successful participants average approximately the same

time on task.

24
The pages on this site

contained a lot of information in one

location and were a major focus of our

usability testing. Users searching for

the school began to get frustrated at

the number of scrolls and page clicks

they had to go through just to find

information on a state university.

Some didn‟t think that „resources‟ was

Figure 10 - Long list of state colleges and the appropriate place for state universities.
universities
Others, similar to the problems outlined in

task 1, believed that state university searches were activities the parents would complete and,

therefore, would be located in the parents section.

Some areas of focus that were identified during this task by

our team dealt mostly with general elements needing correction as


“Can I just Google
opposed to task-specific items. Some such items include some it?”

elements discussed earlier. Some examples of these include “Does the


navigation change
navigation blindness due to the banner, mislabeled pages, and every time I go to
a new page?”
inconsistent sub-navigations.
“Why are there
As discussed earlier, this task had a lower time on task than two ‘home’ links?”

any of the other tasks in the 5 main tasks for parents. Some other “(While browsing
within the
important discoveries are made regarding the quantitative data as well. schools) How can I
get back to that
The success rate, on average, was ultimately higher than any other 5 other page?”

“I would have just


Googled this”
25
main tasks and the lostness level was lowest of all of these. Our usability evaluation indicated

that the resources section would be a very usability-error-prone section of the website due to

the information architecture and navigation presented. Our assumptions were proven

false with regards to quantitative metrics. The participants were able to get to where they

needed to go, relatively efficiently. However, getting there is only half of the battle with

regards to usability. The pleasure attained by the user in accomplishing the task is as equally

important as the accomplishment of the task itself.

• Usability Issues encountered by participants

#Problem 1: Mislabeling of Pages

* Problem Description:
Participants expressed distaste with their experience of the site based upon psycho-

pleasures related to page semiotics. Some mislabeling of pages and multiple links meaning

the same thing caused grief with the participants. The information architecture that the

users experienced to develop their mental map of the site was truly tested at this point. The

users have developed a sense of how things are set up with the first two tasks and now can

use that construct developed to accomplish this task. However, given the two-task prep with

this task, some users still felt like they could not navigate the site appropriately based upon

some of the structure of the sub-pages.

* Severity | 2 – Medium

* Design Defect | Semiotics & Information Architecture

26
Task 4 – Find the Closest School
• Task Description:

“As a parent you want to identify the schools that are nearest to your home where your child
would attend middle school. Your house is located at 816 N. Audubon Road, Indianapolis, IN
46219. Please name the closest high school to your house and find the enrollment information
for that school. “

Average Time on Task /


Quantitative Summary Average of All
350.00 3.50 4:26 / 3:43
300.00 3.00
Success Rates
250.00 2.50 Full-Success – 5
200.00 2.00 Time on Task Partial-Success – 1
150.00 1.50 Lostness Fail – 5
100.00 1.00 Success Rate
Lostness
50.00 0.50 Ease of Use 0.59
0.00 0.00

Figure 11 - Task 4 Quantitative Summary

The time on task for finding the school closest to your home was the second highest

time of all tasks. There were a total of 5 complete successes, 1 partial success (which involved

redirection by the proctor), and 5 complete fails or give ups. The level of lostness is almost

equivalent to the highest lostness (only one hundredth off).

It is important to note that, even though

time on task for this scenario is the second

highest of all tasks, it could be worse than it looks.

Due to the fact that 5 of the participants

completely failed the task, the numbers for this

task may be slightly skewed. If these participants

27
gave up too quickly, these numbers could have factored into the time on task we saw by

dragging it lower than the average would actually have been. By having 5 participants fail or

give up, and a lostness score of 0.59, it is evident that the participants, in general, were not

pleased with what it took to accomplish the task. On top of this, the average ease-of-use

index was the second lowest of all of the tasks.

• Usability Issues encountered by participants:

#Problem 1: Banner Inconsistencies

* Problem Description:
Participants noticed that the banner was static for some of the pages

and dynamic for others. This caused much confusion when users
“The Boundary Map
were attempting to figure out how they ended up in certain points of doesn’t help. Where
can I give my
their navigation. Upon navigating over a banner, the user was unable address?”
to determine whether it was a link or not. This enhances the previous “The Map is too
small to see the
concerns regarding banner blindness; if the banner act like text, the
information”
user will be more likely to avoid text and consider it as part of the
“I’m not familiar
banner than ever before. with the school
boundary map”
* Severity | 2 – Medium
“I think I left the
* Design Defect | Navigation website”

“My first reaction is


to use Google
maps”
#Problem 2: Logical Path Undefined
A logical path was selected by a few of our participants “Can I use Google
Maps?”
dealing with enrollment. When participants went to the enrollment

section of the site, they expected to be able to locate a school there.

Locating a school and enrolling in it seemed to be common place to

28
be together for most of our participants. When accessing the enrollment page and not finding

a school-locator, the users either remained confused and randomly browsed, or simply gave

up. Some users even felt the need to search the schools-specific sites to see if there was

something there that could assist them. This lead to some information architecture issues

we explained earlier.

* Severity | 2 – Medium

* Design Defect | Information Architecture

#Problem 3: Impossible Task- Does not exist

*Problem Description:

Another logical area the users expected to find the schools location was in the schools section

of the page. Some participants went immediately here and assumed that it was a „trick‟

question because they couldn‟t believe the developers would not implement something like

this here. The area they were looking for was the boundary map, which was not located in the

school sub-pages. The semiotics used for the „boundary map‟ and expectation of it being in

„enrollment‟ or „schools‟ are understandable expectations and links to a boundary map should

be included in both of these sections.

* Severity | 1 – Show Stop

* Design Defect | Semiotics

#Problem 4: Syntax Rules not Obvious


* Problem Description:
The actual boundary map itself had some issues with technology and functionality.

We found that when the user types in their address to locate the nearest school, they need to

match the syntax rules of the system, displayed under the map and input field. When they

29
enter an address that doesn‟t match the syntax rules, they are presented with an alternate

address as a “Did you mean…” type of suggestion. However, upon clicking this, the

subsequent page is another page notifying us that our input is incorrect. The suggestion itself

is as incorrect as the users input. Another higher level issue deals with the users

understanding of the semiotics of the page. Many of our participants did not like the idea of

“Boundary Map” as the area they navigate to in attempts to locate a school close to them.

* Severity | 1 – Show Stop

* Design Defect | Semioics

30
Task 5 – Magnet School Application

• Task Description
“You noticed that a school located near your house was a magnet school. Please locate the
2011-2012 English student application for this magnet school.”

Quantitative Summary Average Time on Task/


Average of All
350.00 3.50 3:32 / 3:43
300.00 3.00
250.00 2.50 Success Rates
200.00 2.00 Time on Task Full-Success – 7
150.00 1.50 Lostness Partial-Success – 2
100.00 1.00 Fail – 2
Success Rate
50.00 0.50
0.00 0.00 Ease of Use Lostness
0.50

Figure 12 - Task 5 Quantitative Summary

Task 5 could be denoted as the “Average” task for our participants. The time on task

was 212 seconds with seven of our participants completely finishing, 2 finishing with

assistance, and 2 failing or giving up. The lostness scale is right at 0.50 which is about average

for this site, but completely unreasonable for usage in general.

• Usability Issues encountered by participants:

#Problem 1: Mismatched User Expectations

* Problem Description:

This task summarized many of the points previously made in other tasks. One point referred

to before for other sections, and now this section, is the labeling of pages and how those

31
semiotics match the users expectations. Users expected the magnet school applications to be

in the general information area; especially an area denoted as “Application”.

Some participants felt the need to go to the schools list and find a magnet school to

hone in on while searching for the application for that particular school. At this point in the

scenario, some participants felt confused as to some of the semiotics used to describe the

schools. Magnet schools were both located under high school and other schools. This

confused the user into thinking magnet schools had to be one or the other. However, even

when the user found a magnet school, it was to no avail as the individual school page did not

have what the user expected to be there; an application to that school. This poor set up in

information architecture could cause the participant to feel like there is no place to go from

here.

* Severity | 1 – Show Stopper

* Design Defect | Information Architecture & Semiotics

#Problem 2: Ignore Navigation

* Problem Description: “Is ‘other’ magnet?”

Users tended to want to read all of the material before clicking on the magnet

link. This has to do with an area we discussed in our evaluation testing focusing

on the side navigation being ignored by the user. Most times, especially for this task (which is

odd because it was the last task for most people) the side navigation was not referenced as

often as we expected. While time on task was not horrible, it was high enough to elicit

concerns that items like this navigation instance could cause some major pleasure losses for

the site.

32
* Severity | 2 – Medium

* Design Defect | Navigation

33
Task 6 – Payroll Contact Information (Teachers Only)
• Task Description:

“As a new teacher you had an issue with you pay check this week. You contacted the front
office at your school, however, because they do not directly deal with payroll they directed you
to the corporation website to find the appropriate contact information and the location were
you needed to go to sort everything out. Identify the telephone number and location of the
payroll office”

Task 7 – Teacher Amendment (Teachers Only)


• Task Description:

“As a teacher you wanted to review updates to the 2010 teacher salary information specifically
Amendment 5233. Locate information about Amendment 5233.”

Average Time on Task


Quantitative Summary Task 6 - 60 seconds
Task 7 - 120 seconds
350.00 3.50
300.00 3.00 Success Rates
250.00 2.50 Full-Success – 2
200.00 2.00 Time on Task Partial-Success – 0
150.00 1.50 Lostness Fail – 0
100.00 1.00
Success Rate
50.00 0.50 Lostness
0.00 0.00 Ease of Use 0.00

Figure 13 - Task 5 Quantitative Summary

In both task 6 and task 7, there were only two participants due to the nature of the

task. The tasks involved items only teachers would need to do, so they were conducted only

by our two teacher participants. Time on task for both of them was 60 and 120, respectively

for tasks 6 and 7. They both succeeded fully within the ideal number of clicks.

Users both found what they needed to efficiently. This may not be due to great user

experience designing, yet, adaptation by the participants to this poor user design. Since the

users were both teachers within the IPS school district, they were our expert users and thus,

34
have had much experience with this design interface. Being accustom to a poor user design

experience does not mean a poor user design experience doesn‟t exist.

35
IV. Synthesis of Results from Inspection
The team analyzed the qualitative data discovered during the usability testing and

categorized it according to the five design dimensions: Content, Information Architecture,

Navigation, Presentation, and Semiotics. The team then compared the testing results with

the problems discovered during the inspection to reaffirm the initial findings and to uncover

any new findings.

Content

New Problems

Within the Content section, participants encounter three new problems during the

usability testing which was not discovered in the inspection (ID: 5, 6, 7). They included

search problems caused by “Unclear input requirement”, “Unclear boundary map”, and the

inconsistency between participant‟s expectations and the “Vague enrollment information”.

Inspection Problems

Of the four problems the team found in the inspection, the participants encountered

none of them during the usability testing. The problem of “Missing and inconsistent

information” (ID: 1) was already updated on the website by the time our participants began

testing. The other three (ID: 2,3,4) were not confirmed as they were included in the tasks,

however, participants who did visit those pages during the test did not uncover any problems.

The table below shows the comparison of problem findings in the Inspection and Usability

testing.

36
ID Inspection Results Usability Results Conclusion
1 Missing and inconsistent This problem is repaired by
information the website. Not a problem
2 Target User Misidentified Participants didn‟t encounter
this problem. Severity
Decrease.
3 Consistency of information Participants didn‟t encounter
and representation this problem. Severity
Decrease.
4 When was the news new? Participants didn‟t encounter
this problem. Severity
Decrease.
5 Unclear input Major new problem
requirement
6 Unclear boundary map Major new problem
7 Vague enrollment Minor new problem
information

Information Architecture

In the Information Architecture dimension, participants encountered 7 problems,

including 4 problems predicted by the inspection and 3 new ones.

New Problems:

“Mismatch information categorization between user and the website” (ID: 1) is

a new problem which was found in usability testing and has a very high severity. About 8

participants encountered this problem and it is a main reason causing the highest time-on-

task and lostness, and lowest success rate of Task 2. Users categorize information according

to the semiotic closeness between target information and category, while the website is

37
grouped according to location and structure of the physical IPS departments. The users were

not familiar with the organizational layout of the IPS departments. “Problem finding items

in a long list” (ID: 10) mainly occurred when users were looking for schools from a page

that contained lists. When the users found finally found the school they were looking for the

expected it to be according to the school name sort alphabetically, however the schools were

listed in the order of IPS school number which did not provide much help.

Overlapping Problems:

The Overlapping Problems (ID: 2,4,5,8) were predicted in the inspection and proved

by the usability testing. Users encountered “Problem of find school according to distance”

(ID: 2). Their first reaction so solve the problem was by using the boundary map, but the

static and separate boundary map of each school provide little help for measuring distance

from home to school or location comparison. Three of them used Google Map to finish the

task. Regarding to the problem of “No way go back” (ID: 4), users were confused on how

to go back to the previous page, but they overcame it by clicking the back button multiple

times. The problem of “Where am I?” (ID: 5) was caused by both mislabeled pages and a

lack of cues for current location. The weak awareness of the current location makes it

difficult for the user to find the page again as well as understand the website structure.

Participants encountered the problem of “Homepage without main content” (ID: 8).

Most of them wandered on the homepage for a long time, especially in Task 2, seeking the

entrance to the information they were looking for. They found little help with the

dashboard-like homepage with its many links.

Inspection Problems:

The three Inspection Problems (ID: 3, 7, 9) were not discovered by participants

during the usability testing ( “Too much information accommodating too many people”

38
(ID: 3) and “Archiving” (ID9)). Due to the lack of evidence in testing the severity for these

three problems was decreased.

An Interesting phenomena appeared regarding the problem “Unrelated

Navigation”. When users were finding information within the page, they naturally ignore

the unrelated navigation. The severity for this problem was also decreased.

ID Inspection Results Usability Results Conclusion


1 Mismatch information New problem
categorization between user encountered by most
and the website. users with high severity
2 Visualization of school Problem of finding school Increase Severity
proximity according to distance
3 Too much information Not discovered by
accommodating too participants. Severity
many people decrease.
4 No way back Users are confused and have Increase Severity
difficulty back from the school
website
5 Where am I? No clues of the current Increase Severity
location
6 Unrelated Navigation Ignore by users. Severity
decrease.
7 Homepage without main Homepage provide weak Increase Severity
content (Homeless navigation assistant into the
homepage) website.(user fixate on the
homepage, they are sure the
information can be found
there)
8 Archiving Severity Decreased
9 Problem of finding items from New minor problem
a long list

39
Navigation

Overlapping Problems:

In Navigation, all problems predicted in Inspection were proved in Usability Testing.

“Undistinguishable Links” (ID: 1) was a reoccurring problem where users were confused

by the static text, images (including banners) and links. Users also encountered the problem

of “Too much navigation” (ID: 2) as predicted by the inspection. The users reaction to this

problem was by ignoring the links (even the useful ones). An example would be instead of

clicking the in-page quick link at the top of the page they instead scrolled down to find the

section they were looking for. “No indication when leaving the website” (#ID: 3), was yet

another problem that confused the users. Many expected to find the information within the

IPS website, however, were surprised when they had to navigate or find the window back to

it. Again, as proved by the inspection, “Provide choices without decision information”

(ID: 4), was encountered in the testing and continues to be a problem.

New Problems:

The Usability Testing also uncovered a new problem “Navigation bar blocks

navigating within the topic” (ID: 5). The navigation bar does not consistently including all

links within the topic.

40
ID Inspection Results Usability Results Comments
1 Undistinguishable links Undistinguishable links Increase Severity
(text, & banner)
2 Too much navigation User missed navigation Severity Remain
links
3 No indication when leaving the Lack external website Severity Remain
website alerts lead confusion
4 Provide choices without decision School link with vague Severity Increase
information description

5 Technique problematic New major problem


navigation bar block
navigating within the
topic

Presentation

Overlapping Problems:

Users encountered the problem of “Error Recovery” (ID: 7). In the search feature,

they have difficulty in recovering from the error when the information they input cannot

meet the required format because there is no instruction telling them how to do. Regarding to

the problem of “Mislabeled Pages” (ID:8), users are confused by the phenomena that

several pages have the same label.

New Problems:

The Problem of “Banner Blindness” (ID: 1) was not predicted in the inspection,

however, due to issues uncovered in the usability testing the team classified this as a high

severity. Several users overlooked the navigation bar on the top, especially on the homepage,

which makes it difficult to gain access to the topics within the website. In contrast, the users

can notice the so call “deemphasized” menus on the right. The other problem “The Search

Bar” (ID: 10) is too small to be noticed. The third problem, “Un-salient link” (ID: 11), does

not show links in a consistent format (i.e. no visual cue the text is a link).

Inspection Problems:

41
The problem “Users don’t know where they are” (ID: 2) and “Mixed

representation of information and linked files” (ID: 3) are another aspect of the

overlapping problem, “Where am I” and “Undistinguishable link”. Thus, the severity of

above problems should remain. Other problems, ID: 3, 5, 6, 9 were not discovered by users

in the usability testing, thus their severity was decreased.

ID Inspection Results Usability Results Conclusion


1 Banner Blindness. High severity
2 Users don‟t know where they are Severity remain
3 Inconsistent font sizes, weights & Severity Decreased
decoration (Hyperlinks)
4 Mixed representation of Severity Remain
information and linked files
5 Inconsistent design of sub-domains Severity Decreased

6 Deemphasizing position of Severity Decreased


important menus
7 Error recovery Searching feature Severity Remain
provide not enough
information indicating
how to recover when
input information
doesn‟t meet the
required format
8 Mislabeled Pages Different pages have a Severity Remain
same label
9 Insufficient Information Severity Decreased
10 The Search Bar is too New major problem
small to be noticed
11 Un-salient links New minor problem

42
V. Overall Recommendations for Improvement
The teams recommendations span over four design dimensions; Content, Navigation,

Information Architecture and Presentation. Each dimension, ordered by severity as

discovered in our Usability Inspection, contains our recommendations ordered by severity

based upon our inspection and user feedback. Items highlighted in yellow were identified in

both our inspection and during user testing.

Content

In the content dimension, there are no problems overlapping Inspection and

Usability Testing. Regarding to the severity of the problem, we recommend the website

should supply a section proving the school‟s enrollment information (ID: 6) as users expected.

More over, they should also improve the Searching feature (ID: 4) by providing clear input

format requirement and error recovery help. Also, the Boundary map (ID: 5) should also be

improved to provide useful information, which is clear enough to read.

The problems with lower severity can also be improved if develop team has enough

time. The relevant recommendation and their reference please refer to table below (ID: 1, 2,

3).

ID Recommendation Severity Reference


1 Establish a checklist and a content review board to 1 Inspection: Missing and
ensure information are updated and correct inconsistent information
2 Notify the user of missing/ incomplete information 1 Inspection: Missing and
and when it will be updated inconsistent information
3 Ensure dates are captured and identified on items 1 Inspection: When was the
that have relevance around a date/ time news new?
4 Search feature: Provide a hint of the requirement to 2 Testing: Unclear input
input keyword for the search feature; for example, requirement
provide an example of the address in the format
required.
5 Boundary map: Make sure the boundary map 2 Testing: Unclear Boundary
provides information that the users need. Map: The information
provide on boundary map

43
doesn't meet the user's
need. The boundary map
only provides a static map
of area around the school
without any explanations,
while the users want to
know the distance between
school and their home.
6 Enrollment Information: Clarify enrollment process 3 Testing: Vague information:
by including the link in various subject areas to The users expect to know
provide enrollment information, including how to the information about how
enroll, current enrollment status, resource of forms to enroll and the situation
and etc. of current enrollment, while
the current enrollment
information is a number of
how many students enroll
in this school

Navigation

The priority recommendation for Navigation (ID: 1, 3, 2, 4, 5) as they have a high

severity in inspection and caused great difficulty in users finishing their tasks in the test.

Regarding navigation links, make sure the overall navigation include links to each pages

within the topic. The website should also keep the links consistent and salient in appearance

and layout to make them easy to be notice and distinguishable from the text or static pictures.

Moreover, strengthen user‟s awareness of boundary of the website to improve the confidence

of information by adding an external navigation alert.

It is also necessary to strengthen the users‟ ability to control the navigation, by adding

a controller to the flash banner. (ID: 6)

ID Recommendation Severity Reference


1 The navigation bar to make sure the links cover all 3 Inspection, Testing:
the pages within the topic Technically problematic
navigation bar block
navigating within the topic
2 Use proper formatting ensuring consistency between 3 Inspection, Testing:
text and links, as well as banner picture and links Undistinguishable Links
(text & banner)

3 Keep the most necessary navigation and get rid of 3 Inspection, Testing: many

44
the ones irrelevant to the page. Make sure the navigations that users
appearance and layout of the navigation are salient ignore one or two
and consistent for easy noticed. Distinguish the
navigation of different level by different
presentation.
4 Provide clear awareness of the boundary of the 2 Inspection, Testing: No
website, by alerts when the user is navigated to the indication of leaving the
external website while keeping the graphical and website leads to the user's
navigation consistency within the main website. confusion as to where IPS
ended and new site started.
5 In the School List, provide more strategies for 2 Inspection, Testing: Provide
school searching, such as filtering or arranging the choices without decision
schools according to specific school attribute. information, and users have
Provide more information or attributes to the problem with the school
schools (i.e. address, email fax, etc.) A possible way link with vague description
to keep the list clear with rich information is to
redesign the school list as a folding list like Google
Reader. Users can tab the school name to unfold
and show more information about it. A “show all”
button is needed to easily unfold all the items.
6 Provide controller for navigating through the 2 Testing: The uncontrollable
flashed banner: There should be some kind of way rotation of the flashed
for the user to go back to a previously flashed banner is difficult for users
banner quickly. Links to each banner should be to browse and make use of.
displayed below the banner, allowing the user to
navigate these asynchronously.

Information Architecture

Regarding that this is a deep and wide website with the severe problem of mismatch

information categorized between users and the website (ID: 1), however, it takes a great

effort to re-architecture the website, we recommend proving a sitemap as solution. The link

to sitemap should be in a salient position for easy access at any time, which will show the

overall website structure to the user to help them understand the conceptual model of the

website and easy locate the information they are seeking. The website should also provide

visual cues indicating user‟s current location (ID: 2), breadcrumbs and highlighted labels are

possible solutions. The Homepage should also be improved to provide clear and overall

introduction to the website and helpful navigation assistance to the inner of the website (ID:

3).

45
ID Recommendation Severity Reference
1 Provide a sitemap of the website to aid the user gain 3 Testing: Mismatch
an overview of the logical architecture of the website, information categorization
so that they can find the information they need. between users and the
website. (Usability Testing)
Users categorize
information according to
the closeness of their name,
but the website group
according the department
while the users are not
familiar with the
organization of
departments.

2 Design should be include visual cues in the 3 Inspection, Testing: Where am


navigation that highlights their current location. I? Users are lost because
Possible solutions are breadcrumb providing the there are no clues of the
track and navigate within the hierarchy. Ensure each current location
page has a title matching the pages content.
3 Develop an area for main content to the displayed to 2 Inspection, Testing:
the user. This should summarize the site and what Homepage without main
the site represents. content provides weak
navigation assistant into the
website.
4 Provide an access to the School Board meeting 1 Testing: Mismatch
timetable in the Parent Section. information categorization
between user and the
website.

Presentation

One of the most critical problems the team feels is important to fix is the problem of

Banner Blindness, especially on the Homepage (ID: 1). Possible solution is making the

navigation bar more salient and keeps it a distance away from the banner. Also, need to keep

the useful widget, including links and search button more salient in visual and layout aspect.

(ID: 2, 4). Clear error recovery instruction should also be provided (ID: 3).

ID Recommendation Severity Reference


1 Make the navigation bar around the banner area 3 Testing: Banner Blindness
more salient. Possible solution is avoiding using the

46
color similar to the banner for the background of
navigation bar. Make the appearance of the
navigation bar more touchable like a button.
2 Make the links more visually salient. On the same 2 Inspection: Mixed
time, put them in a more outstanding position, such representation of
as on the top of the page and avoid mixing within information and linked
the body of the content. files, Un-salient links
3 Provide clear recovery instruction when the error 2 Inspection, Testing: Error
appears. Especially to the search feature, provide recovery & Users don‟t
more options for user to choose to let them know how to recover from
recognize rather than recall. Provide format the search error
requirement and example near the input area.
4 Make the search button in the banner area more 3 Testing: The search bar is
salient in order to be noticed. Provide an input area too small to be noticed.
to make it more affordable.

47
VI. Appendixes
Appendix A: Pre-test Questionnaire

IPS Pre-Test Questionnaire


Name:

Date:

Age:

Gender:

Type of Computer:

Browser Used:

Location:

Race:

Type of User (Parent/Teacher/Community/Staff/etc…):


Computers

How familiar are you with computers?

  
Very Somewhat Not At All

How often do you use a computer?

  
Very Somewhat Not At All

What do you mostly use the computer for?

     
Email Social Work School Browsing Other
Networks
School and Computers

Have you ever visited a school corporation’s website?

 
Yes No

48
If yes, was it IPS’s?

  
Yes No N/A

If yes, how often in the past month?

    
1 – 3 Times 4–9 10-20 20-30 More than 30
Times Times Times

Would you ever access the IPS site if you were/are not a parent, student, or staff member?

 
Yes No

49
Appendix B: Usability Test Script

Introduction
Hi, I‟m [Name]. Welcome and thank you for coming. How are you? Today we will be conducting an
evaluation on the Indianapolis Public School (IPS) Corporation website.
I‟m helping IPS understand how well their website works for the people who use the website. I will
be observing what you are doing today. In the evaluation you will have a chance to use the website to
see what you think of their product: what seems to work for you, what doesn‟t, and so on.

This evaluation should take about half an hour.

We‟re going to be videotaping the screen as a record what happens here today. This video is for our
analysis only. Our conversation will be recorded as soundtrack to capture full information for
analysis. It‟s primarily so I don‟t have to scribble notes and can concentrate on talking to you. The
members of our evaluation team will also review the video and audio. The information obtained
today is strictly for the purposes of this study and your information will be kept confidential.

Like I said, we‟d like you to help us with a product we‟re evaluating. It‟s designed for people like you,
so we‟d really like to know what do you think about it and what works and doesn‟t work for you.
You may run into features or functions that will not work right. Please feel free to tell us about your
experiences as you go through the evaluation.

Procedure:
The procedure we‟re going to do today goes like this: we‟re going to start out and talk for a few
minutes about how you use the web, what you like, what kinds of problems you run into, that sort of
thing.

Then I‟m going to show you a website of Indianapolis Public Schools which we are evaluating and
have you try out a couple of things with it. Then we‟ll wrap up, you will talk about your experience of
using it and I‟ll ask you a few more questions about it, and we‟re done.

Any questions?

Now I'd like to read you what's called a statement of informed consent. It's a standard thing I read to
everyone I interview. It sets out your rights as a person who is participating in this kind of research.

As a participant in this research

You may stop at any time.


You may ask questions at any time.
You may leave at any time.
There is no deception involved.
Your answers are kept confidential.

Any questions before we begin?

Let's start!

50
Preliminary Interview

[Soundtrack recorder on]


-Online experience-
1. How much time does you normally spent on the web in a given week (or within a day?)
2. How much of that is for work use, and how much of that is for personal use?
3. Other than email, is there any one thing you do the most online?
4. Do you ever use educational system website? What do you usually visit that website for?
How often do you use it?
5. Do you ever do research online for schools information?

-Offline Habits-
1. (For parents) What kind of information do you want to know about your child‟s education?
From where do you usually get this information?
2. What kind of information do you want to know about the local schools or educational
system? How do you usually get it?

Evaluation Instructions

In a few minutes, I will ask you to begin your evaluation of the IPS website. However, before we get
started, let's talk a little about what will be happening.

Please try to feel as comfortable as possible while using the interface. We are studying how the
website elicits your actions. There is nothing that you can do wrong as a user of this website. We
want to observe your thoughts and reactions on the site's interface. With that, it is extremely helpful
to us if you could narrate your experience while using the site. We would love for you to have a nice
stream of consciousness while interacting with the site. If you don't like something, let us know. If
you really, really like something, let us know. If anything stands out, doesn't stand out enough, or if
you have an idea as to something that should be that isn't, your suggestions will help future users'
experiences.

So, here is just a quick summary of what we discussed. We are testing the site; not you. Make sure to
let us know what you are thinking as you are thinking it. Finally, and most of all, be comfortable. If
there is anything we can do throughout the process to make your experience as natural as it would be
at home, let us know.

Do you have any questions on the process? Does it all make good sense?

Great! Let's go to our Internet browser and go to www.ips.k12.in.us. As the site loads, please feel free
to move the chair, mouse, monitor, and keyboard to a comfortable position.

Tasks

8. As a parent you are growing concerned about the direction the IPS School board is moving
towards concerning year round school. Another parent mentioned that there are school
board meetings where you can go and voice your concerns. Assume today is December 1st
and you want to go to the next briefing session. Please identify the date for this meeting.

51
9. As a parent you feel that your child‟s ability to learn and use technology is very important. In
the midst of growing concerns about budget cuts you are interested in knowing how much
money IPS has allocated to implementing new technology. Find the budgeted amount IPS
has allocated in its Technology Plan.

10. As a parent of a high school junior, another parent told you that IPS has resources for you
and your child to start exploring colleges. Find contact information about Sawyer College in
Merrillville.

11. As a parent you want to identify the schools that are nearest to your home where your child
would attend middle school. Your house is located at 816 N. Audubon Road, Indianapolis,
IN 46219. Please name the closest high school to your house and find the enrollment
information for that school.

12. You noticed that a school located near your house was a magnet school. Please locate the
2011-2012 English student application for this magnet school.

If you are a teacher, please complete the additional two tasks:

13. As a new teacher you had an issue with you pay check this week. You contacted the front
office at your school, however, because they do not directly deal with payroll they directed
you to the corporation website to find the appropriate contact information and the location
were you needed to go to sort everything out. Identify the telephone number and location
of the payroll office

14. As a teacher you wanted to review updates to the 2010 teacher salary information specifically
Amendment 5233. Locate information about Amendment 5233.

Evaluation Wrap-up

Wonderful. Now, if you would, please, simply exit your Internet browser and we'll come together for
a quick wrap-up session.

If you could describe this site in one or two sentences to a colleague of friend of yours, how would
you do so? What adjectives would you use?

Would you say, by the end of the process, that you were satisfied by the service provided by IPS?

When you first started the process, did you expect the site to contain what it did? More than it did?

Based on your experience with the website, can you give three general pros and three general cons
about the site?

Is there anything that you wish this site would provide to people in your same shoes with regards to
stake in the system (IPS)?

Thank you so much for your time and participation. Many times, after completing evaluations,
people tend to have amazing ideas when they are at home, browsing the Internet and utilizing
services like these in their home environment. If you have anything you could add to this evaluation
today, tomorrow, or even as far out as next week, please don't hesitate to contact us at

52
[email_address].

Thanks again and have a wonderful day!

53
Appendix C: Post-Task Questionnaire

IPS Post-Task Questionnaire


Evaluator: ________________ Participant ID: ________________ Task ID: ___________

I was satisfied with the ease of completing this task?

     
Strongly Strongly Agree
Disagree

54
Appendix D: Post-Test Questionnaire

IPS Post-Test Questionnaire


Evaluator: ___________________ Participant ID:
___________________

1. How would you explain the set up of the site; Architecture?

2. How would you explain how you are feeling after completing the tasks?

3. Do you feel like you were “In Control” of navigating around the site?

4. What would you change about the site?

5. Could you understand where you were in relation to the sites structure?

6. What would you change about the site?

7. Is there anything you‟d like to add?

55
Appendix E: Usability Testing Videos

Please reference the attached CD File submitted with this project for a complete list of videos.

56

Você também pode gostar