Escolar Documentos
Profissional Documentos
Cultura Documentos
Abstract
We carried out a series of experimental assessments to learn how to perform effective assessment of user-centred design (UCD) processes. Our starting point was ISO 15504-based process assessment and ISO/TR 18529 UCD process model. During the experiments, the assessment style evolved towards a novel assessment approach. Specific characteristics of the assessment approach are a revised UCD process model, three process performance dimensions, and the implementation of the assessment as a workshop. The approach was found to be efficient and to give concrete, understandable and valid knowledge about the status of UCD. It also proved to be an effective instrument for training and increasing awareness of UCD.
Introduction
Usability is recognised as one of the most important quality characteristics of software intensive systems and products. Usability gives many benefits that can include "increased productivity, enhanced quality of work, improved user satisfaction, reductions in support and training costs and improved user satisfaction" [1]. The prevailing paradigm of developing usable products and systems is user-centred design1, UCD: usable products are created through processes of user-centred design. There exist literature on UCD, e.g. books such as [2], [3], [4], [5], and the standard ISO 13407 [1]. All of them incorporate the same basic principles for developing usable products and system, such as user involvement, iterative design and multi-disciplinary
1
Version <Nr> /
understand & specify the context of use evaluate designs against requirements system meets specified functional, user & organisational requirements produce design solutions specify the user & organizational requirements
The processes can be briefly described as follows: ?? Understand and specify context of use. Know the user, the environment of use, and what are the tasks that he or she uses the product for. ?? Specify the user and organisational requirements. Determine the success criteria of usability for the product in terms of user tasks, e.g. how quickly a typical user should be able to complete a task with the product. Determine the design guidelines and constraints. ?? Produce design solutions. Incorporate HCI knowledge (of visual design, interaction design, usability) into design solutions. ?? Evaluate designs against requirements. The designs are evaluated against user tasks.
Version <Nr> /
Chapter <Nr><Title>
Research problem
A typical approach to start organisational improvement effort in any domain is to carry out current state analysis. Through current state analysis, one can identify the strengths and weaknesses of an organisation, and thus get a good basis for planning and implementing improvement actions. When we examine the recognised process improvement models of software engineering, for example IDEAL of Software Engineering Institute [11] and ISO/TR 15504-7 [12], they essentially include 'current state analysis' as a step in an improvement process. In software engineering, current state analysis is a widely used practice in the form of process assessment. Well-known approaches for software process assessment are CMM [13], Bootstrap [14], and ISO 15504 [15], formerly known as SPICE. A number of different approaches have been proposed for the current state analysis of UCD, or as we call, usability capability assessment (UCA), too [16]. The first UCA approaches were introduced in the first half of 90's. A remarkable achievement in this area was the publication of ISO/TR 18529 [17] in 2000 which defines the UCD processes in the format that complies with the requirements of ISO 15504. The UCD substance of the ISO/TR 18529 model is mainly from ISO 13407. In spite of the many approaches, usability capability assessments are not yet a very widely performed activity. According to a study of Rosenbaum & al [6], 16 organisations out of 134 (12%) reported using 'organisational audits' as a means for enhancing 'strategic usability'. In the same study, audits were ranked to be one of the least effective approaches in promoting strategic usability. Our research problem is to learn how to effectively perform usability capability assessment in industrial settings.
Motivation
Our research background is usability capability assessments that were carried out using the ISO 15504-style software process assessment at Nokia Mobile Phones, Teamware and Buscom. In these assessments, we used ISO/TR 18529 or its predecessor UMM Processes [18] - as the UCD reference process model. We gathered feedback from the assessments from the organisation and assessment team. The feedback indicated that there is space for improvement in the assessment approach. The following kind of problems were identified: ?? The staff perceived the concepts of ISO 15504-style assessment difficult to understand ?? The results were perceived not concrete enough ?? There were disagreements among the assessors on the interpretation of the processes (especially on some base practices) ?? There were disagreements about the ratings of the base practices
Version
<Nr>
Version <Nr> /
Chapter <Nr><Title>
assessment had one significant new feature: it was decided to carry it out in a half-day workshop. The reason for this was practical: there was no time for an assessment with multiple interviews. We used the process model and performance dimensions developed in the previous assessment. In the interview part, we asked the customer to describe the processes of the development process of the user manual and quick reference guide. We did not describe our reference model beforehand but used it as our reference in the interview. In the results session, the findings were contrasted against the process model. The lead assessor explained the model step by step, and interpreted the information received in the interview session against the model. We agreed on the areas where the organisation had strengths and weaknesses. The session was finished in about 1,5 hours. Some refinements and clarifications were made on the process model during the discussion. One immediate feedback came actually when we finished the session. One of the participants said that she already had started to challenge herself how to implement improvements on one specific area. Other participants reported: "The assessment pointed out many issues to consider in the following documentation projects" and "Now we know that task analysis is important. We need to work on usability requirements". The assessment team found the assessment as a positive experience. We succeeded in getting a credible picture of the development process in a short time. We had felt that there was a positive atmosphere in the workshop. The specific implication of this assessment to us was to try the workshop approach again. We found it efficient one workshop instead of a series of interviews. Moreover, a workshop may be a solution to one problem that we have faced: people being in different positions in assessments (those who are interviewed, and those who are not). Third Assessment Before the third assessment, we identified different organisational contexts where improvements in UCD may take place, Figure 2: ?? Improvement of UCD at strategic level means the creation of UCD strategy, and keeping it up-to-date. UCD strategy may include issues such as vision of the organisation in terms of UCD (e.g. compared with competitors), definitions of UCD policy, and selection of pilot projects where concrete improvement actions could be carried out. ?? A typical way of introducing new methods into organisation is to experiment them in pilot projects. The project manager and other key persons of the project should be voluntary to try user-centred methods. As a result of a pilot project, the organisation has experience and knowledge about the usefulness
Version
<Nr>
DEVELOPMENT ORGANISATION
UCD INFRA
UCD infrastructure UCD infrastructure development development projects projects
Figure 2. KESSU improvement model: UCD improvement actions may take place in different contexts in organisations
It was easy to find the appropriate improvement context for the third assessment: it was clearly a pilot project. In the third assessment, we analysed the status of UCD of a previous representative project that we call baseline project. As in the previous assessment, we first had an interview session. After that the assessment team had a wrap-up session that was experienced to be fluent: the team felt that it was easy to agree on findings. In the results session, the findings were contrasted with the process model. As a basis of the results of the assessment, the project team planned together with the assessment team the UCD activities and methods that are reasonable to implement in the pilot project, Figure 3. This paper is written after the planning phase the implementation of the plan is just about to start.
Version <Nr> /
Chapter <Nr><Title>
The assessment team felt that they had learnt about the UCD in the project as much (or even more) as we did in the earlier assessments with numerous interviews. In the end, we found as a very good sign that planning activities for the pilot project truly started.
Baseline project sponsorship Pilot project
Figure 3. A past (baseline) project is assessed to give a basis for planning UCD processes in a customer project.
The process model is illustrated in Figure 4. We identify six main processes: Identification of user groups process, Context of use process (multiple instances), User requirements process, User tasks design process, Produce user interaction solutions process, and Usability evaluation process.
Version
<Nr>
Identificati on of user groups Context Context of Context ofofof Context user use of use user use use ofof user group group 1 (of user 1 1 group group 1)
Usability evaluation
User requirements
The difference between the definitions of processes between the KESSU model and the ISO/TR 18529 model is illustrated in Table 1 (example: Context of use process). When we contrast the process definitions, we can find that the outcomes of the KESSU model include only concrete deliverables. Context of use process KESSU UCD process model ISO/TR 18529 Identification of user groups Define the scope of the context of use for Description of the characteristics of the product system users Analyse the tasks and worksystem Description of the environment of use Describe the characteristics of the users Identification of user accomplishments Describe the cultural Description of user tasks environment/organisational/management Identification of user task attributes regime Describe the characteristics of any equipment external to the product system and the working environment Describe the location, workplace equipment and ambient conditions Analyse the implications of the context of use Present these issues to project stakeholders for use in the development or operation of the product system
Table 1. Illustration of differences between outcomes and base practices. Example: Context of use process
Version <Nr> /
Chapter <Nr><Title>
Process Performance Dimensions
We identify three performance dimensions: quantity, quality, and integration as illustrated in Figure 5:
Quality
PROCESS
(under assessment) OUTPUT Quantity
Process
Integration
?? The quantity of outcomes of the process. The more extensively an outcome exists, the higher performance score it gets. ?? The quality of the outcomes. With this dimension, we examine the quality and validity of the outcomes. For example, we want to make a difference whether an outcome is based on someones opinions or derived by using recognised user centred methods and techniques. ?? The integration of outcomes with other processes. The more extensively the outcomes are communicated and incorporated in other relevant processes, the higher rating is given to integration.
Represenation of results
In addition, the different process and performance models meant a different way of presenting the results. We use different symbols to denote the different dimensions, Figure 6.
N none
3
P partially
3
L largerly
3 3
F fully
Quantity
% s t Q t r 3 3 1
6 t % 1 s7 Q t r
Integration
0 Q 1 s t %t r
2 n d Q t r0 % 0 1
2ndQtr
% 0t Q 0 t 1 r s
2ndQtr
% n d Q t r 3 3 2
Quality
Version
<Nr>
6 7 % 1st Qtr
3 3 % 2nd Qtr
User characteristics
3 3 % 2nd Qtr
1st Qtr
Task characteristics
2nd Qtr
Environment
User tasks
2
Usability requirements
% s t Q t r 3 3 1
2nd Qtr
2 3
1
2nd Qtr3 % 3
2 3
1 2
UI design requirements
user goals
Assessment method
An assessment is carried out as a workshop where all the persons of the pilot project should attend. The customer of the assessment is the project manager (or equivalent). A past project (or a project that is close to end) is selected as the baseline project for the assessment. The previous project should be such that is found as a relevant and representative project for the key persons of the customer project. In an ideal case, the baseline project is carried out with the same management and staff as the new one.
Version <Nr> /
Chapter <Nr><Title>
processes. We examine the not only the extent of practices but also the quality how they are carried out, and how the results are integrated into design decisions. Our experience on UCD project assessment shows that the new process and performance models seem to make the assessment and UCD more understandable to the audience and easier for the assessors in our context. The workshop type of assessment makes an assessment efficient and spreads UCD knowledge to larger part of the staff than interviews of a limited number of people. We want to emphasise that our assessment approach is a complementary one to the traditional process assessment. Compared with ISO 15504 assessment, we can say that we examine the UCD processes thoroughly below the level 1 of capability. Traditional process assessment should be used in contexts where it is applicable to examine the management of UCD processes at higher levels of capability. Limitations Our goal for the assessments was to give a good basis for improvement actions in UCD. From this viewpoint, issues such as spreading knowledge about UCD to the staff and getting them committed to UCD improvement actions are important. A different target for assessment could be to get exact ratings of usability capability for selection of contractors. This was not our goal. Another viewpoint that we excluded is standardisation that has been one driver of the ISO/TR 18529 and HSL models. Another limitation is related to the assessment with the ISO/TR 18529 model. The assessments were very much based on the interpretations, experience and style of the lead assessors. The interpretation of the ISO/TR 18529 model is based on documentation. Some other person may have conducted the assessments in a different way. A specific feature in our assessments is that most organisations represented geographically limited industrial settings (Oulu region in Finland). Implications for Next Assessments We will continue with the KESSU process model and performance scale (probably with refinements) in our next assessments. We find that the outcome-driven process model makes discussions concrete; the three-dimensional performance scale identifies different kind of problems in the effecitveness of UCD and makes possible to have an appropriate level of abstraction in the assessment, and implementation as a workshop spread knowledge of UCD to the staff and gets everybody involved. We find that each new assessment should be considered a research activity. A researcher should try to get access for following assessments and for gathering feedback from them. There is definitely space for improvements both at the model and in the assessment method (steps of assessment) levels. We will carry out further assessments, and regard each of them also as a research effort.
Version
<Nr>
Acknowledgements
There are a number of people who have contributed this work. I wish thank the project members of the KESSU project: Netta Iivari, Mikko Jms, Tonja MolinJuustila, Tero Posio, Mikko Rajanen, Samuli Saukkonen, and Kari Kuutti. I also wish to express thanks to the personnel of the partner companies Buscom, Teamware, Nokia Networks, and Nokia Mobile Phones; and to the national funding organisation TEKES.
References
1. 2. 3. 4. ISO13407, Human-centred Design Processes for Interactive Systems. 1999, International Organization for Standardization, Genve, Switzerland. Nielsen, J., Usability Engineering. 1993: Academic Press, Inc. 358. Hix and Hartson, Developing User Interfaces : Ensuring Usability Through Product & Process. 199. Holtzblatt, K. and H. Beyer, Contextual Design: Defining CustomerCentered Systems. 1998, San Francisco: Morgan Kaufmann Publishers. 472. Mayhew, D.J., The Usability Engineering Lifecycle. 1999. Rosenbaum, S., J.A. Rohn, and J. Humburg. A Toolkit for Strategic Usability: Results from Workshops, Panels, and Surveys. in CHI 2000. 2000. The Hague: ACM. Bloomer, S. and S. Wolf. Successful Strategies for Selling Usability into Organizations. in Companion Proceedings of CHI '99: Human Factors in Computing Systems. 1999. Pittsburgh, USA: ACM Press.
5. 6.
7.
Version <Nr> /
Chapter <Nr><Title>
8. 9. 10. 11. 12. Rosenbaum, S. What Makes Strategic Usability Fail? Lessons Learned from the Field. in CHI '99. 1999. Pittsburgh, USA. Anderson, R., Organisational Limits to HCI. Conversations with Don Norman and Janice Rohn. Interactions, 2000. 7(2): p. 36-60. Bevan, N. and J. Earthy. Usability process improvement and maturity assessment. in (in these Proceedings). 2001. McFeeley, B., IDEAL SM : A Users Guide for Software Process Improvement. 1996, Software Engineering Institute: Pittsburgh. p. 236. ISO/TR15504-7, Software Process Assessment - Part 7: Guide for use in process improvement. 1998, International Organization for Standardization, Genve, Switzerland. Paulk, M.C. and al, The Capability Maturity Model: Guidelines for Improving the Software Process. 1995: Addison-Wesley. Kuvaja, P., et al., Software Process Assessment and Improvement - The BOOTSTRAP Approach. 1994, Cambridge, MA: Blackwell Publishers. ISO/TR15504-2, Software Process Assessment - Part 2: A reference model for processes and process capability. 1998, International Organization for Standardization, Genve, Switzerland. Jokela, T. Usability Capability Models - Review and Analysis. in HCI 2000. 2000. Sunderland, UK. ISO/TR18529, Human-centred Lifecycle Process Descriptions. 2000, International Organization for Standardization: Genve, Switzerland. Earthy, J., Usability Maturity Model: Processes. 1999, Lloyd's Register of Shipping.
Version
<Nr>