Você está na página 1de 6
Virtual Necessities: Assessing Online Course Design Preece Integrating technology into curricula is a time con- suming and complex process that requires innovative ‘approaches to pedagogical practice (Rodriguez, et al, 2001; Nakatani, Edwards, & Zhu, 2001), Despite the finct that distance learning has gained momentunt and now accounts for a significant proportion of course offerings in higher education, limited guidance is aunilable to faculty and collaborators who use the Wob to enhance or deliver courses. A description of research findings highlights the magnitude and importance of this faculty support issue report from the Campus Computing Project noted that “As in the past five years, survey respondents across all sectors of higher ediuca- tion identify assisting faculty integrated technology into instruction as the single most important IT issue confronting their campuses” (Greene, 2001, P. 2, italics added). Educause also conducted member surveys for the years 2000, 2001, and 2002 to identify the most pressing concerns regarding information technology in higher edu- cation (Gandel, & The Educause Current Issues Committee, 2000; Lembke, Rudy, & The Educause Current Issues Committee, 2001; Kobulnicky, Rudy, & The Educause Current Issues Committee, 2002). These survey reports noted that faculty development, support, and training was ranked as one of the top three issues by all three surveys (Greene; Gandel, et al; Lembke, et al Kobulnicky, et al.). These su veys included all personnel from several institu- tions of higher education. Moskal and Dziuban (2001) reported the results of a survey that was specific to 38 faculty who January-March 2004 « International Journal on E-Learning taught online web-enhanced or web-based cours- esata single university. Eighty-five percent of par- ticipants reported that teaching online requires additional investments of time. Faculty partici- pants advice to other faculty “included the impor- tance of preparation (30%), technical support (16%), technology knowledge (16%), clearly defined course design (8%) and goals (8%), signif- icant time demands, and commitment (8%)” (Moskal & Dziuban, p. 178). These survey responses affirm the findings from The Educause and Campus Computing Project surveys that included a broader sample of higher education professionals including administrators. Both fac- ulty and administrators identify a need for sup- port and preparation to implement online courses, In response to this need, several organizations in higher education have published guidelines and benchmarks of quality for distance learning (American Council on Education, 2001; Higher Education Program and Policy Council of ‘the American Federation of Teachers; 2000; Institute for Higher Education Policy (IHEP), 2000). These guidelines have focused on course development in relation to the organizational infrastructure. In a critique of these guidelines, the Pew Learning and Technology Program stated that “What is missing is a process of quality assurance aimed at the course level” (Twigg, 2001, p. 14). The Principles of Online Design and Online Design Checklist were developed to focus on the course level. Referring to the IHEP principles, Twigg noted that “These principles of good practice are basically process-oriented and resemble current accreditation practices. How do we know the institutions and organizations in fact apply Roseata McKNiGHT, HEALTHCARE MULTIMEDIA DESIGN, USA E-Mait: rkmeknight@comeast.net them?” (Iwigg, p. 9). The development of princi- ples aimed at the application of good practice in the design of online courses has been the focus of ongoing development at Florida Gulf Coast University (FGCU). DEVELOPMENT OF THE PRINCIPLES OF ONLINE DESIGN A multidisciplinary collaborative process was used to survey and articulate best i online course design at FGCU. Ini team composed of faculty, instructional design- ers, and media support professionals reviewed the literature to identify guiding principles and best practices for developing courses for the Web, ‘The Principles of Online Design described bench- marks of quality for the development and contin- uous improvement of online instruction. The first iteration of the principles yielded a lengthy and linear document that was published on the Web with links to websites that served as examples of the principles. Although this document was thor- ough, usability was a concern. During 2000-2001, a new study group at FGCU determined that faculty might be more inclined to use the principles if they are presented in a more concise and interactive format. This study group concluded that the linear format of the principles did not foster user control and interactivity. Hence, a second iteration of the Principles of Online Design, shown in Figure 1, was developed. Content was arranged into a two-column for- mat with each principle in the left column and {URL wal open in now window) |Web stes designed on high quality montors and Figure 1. Two-Column Format of the Principles of Online Design 6 examples or links to resources in the right col- umn. The original document used a numbered coding or indexing system to catalogue each principle. In the second iteration, this coding sys- tem was simplified. Internet links to examples within the principles were also repaired or removed in the second iteration. ‘Once the principles were streamlined, the study group turned its attention to assessing the implementation of the principles. In other words, what indicators would provide evidence that the principles were being used to design online courses? To answer this question, a most impor- tant determination of the second study group ‘was that a checklist, designed to correspond with the principles, would expedite implementation of the principles. Hence, a primary goal of the checklist has been to help faculty and course developers to review design components more quickly and efficiently. To facilitate usability, each indicator on the checklist was linked to the origi- nal text of the Principles of Online Design using the indexing or coding system. In this way, with the click of a mouse, faculty and online course designers would be able to quickly access the more expansive information contained within the Principles of Online Design. This article describes the general design and the process used to develop the Online Design Checklist. DEVELOPMENT OF THE ONLINE DESIGN CHECKLIST The checklist was created by carefully paring down the Principles of Online Design through numerous collaborative meetings that took place over a period of five months. The checklist was numbered to correspond with the revised index- {ng system of the Principles of Online Design. Each indexed principle was restated as a concretely identifiable indicator of the principle. For example, as shown in Figure 2 on the checklist, number 1.14 states “Educational pre- requisites are listed.” This links to principle num- ber 1.1.4, which states that “Audience analysis should determine the learners’ personal charac- teristics, intellectual skills, subject knowledge level, and the purpose of taking the course” (McKnight, 2001). The following goals were artie- ulated for the Online Design Checklist checklist: + to serve as a brief and efficient guide to help faculty incorporate pedagogically sound; ‘+ design principles into online courses under development; + to be used asa tool to evaluate courses for con- tinuous improvement and redesign; and ‘to be used to identify exemplars of quality course design. January-March 2004 « International Journal on E-Learning During summer 2001, the Online Design Checklist was field-tested with eight instructional technology students at FGCU. Students were asked to review an online course using the check- list. Although the checklist, the principles, and the course are available online, participants used a printed version of the principles and the check- list for the field test. This precluded the necessity of having to toggle between multiple windows to access the principles website, the checklist web- site, and the course website. Participants were asked to complete a survey that ranked their level of agreement regarding the usability of the checklist. The following crite- ria were used: + ease of using the checklist, + absence of jargon and ease of understanding, * clarity of checklist indicators, and + correspondence between checklist indicators and the principles. Participants were asked to write their com- ments and thoughts in the margins as they used the checklist. They were also asked to answer the following questions about the checklist: * What additional aspects of online design should be included in the checklist? + How can the checklist be improved? * As a course developer, would you use the design checklist? Why or why not? * Will you refer to the Principles of Online Desig and the Online Design Checklist as you develop online courses in the future? Based upon feedback from these student field. tests, in October 2001, minor modifications were made to the design checklist and approved by the Department of Course and Faculty Development Advisory Committee. During this same month, the FGCU Faculty Senate approved and recom- mended use of the Online Design Checklist for peer evaluation of distance learning, courses. In October 2001, the Online Design Checklist was presented during a half-day tutorial at the Association for the Advancement of Computing in Education's (AACE) WebNet 2001 Conference in Orlando, Florida. Using the same surveys, the checklist and principles were demonstrated to TL participants. Since conference participants did not have access to a computer to review an online course, they observed a demonstration of the checklist. This group, however, was not able to complete the checklist since participants did not have individual access to an online course and due to time limitations. In January 2002, the Board of Trustees of FGCU approved a recom- mendation from the Faculty Senate that the January-March 2004 International Journal on E-Learning Online Design Checklist be used for peer evalua- tion of distance learning courses. PARTICIPANT FEEDBACK ABOUT THE ONLINE DESIGN CHECKLIST In this study, both survey data and open-ended questions were used to field-test the checklist. The use of “multiple outcome measures” has been encouraged and demonstrated to be an effective method of evaluation (Moskal & Dziuban, 2001) Seven students participated in a field test in sum- ‘mer 2001, All students were in the instructional technology curriculum at FGCU. Five students had taken at least 10 internet-based courses. All students had used computers for at least two years. Three were females and four were males, Student participants took between 50 and 80 min- utes to complete a paper-and-pencil version of the checklist. All seven participants completed this part of the field test. Of those seven, five returned the Design Checklist Usability Feedback Survey. Table 1 summarizes these survey data. All 11 participants in the conference tutorial field-test completed the Design Checklist Usability Feedback Survey. This international group of par- ticipants included an online learning systems administrator, a teacher, a human resources con- sultant, a user support consultant, a professor, a project director, a technology administrator, a tech- nology trainer, and an instructional designer. Two participants did not provide this information Table 2 summarizes these survey data. ‘Neutral responses to item four in the conference field test may reflect the fact participants did not actually use the checklist but observed a demon Figure 2. Correspondence Between the Principles and the Checklist 7

Você também pode gostar