Você está na página 1de 7

1

K
ARDDIE is the acronym
for analysis, research,
"ARDDIE" may sound odd, and may take some getting used to. But what is equally odd, and what we shouldn't get used to, is a dependence on Kirkpatrick's Level-I reaction measures (78 percent) to demonstrate accountability, versus Levels 3 (9 percent) and 4 (7 percent) to demonstrate behavior change and results, as conveyed by ASTD's 2002 State of the Industry Report. Research is our inroad to greater accountability indeed, but not without a serious shift in our commitment to change how we view and conduct our work. Our worldview needs to shift away from an artificial buoyancy afforded us by distance learning technology. Currently, our three-legged stool of questionably stable practice rests on thin support-the always controversial ADDIE framework; a smattering of adult learning principles; and for
Photo by Getty Images

The Year 2013: ARDDIE Is IN, ADDLE Is OUT


By Benjamin E. Ruark

A research phase coutd be the missing link for the design model of the future.

design, development, imptementation, and evatuation, with "research" being the belatedly new kid on the instructionat systems design block.
Why research? Because for chief learning officers, accountability plays a major role, and while research can deliver it, technology cannot and never will. CLOs must lead our community of practice (CoP) in adopting a long-overdue evidence-based posture. Professional disciplines and organizations that have committed to this posture are said to have adopted evidence-based practice (EBP).

LISTEN TO THIS FEATURE at www.astd.org/TD/TDpodcosts.htm

44 I T+D I JULY2008

JULY2008 I T+D I 45

validity, reliance on input and sign-off from subject matter experts. To boost learning and development's (L&D's) credibility in the workplace, we've latched onto innovations from web authoring, communications, and media technologies supporting distance learning. Even though research shows that the application of these technologies is not necessarily superior over other delivery mechanisms, they are very much the business leader's pet solution to reducing training logistics costs. What has been lacking all the while is research. By introducing research to a far greater extent in how we do our work each day, we fortify the L&ID side of instructional system design (internal focus). We also raise literature reviews to the level of common practice during curriculum development (external focus on content area), arming developers with the latest know-how, and why-to for a given content area. Research is the missing theme that integrates our framework, our discipline, and the validity of our claims of effectiveness together. So for every L&D project we take on, we undertake three gap analyses instead of one. Two more gap analyses needed The one gap analysis with which we're all presently familiar seeks to clarify a client-specific metric differentiating desired versus current performance levels. From this point on, when we initiate a new training project, two other gaps should get equal attention: identifying how the latest research findings on curriculum design, learning, and instructional delivery can best guide the project's approach researching an applicable content area for its causal and key influential factors. You may think that instructional designers currently

do some degree of research during content analysis of any curriculum. However true that may be, a literature review is lengthier, more formal, and far more thorough. It won't make a scholar of the literature reviewer, but the depth and breadth of knowledge synthesized yields a distinct curriculum- design advantage over anything produced by one's instructional design contemporaries.

ADMDIE
This notion of being more researchcentric probably sounded academic and too abstract to business-minded mental health, social services, healthcare, community corrections, and similar human services operations officers-at least until their budgets became directly contingent on demonstrated accountability. Then EBP began to take on new meaning, and eventually, personal importance and acceptance. It then became a matter of marketing EBP internally across one's organization. EBP has recently been adopted

by L&D's cousin, higher education, and in a matter of time, is destined to arrive at our doorsteps, depending on how soon CLOs unite to make this happen. EBP first got its walking legs in medicine, led by a McMaster University group headed by David Sackett and Gordon Guyatt in 1992. For our L&D community, EBP is about every entity conducting its day-to-day practice as evidence-based professionals. This means that they are doing so in accordance with the most current and credible research evidence on effective instructional design, learning optimization, and instructional delivery systems available. "Current" is a key word because only research has the power to continually update our knowledge about what works and to what degree. And credibility is important because research Ob dispels unproven claims by self-anointed experts. It debunks myriad myths specific to any discipline; clears up a number of ongoing misunderstandings and false assumptions; and fills perpetual knowledge gaps existing at large within unchallenged curricula bereft of the latest knowledge science offers. An even more compelling view of L&D being EBP-centric means that we actually begin to identify true causal agents of each behavior change we target. Failing that, we at least identify key factors that influence or mediate desired organizational change. Research will enable instructional systems to elevate behavior change and performance improvement to unprecedented levels of replicable effectiveness across training events, organizations, and industries. This is the vision and promise EBP offers. As a paradigm shift, EBP represents an unshakable reliance on research to reduce the many forms of inef-

46 1 T+D I JULY 2008

Photo by iStockphoto.com

fectiveness inherent in any practice. EBP is a construct of interrelating, effectiveness-driven concepts first recognized in D.L. Sackett and W.M. Rosenberg's 1995 article, "The Need for Evidence-Based Medicine" (Journal of the Royal Society of Medicine, vol. 88). I have refrained some of these concepts to more aptly represent the practice of L&D. Making it informative. Our L&D decisions should be based on the best quantitative and qualitative research available about learning styles, learner populations, and experimentally tested instructional efficacy. To rashly settle on presumed best practices as our mainstay is no longer acceptable. Instead, we set a goal of producing copious amounts of research informing us on instructional methods. Using content areas. Identifying the best evidence calls for probative thinking long before a curriculum outline is attempted. More content area-based research needs to happen first. This will inform L&D practitioners what works best, how and when it works, how to configure it, and with whom it works, all based on current available evidence. Defining the principles. The conclusions we draw from critical appraisals of "what works" evidence are worthwhile only if we translate them into actions that maintain fidelity with research-prescribed methods and mechanisms. This should occur in measurable ways that affect our customers' ability to learn and perform better on the job. We will need to generate a set of EBPsupported principles to which we all adhere with fidelity. Keeping it current. We will continuously identify research gaps, close them, and then question whether our underlying assumptions and practices have kept abreast of current findings. Areas for extensive research and tacit knowledge scrutiny include: "content-area research (conducting lit reviews, synthesizing findings) "dashboard measures for calculating ROI

For our L&D community, evidence-based practice is about every entity conducting its day-to-day practice as evidence-based professionals. This means that they are doing so in accordance with the most current and credible research evidence on effective instructional design, learning optimization, and instructional delivery systems available.
of start-up training (qualitative and quantitative research) "instructional design methods (by delivery system) "instructional delivery methods "*instructional media and learning optimization "* learning, learning styles, and learner populations "*client expectations and relationship management "* skills transfer and transfer strategies or models "* testing validity, adherence to Bloom's taxonomy, and simulations "* training facilitation and coaching "* training needs (front-end) analysis "* workplace consulting, and coaching strategies and techniques. Managing the information. We also practice effective knowledge management, affording L&D organizations the ability to locally store research-related information and more effectively share it with our CoP At minimum, L&D entities would subscribe to relevant online databases

"implementation

JULY 2008 I T+D I 47

L&D management and practitioners must assume an additional role as researcher, as have their counterparts in other EBP-centric disciplines. In addition to acclimating to a whole new vernacular, each of us must upgrade ourselves to use research.

such as ERIC (Education Resources Information Center) and EBSCOhost, which has numerous database search engines. For storing and referencing downloaded and scanned research articles from the server, L&D entities could install some form of reference management resource, such as EndNote. And, yes, this necessitates two new sets of skills for L&D practitioners to better manage their knowledge assets. Using and conducting research Speaking of new skill sets, before our CoP can walk the talk of EBP, we must first learn to crawl in that unfamiliar research patch. L&D management and practitioners must assume an additional role as researcher, as have their counterparts in other EBP-centric disciplines. In addition to acclimating to a whole new vernacular, each of us must upgrade ourselves to use research, as was first identified by Robert Bogue in a 2006 curriculum plan designed for the National Institute of Corrections. Bogue's F*F*A*T skill set for effective use of research, under the new ARDDIE framework, refers to finding, filtering, assessing, and translating research. Find. Use subscription-based online search engines and databases with accuracy; optimally search keywords or phrases to reap accurate online search hits; and expertly access peer-reviewed journals and other salient periodicals. Fitter. Screen search hits for most relevant articles on the content area investigated. Assess. Assess each research article you've decided to use for its level of design quality. Also assess the extent to which a research study's design, limitations, and findings are acceptable to the curriculum purpose at hand (and how overall implications will shape a curriculum's design). TransLate. This also occurs at two levels-in the ability to explain a research-based issue in nonintimidating terms to stakeholders;

and an ability to synthesize a literature review's findings into a design that raises the IQ of any curriculum. (Think of a lit review as consulting external SMEs of perhaps national recognition.) After getting upgraded with using these skills, we'll need to be trained on how to conduct research, by "specifying a research problem or opportunity "* framing as a research question or subject "* deciding best research design "carrying out the research "analyzing research data "* interpreting research results or implications "reporting research findings. CalL for EBP adoption Since most of us are not hardwired with the researcher's gene, I believe that this paradigm shift's chances of happening are best increased through a top-down approach. I therefore call upon CLOs to organize and set these wheels in motion. Form a research consortium. This should be independent of any one L&D society, which would lead our discipline's adherence to principles of EBP Serving as a recognized national or international institute or clearinghouse, and staffed with EBP-fluent researchers, we would rely on this entity for advice and research references; interpretation of EBP principles applied to everyday L&D practice scenarios; research-question formulation, consultation, and dissemination; and diffusion of cuttingedge EBP innovations. Institute research criteria. The L&D community would not only formulate its own definition and principles of EBP, but also institute our own research-rigor gradient or pyramid, using explicit criteria that clearly define escalating levels of research rigor. The higher a research study's rigor level, the more credible and reliable (and generalizeable) its findings on effectiveness will be. Use incentives. Create an incentive system that will encourage and reinforce research replications in support

48 I T+D I JULY 2008

Photo by iStockphoto.com

of effectiveness studies performed inhouse by L&D practitioners. Implement practice-based research. Similarly, convert a wide collection of anecdotal evidence and assertions about what works and best practices claims, into practice-based research. This ensures that tacit knowledge of great potential for broad application is more research-based. Practice-based research applies objective controls and methods to the collection and analysis of data describing, for example, a training method, learning enhancer, or course design feature. CoRaborate. We collaborate in our gathering of current research-supported phenomena toward an ultimate goal of identifying opportunities for metasynthesis and integration. From such attempts, we begin to build largescale model change programs. Thus, you begin to see how EBP would function at many organizational levels across the L&D discipline's many service fronts, collaboratively drawing upon industry and organizational stakeholders, to gain potency and collective thrust at critical mass. The future In the short term EBP will silence all Monday morning quarterback calls as to the effectiveness of L&D's many diverse solutions. For the long term, EBP requires conspicuously stepped-up collaboration between university and corporate entities as sponsors of far more specialized studies known as randomized control trials. This would be in addition to more correlation studies, including multisite study replications to prove or disprove the significance of findings on effectiveness. EBP is the only route open to accessing real accountability for any discipline. Placing the science of L&D on equal footing with technological advances achieved thus far is long overduebut only if CLOs unite and take EBP to critical mass. How about a realistic five-year goal of 2013? T+D

Benjamin E. Ruark is a seniorinstructional and performance transferdesigner;benruarkCaahoo. com.

WHAT DO YOU THINK? T+D welcomes your comments. Ifyou would like to respond to this article, or any article that appears in T+0, please send your feedback to mailboxIiiastd.org. Responses sent to the mailbox are considered available for publication and may be edited for length and clarity. JULY 2008 1 T*D 1 49

COPYRIGHT INFORMATION

TITLE: The Year 2013: ARDDIE Is In, ADDIE Is Out SOURCE: T+D 62 no7 Jl 2008 The magazine publisher is the copyright holder of this article and it is reproduced with permission. Further reproduction of this article in violation of the copyright is prohibited.

Você também pode gostar