Você está na página 1de 12

instruction and Student Outcomes

lAathematics Dynamic Assessment


Informal Assessment That Responds to the Needs of Struggling Learners in Mathematics
David H. Allsopp Maggie M. Kyger LouAnn Lovin Helen Gerretson Katie L. Carson Sharon Ray
The MDA process provides teachers with important information about what students do and do not understand about foundational mathematics concepts, students' levels of understanding and abilities to express their understandings, and where students are in the learning sequence (frustration, instructional, mastery). The data collected through the MDA process provide teachers with an in-depth evaluation of their students' mathematical understandings and thinking that allows teachers to plan their instruction to address students' specific mathematical learning needs. Ms. Carlisi structures an MDA in the area of fractions with an emphasis on comparing fractions, an area in which her students demonstrate difficulty. In today's high-stakes testing and school accountability culture, assessment that evaluates school and teacher effectiveness is a primary focus for policy makers. Unfortunately, policy makers do not emphasize with the same zeal the need for assessment that informs teachers about how to teach students. Highstakes testing that evaluates student outcomes for accountability purposes is primarily summative in nature. The results of such testing can help schools and teachers determine students' performance in general domains within the K-12 mathematics curriculum, but they are not designed to provide educators with the type of diagnostic information necessary to plan instruction for struggling learners. By their very nature, such assessments are not suited to individualization. Their adaptability for addressing both diverse curricula and student learning needs is limited. On the other hand, formative assessments do provide teachers with information that can guide instructional decision making. An example of formative assessment that can inform instructional decisions is Curriculum Based Measurement (CBM), an assessment process that is gaining greater attention in schools because of the recent Response-to-intervention (RTIJ requirement related to the Individuals With Disabilities Improvement Act, 2004 (Federal Register, 2006). CBM incorporates frequent short assessment probes that target concepts and skills for which students are receiving instruction. These data are typically displayed visually using line graphs. CBM data inform teachers about their students' progress toward specified learning goals. The data allow teachers to adjust instruction promptly according to what the data suggest in terms of students' learning. CBM is an effective informal assessment process for monitoring progress as instruction is implemented, an important activity for determining teaching effectiveness. However, it does not provide the types of diagnostic information needed to guide initial instructional planning for the essential core K-12 mathematical concepts.

Ms. Carlisi teaches middle school mathematics for students who were retained at least 1 year and who were identified as at risk of failure in mathematics. About half of her class of 14 students has identified learning disabilities and/or emotional and behavioral difficulties. Ms. Carlisi worries constantly that she is not going to be able to help them "catch up" by the time they take the mandated state high-stakes standardized assessment. Ms. Carlisi knows her students need to build proficiency in fractions. Her students continue to demonstrate difficulties despite previous instruction. She decides to use Mathematics Dynamic Assessment (MDA) to help her students. MDA is an informal mathematics assessment process that integrates four researchsupported assessment practices for struggling learners: Assessment of students' interests and experiences. Concrete-representational-abstract assessment within authentic contexts. Error pattern analyses. Flexible interviews.

6 COUNCIL FOR EXCEPTIONAL CHILDREN

Mathematics Dynamic Assessment (AUsopp, Kyger, & Lovin, 2007) is a promising informal assessment process for mathematics that has potential for complementing praclices such as CBM, This assessment process can provide teachers in-depth insight into the mathematical understandings and thinking of struggling learners as they relate to foundational focal points ot big ideas in the K-12 mathematics curriculum. The MDA process provides teachers with important information about what students understand and don't understand; about students' levels of understanding (concrete, representational, abstract; CRAJ; about students' abihties to express their understandings; and insight into the possible reasons for misunderstandings. Implemented periodically in the course of a year when key mathematics focal points/big ideas are addressed, the MDA process can help teachers pinpoint their students' instructional needs, so that they can plan instruction that meets those needs as they teach the particular concepts/skills related to the key mathematics focal point/big idea. Then other formative assessment processes such as CBM can be used to help teachers evaluate the effectiveness of their instruction and make changes as needed. MDA: Integration of Four ttffectlve Assessment PrcKtices The MDA process is a flexible informal assessment process that teachers can implement within their own curriculum, not a prepackaged test. We developed and field-tested this process to evaluate its potential value for providing teachers a practical way to obtain data that provide Insight into struggling learners" mathematical understandings using research-supported assessment practices. The MDA process integrates the four effective assessment practices in mathematics previously listed. The purpose of an MDA is to provide teachers with specific Information about diverse students' mathematical understandings related to foundational curriculum focal points/big ideas across the mathematics curriculum. Teachers can decide which concepts to emphasize based on their individual contexts. Some

teachers might find it useful to focus on broader areas of the mathematics curriculum, although others may wish to focus on more specific areas. Teachers will probably make this decision based on the specific curriculum they use, their school's curriculum pacing guides, the particular needs of their students, and time available to develop and implement an MDA, as well as other factors. The National Council of Teachers of Mathematics (NCTM) identifies three curriculum focal points for each grade in the PK-8 mathematics curriculum [NCTM, 2006). These curriculum focal points can serve as an excellent guide for teachers as they plan where in their curricula to implement an MDA. Figure 1 shows the mathematics curriculum points for Grades 1 tlirough 8. Through the integration of the following four research-supported practices, the teacher in the field test used the MDA process to build a dynamic picture of her students' understandings in the area of fractions. By implementing the MDA process she was able to (a) identify her students' level of CRA understanding, (h) identify her students' abilities to express their understandings (receptive and expressive), (c) identify her students" misconceptions, and (d) gain insight into her students' mathematical thinking. The teacher used the information obtained from her MDA to develop an instructional hypothesis that guided her instructional planning to address her students' learning needs about fractions. The teacher"s instructional hypothesis provided her with a concise statement about her students' understandings that contained what her students understood and what they didn't understand, including insight into potential reasons for what her students did and did not comprehend about fractions. Evaluating Student Interests and Experiences Students benefit from instruction and assessment that occurs within authentic contexts (e.g., Bottge, 1999; Bottge, Heinrichs, Chan. Mehta, & Watson, 2003; Bottge, Heinrichs, Mehta, & Hung, 2002; Gersten, 1998; Schumm et al..

1995; Wehmeyer, Palmer, & Agran, 1998). Placing assessment within contexts meaningful to students is the foundation for an MDA. Teachers may be more familiar with integrating authentic contexts in their "teaching," but we believe that such contexts are relevant to the informal assessment process as well. Situating assessment within authentic contexts provides students with a familiar framework within which to demonstrate what they know. Students who struggle to learn may be less anxious about doing mathematics and may be more motivated to perform when they are doing it within a familiar context. In addition, students may be more likely to retrieve previous knowledge from memory when it is associated with interests and experiences that have meaning. Subsequent instruction can then incorporate similar contexts, thereby supporting students' transition between assessment and instructional activities.

Students may be more likely to retrieve previous knowledge from memory when it is associated with interests and experiences that have meaning.
Teachers can evaluate students" interests for the purpose of developing authentic contexts for assessment by asking students to identify activities and experiences of interest, recording their responses, correlating their interests with target mathematical concepts/skills, and then creating relevant problem situations that relate directly to students' interests. This process can be completed in a variety of ways, from having students write letters about what they like to do to using a more structured process such as the Mathematics Student Interest Inventory shown in Figure 2 and Figure 3. Figure 2 shows a structtire for having students describe their interests and experiences across a range of situations. After evaluating students' responses, the teacher can use the structure illustrated in Figure 3 to integrate student interests
JAN/FEB

TEACHING EXCEPTIONAL CHILDREN

2008 7

Figure 1 . NCTM Grades 1 - 8 MathemcHlcs Curriculum Fecal Points Grade Grade 1 Curriculum Focal Points Number and Operations and Algebra: Developing understandings of addition and subtraction and strategies for hasic addition facts and related subtraction facts Number and Operations: Developing an understanding of whole number relationships, including grouping in tens and ones Geometry: Composing and decomposing geometric shapes Number and Operations: Developing an understanding of the base-10 numeration system and place-value concepts Number and Operations and Algebra: Developing iiuick recall of addition facts and related subtraction facts and fluency with multldigit addition and subtraction Measurement: Developing an understanding of linear measurement and facility in measuring lengths Number and Operations and Algebra: Developing understandings of multiplication and division and strategies for basic multiplication facts and related division facts Number and Operations: Developing an understanding of fractions and fraction equivalence Geometry: Describing and analyzing properties of twodimensional shapes Grade 4 Number and Operations and Algebra: Developing quick recall of multiplication facts and related division facts and fluency with whole number muliiplication Number and Operation: Developing an understanding of decimals, Including the connections between fractions and decimals Measurement: Developing an understanding of area and determining the areas of two-dimensional shapes Number and Operations and Algebra: Developing an understanding of and fluency with division of whole numbers Number and Operations: Developing an understanding of and fluency with addition and subtraction of fractions and decimals Geometry and Measurement and Algebra: Describing threedimensional shapes and analyzing their properties, ineluding volume and surface area Number and Operations: Developing an understanding of and fluency witli multiplication and division of fractions and decimals Number and Operations: Connecting ratio and rate to multiplication and division Algebra: Writing, interpreting, and using mathematical expressions and equations Number and Operations and Algebra and Geometry: Developing an understanding of and applying proportionality, including similarity Measurement and Geometry and Algebra: Developing an understanding of and using formulas to determine surface areas and volumes of three-dimensional shapes Number and Operations and Algebra: Developing an understanding of operations on all rational numbers and solving linear equations Algebra: Analyzing and representing linear functions and solving linear equations and systems of linear equations Geometry and Measurement: Analyzing two- and threedimensional space and figures by using distance and angle Data Analysis and Number and Operations and Algebra: Analyzing and summarizing data sets

Grade 2

into assessment and instruction. Teachers can then determine which student interests might best apply to mathematical concepts/learning objectives that the teacher will cover during a given period. He or she can then develop a context for assessment and/or instruction using a given concept. A story problem and associated tasks that reflect the authentic context can be developed for assessment purposes. CRA Assessment Evaluating student mathematical understandings as well as teaching mathematics at the concrete, representational, and abstract levels are effective practices when teaching struggling learners te.g., Kennedy & Tipps, 1998; Maccini & Gagnon, 2000; Mercer & Mercer. 2005; Miller, Butler. & Lee. 1998; Miller & Mercer, 1993; Van de Walle, 2006; Witzei, 2005). In an MDA assessment, a relevant story problem contains items to evaluate students' understandings of target concepts at three levels of understanding; (1) concrete {where students use materials to show their understanding of the concept), (2) representational (where students use drawings), and (3) abstract (where students use mathematical symbols without the use of concrete materials or drawings). Creating three assessment centers in the classroom to address the three levels of understanding is a convenient way to implement this component of the MDA process. Teachers should include an assortment of items in each of the assessment centers that evaluate students' abilities to demonstrate their understandings. Tasks should range from choosing the appropriate answer from among several choices (receptive/recognition level response) to solving problems (expressive level response). Varying the question type is important because, for example, teachers sometimes assume that students have no understanding of a concept or skill if they cannot solve an expressive level response item. However, students can often demonstrate some understanding when they see possible choices, and they must decide which choice is the best answer to the question or prompt. Furthermore, it can be mis-

Grade 3

Grade 5

Grade 6

Grade 7

Grade 8

8 COUNCIL FOR ExcEPTiONAL CHILDREN

leading to assume that if a student can successfully solve a receptive level response item, he or she has a complete understanding of the mathematical idea. Students often possess some understanding of a concept/skill but have not fully developed their understanding or become proficient with it. Finding the CRA level at which students understand a mathematical concept and the extent to which they can express that understanding (recognition vs. expressive) provides important information for the teacher. By examining student responses from a CRA Assessment, a teacher can more precisely determine what a student understands and focus his or her instruction accordingly. For example, if a student demonstrates mastery at the concrete-expressive level (solving a problem using materials) but demonstrates frustration at the abstract-recognition level (selecting the best answer to a number statement), the teacher knows that instruction should build on the student"s concrete understanding of the concept. Error Pattern Analysis Analyzing errors made by students provides teachers insight regarding their students" procedural and conceptual misunderstandings (Ginsburg, 1987; Howell. Fox. & Morehead, 1993; Mercer & Mercer, 2005; Woodward & Howard, 1994), The errors that students make can sometimes be even more informative to teachers than correct responses. Student errors often provide insight into students' misunderstandings about a particular mathematics concept or skill. When students make consistent errors, these "error patterns'" highlight areas of students" misunderstandings. Equipped with a picture of what students misconceive, teachers are better able to provide instruction that supports students' development of appropriate mathematical understandings. For example, summing the individual digits of a multidigit addition problem instead of regrouping indicates that the student has little understanding of place value. Students who use only the value of the numerator to determine the size of fractions are probably confused about the relationship of the numerator to the denomina-

Figure 2. Mathematics Student interest Inventory Student Name: Age/Grade Level: Period/Class: Things I Like to Do on My Own Special Hobbies I Have Things I Like to Learn About Things 1 Like to Do With My Friends Fun Things My Family Does

tor or may not have the foundational understanding of what constitutes a fraction. Therefore, examining student "error patterns" can be a pivotal assessment practice, particularly for students who exhibit mathematical learning diffictilties. When teachers understand why students are making mathematical errors, they are better able to direct their instruction toward developing conceptual understanding. Flexible Interviewing Error pattern analysis provides obvious places to probe a student's understanding. However, a correct answer does not necessarily mean the student used valid reasoning to get that answer. For example, a student might correctly respond that 2/3 is less than 3/4 when comparing fractions. However, when asked to explain his reasoning, the student states that because 2 is less than 3 in the numerators and 3 is less than 4 in the denominators, then 2/3 has to be less than 3/4. Hence, teachers need to ask students periodically to explain or justify correct answers to ensure that correct answers are not masking misconceptions.

Students' mathematical thinking is not always obvious from their initial responses to mathematics tasks. Talking to students about their thinking processes when they problem solve is an excellent way to learn about their conceptual understandings (Bryant. 1996; Kennedy & Tipps. 1998; Liedtke, 1988; Zigmond. Vallecorsa, & Siiverman, 1981). When interviewing students, it is important to be flexible, using a variety of interview methods. Flexible interviews can be accomplished in a variety of ways: Teachers can ask students to describe how they solved a previously completed problem. Teachers can ask students to "teach" the teacher by showing him or her how to solve a problem. Teachers can ask students to catch them doing something incorrect as the teacher completes a set of steps to a problem. Teachers can complete each step of a problem-solving procedure or strategy, asking the student to describe why the teacher might have done each step.
JAN/FEB

TEACHING EXCEPTIONAL CHILDREN

2008 9

Figure 3. Class MaHiematlcs Student Interest Inventory


Period/Class; School Year: Interests Relevant Mathematics Concepts/Skills I Teach That Match Interest Ideas for Creating Authentic Contexts

Individual Interests/Activities (Columns 1-3 on Student Interest Inventory) 1. 2.

of an instructional hypothesis. The MDA instructional hypothesis is similar in purpose and structure to a hypothesis developed from a functional behavior assessment, where the teacher develops a rationale that suggests why a student may engage in a particular behavior that interferes with his or her success in school (Broussard & Northup. 1995; Larson & Maag, 1998; Sugai, Sprague, & Horner, 1999). In comparison, an instructional hypothesis for an MDA assessment includes four parts: 1. Mathematical ideas reflected in the target concept/task.

Peer Related Interests/Activities (Column 4 on Student Interest Inventory) 1. The local college football team Fractions: Comparing fractions Create story problems involving the team playing its big rival where they determine who gained the most yards in a play or quarter based on the fractional parts of the field (e.g., Ihe home team gained yardage equivalent to 7/lO's the length of the 100 yard football field while the visiting team gained yardage equivalent to 3/5's the length of the field).

2. What students can do. 3. What students cannot do. 4. A rationale for students' current levels of understanding (CRA; receptive/expressive abilities; mathematical thinking). The instructional hypothesis is based on data from the four assessments. By creating an instructional hypothesis, the teacher clearly articulates a description of the student's mathematics abilities, providing a foundation for developing an instructional plan that addresses the mathematical learning needs of the student. The structure for developing an instructional hypothesis is shown in Figure 4.

2. Family Related interests/Activities (Column 5 on Student Interest Inventory) 1. 2.

MDA Procass In AcHon


When the teacher integrates these practices within a single assessment process, he or she can obtain a deep level of information about students' mathematical understandings, resulting in a dynamic picture of what students know and do not know and possible reasons for their understandings and misunderstandings. The following vignette, adapted from a real-life example of a mathematics dynamic assessment (MDA) in a middle school classroom, portrays the implementation of the MDA process in a single 40- to 45-minute instructional period. The vignette includes examples of the information that the teacher obtains about her students' understandings of comparing fractions. Video of the teacher implementing the MDA process, including her

The intent of flexible interviews is to engage students in communicating their thinking about a particular mathematical concept, skill, or procedure. The specific approach should be based on the individual student's ability to communicate and the approach most likely to engage the student. Furthermore, the use of concrete materials or drawings to help students commutiicate their thinking can be effective, particularly when students possess expressive language
10 m COUNCIL FOR EXCEFFIONAL CHILDREN

difficulties. Flexible interviews are especially helpful when confirming interpretations regarding students' understandings that arise from an error pattern analysis.

MDA Insfnictlonal Hypoffliesls


Teachers develop a premise about their students' mathematical understandings related to the target concept based on the results of these four assessment practices and articulate this in the form

Figure 4 . Structure for Developing an Instructional Hypothesis Given: Teacher identifies the target mathematical ideas reflected in the
task.

Students are able lo: Teacher describes what students demonstrated they understand and are able to do. Students are unable to: Teacher describes what students demonstrated they do not understand and are not able to do, Beciuse: Teacher writes a rationale to explain why students lack understanding of/ability to accomplish the target mathematics task.

cepts. Moreover, she understands that she cannot teach her class as if the students were a homogeneous group. To meet her students' individual needs, Ms. Carlisi knows that she needs more information about their individual understandings of fractions. She decides to structure a mathematics dynamic assessment in the area of fractions with in emphasis on comparing fractions, an area with which her students demonstrate difficulty.
Authentic Assessment Context

story context to serve as the foundation to her MDA: During the second half of the football game on Saturday, the Wildcats began to move the ball both on the ground and in the air. In the fourth quarter, the Wildcats gained 5/8 of the football field and the Knights gained 3/4 of the football field. The television announcer said that the Wildcats really out-gained the Knights during the quarter.
CRA Assessment

thoughts, can be found at the MathVIDS Web site, http://coe.jmu.edu/mathvids2. Ms, Carlisi teaches middle school mathematics for students who were retained at least 1 year and who were identified as at risk of failure in mathematics. About half of her class of 14 students has identified learning disabilities and/or emotional/behavioral difficulties. Ms. Carlisi worries constantly thai she is not going to be able to help them 'catch up" by the time they take the mandated state high-stakes standardized assessment. In addition to the students' diverse learning needs because of disabilities or other learning factors. Ms. Carlisi's class is diverse in other respects. Numbers of students who are African American, Caucasian, and Hispanic are about equal; additionally, five students are English-language learners, Ms. Cariisi knows her students need to build proficiency in fractions. Her students continue to demonstrate difficulties despite previous instruction. Last year"s state assessment scores show that her students are deficient in mathematics overall and specifically in the area of fractions. Ms. Carlisi knows from her U years of teaching struggling learners that students benefit from a concrete-torepresentational-to-abstract sequence of instruction and practice and that they often have gaps in their knowledge and understanding of mathematical con-

Several years ago, Ms. Carlisi began asking her students at the beginning of each year to write a letter to her about their interests, their experiences outside of school, and their positive and negative experiences related to learning mathematics. From these letters, she creates a list that summarizes their collective interests and experiences. From time to time she reviews the list so that she can incorporate these relevant contexts in her teaching. Ms. Carlisi finds that her students learn more effectively when they leam mathematics in meaningful contexts. It occurred to Ms. Carlisi that perhaps the same thing might be true for assessment. She had never tried it before, but decided that it would be worth a try. She reviewed the interest list she had generated from her current students' letters and began thinking about a context that would match an Interest of her students and that related well to fractions. She and her students lived in a college town, and there was great interest among her students in the local university's football team. She thought about how a football field is divided into equal parts, ten 10-yard sections, similar to the fraction bars she had often used in the past to teach fractions. She also knew that statistics are kept for each team based on how much yardage each team gains on offense for each of the four quarters of a football game. Ms. Carlisi thought that a logical application of her assessment objective would be to have students compare the fractional parts of a football field that each team gained during different quarters of a game. Ms, Carlisi created the following simple TEACHING

Ms. Carlisi remembered that students often understand mathematical concepts at different levels: concrete, representational, and abstract. She also ktiew from experience that students do not always possess understanding of concepts at all three levels. These gaps in their level of mathematical understanding often resulted in varying degrees to which students understood and were able to apply particular mathematical concepts at the abstract level, Ms. Carlisi knew that many of her past students could perform an operation or procedure at the abstract level but never truly understood the mathematical concept underlying the procedure. Likewise, she knew of students who could not apply a mathematical concept abstractlyusing only symbolic notationbut who could demonstrate conceptual understanding by using concrete materials or drawings, Ms. Carlisi appreciated the importance of knowing at what levels of understanding students could and could not demonstrate proficiency to help her determine where to place her instructional focus. Ms. Carlisi decided to create three assessment centers in her classroom to evaluate her students' levels of understanding in comparing fractions; a numbers and symbols center (abstract), a drawings center (representational), and a materials center (concrete). At each center, Ms. Carlisi created simple assessment sheets that prompted students to compare fractions at the appropriate level of understanding. The prompts for each center's assessment sheet came directly from the main assessment story context about the
EXCEPTIONAL CHILDREN

JAN/FEB

2008 a U

Figure 5. CRA Assessment Response Sheet Example: Concrete Expressive Level Name: Concrete Center Response Sheet 1) Below each item, use your fraction bars to show the first fraction and then show a fraction that makes each statement true. You can use any of the fractional parts listed in the parentheses for each item. 'Take a picture of your answer sheet when you've shown all your answers.

the one that was equivalent to the 2/10 fraction bar shown in the prompt. Expressive prompts required students to use fraction bars to create a fraction equivalent to the 2/10 fraction bar without providing choices. Students responded by taking a "whole" piece, attaching the appropriate fractional pieces to the bar, and then placing the fraction bar next to the prompt.

la) (Use halves, thirds, sixths, eighths, tenths, or twelfths) Wildcats 3/4 Knights 7/8

is less than

1
'I^^H
1

Example of student response

football game. Fraction bars associated with numbers and symbols were incorporated into the materials center, drawings of fraction bars associated with numbers and symbols were incorporated into the drawing center, and numbers and symbols only were incorporated into the numbers and symbols center Another important feature that Ms. Carlisi Included in her assessment was to provide students with two types of prompts at each center.

Receptive/Recognition prompts required students to choose the correct solution from among several choices. For example, at the materials center, Ms. Cariisi placed several different fraction bars next to the recognition prompts. Students would see a fraction bar representing 2/10 with the prompt asking then to choose an equivalent fraction bar. Below the prompt were three other fraction bars, each representing different fractions. Students used string to circle

Figure 6 . CRA Assessment Response Sheet Example: Representational Receptive Level Name: Representational Center Response Sheet !) For each item below you are shown a comparison of two fractions. A drawing of the first fraction is provided. Below this fraction are several other drawings. Circle the drawing that best illustrates the comparison.

la) Wildcats 1/4 Knights ]/8 Example of student response

is greater than

"^ J^
^

Ms. Carlisi placed a digital camera at the center so students could capture their responses. After responding to each prompt, students placed a name card next to their responses and took a digital picture. She reviewed the pictures to evaluate the responses. By including both recognition and expressive response items in her assessment, Ms. Carlisi provided her students an additional way to demonstrate their understanding. She knew from experience that students might actually have some understanding of a concept or skill even if they could not express their understanding without choices. She realized that students with cognitive processing difficulties and expressive language difficulties might actually understand more about a mathematical concept than they could express. Recognition prompts provide these students a way to demonstrate understanding. Figures 5 through 7 show examples of assessment items Ms. Cartisi used for concrete, representational, and abstract levels of understanding including recognition and expressive response modes. Ms. Carlisi knew that it was important for student transitions to be smooth and without distraction. She provided explicit directions and modeled how students would move from one center to another, including how they should use the materials at each center to respond to the assessment prompts. Ms. Carlisi wanted students to move through the centers sequentially (number-and-symbols center to drawing center to materials center). She decided to have students move in this sequence so that students' work with concrete materials and drawings would not bias their performance at the numbers and symbols cen-

12 COUNCIL FOR EXCEPTIONAL CHILDREN

ten By starting at the abstract level center (numbers and symbols), students would receive cues from prior work at the representational (drawings) and concrete (materials) centers. Students were organized into four groups. The first group started at the numbers and symbols center while the remaining groups worked independently at their desks on a maintenance activity involving a different mathematics concept/skill at with which they were already proficient. Groups were provided 8 minutes to complete the assessment prompts at each center. After 8 minutes, the first group was directed to move to the drawings center, and the second group started at the numbers and symbols center. The teacher used a digital timer that projected on a dryerase board the time left for each rotation. This process continued until all four groups completed each center. As students worked at the centers, Ms. Carlisi monitored, taking notes about common error patterns made by students and periodically interviewing students about their responses. Ms. Carlisi's CRA assessment revealed important information about her students' understandings of fractions and comparing fractions. She learned that all of her students had some level of understanding. She also learned that there was much variability across the class. As she reviewed the results, she developed a table to help her evaluate the class as a group and to determine at what levels of understanding individual students were functioning. Table 1 shows the results of Ms. Carlisi's CRA assessment. Ms. Carlisi's CRA assessment results table presents her students' knowledge at each leve! of understanding and their ability to demonstrate this understanding through receptive/recognition and expressive response modalities. In order to do this she had students respond to 5 prompts at each level of understanding (CRA) and for each expressive modality (receptive/recognition and expressive). Therefore, students responded to 10 prompts at each center, 5 receptive/recognition prompts and 5 expressive prompts. Overall, students respond-

Figure 7 . CRA Assessment Response Sheet ixampio: Abstract Expressive Level Name: Abstract Center Response Sheet 1) For each written fraction, write a fraction in the space provided that makes each statement true. You can use any of the fractional parts listed in the parentheses for each item.

la) (Use halves, thirds, sixths, eighths, tenths, or twelfths) Wildcats 1/4 < , Knights 3/6 "*Example of student response

ed to 30 prompts. Ms. Carlisi decided on this number of items to ensure that there were enough responses to get an accurate representation of students' abilities while also staying in a manageable timeframe.. Additionally, Ms. Carlisi predetermined a set of criteria for evaluating whether her students demonstrated mastery (highly proficient with the skill and prerequisite skills), instnictional (some knowledge of the skill and prerequisite skills), or frustration level (little or no knowledge of the skill and/or prerequisite skills) abilities for each

level of understanding and expressive modality. Her criterion for mastery level was 5 of 5 correct (100%); for instructional level, 3 or 4 of 5 correct (60%-80%); and for frustration level. 3 of 5 correct (< 60%). Ms. Carlisi recorded an "M" in the table for students who were at mastery level, an "I" for students working at an instructional level, and an "F" if students were at a frustration level. This display allowed Ms. Cariisi to identify students who had simiiar levels of understanding about fractions and comparing fractions. Student names are coior-coded to show

Tnhle 1 . CRA Assessment Results for Ms. Carllst'* Class Abstract Name Expressive SA ZD JD AD RF F.) RJ SK NM JM XM TR JT TW F F F M I M M F I I I F M M Receptive I I I M M M M I M M M I M M Representational Expressive I I 1 1 M 1 I I M M M I 1 1 Receptive M M M M M M M M M M M I M M Concrete Expressive Receptive I I I M M M M I M M M I M M M M M M M M M M M M M M M M

Note. Red = group with least understanding; Blue = group with medium level of understanding; Greeji = group with greatest level of understanding.
TEACHING EXCEPTIONAL CHILDREN JAN/FEB 2008 13

Figure 8. Jamaal's Drawing of Lack of Equivalency

Is Equai to 3 / 4 " Showing

1/2

is equal to
2/4

those who demonstrated similar levels of understanding. Ms. Carlisi would use these data and subsequent student groupings the next day as she planned for differentiated instruction. Error Pattern Analysis and Flexible Interviews As students completed the assessment centers, Ms. Carlisi monitored students by conducting "impromptu" error pattern analyses and flexible interviews with individual students. As she monitored, she noted students who demonstrated difficulties. For example, Jamaal was working at the drawing center, and Ms. Carlisi noticed that his drawings of fractions did not show equivalency. In other words, the area of the "whole" for each fraction he drew varied. Although he did understand that fractions represent parts of a whole, he did not understand that in order to compare parts for equivalency, the wholes must be equiv-

alent in size. Also, the shaded areas, representing the fractional part, were not equivalent. Figure 8 shows an example of Jamaal's drawings. Ms. Carlisi conducted a quick flexible interview with Jamaal to better understand his thinking about his comparisons. She asked Jamaal to describe why he thought the two fractions he had drawn were equivalent. His response was simply that he remembered that one-half and twofourths were equal. She then asked Jamaal to examine the drawings again and tell her other reasons why the two fractions were equivalent. He explained that both drawings were rectangles so the wholes were the same. He also said that the shaded-in places represented the top number in the fraction. As each group moved through the "Drauings" center, Ms. Carlisi observed that several students in each group made drawings of fractions that lacked equivalency.

Their responses were similar to Jamaal's when she prompted them to describe what their drawings represented. At the numbers and symbols center, Ms. Carlisi noticed that students successfully identified fractions as greater than, less than, or equal to when using like denominators and for common fractions with unlike denominators (e.g.. 3/4. 1/2). However, they consistently used the numerator to determine greater than, less than, and equal to for uncommon fractions with unlike denominators. For greater than, they wrote or identified a fraction with the largest numerator and for less than, they wrote or identified the fraction with the smallest numerator regardless of the value of the denominator. Ms. Carlisi also conducted quick flexible interviews with several students at this center. Students consistently said that the reason a fraction was greater than or less than was because the "top" number was bigger or smaller. When two fractions with unlike denominators had the same numerator, students rationalized that this indicated that the fractions were equivalent. Figure 9 shows the conclusions reached by Ms. Carlisi about her students' understandings of comparing fractions. Ms. Caribi's Insfrucfional Hypofltesis At the end of the day, Ms. Carlisi reviewed results (see Figure 9) of the MDA, which informed her in three specific ways about her students' understandings. First, the results of the CRA Assessment showed that her students could be separated into three distinct groups based on their levels of understandings (CRA/Expressive modalities). These groupings provided Ms. Carlisi a basis for how she would differentiate instruction during the following days. Second, Ms. Carlisi gained insight into her students' thinking about fractions and comparing fractions. By monitoring her students as they moved through the CRA Assessment centers and conducting "at the moment" error pattern analyses and flexible interviews, she learned quite a lot about what her students understood about fractions and how they could apply their understandings to different levels of problem solving.

Figure 9. Conciusions Reached by Ms. Carilsl Based en Error Pattern Analyses and Floxlble Interviews
difficultij representing fractions that are Greater Than. Less Than. Et^ual To using unlike denominators (Abstract & fKepre^entational) dif^tcukij determining Greater Than. Less Than. Ecjual To using sijmbois between fractions with unlike denominators (Abstract & P^presentational) have some abiiitij to do this with fractions that have natural relationships - 2/4 & 1/2; 4/6 & 2/3. (Abstract) difficulUj relating written fractions to drawings: "meaninQ" of what a fraction actuaiiij represents maij be lacking concept of "equivalent area" of whole to part when drawing not evident

14 COUNCIL FOR EXCEPTIONAL CHILDREN

Importantly, she learned about several critical misunderstandings and gaps in knowledge her students had about fractions and comparing fractions. Third, based on what Ms. Carlisi learned through the CRA assessment, her analyses of error patterns, and flexible interviews, she had the information she needed to develop an instructional hypothesis that would guide her instructional planning and teaching focus. Based on these data, Ms. Carlisi decided that overall her students could determine greater than, less than, and equal to for fractions with like denominators at all three levels of understanding. She also decided that students could not determine greater than, less than, and equal to for fractions with unlike denominators and that they had difficulty vinth this at all three levels of understanding. Based on her error pattern observations and fiexible Interviews of students at the centers, she determined that her students" difficulties overall stemmed from their lack of understanding of equivalence as it relates to fractions. Figure 10 shows Ms, Carlisi's instructional hypothesis based on this MDA. Outcomes of the MDA: What Ms. Carlisi Learned About Her Students' Understandings By completing an MDA, Ms. Carlisi equipped herself with the kind of information about her students' understandings that would help her to provide them instruction that would truly meet their individual learning needs about fractions and comparing fractions. She had confidence in her understanding of what her students did know and what they did not know about fractions and comparing fractions, and, most importantly, of their levels of understanding. Ms. Carlisi knew that this information would empower her to help her students successfully move from concrete to abstract level success, developing conceptual understanding along the way. Her newfound knowledge of students' conceptual misunderstandings would allow her to focus her instruction to address their conceptual gaps. The instructional hypothesis she developed provided her a concise yet comprehen-

Figure 10. Ms. Carlisi's Instructional Hypothesis Given: two fractions Students are able to: determine > . < , = when fractions have like denominators at concrete, representational, and abstract levels. Student are unable to: determine > , < . = when fractions have unlike denominators at concrete, representational, and abstract levels. Because: they do not understand the relationship between the numerator and the denominator in a fraction. They also lack understanding that the "whole" of the fraction must be equivalent when comparing fractions. sive statement about what she learned through the MDA and provided her with a meaningful road map for targeting her instruction. By determining that almost all of her students demonstrated a lack of understanding of the importance of equivalency with respect to fractions, she decided that tbis would be a primary focus for whole-class instruction. Ms. Cariisi would begin by modeling the comparison of fractions using fraction bars of different sizes, emphasizing the extent to which the areas of the fractional parts and wholes are equivalent. She would then engage students in pointing out and describing why certain fractional representations are or are not equivalent based on whether the areas of the fractional paris and wholes are the same. For example, she might display two different-sized fraction bars, each showing 1/2. Students would determine whether they are actually equivalent by examining whether tbe fractional parts represent the same area. Then a student would physically compare the concrete materials by placing one on top of the other to verify the class's decision. Initially, Ms. Cariisi would accomplish this within tbe context of a football game similar to the context used for assessment. Then, she would incorporate other contexts related to her students' interests including the use of different area model concrete materials (e.g., fraction circles). Concrete level instruction for the whole class was appropriate, because based on the results of the MDA all students were at least at the instnictional level using concrete materials. The three groups identified through the CRA assessment (see Table 1) provided her a foundation for thinking about how she could differentiate her instruction after whole-class instruction. The group with the lowest level of understanding would continue working with comparing fractions using concrete materials. This group would receive a high level of support from Ms. Carlisi, She would continue to model, prompt students, and provide corrective feedback with this group. Concrete/expressive instruction was appropriate for this group because they all demonstrated instructional level abilities for the concrete/expressive domain in tbe CRA assessment.

Ms. Carlisi equipped herself with the kind of information about her students' understandings that would help her to provide instruction that would truly meet individual learning needs.
The group with a medium level of understanding would work independently in a peer-tutoring format where each peer group was provided with different fractional parts from different area models (e.g.. larger and smaller circles, fraction bars, etc.). Students had to determine which fractional parts (with unlike denominators) were comparable in relation to the area of the corresponding "wholes" (e,g,, two 1 /4 pieces from a small circle are not equivalent to a 1/2 piece from a larger circle, although two 1/4 pieces and one 1/2 piece from the same size circles would be equivalent). Each peer would take turns selecting fractional parts that were equivalent, providing each other feedback about their responses. They would

TEACHING EXCEPTIONAL CHitjjREN JAN/FEB 2008 15

align their materials so that Ms. Carlisi could evaluate their decisions at the end of the period. Although this group was at mastery for concrete and representational levels, Ms. Carlisi wanted to use fheir concrete abilities to emphasize the concept of equivalency. The group with the greatest level of understanding would work together in cooperative learning groups to develop differentiated ways to draw fractions that showed equivalency (e.g., circles, boxes, bars. etc.). Below each set of drawings, students would write a rationale for why the drawings demonstrated equivalency. They would give Ms. Carlisi their drawings at the end of the period for review. Because this group demonstrated mastery at the representational level, Ms. Carlisi wanted to engage them in using their drawing abilities to emphasize the concept of equivalency. Ms. Carlisi would periodically monitor the groups for which she was not providing direct support, providing feedback and positive reinforcement as appropriate. Indeed, probably for the first time, Ms. Carlisi felt confident about teaching mathematics to her struggling students. Finally, she felt as if she would be able to address all of her students* learning needs. She had a clear framework for interpreting what her students knew and did not know about a key mathematics concept, and she had valuable information about the possible conceptual understandings her students needed to develop in order to be more successful. This information gave her confidence to make decisions about how to differentiate her instruction across the CRA levels. She began to think creatively about enhancing her students' understandings of fractions, within a context that would provide them a springboard to develop their mathematical understandings even further. References
Ailsopp. D. H., Kyger. M. M.. & Lovin, LA. (2007). Teaching mathematics meaRingfnlly: Solutions for struggling learners. Baltimore: Brookes. Bottge, B. A. (1999). Effects of contextualized math instruction on problem solving of average and below-average achieving students. The .loumal of Special Education. 33. 81-92. 16 COUNCIL FOR EXCEPTIONAL CHILDREN

Bottge, B. A.. Heinrichs. M., Chan, S.. Mehta, Z. D.. & Wdtson, E. (2003). Effects of video based and applied problems on tfie procedural math skills of average- and low-achieving adolescents. Journal of Special Education Technology. 18{2), 5-22. Bottge, B., Heinrichs, M.. Mehta, Z.. & Hung, Y. (2002). Weighing the benefits of anchored math instruction for students with disabilities in general education classes. The Journal of Special Education. 35, 186-200. Broussard, C. D., & Northup, J. (1995). An approach to functional assessment and analysis of dismptive behavior in regular education classrooms. School Psychology Quarterly. 10[2], 151-164. Bryant, B. R. (1996). Using alternative assessment techniques to plan and evaluate mathematics instruction. W Rirum, 2J(2). 24-33. Federal Register. (2006). Assistance to states for tbe education of children witb disabilities and prescbooi grants for children with disabilities. 34 CFR Parts 300 and 301 RIN 1820-ABS7. 7i(156). 46540-46845. Gersten, R. (1998). Recent advances in instructional research for students with learning disabilities: An overview. Learning Disabilities Research and Practice. 13. 162-170. Ginsburg, H. P. (1987). How to assess number facts, calculation, and understanding. In D. D. Hammill (Ed.). Assessing the abilities and instructional needs of students (pp. 483-503). Austin. TX: PRO-ED. Howell, K. W., Fox. S. L.. & Morebead. M. K. (1993). Curriculum-based evaluation: Teaching and decision-niaking (2nd ed.). Pacific Grove. CA: Brooks/Cole. Kennedy, L. M., & Tipps. S. (1998). Guiding children's learning of mathematics (7tb ed.). Belmont. CA: Wadsworth. Larson. P. J.. & Maag. J. W. (1998). Applying functional assessment in general education classrooms: Issues and recommendations. Remedial and Special Education, 19, 338-349. Liedtke, W. (1988, November). Diagnosis in mathematics: The advantages of an interview. Arithmetic Teacher. 181-184. Maccini, P., & Gagnon, J. G. (2000). Best practices for teaching mathematics to secondary students with special needs. Fbcns on Exceptional Children, 32. 1-22. Mercer. C. D., & Mercer. A. R. (2005). Inching students with learning problems (7th ed.). Upper Saddle River. NJ: Prentice Hall. Miller, S. P., Butler, P. M., & Lee, K. (1998). Validated practices for teaching mathematics to students with learning disabilities: A review of tbe literature. Focus on Exceptional Children. 31, 1-24. Miller. S. P.. & Mercer, G. D. (1993). Using data to iearn about concrete-semiconcrete-abstract instruction for students with math disabilities. Learning Disabilities Research & Practice. 8. 89-96.

National Council of Tfeachers of Mathematics. f2006). Curriculum focal points for prekindergarten through grade 8 mathematics. Reston, VA: Autbor. Schumm. J. S.. Vaughn. S., Haager, D.. McDowell, J.. Rothlein. L., & Saumell. L. (1995). General education teacher planning: What can students with learning disabilities expect? Exceptional Children. 61, 335-352. Sugai, G.. Sprague. J. R., & Horner, R. H. (1999). Functional-assessment-based bebavior support planning; Research to practice to research. Behavioral Disorders. 24(3), 253-257. Van de Walle, J. A. (2006). Elementary school mathematics: Teaching developmentally (5th ed.)- White Plains, NY: Longman. Wehmeyer, M. L., Palmer, S.. & Agran. M. (1998). Promoting causal agency: The self-determined learning model of instruction. Exceptional Children. 66, 439-453. Witzei. B. S. (2005). Using GRA to teach algebra to students with math difficulties in inclusive settings. Learning Disabilities: A Contemporary Journal, 3(2), 49-60. Woodward. J., & Howard, L. (1994). The misconceptions of youth: Errors and their mathematical meaning. Exceptional Children, 61. 126-136. Zigmond. N., Vallecorsa. A., & Silverman. R. (1981). Assessment for instructional planning in special education. Upper Saddle River. NJ: Prentice Hall. David H. Allsopp (CEC FL Federation). Associate Professor. Department of Special Education, University of South Florida. Tampa. Maggie M. Kyger (CEC VA Federation), Associate Professor, Department Head, Department of Exceptional Education: and LouAnn Lovin, Associate Professor. Department Head, Department of Middle. Secondary, & Math Education, James Madison University. Hanisonburg, Virginia. Helen Gerretson, Assistant Professor. Mathematics Education. Department of Secondary Education, University of South Florida, Tampa. Katie L. Carson, Math Specialist, Cornerstone Learning Community. Tallahasse. Florida. Sharon Ray, Doctoral Candidate, Department of Special Education, University of South Florida. Tampa. Address correspondence to David H. Allsopp. Department of Special Education, University of South Florida. 4202 E. Fowler Ave.. EDU 162, Tampa, FL 33620- (e-mail: dallsopp tempest, coedu.usf.edu). TEACHING Exceptional Children, Vol. 40. No. 3, pp. 6-16. Copyright 2008 CEC.

Você também pode gostar