Você está na página 1de 11

Chemistry Education

Research and Practice


View Article Online
PAPER View Journal | View Issue

Guiding teaching with assessments: high school


chemistry teachers use of data-driven inquiry
Cite this: Chem. Educ. Res. Pract.,
2015, 16, 93
Jordan Harshman and Ellen Yezierski*

Data-driven inquiry (DDI) is the process by which teachers design and implement classroom assessments and
use student results to inform/adjust teaching. Although much has been written about DDI, few details exist
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

about how teachers conduct this process on a day-to-day basis in any specific subject area, let alone
chemistry. Nineteen high school chemistry teachers participated in semi-structured interviews regarding how
they used assessments to inform teaching. Results are reported for each step in the DDI process. Goals
there was a lack in specificity of learning and instructional goals and goals stated were not conducive to
informing instruction. Evidence at least once, all teachers determined student learning based solely on
scores and/or subscores, suggesting an over-reliance on these measures. Conclusions many teachers
Received 8th September 2014, claimed that students either did or did not understand a topic, but the teachers did not adequately describe
Accepted 21st October 2014 what this means. Actions very few teachers listed a specific action as a result of analyzing student data,
DOI: 10.1039/c4rp00188e while the majority gave multiple broad pedagogical strategies to address content deficiencies. The results
of this study show the limitations of teachers DDI practices and inform how professional development on
www.rsc.org/cerp DDI should have a finer grain size and be based in disciplinary content.

Introduction Even so, previous research has failed to characterize this process at
a level that is sucient to inform advances teachers can make in
High school chemistry teachers have daily access to assessment their own DDI processes (Harshman and Yezierski, in press).
data from their students. Whether from homework assignments, Herein lies the purpose of our study; to thoroughly investigate
quizzes, informal questions, classroom discussions, or lab activities, how teachers use (or do not use) their classroom data to inform
teachers have virtually endless ways of eliciting student feedback their teaching. Addressing this problem will allow teachers to tailor
and using the information gained in the process to evaluate and their instruction to meet the needs of their specific classroom
adjust their instruction if necessary. Throughout the process of environments with their specific content areas to promote lasting
informing their teaching via assessment results, teachers make learning (Suskie, 2009). By enhancing assessment analysis and
multiple inquiries about teaching and learning, using data to drive interpretation skills, teachers are better able to accommodate their
any decisions made, which is why we refer to it as data-driven individual students (Means et al., 2011). Further, the National
inquiry (DDI). This process goes by many other names in the Academy of Science reported in 2001 that classroom assessments
literature (Calfee and Masuda, 1997; Brunner et al., 2005; are not being used to their fullest potential (National Research
Datnow et al., 2007; Halverson et al., 2007; Gallagher et al., Council, 2001). Eight years later, the Institute of Education
2008; Hamilton et al., 2009), but all describe a very similar Sciences remarked that despite a keen interest in DDI, questions
framework (Harshman and Yezierski, in press). about how educators should use data to make instructional
DDI is carried out daily in the teaching of chemistry and decisions remain mostly unanswered (Hamilton et al., 2009)
teachers are expected to use student data as a basis for improving and three extensive studies from the Department of Education
the eectiveness of their practice (Means et al., 2011). All decisions (Gallagher et al., 2008; Means et al., 2010, 2011) suggest a similar
that teachers make about what they will do dierently or similarly need for investigation. Characterizing DDI practices in chemistry
next year, next topic, next class, and even next minute is classrooms is a necessary first step in determining what reforms in
in some way dictated by information that teachers consider assessment and teacher learning might be necessary to optimize
(Calfee and Masuda, 1997; Suskie, 2004; Hamilton et al., 2009). such DDI practices which have been deemed critical to eective
teaching and student learning.
For any one unit, topic, or concept, teachers provide instruction
Department of Chemistry and Biochemistry, Miami University, Hughes Hall,
501 East High Street, Oxford, OH 45056, USA. E-mail: yeziere@maimioh.edu
to students and then assess students via the variety of assessment
Electronic supplementary information (ESI) available. See DOI: 10.1039/ types (e.g. informal, diagnostic, formative, summative, etc.). We
c4rp00188e focus here on formative assessments, the definition of which

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 93
View Article Online

Paper Chemistry Education Research and Practice

revolves more around what one does with the results as opposed Finally, that action (or inaction) can also be assessed with the
to what it physically is (Bennett, 2011; Wiliam, 2014). When we same or a dierent assessment, making the DDI process cyclical.
use the word assessment, we are referring to anything that The impetus for the investigation lies in two crucial ideas.
elicits student responses regardless of whether it is an actual Firstly, teachers will not adopt new pedagogical ideas if the task of
object or not (informal questioning, for example). Throughout translating those ideas into practice is the primary responsibility
this manuscript, we will detail the process of DDI into four of the teachers (Black and Wiliam, 1998). Secondly, previous
distinct steps as described by general education research, provide formative assessment research has been largely generalized, and
specific science education research of DDI, describe our methods, the specifics of a given content area have not been adequately
and present results from the study according to each of the addressed when considering design and analysis of formative
four DDI steps. The research presented here was born from assessments (Coey et al., 2011). Both ideas suggest the necessity
an extensive literature review we conducted (Harshman and to study formative assessment practices while the content,
Yezierski, in press), reviewing over 80 sources on the topic of chemistry in this case, is fully integrated into the investigation.
using assessment results to inform instructional practice. In a As an example of this integration of chemistry, instead of just
general sense, DDI is simple to explain because it closely asking why a teacher designed an assessment as they did, this
mimics the familiar scientific process of inquiry. Specifically study probed chemistry-specific considerations, such as why the
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

applied to high school chemistry education, we describe DDI in teacher chose a certain mole-to-mole ratio, why s/he included
terms of four steps: determining goals, collecting evidence, unit conversions or not, and what the rationale was for providing
making conclusions, and taking actions. additional information such as molar masses and chemical
Goals are what teachers are hoping to find out via the results equations. Thus, the findings presented here were generated
of their assessments. Goals can range from strictly content because the study was designed around high school chemistry
goals (e.g. Students will be able to explain why electrons transfer teachers course-specific assessments and the central tenet of
in a single displacement reaction.) to more abstract ones (e.g. data analysis was chemistry content learning.
I want my students to better appreciate the energy loss asso-
ciated with internal combustion engines.). Similarly, teachers
can set the goal to evaluate the eectiveness of an instructional Background
strategy (e.g. Does my implementation of the smores lab
promote understanding of mole-to-mole ratios?). Although this In our literature review, we found that research on DDI is
particular aspect of aligning assessments with goals is important not specific enough to guide day-to-day practice and that a
to the entire process of assessment in chemistry, it is largely vast majority of the research does not specify a discipline or
outside the scope of this study but investigated using the same education level (Harshman and Yezierski, in press). Although
interviews presented here (Sandlin et al., in preparation). they do not directly call out a DDI process, there are a handful
Once the goal(s) for an assessment is (are) set, teachers design of projects that have investigated analogous data-analytic
an assessment that will elicit the data needed to evaluate the frameworks or various aspects of DDI in chemistry/science.
goals. After implementation and collection of this assessment, Focusing on the practices of teachers informal, formative
teachers seek out the relevant information that will inform their assessments, Ruiz-Primo and Furtak (2007) reported the
goal(s), referred to as evidence. Like goals, sources of evidence number of complete and incomplete ESRU cycles (Elicits a
range widely from simple, objective scores and sub-scores (e.g. question, Student responds, Recognize the response, and Use
90% students responded correctly) to complex aective char- the information) as well as individual components of each step
acteristics such as the disposition of specific students on the in the process (e.g. teacher revises students words was one
day of assessment (e.g. this student didnt really try). Much specific thing the researchers coded for). In this study, three
like in research fields (Linn et al., 1991; American Educational middle school science teachers assessment practices were
Research Association, 1999; Schwartz and Barbera, 2014), the described according to steps of the ESRU framework. While
validity and reliability of sources of evidence used in educa- this method provides valuable information of how these teachers
tional environments should be considered (Popham, 2002; implement ESRU in their classrooms, focusing only on theorized
Datnow et al., 2007; McMillan, 2011; Witte, 2012). Often in steps of ESRU neglects specific aspects not predicted by the
conjunction with choosing valid and reliable evidence, teachers ESRU model.
make conclusions relevant to the goals of the assessment. More specific to the secondary science setting, Tomanek et al.
Conclusions are any declarative statements that a teacher (2008) investigated several factors that explained the reasoning
makes about teaching and learning, the assuredness of which used to select tasks for students in science courses. This directly
is dependent upon what can be supported by the evidence relates to the goal step of DDI as it dictates the alignment
examined. Of note, it is mostly unavoidable that personal and between the goals and the task. The major finding of this study
professional experience play a role in making conclusions was a detailed framework that delineates how teachers (experi-
about teaching and learning, which is not always a bad thing enced and preservice) choose a specific assessment task. While
(Suskie, 2004). Once data are analyzed and interpreted, teachers these two studies from Ruiz-Primo and Furtak (2007) and
address issues, if any arise, through proposed actions. From the Tomanek et al. (2008) add valuable understanding to teachers
teachers perspective, all actions are instructional actions. assessment process, much more is left to be investigated.

94 | Chem. Educ. Res. Pract., 2015, 16, 93--103 This journal is The Royal Society of Chemistry 2015
View Article Online

Chemistry Education Research and Practice Paper

Research questions Interviews

Although assessment also serves the very important purposes of After informed consent (research was fully compliant with IRB) was
evaluating and providing feedback to students, the focus of our collected, each of the 19 teachers participated in a semi-structured,
research is how teachers use evaluation of students to inform three-phase interview that lasted between 3090 minutes. Next,
their teaching. The research questions are aimed at closing a gap the teachers provided a homework assignment, quiz, laboratory
in the knowledge base about specific ways in which chemistry experiment, or in-class activity (collectively referred to as an
teachers inform their teaching with assessment results: assessment) that had already been administered, collected, and
1. How do high school chemistry teachers guide their evaluated by the teacher prior to the interview. This assessment
teaching via the design, implementation, and analysis of class- was sent a day before the interview so that the interviewer could
room assessments? customize questions for specific items on the assessment
a. What goals do teachers hope to achieve throughout the provided. In Phase 1 of the interviews, teachers were asked
assessment process in general and in one specific assessment? about the definitions and purposes of assessment in general
b. What sources of evidence do teachers consider when and in their own classrooms. For each assessment as a whole
determining eect of their teaching and/or evaluation of learning? (Phase 2) and two to four individual items on the assessment
(Phase 3), teachers were asked (1) what the goal(s) of the
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

c. What types of conclusions are made about students and


as teachers themselves as a result of analyzing student assess- assessment/item was (were), (2) what conclusion(s) could be
ment data? made about those goals, (3) how the teacher determined these
d. What kind of actions do teachers take to address the conclusions, (4) what the teacher would do in light of the
results of assessments? conclusions, (5) how the material was originally taught, and
(6) any questions specific to the content area assessed by the
item or assessment. For the complete interview protocol with
Methods an example assessment, see the Appendix B (ESI).
Sample
Coding
A total of 19 (12 female, 7 male) high school chemistry teachers
from 10 dierent states participated in an interview conducted All transcripts were transcribed verbatim and all coding was
either in-person or via Skype. The sampling method used carried out in NVivo 9.2 software (QSR International, 2011).
is dicult to label definitively, but had aspects of criterion Analysis of data went through multiple iterations within each of
and random purposeful sampling techniques (Miles and five methodological steps: (1) data were open coded (as described
Huberman, 1994). The list of teachers invited to participate in by Creswell, 2007) into 13 major categories. (2) Horizonalization
interviews was compiled by selecting states from the low to develop clusters of meaning (Moustakas, 1994), which were
(approx. 140), middle (approx. 150), and high (approx. 160) transformed to specific codes. (3) Data were coded according to
performance on the science portion of the National Assessment these codes. (4) Individual codes and associated references were
of Educational Progress standardized tests. Using a random organized and coded into the four main aspects of DDI frame-
number generator, 3 states from each category were chosen, work (closed coding). (5) The final organization of the codes was
followed by 10 high schools in each state (found by randomizing created by aligning the clusters of meaning hierarchically under the
a list of each states counties available online and then searching four main steps of DDI. See Appendix C (ESI) for an illustration
for schools within those counties), and finally 1 high school of this entire process.
chemistry teacher from each school was selected using school
web directories. In addition to the 90 teachers chosen this way, Reliability
more were added by listing acquaintances of the authors of this Two separate stages of inter-rater reliability were conducted
paper for a total of 126 teachers invited (15% response rate). to determine the reliability of the coding performed. In both
Science Teacher Ecacy Beliefs Instrument (STEBI, Bleicher, stages, Krippendors alpha (Krippendor, 2004a) statistic was
2004) distribution scores and years of teaching experience are used. In opposition to percent agreement, Krippendors alpha
available in Appendix A (ESI). We purposefully did not collect corrects for chance agreement among raters. Although kappa
any information regarding previous professional development statistics also correct for chance agreement (Cohen, 1960;
programs regarding chemistry-specific DDI for two reasons. Fleiss, 1971), Krippendors alpha calculates chance agreement
First, we assumed that few professional development programs via the proportion of individual codes applied globally versus
(and no chemistry-specific ones) directly address DDI, and it is proportions of agreements and disagreements between raters.
even less likely that our sampled teachers had gained skills This attenuates the dierences that individual raters bring to
or knowledge from such a program (and our results support coding and emphasizes the reliable use of codes given any rater
this). Secondly, we took great strides to discourage teachers (Krippendor, 2004b). Also, kappa statistics have been shown
from responding based on what they thought the interviewer to suer from a base rate problem (Ubersax, 1987), which is to say
wanted to hear rather than what they actually do in their that the interpretation of kappa may depend on the proportions
classes, which could be prompted by even a rudimentary of agreement/disagreement. In the first stage, approximately
description of DDI. one third of two dierent interviews were coded for the four

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 95
View Article Online

Paper Chemistry Education Research and Practice

main steps of DDI (goals, evidence, conclusions, actions) by idea will be shown in boldface followed by the total number of
three chemistry education researchers plus the first author of times that idea was coded across all interview in parentheses
this paper (a = 0.889). In the second stage, one full transcript was (NVivo terms sources and references, respectively). Fig. 1
coded by the same four researchers in accordance with the full shows a graphical representation of the ideas coded along with
list of codes (81 total, a = 0.787). In addition, three researchers the frequencies that they manifested in the data. Appendix D
applied these 81 codes to three other full transcripts (one for (ESI) contains a list of all codes along with their descriptions.
each of the three, the first author coded all three transcripts);
the three pairwise alpha values were 0.848, 0.874, and 0.729. Goals
Cutos for Krippendors alpha have been documented between Given the importance of delineating a goal for each item on any
0.6670.800 (Krippendorff, 2004b). These results indicate that assessment, it is surprising that many teachers did not clearly
the codes were reliably applied throughout the data. None- articulate goals when asked (see Fig. 1). Opposed to stating
theless, the 81 individual codes were revised as needed from objectives those in teacher education might be accustomed to
this process and the entire data set was recoded according to seeing (i.e. students will be able to. . .), many teachers listed
these revisions. the required knowledge, equations, and concepts to solve the item
or use colloquial terms like just see if they get it to describe their
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

Results and discussion goals for an assessment or assessment item. Anecdotally, it


seemed as if the teachers were unaccustomed to describing goals
Results will be presented in terms of the DDI steps, since this for specific assignments despite the heavy emphasis on lesson
framework informed the interview protocol and analysis. planning in high schools. Just because teachers did not delineate
Within each DDI step, major themes and representative quotations their goals clearly does not mean that they do not have well-
are given. To more eciently present the results in the narrative, defined goals for their assessments. However, it does indicate
the number of teachers (out of 19) that were coded for a particular that they may not be consciously thinking about the specific

Fig. 1 Coding organization chart.

96 | Chem. Educ. Res. Pract., 2015, 16, 93--103 This journal is The Royal Society of Chemistry 2015
View Article Online

Chemistry Education Research and Practice Paper

Fig. 2 Example of one of Mandisas assessment item with goals.

reason for assessing students on a regular basis. Fig. 2 shows further sections). For example, when Adie wanted to use assess-
an example problem provided by Mandisa: ments to know if what [shes] doing is working, there are many
Interviewer: So what were the goals for this specific item? inquiries that naturally follow: How did you teach it originally?
Mandisa: Number one, they had to remember that delta H was How have students responded to that teaching style previously?
the sum of the products minus the sum of the reactants and they How does what is measured (student understanding) translate to
had to multiply those delta H values by the coecients, they forget what you wish to conclude (eectiveness of teaching)? How
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

that. . . Then they had to understand how to take that data, and do you measure student understanding of the chemistry pheno-
plug it into the Gibbs free energy. . . they dont look at units. That menon? Yet, these questions were not discussed by Adie or
delta S was in joules, not kilojoules. . . any of the other teachers that had similar instructional goals.
The goals given by the teacher represent the steps that In the data, setting general instructional goals lead to broad
students must complete to arrive at the correct solution. The and oftentimes not very helpful actions.
analysis of this goal, therefore, will only ever tell if students are Not surprisingly, every teacher at some point defined goals
or are not following the correct steps to achieve the right for their specific assessment and assessment items pertaining
solution, a conclusion that does not readily inform instruction to their students understanding of certain chemistry and math
or the degree students have learned chemistry concepts. concepts. This is consistent with a DDI model because, in order
In Phase 1 of the interviews (which does not allude to any to make decisions about instruction, evaluation of student-level
specific assessment), 11 (13) teachers expressed at least one goal aspects must be made. However, just because a goal is about
that was directly tied to their instruction. Examples of this were chemistry content does not mean that a meaningful interpreta-
teachers that wished to use results of assessments to know if tion can be gained from results. Surprisingly, 2 (2) teachers
what Im doing is working (Adie), make the teaching better describe their content-oriented goals in a manner that neither
(Chris), or explain this better the next time (Mark). However, leads to a clearly assessable outcome nor yields valuable
in Phases 2 and 3 of the interview that probed the goals of information, exemplified by Chris and Jess:
specific assessments, only one explicitly stated that she designed Chris: So the goal is basically to expose them to every type of
and used her specific assessment to inform her instruction: chemical reaction that the CollegeBoard could possibly ask them
Alyssa: But almost every day I give, its kind of a video quiz, an and to try to give them a format exactly as theyre going to see it.
assessment of what they saw last night. . . Sometimes that informs Jess: Um, and so that was kind of the intent. So to give them
me as a teacher, did they pick up what I wanted them to get out of practice with a variety of the problems and then in kind of a
that video. If not then its my job to fill the gaps. . . am I doing what structured way.
I need to do to support them? Exposing formats and giving are verbs associated with
None of the teachers stated instructional improvement as a teachers actions that require no student feedback whatsoever.
goal for an individual assessment item, but just because this These teachers may not literally mean that their only goal is to
goal was not explicitly stated does not mean the results did not expose content or give practice. However, their elaboration
inform instruction. However, other evidence presented suggests provides evidence that they are teacher-centric goals because
that these teachers spent little time thinking about their instruc- they do not discuss anything related to students learning when
tion in light of assessment results. making conclusions. Because such goals do not depend on student
While no specific instructional goals were reported by the performance, the results cannot directly inform teaching and
teachers, two types of broad instructional goals can be seen in learning, thus bringing the process of DDI to a halt.
Fig. 1. First, 9 (14) teachers expressed their interest in using Finally, 10 (18) teachers goals for items and assessments revolve
assessments to determine how to make the teaching better around the idea of understanding content. Unfortunately, the
(Chris) or see what areas I need to improve (Mandisa). Second, word understand appears to have many dierent meanings to
Mark and 9 (12) others discussed using their assessments to many dierent teachers and will be discussed below.
determine whether or not to continue on with the curriculum
versus going back and reteaching in some way. Although these Evidence
goals have a growth mindset, they are very limited as they do Chronologically, after teachers design and implement an assess-
not inform a specific conclusion or action because of their lack in ment, evidence must be considered in order to make conclu-
specificity. Being able to conclude one or two specific aspects of sions about teaching and learning. To make these conclusions,
teaching lead to extremely vague suggested actions (detailed in teachers used both evidence that can be measured with a high

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 97
View Article Online

Paper Chemistry Education Research and Practice

degree of certainty (calculated scores, attendance records, etc.) Interviewer: . . .was [conclusion that students could not con-
and evidence that is measured with a lesser degree of certainty struct particulate drawings from scratch] based on what you saw in
(eort, confidence, paying attention, etc.). We refer to these as this assessment?
suppositional (lesser certainty) and non-suppositional (greater Britt: No, just previous knowledge of because Ive done this, Ive
certainty) evidences. The primary source of non-suppositional used these particle drawings before so my expectation is ya know I,
evidence was the use of a percent correct on an item or whole from previous assessments, I know that if I were, would have
assessment. In this logic, teachers concluded that because a thrown something more complicated on there, I ask them to um,
high percentage of the class got an item correct, they must draw something that we havent seen yet, they might not be able to
understand the content. The problem with this logic is that it do that. . . I think its uh, ya know theyve seen this before, but
heavily relies on the assumptions that (1) the assessment is theyve never had to draw them themselves. . .
designed to appropriately elicit chemistry understanding; and Natalie makes the assumption that if a student performs well
(2) the responses from students represent those students in the past they should perform well in the future and vice versa
mental models. Neither researchers nor teachers can guarantee for those that did not perform well. While arguments can be made
that responses to assessment items absolutely reflect students for the validity of this claim, its use as evidence in data analysis is
mental models (Pelligrino, 2012). According to Fig. 1, this over- built on faulty logic, as a wide variety of factors that aect student
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

simplified view of assessment occurred in 16 (40) of the performance on her assessment were not taken into account by
teachers the three teachers that did not exhibit this code Natalie. Alternatively, Britt not only realizes her assessment (which
either never collected their assessment or took completion requires students to recognize features of particulate drawings but
points and did not ever calculate percent correct. Bart provides does not require them to draw them) cannot answer the question
an example of this code: posed by the interviewer, but she also relies on the performance of
Interviewer: . . .do the assessment results seem to support or previous classes and states that students lack of familiarity with
refute this goal, that youve met this goal, or your students have? this type of problem may mask conclusions she can make about
Bart: It seems to support it. her students understanding of particulate images.
Interviewer: And then why would you believe that? Alternatively, suppositional sources of evidence were used
Bart: Because the vast majority of the students got most of that abundantly throughout all portions of the interviews. Compar-
problem right if not completely right. isons using non-suppositional evidence generally led to more
Beyond simple descriptive statistics of scores on an assess- thoughtful conclusions in the sample of teachers whereas the
ment, the only non-suppositional measures of evidence used use of more suppositional measures lead to greater uncertainty
by teachers were the current students performance on future in conclusions. Of these suppositional sources, the most pre-
(13 (20)) and past (5 (6)) assessments with similar content valent were observations from classroom discussions (13 (22)),
and performance of previous classes on the same or similar teacher experience or instinct (9 (14)), and the level to which
assessments (9 (12)). This means that teachers will use past students were paying attention during classes (6 (8)). Teachers
assessments or look ahead to assessments that are not yet given also used evidence that has both suppositional and non-
to students: suppositional characteristics. An example of these sources
Matthew: . . . are the kids really paying attention or do they of evidence were the previous math and chemistry courses of
not know, and I typically find that out on the final assessment students and what that means (7 (8)) and the motivation of
for that unit. students (3 (4)). In this category of codes, special attention is
Nichole: Our final in this unit, one of my classes the average given to teachers who made conclusions about students based
was 89% and the other was 80%, and last year it was down in the on how much practice the teacher perceived the student
lower 70s. And Ill be honest with you, these kids struggle more completed. This idea can be seen in Fig. 1 as a goal (purpose
than the kids from last year. Just, theyre not as bright, and theyre of assessment or item was to give students practice, 4 (8)),
not as motivated. . . source of evidence (conclusions made were based on amount of
In the two examples, both teachers use performance on a practice, 10 (16)), conclusion (students did not get enough
final exam as a source of evidence to make their conclusions, practice, 9 (17)), and an action (give students more practice
but Nichole also compares current students performance to opportunities, 12 (18)). Many times, it seemed as though the
previous years performance (a second source of evidence). teachers were entirely focused on getting practice with solving a
Teachers also used the same sources of evidence in very particular type of problem instead of anything to do with
dierent ways. For instance, consider how Natalie used the understanding the chemistry concepts:
previous performance of her students compared to Britt: Alyssa: Yeah, I think that for the most part, um, I think that
Natalie: [The students] definitely have met [my goals]. I did students were able to do exactly what I had asked them to do
have a wide range from As to Fs, but knowing the students because they had that practice in advance.
background, of course the students that made the As are the ones Britt: Um, but Im happy with where were at so far because
I anticipate getting extremely high scores on the national exams, theyre going to be getting more practice on it.
those that made Fs are thinking they are getting out of the class by Bart: . . .they usually from enough practice can tell me what the
transfer anyways so they have not been trying, so I did not expect limiting reagent is, they can dierentiate from the limiting and
my two Fs to do well. . . actual and find the percent yield. . .

98 | Chem. Educ. Res. Pract., 2015, 16, 93--103 This journal is The Royal Society of Chemistry 2015
View Article Online

Chemistry Education Research and Practice Paper

The use of suppositional evidence in making instructional information could be gleaned from the results, indicating that ideas
decisions is as unavoidable as it is encouraged (Hamilton et al., not directly from assessment results were a large part of the
2009). However, when these teachers used more suppositional teachers data analysis process.
sources of evidence, it was often in isolation of as opposed to in Conclusions tied to the chemistry content are integral to the
tandem with non-suppositional evidence. Making decisions in DDI process, as they inform the specific content area weak-
this way can be detrimental to the process of DDI. nesses of students, which in turn helps identify the pedagogical
strategy used so that it also may be evaluated. As alluded to
Conclusions previously, a majority of the teachers mentioned conclusions
By far, codes that represent conclusions that teachers made about revolving around the idea of student understanding (10 (18)). As
students were the most abundant throughout the interviews. an example, Mark used the root understand and its associated
Looking at the blue regions in Fig. 1, it is clear that student- tenses 37 times in a 64 minute interview (not counting colloquial
centered conclusions (any conclusions about students or learning) phrases such as get it and on the same page). Below is a
grossly outnumber teacher-centered conclusions (any conclusions variety of dierent ways in which teachers concluded that their
about teachers or teaching). This suggests that no matter how students understood something:
much the teachers set out to use the results of their assessments Adie: . . . if they can represent the situation in more than one
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

for instructional modification, few actually concluded anything way, it shows me that they actually understand whats going on. . .
about their instruction, and instead made many declarative they could participate in discussion, which tells me that at least
statements about their students or their students learning. Not they understood enough to discuss it.
all questions can be appropriately used to inform instruction, but Laura: . . .I dont think theyre really understanding the dier-
the high number of conclusions that the teachers made about ence between that that does not represent necessarily two mole-
students that were unrelated to their understanding of chemistry cules of hydrogen and two molecules of oxygen and then making
content was of interest. These conclusions included making the mole thing. . .
aective judgments and determining student characteristics: Nichole: . . .we just were not getting any clear understanding, what
Michael: Yeah, because as I was saying these are capable was the dierence? What were the dierences and what were the
students and highly motivated, theyre very math oriented. . . these similarities of the four models [of bonding], we werent getting those.
are students who have a very high math capability and so once they Amy: . . .they did pretty successfully, they understood the question,
understand the methods here, they can just go run by themselves they understood the concept, they understood the math, and like I said
after that. thats what its all about.
Britt: . . .ya know it really wasnt anything new as far as the Here, a range of meanings of understand can be observed. To
actual content or discussion but that they actually went and took Adie, understanding means that students can represent phenomena
some measurements, just engaged them more so that they, um, they in dierent ways with an emphasis on being able to talk about it,
understood it better. although she does not specify how the students talk about it,
The two examples from Britt and Michael above could just as only that they do talk about it. Laura diers from this by equating
easily be displayed in the evidence section of this paper, as the understanding with demonstrating rote knowledge (of atomic
teachers made conclusions to be used as evidence of under- versus molecular hydrogen and oxygen). Nicole likens under-
standing and learning. Michael informs his instructional decision standing to being able to identify dierences and similarities in
(deciding that he will let them run by themselves) by stating dierent models of bonding. Lastly, Amy both provides scarce
that his students are capable, motivated, and math-oriented. details on what it means to understand something, but only
Similar to previous discussion, these may not be the best sources that they do or do not understand it. The ambiguity of under-
of evidence for which to base what a teacher should do next, as it standing something is just as detrimental to conclusions as it is
could lead to false-positive (teacher believes instruction promoted to goals: without the specification dictated by detailed models of
learning when it did not) or false-negative (teacher believes chemistry phenomena, the teachers gathered very little useful
instruction did not promote learning when it did) results. Beyond information from assessments.
representing evidence, these and other quotes like it reveal what
teachers were aiming to conclude about with their assessments. Actions
Teachers that commented on aspects about students that were In contrast to conclusions, actions that teachers would take in
not tied to the chemistry content were comparable in abundance light of assessment results were scarce. This is most likely a
to aspects that were tied to chemistry content. This finding is very result of not having specific instructional goals and conclu-
interesting because it demonstrates the value of evidence that can sions, because any prescribed action needs to address the
only be obtained by looking outside of the responses to the conclusions made from assessment data. Similar to conclu-
assessment and outside of considerations from the chemistry sions, teachers proposed actions were often vague and marked
content. It was dicult to know if the teachers were determining with indecision:
these characteristics based on the results of the assessment or Adie: Well, if they cant do those things then I need to go back
simply based on their experiences as teachers. Regardless of and present the material again or do it in a dierent way.
which, these conclusions regarding students came up during Laura: So obviously they didnt get the concept, so I need to
interview prompts that specifically asked them to consider what reteach it somehow, so if it doesnt work by just lecturing and

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 99
View Article Online

Paper Chemistry Education Research and Practice

putting an example on the board and doing a couple of them with problems did not mean that they understood the concepts
it, Ill have to come up with a manipulative something. . . behind them:
Adie and Laura both desire to change their instruction, but do Laura: I consider chemistry and application so I feel like, hey,
not detail what changes are going to be made or how the content doing math problems is applying it but again, everybody can plug
deficiencies will be addressed. These were both coded as and chug if they know how to do it, but do they understand why
ambiguous actions (13 (36)) because it is unclear as to what theyre doing it, thats, Ive been asking a lot of why questions
the teachers will actually do. In addition, 10 (14) teachers lately, which I think is helping.
responded with multiple options for instructional adjustments Quotes like the one above from Laura indicate at least
as a sort of laundry list of suggestions, but never committed to partial alignment between goals and items on assessments
one even after follow-up questions trying to pinpoint which because the teachers demonstrate that they interpret results
action(s) the teacher deemed necessary. While coming up with knowing that correct mathematical answers do not necessarily
multiple options for how to adjust instruction is a good thing, indicate (mis)understanding of chemistry concepts.
teachers must eventually decide on one or more actions, or else Additionally, 5 (7) teachers specifically sought feedback from
no changes will be made. Because of the limited actions brought their students by use of metacognitive questions beyond what
up, the data corpus holds limited findings on the decision- are you struggling with? When teachers invest in collecting
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

making processes of teachers course of action. more information from their students, they are gathering more
Another noteworthy action exhibited by the teachers was the data than just the results of the assessment and are better able to
notion of reteaching a concept. While many more teachers inform appropriate instructional modifications. Lastly, a fair
used the words reteach or recover, only 7 (10) gave evidence number (approximately half, although this was never specifically
to suggest they would physically do or say the same exact things coded) of the sample of teachers mentioned writing notes for the
to address a content deficiency. Alternatively, many teachers next time they would implement an assessment or activity. This
implied that they would actually teach with a dierent pedagogy common practice aligns well with DDI as the teacher uses data to
even though they used the word reteach and were thus coded drive what changes (if any) should be made to instruction.
as ambiguous as described above. As an example of the former
group of teachers, Michael stated that he originally taught
stoichiometry to his students using an analogy of turkey sand- Conclusions
wiches and bicycles. When the assessment yielded less-than-
expected results from his students, Michaels response indicates In our response to specific research questions 1a1d, major
that he will do exactly as he did previously: themes were identified in each subheading above. The high
Interviewer: . . .what exactly are you going to do or what have school chemistry teachers in this study set goals that were not
you done to address the mole to mole ratio? very conducive to instructional improvements largely due to their
Michael: Well I go back to bicycles and turkey sandwiches. How ambiguous nature, as was determined by analyzing the chemistry
many seats do you need for each bike, so thats a one to one ratio. content the teachers stated as goals. In evaluating these goals,
How many wheels do you need for each bike, two, thats a two to teachers often used suppositional or non-suppositional evidence,
one ratio, so now lets come back to the chemical substances and but not both in conjunction with the one another, thus missing
its the same methodology. opportunities to cross-reference sources of evidence upon
Our hypothesis to explain this finding is that these teachers which to draw conclusions. As a result, many of the conclusions
adhere to the belief that students need to hear things multiple about student learning were based on potentially faulty evidence,
times before being able to understand the content and/or meet compromising the integrity of the DDI process. In addition,
the teachers goal(s). We argue for this belief because all but one conclusions regarding the teachers themselves as effective educa-
of the teachers who said they would reteach exactly as originally tors were not reported in the same quantity as the goals that were
taught also had heavy emphasis on practice inside and outside made for instructional improvement. This led to an even lesser
of class, indicating the value of repetition in student work. number of tangible, specific actions that teachers said they would
take to improve their instructional effectiveness.
Our study also shows several overall themes associated with
Exemplary DDI how teachers enact the DDI process (response to broad research
Throughout this project, it was never our intention to criticize question 1). First, in every step teachers demonstrated a lack of
high school teachers or to actively search for things they were elaboration when responding to interview prompts. Teachers
doing wrong. So far, we have purposefully shown places may be oversimplifying the processes of learning and measuring
where teachers DDI practices limit their ability to inform their learning, which are both extremely complex, by reducing complex
instruction so that we can promote future research to focus on chemical phenomenon to algorithms for which understanding
interventions in these areas. Despite the previous results, we can be determined by simple correct/incorrect dichotomies.
are happy to report just a sample of the promising aspects seen Secondly, in several of the specific assessment items that
throughout the data. For example, with a heavy emphasis on teachers provided for the interviews, the DDI process was
mathematical assessments and items, 9 (17) teachers noted stunted by having dichotomous goals (e.g. see if the students
that just because students were able to complete algorithmic got it or not) or by teachers dismissal of assessment results

100 | Chem. Educ. Res. Pract., 2015, 16, 93--103 This journal is The Royal Society of Chemistry 2015
View Article Online

Chemistry Education Research and Practice Paper

because of aective aspects. Lastly, based on the results dis- not lead to fruitful conclusions regarding the quality of learning or
cussed, the sample of high school chemistry teachers in this instruction. A great way for teachers to do this is to take extra time
study demonstrated limited DDI practice. We label it limited to ask what does it mean to understand this topic and what
because the teachers implementation of DDI constrains what evidence would show this? and then ensure that assessment
information teachers could draw from their assessment results items can actually elicit such evidence. High quality assessments
and what pedagogical strategies could be employed to support available to teachers such as Page Keelys assessment probe books
eective pedagogy and improve upon weaker pedagogy. from NSTA Press (Keely and Harrington, 2010), concept inventories
(a sample of inventories in redox, Brandriet and Bretz, 2014;
bonding, Luxford and Bretz, 2014; general chemistry, Mulford
Limitations and Robinson, 2002), and the American Association for the
Advancement of Sciences Assessment web site (http://assess
To exhibit transparency in our work, we have provided a
ment.aaas.org/pages/home) can alleviate the burden on teachers
discussion of some limitations with our study. First, we asked
to generate items so that teachers can focus more on aligning
several questions in a format that implies a yes/no answer so
assessments with specific learning goals. Secondly, determining
that teachers would be prompted to commit to explicitly stated
how eective a lesson was in encouraging learning should be a
conclusions as opposed to supplying responses that make it
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

goal of specific assessments and items, not just a general concern


unclear as to what the teacher actually thinks. This encouraged
as many teachers thought of it as. Teachers that actively consider
shorter responses to our prompts and as a result, could have
how their teaching has impacted learning, which is at least to a
increased the prevalence of codes that targeted dichotomous
degree displayed in student responses, will have a heightened
responses. For this reason, the interviewer always asked
ability to reflect on their practice (for reflection theory, Schon,
unscripted questions to elicit more detail and codes were given
1983; Cowen, 1998; for example implementation in science educa-
more stringent descriptions so that what was coded was a result
tion, Danielowich, 2007). We wish to encourage teachers to make
of what the teachers expressed as opposed to a product of the
specific conclusions tied to their instruction; without this reflec-
prompt format. Another limitation was the variety of teacher-
tion/consideration, very few changes to teaching will be occur, as
provided assessment content. To rule out dierences and find
was the case in our sample.
similarities in DDI, we originally limited the content to a few
topics, but ended up with nine chemistry topics throughout the Evidence
interviews due to participant availability. While this provides
The most pressing issue observed at least once from every single
breadth, we recognize that the DDI process may look dierent
teacher (except three that did not collect their students assessments)
depending on the topic being assessed because pedagogical
is the unquestioned assumption that student performance on
strategies and types of assessment may also change depending
assessments unequivocally correlates to understanding of the con-
on these topics. This limitation has theoretical backings that
tent. At a minimum, the use of scores and/or sub-scores on
content knowledge interacts with pedagogical knowledge to
assessments needs to be considered in light of the design of the
engender pedagogical content knowledge (Shulman, 1987). In
assessment and should be considered alongside other sources of
response to this issue, we sought information specific to the
evidence. For example, a teacher should consider more than a class
topic and later generalized the findings into themes that cut
average score to determine student understanding and gather
across multiple topics such as conceptual vs. mathematical
evidence outside of scores, such as similar assessments given
understanding. Lastly, the data used in this project was self-
previously and/or students questions during work time and discus-
reported data from the teachers that participated and findings
sions. Alternatively, teachers overly favored suppositional evidence.
are not based on observational evidence. We do not see this as
While professional experience certainly has a place within DDI
much as a limitation, but more as our focus on what teachers
(Suskie, 2004), it should be noted that judgments of the aective
were thinking during assessment interpretation an aim that
characteristics of students does not come without bias, and caution
observational data would not be able to validly measure.
is warranted when making such judgments. We suggest that
teachers have at least 23 sources of evidence when making an
Implications affective judgment. For example, considering in conjunction a large
absence/tardy record, multiple conversations revealing apathy, and
This study generated many implications for chemistry teachers, unsuccessful attempts to get a certain group of students to partici-
professional development providers, and chemistry education pate strongly suggests a lack of motivation where only one of these
researchers. The implications for teachers and professional sources of evidence would weakly suggest motivational issues. In
development providers are organized by the steps of DDI, many cases, use of only one source of evidence (student perfor-
followed by the implications for researchers. mance data) will not enable the teacher to determine if one goal is
fully met. The obvious solution to this is to collect more evidence,
Goals but this suggestion wasnt mentioned frequently by the teachers.
When teachers set goals for assessments, caution should be used Scaffolding problems is a way to access more information while
when trying to assess students understanding of a certain topic. providing feedback to students throughout the learning process (see
As was seen in our study, many vague goals of understanding did Chapter 5: scaffolding learning in chemistry, in Taber, 2002).

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 101
View Article Online

Paper Chemistry Education Research and Practice

Conclusions how widespread these themes are. After characterization of a


As was observed in the data, without a great deal of conclusions large sample, professional development and pre-service educa-
about teaching, there are few teaching actions that can be tional programs should be developed to address limitations in
specified. Therefore, we reiterate the importance that teachers the process found in larger scale studies. Also, our qualitative
consider the impact that pedagogical strategies have on learning study focused on a bigger picture of the entire DDI process
as measured by assessments. Also, whether students results are with details at each step. Additionally, an investigation into
good or bad, teachers should attempt to specify the why the alignment between stated goals and assessment items,
behind what was observed. Asking clarifying questions such as teachers process while developing assessments, and compar-
what was it specifically about this lesson that worked so well in ing assessment beliefs to assessment practices would be very
helping them learn? or I think that showing a guided example valuable for both researchers and practitioners.
of the whole stoichiometry process followed by an explanation of
each step did not work as well as first explaining each step and
then showing the whole process. In these examples, ideas Acknowledgements
directly tied to teaching are being assessed by the assessment
as opposed to solely assessing student learning. Also, when We would like to thank the teachers that participated in this
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

student learning is being assessed, we suggest (as we did in study as well as the Yezierski and Bretz Research Groups at
the goals section) to avoid making conclusions such as the Miami University. Specifically, we thank Justin Carmel, Sara
students understand this content. Teachers should clarify both Nielsen, and Sarah Erhart for their assistance in qualitative
what chemistry the students specifically understand and what coding for this project.
it means to understand that specific chemistry content.

Actions References
Consistent with our theme of calling for detail, we find it
American Educational Research Association, American Psycho-
important that chemistry teachers choose a specific action for
logical Association, National Council on Measurement in
a specific purpose. If a teacher suspects that one type of activity
Education and Joint Committee on Standards for Educa-
did not promote learning, it is better to uncover (or at least
tional and Psychological Testing (U.S.), (1999), Standards
hypothesize) a reason why it did not than to try something
for educational and psychological testing, Washington, DC:
dierent just because it is dierent. Targeting a reason for
American Educational Research Association.
less-than-expected performance specifically informs a new
Bennett R. E., (2011), Formative assessment: a critical review,
pedagogical strategy needed that could address the reason that
Assess. Educ.: Princ., Pol., Pract., 18(1), 525.
the strategy was less eective.
Black P. and Wiliam D., (1998), Inside the black box: Raising
Research standards through classroom assessment, Phi Delta Kappan,
80(2), 139144, 146148.
In addition to the above implications for teachers, this study
Bleicher R., (2004), Revisiting the STEBI-B: Measuring Self-
also has implications for researchers studying high school
Ecacy in Preservice Elementary Teachers, Sch. Sci. Math.,
chemistry teachers and instruction. Because the teachers from
104(8), 383391.
our sample spoke ambiguously about their assessment pro-
Brandriet A. R. and Bretz S. L., (2014), The development of the
cesses, other studies investigating DDI or general assessment
Redox Concept Inventory as a measure of students symbolic
characteristics may find a similar lack of specificity. As we have
and particulate redox understandings and confidence,
reported, this does not mean that teachers neglect to think
J. Chem. Educ., 91, 11321144.
about their assessment practices, but a carefully constructed
Brunner C., Fasca C., Heinze J., Honey M., Light D., Mandinach
data solicitation method is required to prompt teachers to
E., et al., (2005), Linking data and learning: The Grow
elaborate on their assessment practices. We found that the
Network study, Journal of Education for Students Placed at
discussion of assessment practices in situ with their chemistry-
Risk, 10(3), 241267.
specific curriculum, assessments, and instruction worked well
Calfee R. C. and Masuda W. V., (1997), Classroom assessment
and would serve future inquiries. This outcome emphasizes the
as inquiry, in Phye G. D. (ed.), Handbook of classroom
necessity of studying assessment in a disciplinary context.
assessment. Learning, adjustment, and achievement, San Diego:
Academic Press.
Future work Coey J. E., Hammer D., Levin D. and Grant T., (2011), The
missing disciplinary substance of formative assessment,
While this study investigates the DDI process of 19 high school J. Res. Sci. Teach., 48(10), 11091136.
chemistry teachers, it does not necessarily translate to the Cohen J., (1960), A coecient of agreement for nominal scales,
majority of high school chemistry teachers. Future studies Educ. Psychol. Meas., 20, 3746.
should focus on characterizing which of these traits exist in a Cowen J., (1998), On Becoming and Innovative University Teacher,
representative sample of high school chemistry teachers to see Buckingham: Open University Press.

102 | Chem. Educ. Res. Pract., 2015, 16, 93--103 This journal is The Royal Society of Chemistry 2015
View Article Online

Chemistry Education Research and Practice Paper

Creswell J. W., (2007), Qualitative inquiry and research design: Miles M. B. and Huberman A. M., (1994), Qualitative Data
Choosing among five traditions, 2nd edn, Thousand Oaks, Analysis: An Expanded Sourcebook, Thousand Oaks, CA: Sage.
CA: Sage. Moustakas C., (1994), Phenomenological Research Methods,
Danielowich R., (2007), Negotiating the conflicts: Reexamining Thousand Oaks, CA: Sage.
the structure and function of reflection in science teacher Mulford D. and Robinson W., (2002), An inventory for alternate
learning, Sci. Educ., 91(4), 629663. conceptions among first-semester general chemistry students,
Datnow A., Park V. and Wohlstetter P., (2007), Achieving with J. Chem. Educ., 79, 739744.
data: How high-performing school systems use data to improve National Research Council, (2001), Knowing what students know:
instruction for elementary students, Los Angeles, CA: University The science and design of educational assessment. Committee
of Southern California, Center on Educational Governance. on the Foundations of Assessment, in Pelligrino J., Chudowsky
Fleiss J. L., (1971), Measuring nominal scale agreement among N. and Glaser R. (ed.), Board on Testing and Assessment, Center for
many raters, Psychol. Bull., 76(5), 378352. Education. Division of Behavioral and Social Sciences and Educa-
Gallagher L., Means B. and Padilla C., (2008), Teachers use tion, Washington, DC: National Academy Press.
of student data systems to improve instruction, 2005 to 2007, Pelligrino J., (2012), Assessment of science learning: living in
U.S. Department of Education, Oce of Planning, Evalua- interesting times, J. Res. Sci. Teach., 49(6), 831841.
Published on 30 October 2014. Downloaded on 25/09/2016 11:51:40.

tion and Policy Development, Policy and Program Studies Popham W. J., (2002), Classroom Assessment: What Teachers
Service. Need to Know, 3 edn, Allyn and Bacon.
Halverson R., Prichett R. B. and Watson J. G., (2007), Formative QSR International (2011), NVivo qualitative data analysis soft-
feedback systems and the new instructional leadership, Madison, ware. Version 9.2.
WI: University of Wisconsin. Ruiz-Primo M. A. and Furtak E. M., (2007), Exploring teachers
Hamilton L., Halverson R., Jackson S., Mandinach E., Supovitz informal formative assessment practices and students
J. and Wayman J., (2009), Using student achievement data to understanding in the context of scientific inquiry, J. Res.
support instructional decision making (NCEE 2009-4067), Sci. Teach., 44(1), 5784.
Washington, DC: National Center for Education Evaluation Sandlin B., Harshman J. and Yezierksi E., (2014), Formative
and Regional Assistance, Institute of Education Sciences, assessment in high school chemistry teaching: investigating
U.S. Department of Education. the alignment of teachers goals with their items, J. Chem.
Harshman J. and Yezierski E., (in press), Assessment data- Educ. Res., in preparation.
driven inquiry: A review of how to use assessment results Schon D. A., (1983), The Reflective Practitioner: How Professionals
to inform chemistry teaching, Sci. Educ., Summer, 2015. Think in Action, USA: Basic Books.
Keely P. and Harrington R., (2010), Uncovering Student Ideas in Schwartz P. and Barbera J., (2014), Evaluating the Content and
Physical Science, Washington, DC: National Science Teachers Response Process Validity of Data from the Chemistry Con-
Association Press, vol. 1. cepts Inventory, J. Chem. Educ., 91(5), 630640.
Krippendor K., (2004a), Content Analysis: An Introduction to its Shulman L. S., (1987), Knowledge and teaching: Foundations of
Methodology, 2nd edn, Thousand Oaks: Sage. the new reform, Harvard Educ. Rev., 57(1), 122.
Krippendor K., (2004b), Reliability in content analysis: Suskie L., (2004), What is assessment? Why assess? In Assessing
some common misconceptions and recommendations, student learning: A common sense guide, San Francisco:
Hum. Commun. Res., 30(3), 411433. Jossey-Bass Anker Series, pp. 317.
Linn R. L., Baker E. L. and Dunbar S. B., (1991), Complex, Suskie L., (2009), Using assessment results to inform teaching
performance-based assessment: Expectations and validation practice and promote lasting learning, in Joughin G. (ed.),
criteria, Educ. Res., 20(8), 1521. Assessment, Learning, and Judgement in Higher Education,
Luxford C. J. and Bretz S. L., (2014), Development of Springer Science.
the Bonding Representations Concept Inventory to Identify Taber K., (2002), Chemical Misconceptions Prevention, Diag-
Student Misconceptions about Covalent and Ionic Bonding noses, and Cure Volume I: Theoretical Background, Piccadilly,
Representations, J. Chem. Educ., 91, 312320. London: Royal Society of Chemistry, pp. 6784.
McMillan J. H., (2011), Classroom assessment: principles and Tomanek D., Talanquer V. and Novodvorsky I., (2008), What do
practice for eective standards-based instruction, 5 edn, science teachers consider when selecting formative assess-
Pearson. ment tasks?, J. Res. Sci. Teach., 45(10), 11131130.
Means B., Chen E., DeBarger A. and Padilla C., (2011), Teachers Ubersax J. S., (1987), Diversity of decision-making models and
ability to use data to inform instruction: challenges and supports, the measurement of interrater agreement, Psychol. Bull.,
Oce of Planning, Evaluation and Policy Development, U.S. 101(1), 140146.
Department of Education. Wiliam D., (2014), Formative assessment and contingency in
Means B., Padilla C. and Gallagher L., (2010), Use of education the regulation of learning processes, paper presented at
data at the local level: From accountability to instructional Annual Meeting of American Educational Research Associa-
improvement, Oce of Planning, Evaluation and Policy tion, Philadelphia, PA.
Development, U.S. Department of Education. Witte R. H., (2012), Classroom assessment for teachers, McGraw-Hill.

This journal is The Royal Society of Chemistry 2015 Chem. Educ. Res. Pract., 2015, 16, 93--103 | 103

Você também pode gostar