Wednesday, April 11, 2012

Claim Evidence Reasoning - More on Reasoning

Mylene asked about helping with the reasoning part of Claim, Evidence, Reasoning. (Mylene - get on twitter already. We could just talk about it.) Here's the relevant snip:
One thing I'm really struggling with is the concept of "logic" or a conclusion following from its premises. It's hard for my students to understand what I mean by this and it's hard for me to explain in other terms. So far, it appears that their definition of "logical" is something along the lines of "familiar" or "what I was expecting." Any suggestions? How do you handle reasoning that is preposterous or that makes leaps of faith?
This is definitely the hard part. I don't have any magic bullets, but I can tell you where I am right now.


1. Faulty reasoning is most often due to a lack of content knowledge. At the beginning and middle of a unit, this is expected. Honing reasoning as we go is one of our primary goals.

If things aren't going well though, my best advice would be to narrow the question. Broader questions are usually more complex. I started conservation of mass with, "What happens to atoms in a chemical reaction?" (Actually, the question was about why ice melts when you heat it but paper burns) and kids were flying all over the place. They generated their own claim and designed their own experiments and it was a disaster. There was no real way to differentiate through experiment most of their claims (atoms are being fused together, atoms are exploding, atoms turn into heat). I rebooted with whether atoms are destroyed or not in a chemical reaction and it worked much better. We all did a similar experiment and students were able to make a well-reasoned argument that atoms weren't destroyed.

The counterpoint is that I start other topics, like kinetic molecular theory, very broadly. I can't say for sure why some topics need to be narrower but it is some combination of how much background knowledge students come in with, how well our classroom experimentation can differentiate between competing claims, and how much time I'm willing to devote to testing competing ideas.

2. There are mechanical issues. For MS kids, sentence frames and starters go a long way. Be sure to fade them as you go. Another teacher at my school has a lot of success with sentence starters in group discourse. He posts them at his tables. Instead of CER, some teachers like C-ER-ER-ER where you give a reason after each piece of evidence.

3. Ask students to explain ahead of time what different results would mean with regard to the claim. Before they could start the experiment to test if atoms are destroyed or not, my students needed to explain to me that if the bottle got lighter, it meant atoms are being destroyed. If it weighed the same, they weren't. If it weighed more, something else was going on that we couldn't explain (this is an important and often overlooked step). In essence, I was locking them in to a reasoning. Is that preventing them from doing any thinking? I say no because I'm still asking my students to reason, it's just the timing that changes. If I do it after, students more often construct weird explanations for the results and engage in all kinds of magical thinking and confirmation bias. This is assuming the purpose of a lab is to test a specific claim. Sometimes a lab is just to get ideas rolling in which case we do our reasoning after. [Edit: Have your students sketch a "prediction graph" on their whiteboards ahead of time.]

4. Distinguish between reasoning and introducing a new claim. It is hard for students to see when their reasoning is backing up a claim and when they're making a brand new assertion. I don't have a good method for this other than continually asking them to go back to what their claim actually predicts. It's also important to help them with the idea that a single experiment doesn't have to explain everything. We burn paper and weigh it and it weighs the same. That only tells us that the atoms aren't being destroyed. That means we often need to use multiple pieces of evidence to converge on a single, more complex, explanation.

5. Differentiate between logical/scientific reasoning and reasoning from evidence. I think this idea came from a paper by Deanna Kuhn2 but I'm not sure. The part I really latched onto was that students come into experiments in science class with the idea that something has to covary. We're measuring this and manipulating that so clearly there's a relationship between the two. Students aren't asked to actually look at the data created but simply to reason about the science involved. In hindsight, this is certainly true in most of my classes. I run some kind of amazing science lab where the null hypothesis is always rejected. My takeaway is that occasionally students need to be doing experiments where no relationship will actually be found. I've gotten better at this both in intentionally designing them in and allowing students to run an experiment that I know will result in no relationship. The first few attempts at this what do I find? Students can ALWAYS find a relationship. It's a long slow process of breaking this habit.

The other insight I got from that paper was that students don't really understand measurement uncertainty. I definitely find that to be true. I like Geoff's approach but I don't really do anything about this other than hate how I don't do anything about it.


This is getting long. I'm going to stop here. I've got two more posts on CER in my drafts but history has shown I'm awful at following up on promised posts.




-------

1: The flipside is the argument that more attention to persuasion will lead to better arguments. I'll write more on this next time. (EDIT: This footnote doesn't go anywhere but I'll keep it to remind me.)
2: I hadn't heard of the Education for Thinking Project until I googled her for a link. Looks right up my alley.

Friday, April 6, 2012

Claim Evidence Reasoning

By far, the biggest shift in my teaching from year 1 to year 7 has been how much emphasis I now place on evaluating evidence and making evidence-based claims.

I blame inquiry. Not inquiry in the generalized, overloaded, science teaching approach sense. Just the word. "Inquiry."

Even now, when I hear the word "inquiry" I still think mainly of asking questions and designing experiments. A bad side effect of thinking in this way was that I would spend far too long having students ask questions and design experiments and very little time evaluating evidence and generating claims. On good days, when we miraculously got cleaned up before the bell, I might have spent 10 minutes at the end of class telling students what they were supposed to have figured out and students would answer questions like, "Explain how you know mass is conserved in a chemical reaction." (Answer: Because you just told me by asking that question.)

We were very busy and very engaged and learned very little.

There are a few structures I've been using to help shift the focus on the class to analysis and argument. One of them is the claim-evidence-reasoning framework. I'm not sure who the originator was but most of what I do came from Katherine McNeill who has published a ton on it. She wrote a book too.1


Claim-Evidence-Reasoning (pdf and pdf) is a framework for writing scientific explanations. Occasionally I use it as a probe in the style of Paige Keeley. After most labs, students are asked to write about a paragraph worth in this format and a more extended version when we wrap up a unit.

 As part of their lab handout they get a prompt that looks like this:


As the year goes on I remove most of the scaffolds until ultimately the students just get a prompt or question.

I've been happy with it. There's a fourth step, Rebuttal, which I've never gotten off the ground. 

I like frameworks a lot. I like having specific language I can refer to over and over. In a typical initiate-respond exchange, I follow-up by ask students for their reasoning. When students want to whine or talk back to me, I get to ask them, "What is your evidence? How does that evidence support your claim that I'm the most annoying teacher in the world?" (Evidence #1: The fact that I ask that question.)

The key to implementation is that the structure of the class really has to be designed around C-E-R. Even if I'm just direct teaching something, I need to model how to think about the evidence that led to the claim.


I also give my students a whiteboard format now. It usually looks like this:



I used to structure it in Claim-Evidence-Reasoning order but I realized students would then write their claim before they got their evidence.

McNeill uses C-E-R for essentially everything. Any major idea can be written up in that format. I think the framework works best when students truly have something to argue about. Of course, I'd say that all learning works best when students have something to argue about.

I've had a lot of success with using it when students are constructing the kinetic molecular theory or deciding if mass is conserved in a reaction or not. Why things float or sink was fun. "Is Pluto a planet?" is a good one. A lot of Dr. McNeill's papers are written around a multi-week unit designed around the question, "Are soap and fat different substances?"

I've been happy with the results. We used to spend all of our time just generating data. Now that data is being put to use.




----------------------

For more on writing, I show up in an interview in ASCD Education Update titled Improving Student Writing Through Formative Assessments. It's only for ASCD members but most of what I have to say is in Managing Feedback.


Addendum: Kirk shared a free NSTA article called Engaging Students in the Scientific Practices of Explanation and Argumentation. Berland and Reiser in particular have written a number of good articles on the topic.

1: I can't fully recommend the book. It's pretty basic and you can find most of the information from googling around for her various research papers. If you're not interested in dealing with that you might want to give it a go.