About a week ago, I asked twitter the following question:
As expected, faculty meetings aren’t a beloved institution for teachers. But there are bright spots:
I, too, have hopes that faculty meeting can be a useful and even inspiring gathering of colleagues, and today, I have just a bit more proof of how that is possible.
Our school participates in the College and Work Readiness Assessment (CWRA). The CWRA is a standardized test designed to measure those 21st Century critical thinking skills we always talk about, and it does it with a real world performance task that pushes you to synthesize a bunch of different sources of information to analyze an argument.
My school uses this test to give a very basic measure of how we are doing at teaching these skills and try to being to assess the value we are adding. Since the CWRA is changing a bit in its format, and we haven’t discussed it in a while as a faculty, our academic dean decided decided it would be a good idea devote part of a faculty meeting to this topic.
And we did this not through death by powerpoint but simply by taking the sample task that you can download from the website. After 30 seconds of introduction, he asked the faculty to start working on the task from the midst of a student taking it, and it didn’t take long for the faculty to break into deeply engaged small group discussion. After about 15 minutes of discussion, our academic dean asked the faculty to switch gears and use the whiteboards to write out the answers to two questions:
- (Skills) What critical thinking skills are required to answer this task successfully?
(Validity) To what extent is this a valid measure of those skills?
I was amazed by how quickly the 20 whiteboards around the room were filled with all of the same skills and a recognition of the validity of this test. When we began to discuss our thoughts about the test, faculty quickly saw the value of collaboration in working on this task (something not permitted on the test itself, but a real world skill we want our students to have), and the need for our students to do more interdisciplinary work like this. All of this took only about 40 minutes of meeting time, and I think this exercise left all of us hungry to do more work around this assessment and thinking about our curriculum, which is a pretty exciting outcome of a faculty meeting, in my opinion.
And of course, I don’t think you actually need to participate in the CWRA to have this conversation with your faculty—this is just one way to assess student learning in macro way, and it’s certainly better than many of the other high stakes tests out there I read about. Simply get your faculty or department to spend 20 minutes taking this task, and see where the discussion takes you. You might even try it out with a small group of students to get a sense of how your own students wrestle with the question.
We write comments 3 times a year and give grades at the quarter and end of the semester. One of the things I dislike about this process is the tremendous amount of lag between me writing a comment or determining a grade and a student actually getting this feedback, and this has been true at every school I’ve worked at. It’s not uncommon for comments to take a week or more to get proofread, and then it takes time to get them out to parents, who then pass them along to kids. Often, by kids get their comments weeks after they were written which doesn’t do much to close the feedback loop and give students timely feedback in a way that can be beneficial to their learning. And of course, I dislike quarter grades themselves, a single number attempting to summarize what a student has learned, when we are really in the middle of the game and actually need to be having a conversation about learning rather than summing it up.
So I’ve tried to address both of these concerns in the small ways that I can. One thing I do is send my comments and/or grades to the student via email as soon as I finish them, and I usually enclose them in an email inviting them into a conversation. One side benefit from this is I get one more set of eyes to proofread my comments, but the real reason I do it is to continue a conversation about learning with each student.
But sending emails to 50 students can be quite tedious. Luckily, I have an awesome tool in my arsnel that makes this a cinch—textexpander. Text expander lets you expand short abbreviations like wwork with longer strings of text, like my work email address. But it is so much more powerful than that. I can also create fill in fields so that when I enter type the expansion shortcut, I’m prompted to enter values for the fields that are incorporated into the text.
So now typing a long email with a grade and questions I’d like the student to reflect upon is as simple as typing “1qtr” and filling in the appropriate fields, and I can even add a field to allow me to add a personal note to each student, which I can quickly type, which means I can send personalized emails to 50 students in a little less than an hour. Not bad.
And the payoff is huge. I’m convinced that there is great benefit to students engaging their teachers in 1-1 conversations, even if only in email. I’ve gotten incredibly thoughtful replies from students that give me real insight into their learning and are disarmingly honest, like this:
Last week we had the great honor of having PER legend Eugenia Etkina visit our department for the day to coach our faculty and talk about physics teaching. It was an incredible day.
Immediately upon arriving, Eugenia started observing classes, and by the time she’d seen one of my classes she had two great points.
In my intro physics class, we had gotten off on a tangent about whether air would have a higher or lower specific heat than water. I was trying to elicit arguments from the students, ask them to back them up with evidence and then try to compare them. One of the arguments revolved around comparing the mass of a water molecule and a nitrogen molecule (the main constituent of air). Without thinking too much about the final answer, I was sort of hoping that this student would help us to see that there are many more air molecules in a given mass than there are in an equal mass of water. But in our calculation, we saw that the atomic mass of water was 18, while the mass of a nitrogen molecule is 28. Puzzled, i realized I’d lead us into a bit of a dead end, and so I admitted my confusion and set the students to work on a lab to measure the specific heat of zinc and copper. Not my finest moment. However, afterward, Eugenia reminded me that it is true that water molecules have less mass than nitrogen molecules, and this is exactly why we have clouds, and why water vapor remains suspended int he air. It was a flash of insight that helped me to see just how much I was missing when I was trying to drive the class toward a “correct” justification that air has a lower specific heat than water.
When Mark and I first debriefed with Eugenia, she jumped right into some direct and honest feedback, “You guys like Socratic dialogue way too much,” she explained to us that we were walking students down the arguments and lab setups we had devised in our own heads, when what we need to be doing is having students construct those arguments for themselves. She then told us that the lab experiment to measure specific heat that we were having a group design discussion about (“what will you measure? how will you measure it, etc”) was a waste of time and we should just have students design the lab.
At this point, I was skeptical. My students were struggling with designing experiments—they didn’t see the reason for the decisions we were making and often seemed to crave step by step instructions (which we did not provide). How would they be ready to design experiments for themselves? “Your students are ready,” Eugenia said, and so in the second section of my intro physics class, I simply wrote the question on the board “Devise an experiment to measure the specific heat of Zinc or Copper.” I asked students to write up their proposed experiments on whiteboards, and here is what I got.
- Heat a copper rod by burning peanuts under it. You could then measure the temperature difference, and since you know each peanut has approximately 5 kcal, you can find the specific heat.
- Heat copper rod by putting it in boiling water and then dropping it into room temperature water.
- Put room temperature copper into warm water.
- Put 10 grams of each substance into a bunsen burner, and measure the time that it takes to raise the temperature of each substance by 10°. Then the ratios of the times are the same as the ratios of the specific heats.
These experiments were incredible—students were connecting to some of our past understanding of the energy content of peanuts, as well much of the work we’d done with heat transfer to talk about why you would want to work with metal in very small bits rather than one large cylinder. Afterward, we had a discussion of the strengths and flaws of each experiment, and students were able to carefully refine their design until they came away with an experiment that was very similar to the experiment I was intending to have them do, but was so much better because they had figured out the rationale for each design decision and had a far better grasp of how each decision they would make would affect the overall quality of their measurement.
And of course, their engagement in this task was higher than almost any previous experiment we’ve done. It was a complete turnaround.
Here’s the puzzling thing to me. This idea that students are ready to design their own experiments is something I’ve known for years. We build it into our curriculum many times, and even started the intro class with designing their own experiments to measure the energy content of peanuts and batteries. So why do we later turn to offering more scaffolding and as Eugenia described it “giving them a cookbook lab via discussion?” I’m not completely sure—I think it has something to do with feeling that when we are doing an experiment that is more than just exploration, they need to follow some sort of proper procedure that I’ve designed and given to them. I now see the error of my ways.
Wrap up time
One thing I’ve always done with my classes is tried to maximize the amount of time students are working and thinking about physics. This means that I often just have students working right up until the end of class, when I’ll then shout a quick reminder of the work I’d like them to do that evening.
Eugenia re-emphasized to me the importance of having students go back and reflect on what they learned that day, and offered a number of questions one could ask to help prompt this reflection:
- What did you learn today?
- How did you learn it?
- Why is it important?
Eugenia reminded me of research into human learning that having students try to summarize what they learned is a powerful tool for understanding and helps to form much stronger connections in their brains. So I’ve been setting my phone to remind me when there are 3 minutes left in class, and I take that opportunity to ask the class to try to summarize our learning.
Overall, it was fabulous day that reminded me of the need to constantly seek out new and honest feedback on our teaching. Eugenia is a gifted teacher who had a major impact on my teaching with just a few questions.
This also reminds me of the virtual coaching project from a few years back. Maybe it’s time to get this idea going…
I’m in the market for new ideas for a paradigm lab for the unbalanced forces unit, so I turned to twitter, and as usual, the awesome physics teachers of the twitter sphere did not disappoint.
I’ve put all of the responses in the storify post below. I’m thinking I will probably stick with carts and springs for one more year, but I’m also seriously thinking about just opening up the experiment to have students design their own experiments.
My Intro Physics class started the year with energy, and we incorporated the possibility of heat transfer between the system and surroundings from the very beginning. Our first experiment was to measure the energy content of a battery and a peanut. I’m going to be writing much more about this very soon.
Since then, we’ve been studying heat transfer more closely and I’ve reached a bit of a mystery on an experiment I’d like some help with. We’ve already done a few experiments to compare the thermal conductivities of different materials, and now we are measuring the conductances of various building materials by constructing these light boxes—a 100 W lightbulb surrounded by 5 different types of building materials (wood, plexiglass, various types of foam insulation, and sheet metal-zinc coated steel).
The mystery comes when we try to measure the conductance of the sheet metal. We are measuring the temperature of the air inside the box and the inner and outer walls of the sheet metal using a Vernier Surface Temperature Probe, taped to the plate using white masking tape. The measurement we get for the inner wall temperature is while the outer wall has a temperature of , for a difference of 7.1 degrees.
To calculate the heat flowing through the plate, we need the conductivity of steel the area of the plate , the thickness ( and the temperature difference ( .
Putting all of this together I get:
This is obviously wrong, since we’ve only got a 100 W lightbulb inside the box. How can 54,000 W of energy be flowing through 1 of the 5 faces?
One explanation my colleague Mark Hammond came up with is that the surface temperature sensors aren’t reading just the temperature of the surface of the metal. The inside sensor is reading higher than it should because it is also partially reading the temperature of the the air. Likewise, the outside sensor is partially measuring the temperature of the outside air which is lower than the outside surface of the sheet metal. Both of these effects would combine to produce a larger temperature difference than the actual value and skew us toward a larger heat transfer value. Still, it doesn’t seem like this could account for how wildly off we are.
Shouldn’t it be that when we are in steady state all of the heat transfers through the 5 faces should be close to 90W, which is the rate at which thermal energy is being produced by the 100W lightbulb?
I’d love any ideas you might have for how to resolve this mystery.
The buggy lab is a staple of modeling and many other physics classes. Take a bunch of tumble buggys, modify a few to run slow (short one of the batteries by wrapping it in foil), and then send students off devise an experiment to measure the velocity of their particular buggy. You can read a more thorough description of this experiment on Kelly’s blog.
I want to describe a particular variation on this lab I’ve created that helps students to see the power of computational thinking. Often, I’ve concluded the buggy lab by asking students to figure out, on the basis of their whiteboards, which buggy would win a race. Then, after the class reaches agreement, we test it. This is a great question I got from Frank Noschese that really pushes students to begin to put their newfound CVPM knowledge to use. But I’d like to push this even further and get students to make their first foray into computational modeling.
I’ve written a lot of about computational thinking and modeling before, and I’ve sometimes had students modify a ready made program so that the motion of the buggy on the screen matches the motion of their actual cart. This only takes two lines of editing (the position and velocity statements), and I think it shows students that they can begin to understand a fairly complicated program just by diving in and playing around.
Still, creating a computer model a single cart doesn’t really do much to highlight the power of computational thinking. What if instead, we could somehow run a race of all of the carts together on the computer? Given that the buggies have a tendency to curve, which often makes a race impossible, being able to create a simulated race where the buggies would move in a straight line is a clear example of building a model that disregards unimportant features (the curving motion) and allows us to predict and visualize a race we might not even be able to carry out in real life.
I don’t really want to force students to learn how to make 8 objects move in their program on the second day of class when many of them have no previous exposure to programming.
What if there were a way to automate this? What if I could write a program to take the student’s changes from their individual buggy code and merge them into a single program to simulate the race? Now that would highlight computational thinking.
Here’s how I managed to get this working.
- Students modify their code to simulate the motion of the buggy they studied.
- Students then cut and paste the two lines of code they modified into a google form.
- A python program running on my machine downloaded the form entries from all of the students and then writes a python program that models the race.
Here’s a video of what the merged program looks like that recreates the original experiment with all of the carts on the same table.
Here’s a video of what the race program looks like—the carts have all be turned to move int he same direction and start from the origin.
How this works
Thanks to Josh, I learned about the awesome googlecl (google command line) interface which lets you do almost anything you can do with google apps on the command line. In this case, I’d like to access and download the form data from my google form into a cdv file.
To do this you must first install googlecl, which also requires you install the Google data python client, gdata. This took a bit of fiddling, as I found some incompatibilities between various versions of googlecl and data, but eventually, I found googlecl-0.9.13 and data 2.0.14 are compatible.
With this installed you can now have python make a system call along the following lines:
os.system('/Applications/googlecl-0.9.13/build/scripts-2.7/google docs get CVPMProgramResponses --format csv /Users/jburk/Dropbox/_archive/_Teaching/_SAS\ Teaching/Courses/Honors\ Physics\ 13-14/01-CVPM/Python')
This is simply telling the os to run the googlecl app, and has it get the responses from the google doc CVPMProgramResponses in csv format and then save the on my machine.
After that, I wrote a python program to generate the two programs. I’ve put all of the necessary python files in this github repo. Be forewarned—this is mostly hacked spaghetti code I wrote to get the program working like I wanted.
A gentle introduction
My hope is that this will be a short 5-10 minute introduction to VPython. Students will see that they can modify large programs and observe how easy it is to manipulate objects in VPython. They’ll also get a taste of the real power of computing to merge all of our programs into a single program and create a more simulation that they can then play with and analyze.