Friday, May 02, 2008

Homework project progress

Grades are in (and so far have generated remarkably little student angst), so now the homework project shifts from teaching to research. I see that I haven't posted about this project here, so here's a link to a post about it on my research blog, RRResearch. Basically, my ~400 introductory biology students were split into two homework groups - one group got homework with multiple-choice questions to answer, and the other had to provide written answers (one sentence to one paragraph in length).

I'm working with a teaching fellow in our university's Carl Wieman Science Education Initiative (CWSEI); we're addressing two questions. First does having to generate written answers and explanations improve students' understanding of course content? This will be assessed by comparing the scores of the two groups on different parts of the final exam. Second, does the homework writing, and the feedback they get, improve students' ability to write clear and correct sentences and paragraphs? This will be assessed by scoring the quality of their writing on written-answer exam questions and on other components of the course. For most of the writings we'll only be looking at basic errors in grammar, spelling, punctuation, syntax etc.

Now that the exams have been graded we have the data to answer the first question. I've just done some preliminary mean-calculating and graphing, but I'm not going to describe the results here yet, partly because these results need careful checking (I could have made yet another Excel error), and partly because I need to first discuss research-blogging issues with my teaching fellow partner in this project.

We can't answer the second question yet because the students' writing hasn't been scored. Luckily we don't have to do this ourselves; the CWSEI has given us funds to hire assistants to do this. The assistants will be Biology grad students, but we need to first check that the students we hire have good enough English skills to catch all of the students' errors. Our first idea was to put together a small set of error-filled student writing and ask potential assistants to grade it with the rubric that was used for grading the homework answers. We've now polished the rubric to make it better for this new purpose. But in the meantime we realized that we probably weren't the first researchers needing to assess basic writing skills,a nd that our research would have more credibility if we assessed our assistants using tools that had been previously validated. So this morning I called our Writing Centre, which provides a number of non-credit courses to improve students' ability to write in various contexts (Language Proficiency Exam, term papers, etc.). The helpful director suggested I call the English Department's first-year program, which she thought had a test they had previously used to assess potential tutors. I'm waiting to hear back from them.

1 comment:

  1. I am curious if you found a significant difference between the groups... Did your colleague advise against posting about your results here?

    Did you find a test for your markers?

    ReplyDelete