Tuesday, March 15, 2011

The 'calibration' part works!

My students have finished their Calibrated Peer Review assignment, and now I'm discovering the wealth of instructor resources for interpreting the outcomes and correcting discrepancies.

First is a Student Results page, listing each student's total score on the assignment (text plus reviewing activities), the points earned by the text they submitted, and their Reviewer Competency Index.  The latter is a measure of how well they reviewed the three calibration texts, and is used to modulate the ratings they assigned to the three student submissions they reviewed.  From this screen you can click on any student's name to see all the details of the scores they assigned and received.

Next is a Problems List page.  This again lists the students, this time flagging in red any problems the system noted with their assignment.  Students who failed to complete the calibrations on time are flagged and their Reviewer Competency Index is zero.  Reports that were reviewed by only a single student, or not reviewed at all, are flagged.  So are reports that received discordant reviews (discrepancies exceeding a pre-set tolerance) , and reports that were reviewed by students with very low Reviewer Competency Indexes.

Only two reports had seriously discordant reviews, and rather than being evidence of problems with CPR, both of these demonstrate how well the CPR system works.  The first was a very good report that received one very low score because the reviewer had given too much weight to very minor flaws.  But because this reviewer had also performed badly on the calibration reviews, they had a low Reviewer Competency Index and this poor review didn't drag down the good report's score.  The second problem report also had one very low score and two good scores.  But this time the low score was from a very competent reviewer, and when I read the report I discovered strong evidence of plagiarism, which would certainly justify that low score.  (The other two reviewers evidently assumed that the professional writing was the student's own work.)

The third set of resources are on a Tools page.  Here you can change deadlines for individual students to allow them to submit after the original deadline has passed (I've done this for one student), change student's ratings and scores, and have the whole assignment remarked to incorporate adjustments you've made to a Reviewer Competency Index.  You can also download the complete texts of submissions and evaluations.  You can even access the system as if you were any one of the students (this gets you a caution that any changes you make will appear to have been made by the student).

Overall I'm very pleased with the CPR system.  I've only encountered a few minor problems:  One is that many of my students earned low Reviewer Competency Indexes.  I suspect this is because I had so many calibration questions about specific details of the submissions.  Several students had minor problems submitting the various components by their deadlines.  I think these were mostly because they overlooked a final step needed to complete the submission.  The text-only interface didn't seem to cause many problems - some students had trouble with the word limit (because HTML tags are counted as words), but only one student's submission had text-formatting errors.

Now it's time to start spreading the word around campus about this new resource.  Hopefully there'll be enough interest that UBC will decide to purchase it once our free trial ends.

No comments: