Thursday, February 24, 2011

The CPR site counts html tags as words...

The instructions for my Calibrated Peer Review assignment specified that the reports should be between 300 and 500 words long.  And a nice feature of the CPR assignment setup is that it polices this -  the instructor specifies the acceptable word range for submissions, and submissions that are too short or too long are not accepted.   But a problem has arisen because the CPR submission form only accepts plain text, and needs HTML tags even for paragraph breaks.

Of course very few of my students will be familiar with HTML tags (it's a bit shocking that I'm more web-savvy than most of them).  I thought I had solved the formatting problem by giving them a link to a web page where they could paste in their formatted text (from a Word document) and have it converted to HTML, which they could then paste into the CPR submission form.

But the CPR form counts the HTML tags as words and, for students who have meticulously formatted their reports in Word, this adds hundreds of pseudo-words that push it way over the word limit. 

So I've raised the CPR word limit to 1000, but warned students that their final submission (after CPR is complete) will still have to be no more than 500 words.

Friday, February 18, 2011

CPR almost ready to go, surpasses all expectations

The Calibrated Peer Review assignment for my genetics class is almost ready to go, thanks to lots of work by our Faculty of Science technical support person working with the Centre for Learning, Teaching and Technology (or is it 'Teaching, Learning and Technology...?).

The legal indemnification issue has been resolved; apparently it arose due to a misunderstanding between the two parties about which university was meant by the phrase "the University".  The software has been installed into UBC's elearning system, and the course has been created.

Setting up the student accounts turned out to be a bit of a hassle.  The local CPR setup needs to be told, for each course, the names and IDs of the students who are entitled to create user profiles.  In principle this info should be uploadable from the class list for the course, but for some reason it had to be entered manually (by the tech person, not me).  This problem will need to be resolved by September, as we'll have >400 students in the course then.  (Note added later:  In my Instructor view I see a page for uploading students from a class list file; I don't know if this is what wouldn't work for the tech.  Later: this problem has been solved.)

The students have been sent emails telling them to set up their user profiles (just password, email address and secret question/answer) and complete the introductory tutorial and pretest, and several have already done so. Once they have an account they must agree to the CPR Terms of Use; this requires them to promise to abide by UCLA's Student Conduct Code!  (I couldn't check out the code because the link to it is broken.)

The tutorial and pretest are an excellent feature.  The tutorial takes students through ten pages of information about the different steps of a CPR assignment.  The pages are very clear, with good diagrams and illustrative screen shots.  Once the student has been through the tutorial they take a pretest to check that they've understood how the CPR process works, with 12 Yes/No questions.  The pretest isn't graded, but the system records that the student has completed it.

At this point students are ready to begin their assignment.  Unfortunately mine can't do this yet, because a setup problem won't let me import the assignment information from its home on the CPR Central website at UCLA.  I've finished creating all the assignment components (learning objectives, instructions for the students, calibration essays, review questions, answers and feedback to review questions for each calibration essay).  The local CPR interface asks me if I want to activate a new assignment, and then asks for my CPR Central userID and password so it can connect to the assignments I have there.  But it can't make the connection.

The support tech had her CTLT tech working on this problem yesterday (Later: they're waiting to hear back from the UCLA tech).  If it's solved today (and no other bugs surface) I'll be able to tell the students to submit their drafts by the Sunday midnight deadline.  If it persists I'll have to extend the submission deadline again, and push back the dates for completing the various stages of the assignment.  Fortunately I set up the original dates to have the final polished submission due several weeks before the end of term, so moving to a later deadline won't be a big problem.

Over all I'm very impressed with the high quality of the CPR resources and interface.  Every step has been very straightforward and easy to understand, and the supporting materials for both instructors and students are very well designed.  (Of course, the technical people may think differently...)

Tuesday, February 15, 2011

email to a textbook rep

Dear textbook rep,

I haven't received the brand-new textbook yet (author's name and title redacted), but today I did get the Instructor's Media CD for it, and I've been through all the sets of slides.  I'm afraid it's not at all what we're looking for.

Because the title promises a genomics approach, I was hoping for a text that presented the basic principles of genetics in the context of our new spectacular information about human and other genomes.  Instead it's yet another old-fashioned Genetic Analysis textbook, with no modern genomics at all! 

The first chapter covers molecular biology at a high-school senior/Intro Bio level, and is followed by the standard four chapters teaching classical genetics (Mendel and single-gene inheritance, mitosis and meiosis, linkage and mapping, chromosome structure and behaviour.  All the same material that's been taught since the 1960s, with the odd snippet about gene function, such as the molecular defect in Mendel's wrinkled pea allele.  Then a series of more molecular chapters (DNA replication, bacteria and their viruses, gene expression, gene regulation), all still very classical in the information they present.

Chapter 10 claims to be about genomics.  But what does it contain?  The same old molecular cloning and genetic engineering methods, supplemented with explanations about how microarrays work and how a germ-line transformation is done.  The only genomics is the genome of the bacterium Mycoplasma genitalium, published 15 years ago!  No human examples at all, just flies and fish and rice.

Then more standard chapters with the standard material present in every other textbook: development, mutation and DNA repair, cell cycles and cancer, classical population genetics, classical quantitative genetics.

Students taking an introductory genetics course don't need to learn how to clone genes, they don't need to know what Mendel did, and they certainly don't need to understand how a Southern blot is done.  Even professional geneticists never do Southern blots any more!

More than anything students need to understand their own genomes.  They need to know how inheritance works, and how genes affect phenotypes.  They need to understand natural genetic variation, in their own and other species.  These concepts aren't particularly difficult, except maybe when they're embedded in the baggage of classical genetic analysis.

I'm quite disappointed, as I was hoping that this would finally be a textbook we could use.  But I guess I'll continue to make do without any assigned textbook.

I'm attaching a copy of the lecture schedule for this course, just in case you know of any other textbook that might take a more modern approach.


Sunday, February 13, 2011

Comparing the midterm results

Here's a graph comparing how each student in my pilot genetics course did on the first and second mini-midterms.  The first tested their understanding of the relationship between genotype and phenotype, especially in diploid organisms.  The second tested their understanding of how alleles are inherited through meiosis and mating.  The material on the two tests did not overlap at all.  Both were 25-minute, open book, and mostly multiple choice and short answer.

The blue dashed line separates students who did better on the first midterm (dots above the line) from students who did better on the second (dots below the line).   Although many students did worse on the second one, 14 of the 38 did better.

The two dots in the lower left square are students who failed both tests.  The single dot in the upper-left pink square is the student who failed the first test but passed the second.  The ten dots in the lower-right pink square are students who all passed the first midterm but failed the second (some very badly).

We've just entered the marks for each question, but I'm starting to think that the dataset is too small to allow any more useful generalizations.

Saturday, February 12, 2011

Interpreting a grade distribution

Yesterday my genetics students wrote their second 'mini-midterm'.  This was a 25-minute quiz on the material we've covered in the last two weeks.  Everything went smoothly, the papers are graded, and the grades are posted along with the answer key.  But I don't know how to interpret the grade distribution.  Here's the histogram:

The quiz was open book, with five questions that were designed to require some thoughtful analysis but be very easy to mark (grading 38 papers took four of us about 30 minutes).  The questions weren't too difficult - 3 students earned perfect scores, and most had finished before the time was up.  They also weren't too easy - 12 students failed, and the mean score was only 15.8/25 (62%).

We expect grades on most tests to give a 'normal' distribution - the bell-shaped curve.  The curve may be skewed to the right if the exam was too easy, or to the left if it was too hard. A bimodal curve (with two humps) usually means that the students fall into two groups - those who have acquired some key skills and those who haven't.

But the grade distribution for this quiz looks flat to me, not really a curve at all.  It's flat all the way from 12% to 100%, with no more scores close to the mean than elsewhere.  I've never seen a grade distribution like this before and I don't know how it should be interpreted.  I can find technical descriptions of this shape in the context of a normal distribution (it's 'platykurtic') but I can't find any consideration of what this would imply about either students' abilities or the design of the test.

Maybe I'll ask my colleague in Curriculum Studies...

CPR update

The original plan for our Calibrated Peer Review assignment was that students would submit their draft assignments by last night.  Next week is Reading Week, a mid-term break with no classes or assignments due, so this would give them have a week off before doing their three calibration reviews (one week) and their reviews of three student submissions (the next week).

But getting the CPR system up and running at UBC is taking longer than I had hoped, so I've set back the due date until next Sunday (at the end of Reading Week).

I had anticipated that there might be problems't getting the CPR software to work as intended, and had set the original the due date with the plan that it could be changed. However the problem isn't software implementation at all, but the legal license agreement between UBC and UCLA. 

The agreement includes an 'indemnity' clause that (I think) protects UCLA from the consequences of anything bad that UBC might do. But UBC's lawyers have provided a standard indemnification clause that isn't as sweeping as the one UCLA specifies.  Any changes to this could require months (years?) of back-and-forth between the legal counsels for the two institutions.

The new version of CPR (CPR4) is the first one to have a local component installed on the user's computers - the previous versions all ran entirely on the UCLA system and were remotely accessed by students at other institutions.  So I was concerned that UBC might be the first foreign user, and that all the legal bugs still had to be worked out.

Luckily my UCLA contact assures me that there should be no legal problems, so the local installation is proceeding.  With luck, we'll get all the bugs out next week and be ready for the students next weekend.

Saturday, February 05, 2011

Calibrated Peer Review is 'developmental' for instructors as well as students

I've set aside the original plan of having the students in my new genetics course write letters to the editor about genetics reporting errors, because finding suitable errors turned out to be too difficult for them (but thanks for the suggestions).  The new plan (they voted and chose it) is that they will instead write short reports titled 'My Favourite Human SNP'.  This looks like it will work well both for this small pilot class and for the ~500 students we expect in September, because the pool of SNPs with associated phenotype information is large and growing fast.  The assignment still has the benefit of letting each student choose their own topic, and of being very suitable for Calibrated Peer Review (CPR).

I've already posted on our course-management system a page of instructions to the students about what is expected in this assignment, but now I'm creating the assignment within the CPR system.  Although the CPR interface for the students is run locally (i.e. at UBC) under the new CPR4 system, assignments are generated centrally on the CPRCentral server at UCLA.  This allows the central server to maintain consistency (all authors have to use the same structure) and to provide a library of past assignments that any instructor may use or adapt.

In principle I could have adapted one of the many existing assignments from the CPR library, but none of them were suitable.  Instead I'm writing my own from scratch, using some of the library assignments as models.  Now I'm working through the surprisingly many steps of creating an assignment from scratch, using the detailed Authoring Guide that CPR provides.  This turns out to be much less daunting  than I had expected, and much more enjoyable and educational (for me).

The first step is choosing a title and descriptive information for the assignment.  Because all assignments are put it the open library for others to later adapt, it's more important that this title be informative for future users than that it be the best title for the students.  A suggested student title can be included in the explanatory notes.  The assignment is also given a topic area (mine is Biology - Genetics), keywords, and a user level (mine is Lower-division undergraduate).  This information is used by other instructors but I don't think it's seen by the students.

The next step is writing explicit Learning Goals for the assignment.  I hadn't done this for an assignment before, so thinking through what I wanted the students to learn (guidelines are provided) helped to educate me about the value of setting such goals as well as providing the students with clear expectations (students see these at the top of the assignment page).

Writing the Learning Goals was the first place where the lack of formatting power raised its ugly head.  The CPR interface accepts only plain text with or without html codes (e.g. you have to manually insert
wherever you want a line break).   I expected this to be very frustrating but the combination of a nice page of html tags and my kindergarden-level html skills let me format lists of points and boldface headings without a hitch.  However I anticipate that most of the students will have more difficulty with this - at a minimum they'll have to put in the line breaks.  (Yes, I know paragraph breaks are better, but if you don't have any idea how html tags work, line breaks are easier to understand.)  I'll need to create a short example page for them on Vista showing how to do this.  They can also just use a web page I've found that lets them paste their Word-generated text into a box that converts it to HTML.

The next steps create the resources the students should use to carry out their assignment (to generate the reports that will be assessed by their peers).

I. Guidance for Studying Source Materials:  This tells students how they should gather the information and understanding that will go into their report.  It has two parts, some text describing the source materials (handouts, textbook, articles, etc) and then some hyperlinks to web pages.  Here I just used modified text from the first part of the instructions I'd already posted for my students. I didn't initially notice the second part of this section, so I hand-coded the hyperlinks into my text section - that worked fine.

II. Guidance for Writing Your Text:  This tells students what their reports should say and how this should be presented.  Here I used modified text from the second part of my posted instructions.

III.  A 'writing prompt'.  This appears above the text entry box and gives the student specific instructions about format and text entry.

Now comes the big task of preparing the 'calibration essays'.  Three of these are required; each student will evaluate these using the series of rubric questions that you create in the next step.  One calibration essay is tagged 'high quality', one 'middle quality', and one 'low quality', but they can differ along several axes, with the caution that students will have a hard time evaluating the content of an essay with too many writing errors.  I had already decided on the SNPs I would write these about, but they're only about half done right now so I'll write more about these later.

The next step is creating the series of questions that students will use to evaluate the essays.  The interface makes this quite easy; it lets you specify what kind of answer is expected (yes/no, none/some/many, A/B/C (where you specify what A, B and C mean), and whether the student is expected to enter some text in a box.   I had some questions in mind from the original instructions I'd given the students, and I thought of more while I was working on the calibration essays.  Now I have 17 questions, which I suspect may be too many, even though most of them are very simple: "Does the report contain spelling errors? (none/some/many)"; "Does the report say how common the phenotype of interest is? (yes/no)".  I'll no doubt refine these once I'm into the next step.

The final big step is, for each evaluation question, designating the correct answer for each of the three calibration reports and writing a brief explanation of why this is the correct choice in each case.  This is going to take a while, and I can't begin until I finish writing the calibration essays.  But I'm looking forward to it, because I can see how valuable it will be.

So the above is a lot of details - what's the big picture?  UBC's learning technology people agreed to set up this CPR trial because they saw CPR as much more 'developmental' for the students than our existing peer-review options (iPeer and the peer review component of Turnitin).  I agree, but I'm finding that the experience of creating the assignment is also developmental for me - I'm being gently led through a series of steps that greatly improve the learning experience I'm providing for my students, with instruction at each step so I see both why the step is valuable and how best to implement it.