In previous years my courses used versions of the WebCT course software, but this year they were 'upgraded' to the Vista platform; WebCT has been bought out by Blackboard, so this is a Blackboard system.
Some new 'features' of Vista have worked very well (for example creating and giving homework to sub-groups of students), but others (and some old stuff) have not. I've been keeping a list of the issues that our local tech support people didn't solve, hoping that someone would someday ask me about problems I've experienced. But nobody's asked, so I'm posting the list here (Mainly so I can throw out the sheet of paper it's written on).
First a complaint about the technical support. The problem isn't the support people, who do their best, but the way UBC allocates resources. Years ago UBC decided to transfer funds and responsibility for computer support to local units (Faculties and Departments), rather than providing it centrally for everyone. I don't know whether this was driven by budget issues (dreams of 'cost recovery' were in the air) or by the hope that local support would better suit user's needs, but the consequences for university-wide resources such as Vista have been disastrous. Instead of having a central support group with experts available when we need them, we have many dispersed department-level support people, each working part-time (ours is only available after 4pm) providing support for a system they haven't been able to learn in depth. Worse, each answer they provide is only available to the individual who asked the question. Because there is no discussion board individual users have no way to learn from answers given to anyone else.
So here's my list:
I can't connect to Vista with Safari on my office computer. I can connect with Safari at home, and with Firefox in my office, but when I try to connect with Safari I get the message that I already have a connection and can't have two. And yes, I've emptied the Safari cache, and tried quitting Firefox.
When I do connect with Firefox, I get several problems that I didn't get before I upgraded to Leopard.
First, every time I connect I get 'Code not verified' security warnings (two). I can't find any way to stop them appearing.
Second, when I try to upload files to Vista from my computer, I get a warning that a Java applet isn't working and that I will have to use a more cumbersome method.
Third, when I download results from quizzes, all of the question marks, percent symbols, and single and double quotes in students' answers have been replaced by 'Unicode' codes (e.g. ''' replaces every '?').
Other problems may not have anything to do with using Leopard:
For a while I couldn't upload student marks for one segment of the course. I'd get a 'System exception' error, which apparently just means that for unknown reasons Vista has failed to do what was asked of it.
When I click on View Submissions for an assessment in the Assessment Manager, sometimes I'm taken to the submissions for that assessment, and sometimes I'm instead taken to the submissions for whatever assessment the Manager feels I should be looking at. No rhyme or reason that I can detect.
Sometimes the settings for quiz questions appear to have been reverted, even though I'm quite sure I set them up correctly.
The system claims that it will show me the number of times individual students have read discussion board posts, but the numbers it provides are obviously very wrong.
I've had lots of other problems that our local support person was able to solve, but not these.
I teach genetics and do research in evolutionary microbiology at the University of British Columbia. This blog is about my teaching, and about other teaching-related ideas and issues.
Tuesday, May 13, 2008
Friday, May 02, 2008
Homework project progress
Grades are in (and so far have generated remarkably little student angst), so now the homework project shifts from teaching to research. I see that I haven't posted about this project here, so here's a link to a post about it on my research blog, RRResearch. Basically, my ~400 introductory biology students were split into two homework groups - one group got homework with multiple-choice questions to answer, and the other had to provide written answers (one sentence to one paragraph in length).
I'm working with a teaching fellow in our university's Carl Wieman Science Education Initiative (CWSEI); we're addressing two questions. First does having to generate written answers and explanations improve students' understanding of course content? This will be assessed by comparing the scores of the two groups on different parts of the final exam. Second, does the homework writing, and the feedback they get, improve students' ability to write clear and correct sentences and paragraphs? This will be assessed by scoring the quality of their writing on written-answer exam questions and on other components of the course. For most of the writings we'll only be looking at basic errors in grammar, spelling, punctuation, syntax etc.
Now that the exams have been graded we have the data to answer the first question. I've just done some preliminary mean-calculating and graphing, but I'm not going to describe the results here yet, partly because these results need careful checking (I could have made yet another Excel error), and partly because I need to first discuss research-blogging issues with my teaching fellow partner in this project.
We can't answer the second question yet because the students' writing hasn't been scored. Luckily we don't have to do this ourselves; the CWSEI has given us funds to hire assistants to do this. The assistants will be Biology grad students, but we need to first check that the students we hire have good enough English skills to catch all of the students' errors. Our first idea was to put together a small set of error-filled student writing and ask potential assistants to grade it with the rubric that was used for grading the homework answers. We've now polished the rubric to make it better for this new purpose. But in the meantime we realized that we probably weren't the first researchers needing to assess basic writing skills,a nd that our research would have more credibility if we assessed our assistants using tools that had been previously validated. So this morning I called our Writing Centre, which provides a number of non-credit courses to improve students' ability to write in various contexts (Language Proficiency Exam, term papers, etc.). The helpful director suggested I call the English Department's first-year program, which she thought had a test they had previously used to assess potential tutors. I'm waiting to hear back from them.
I'm working with a teaching fellow in our university's Carl Wieman Science Education Initiative (CWSEI); we're addressing two questions. First does having to generate written answers and explanations improve students' understanding of course content? This will be assessed by comparing the scores of the two groups on different parts of the final exam. Second, does the homework writing, and the feedback they get, improve students' ability to write clear and correct sentences and paragraphs? This will be assessed by scoring the quality of their writing on written-answer exam questions and on other components of the course. For most of the writings we'll only be looking at basic errors in grammar, spelling, punctuation, syntax etc.
Now that the exams have been graded we have the data to answer the first question. I've just done some preliminary mean-calculating and graphing, but I'm not going to describe the results here yet, partly because these results need careful checking (I could have made yet another Excel error), and partly because I need to first discuss research-blogging issues with my teaching fellow partner in this project.
We can't answer the second question yet because the students' writing hasn't been scored. Luckily we don't have to do this ourselves; the CWSEI has given us funds to hire assistants to do this. The assistants will be Biology grad students, but we need to first check that the students we hire have good enough English skills to catch all of the students' errors. Our first idea was to put together a small set of error-filled student writing and ask potential assistants to grade it with the rubric that was used for grading the homework answers. We've now polished the rubric to make it better for this new purpose. But in the meantime we realized that we probably weren't the first researchers needing to assess basic writing skills,a nd that our research would have more credibility if we assessed our assistants using tools that had been previously validated. So this morning I called our Writing Centre, which provides a number of non-credit courses to improve students' ability to write in various contexts (Language Proficiency Exam, term papers, etc.). The helpful director suggested I call the English Department's first-year program, which she thought had a test they had previously used to assess potential tutors. I'm waiting to hear back from them.
Thursday, May 01, 2008
3491 questions about Biology
One innovation this year was intended to build students' abilities to ask questions. Before each week's lectures the students had to complete a brief multiple-choice reading quiz (usually about 5 questions) based on the assigned readings for the week. This year the last question on each quiz (worth 1 point, like the others) asked
Now I've assembled all of the questions, unedited, into a single Word file titled "3491 questions about Biology", which I'm going to email to the other instructors teaching this course. I'll also post it on my own web site; here's a link.
Please give one question about this week's material that you would like to have answered in class.The writing was initially bad, but both the writing and the quality of the questions quickly got much better, and I started posting each week's questions on the course web site for students to read, and using some of them in class.
To earn the point your question must be stated as a question in correct English (e.g. "How do birds fly?", not "how birds fly" or "I want to know how birds fly.").
Now I've assembled all of the questions, unedited, into a single Word file titled "3491 questions about Biology", which I'm going to email to the other instructors teaching this course. I'll also post it on my own web site; here's a link.
Subscribe to:
Posts (Atom)