If the thought of statistics leaves you cold, John Holcomb may be able to help. He prefers an everyday approach to the subject.
Jennifer is a typical introductory statistics student. She is 20, a biology major and, like everyone else in the class, taking statistics to fulfil a requirement for her degree. Jennifer, like most students at Cleveland State University, Ohio, commutes to classes from home and works more than 20 hours a week at a part-time job.
She tells me during office hours of the first week about how nervous she is about taking a statistics course. Mathematics has never been her best subject, she says, and her friends have told her "stats is hard".
In teaching introductory statistics for more than ten years, I have met hundreds of students like Jennifer. Forced to take statistics, they come to class with a great deal of apprehension and misconceptions. It is not their fault. For many years, a great number of statistics courses were taught by requiring students to memorise complicated formulae with little context and motivation.
To help such students, the statistics community has advocated redesigning introductory courses to emphasise the use of real data, computer software, collaborative learning and writing to create a more meaningful course. To this end, I developed with Rochelle Ruffer, of Youngstown State University, Ohio, a sequence of projects that require students to work in teams. Their task was to analyse a large data set regarding the birth weight of infants and report their findings.
The data is a random sample of observations from the North Carolina birth registry. The variables describing the infant include sex, total ounces and whether it was a single or multiple birth. Descriptive variables on the mother include weeks of gestation, education level, age, smoking status and drinking status during pregnancy. On completion of their investigation, each team compiles a report that is graded 50 per cent on statistical accuracy and 50 per cent on written presentation.
The students complete four projects over the term that require different techniques to extract the story that birth-weight data has to reveal. Since the course involves students from a variety of majors, I chose birth weight as everyone knows someone who is, or has been at some time, pregnant.
Statistics almost always involves consultation with others, so teamwork is good preparation for the world of work. Another benefit for students such as Jennifer, who possess a high degree of mathematics anxiety, is that the team provides a natural environment for students to learn from each other.
When we originally designed these projects, we hoped that students could "do" statistics when they left the course. We wanted them to be able to analyse data and report their findings in a professional manner that would be useful for other courses. We had no idea how successful these projects would be.
Then I read about a movement in higher education in the United States called the scholarship of teaching and learning whereby educators conduct studies to evaluate student learning in their courses. Thus, I applied for and won a grant from the Carnegie Academy for the Scholarship of Teaching and Learning to assess the impact of the data analysis projects on students' understanding of statistics.
The main tools used to see how the projects were helping students were one take-home midterm exam and a final exam based on their own individual data set. I designed the data sets from my own research as an applied statistician. They involved variables from a nutrition study on eating habits of first-year university students, a study on bone mass density and osteoporosis in ambulatory subjects and taste-test evaluations of beef, turkey and emu meat. Students worked on the same data set for both the midterm and final.
Drawing on ideas of the "authentic assessment" movement in education, the individual data sets allowed me to test student performance on skills that are essential in their subsequent coursework. Using a score of 80 out of a possible 100 points as a threshold, I found that 90 per cent and 86 per cent students respectively achieved the threshold on the take-home components of the midterm and final exams.
The grading of these exams is subjective, but the benefit of this approach is that I now have a written record of student work. Professors from disciplines that require statistics can look to see how well students performed statistical tasks when they were enrolled in this course.
But there was a further worry. A number of students were not actively contributing to team assignments. I encouraged them to talk to me and not their team, when they completed the take-home exams. I then kept a log of the difficulties they were experiencing with exams.
Other assessment tools for the course consisted of short questionnaires the students received after the homework projects and the midterm examinations. Lastly, they received an 18-question questionnaire at the end of the course, with five options ranging from "strongly agree" to "strongly disagree".
Conducting research on student learning is tricky. In this case, I had no "control group" with which to compare results. It did not seem plausible, or even ethical, to have students from a traditional statistics class complete the take-home exams without having any previous class experience with such a format. Preliminary results were reported at the First Annual Joint UK and US conference on the scholarship of teaching and learning sponsored by the University of East London and City University, London, in June.
Even though students such as Jennifer may not come to love statistics, I believe students like her are now leaving the introductory course having an authentic idea of what statistical analysis involves. Meanwhile, I am learning about how projects I devise impact on student learning.
John Holcomb is assistant professor in the mathematics department, Cleveland State University, Ohio, United States. Materials used in the course can be found at: http:///academic.csuohio.edu/holcombj/ </a>