Forget school league tables. A US-developed expert system will give a fuller picture of school performance says Carol Nahra. Researchers at the University of Maryland are developing an innovative software system linking results of the state's mandatory school testing programme to Internet resources.
The system, believed to be the first of its kind, will be based upon a remarkable bank of student-performance data gathered from every public primary and middle school in the state. The tests used to assess pupils' performance have been hailed by educationists as the true measure of student ability, and are at the forefront of school reform efforts in the United States.
A sharp departure from traditional fill-in-the-blank tests, the assessments measure students' critical thinking through group tasks, written essays, and multidisciplinary story problems. The test results provide detailed information about student performance, with specific categories such as estimation and probability in mathematics.
States such as California and Arizona have abandoned similar testing programmes through political and financial pressures. In Maryland, it has become a high stakes game. The state has the power to take over schools which are not progressing adequately towards standards for the year 2000. Test results have begun to influence real estate prices, and the business community has rallied around the testing programme as the key to producing competent school leavers.
In four years of reporting results, the tests have caused friction between state and local government. Nearly all of the schools the state has placed on a "danger list" have been in Baltimore, Maryland's largest and poorest city. Yet although the programme has yielded the most comprehensive state-wide databank on student performance in the country, schools have had difficulties using it effectively. Without resources, poorer schools can do little more than recognise that they at the bottom of the heap.
"The problem with providing schools with printed reports is that they have to be canned and of generic use," says Mark Moody, assistant state superintendent at the Maryland state department of education. "To do anything with the data you have to be statistically inclined, so you can analyse and manipulate it. So most people end up looking at the reports in a blunt fashion to see if they are better or worse off than before."
As the stakes tied to the testing programme have grown greater, school community members including businesses have increasingly asked for real access to the data, Dr Moody says.
The problem seemed custom-made for a technological solution. Dr Moody approached Denis Sullivan, director of the University of Maryland College of Education's centre for learning and education technology. They won an $800,000 grant from the Department of Education to develop and test the system over three years. Its development coincides with Maryland Governor Parris N. Glendening's efforts to get every school online by 1999.
The project is modelled after the type of expert system used by doctors to help make diagnoses. Users logging in will be offered information appropriate to their perspective as teacher, parent, principal or businessperson. From the abundance of data on each school, users will be provided with a tailored guide to school strengths and weaknesses. "You're not wading through a stereotypical tutorial of how to improve instruction, but the focus will be on the specific issues uncovered by analysis of the school's results," says Dr Moody.
A teacher will be able to look at the performance of her fifth-graders across a range of subject areas. She will be tutored in the meaning of the data and told which trends are significant and which might be anomalies. She will be able to compare her school to others of similar demographics throughout the state, identify which schools are high performing where hers is weak, and make contact through email with teachers in those schools. After providing a customised diagnosis, the system will point to best practices in identified areas of weakness. Rather than allowing users to wander through a maze of Internet resources, the system will recommend specific practices. "There is a lot of research out there being done on a daily basis in the university setting," says Dr Moody. But not only is it unindexed but it is also unranked, so that all ideas have equal weight. The University of Maryland is convening a group of academics and practitioners to select resources the system should point to as relevant to Maryland's reform efforts. "The research recommendations will relate very specifically to the needs of particular schools and teachers," says Professor Sullivan.
If successful, the system will accelerate school reform at a speed which has proved elusive . It will strengthen links between pre-service and in-service education. Maryland's emphasis on performance assessment is forcing teachers to focus on the use of content in problem-solving ways. "That is quite different than traditional methods, and requires that teacher trainees understand the difference between conveying facts and problem-solving performance," says Dr Moody. "Through this collaboration with the University of Maryland at College Park we hope to heighten the awareness of the university faculty who prepare these prospective teachers about the implications of using large-scale performance measures for teacher preparation."
Professor Sullivan agrees the collaboration has brought the worlds closer together: "It's one thing to do research, but if it doesn't affect practice, then it really isn't worth doing. What we're particularly focussed on in this project is bridging the gap."
Carol Nahra was formerly a grants officer at the Maryland State Department of Education.