Research intelligence - Now for post-launch lessons

The running of Australia's first research excellence scheme pleased its head but, she tells Paul Jump, there's room to improve

April 7, 2011



Credit: Swinburne Astronomy Productions
Sky-high plans: the exercise that mapped out Australia's research strengths found much of its science to be world-class


Margaret Sheil, chief executive of the Australian Research Council, was handed oversight of the country's first national research assessment programme "for Christmas" in December 2007, but it was not exactly what she had always wanted.

As deputy vice-chancellor for research at the University of Wollongong, she had worked on developing the research quality framework, a programme "sort of" based on the UK's research assessment exercise, complete with impact assessments.

In November 2007, just three months after Professor Sheil took up her post at the council, a new Labor government came to power and asked the ARC to develop a cheaper, metrics-based alternative. Old hands warned her to steer clear, fearing that the inevitable controversy surrounding such an exercise could undermine confidence in the council.

Sheil, a former chemist, was also keenly aware of how much damage being branded "the person who stuffed up" the Excellence in Research for Australia (Era) programme could do to her ambitions to return to higher education practice one day.

Nonetheless, she accepted the challenge. And although the Era elicited some very audible howls of outrage even before the results of its first iteration were released earlier this year, Professor Sheil said it went as well as she could have hoped and also helped re-establish the ARC as an "important policy player".

"The number of complaints is extraordinarily low for something of this scale," she told Times Higher Education. "The feedback we are getting is that it was fair and not too unexpected for universities. Where there has been angst, it has been where people haven't really engaged and have woken up to discover that they have an Era score they didn't know about or understand."

For the most part, she continued, everyone recognised that this was the ARC's "first go and that we can do some things better next time".

First among these is the journal rankings that formed part of a "dashboard" of indicators used by Era assessors alongside, where appropriate, citation averages, esteem indicators and peer-review reports of non-journal outputs.

Of some 22,000 journals ranked for the Era, Professor Sheil said "we didn't necessarily get it right" in just a small number of cases: "There are only about 200 where people are still really upset." But she admitted that the ARC had not scrutinised "as much as we would have liked" the methodology used by the various "peak bodies", such as subject associations, to which it delegated rankings decisions. This will be corrected in the next Era, she added.

This will take place next year to allow the ARC an early opportunity to correct mistakes. The differences between the 2010 and 2012 results will inform decisions on the frequency of future exercises.

Before the next Era, Professor Sheil hopes to "educate" academics and administrators against trying to tailor all their research to top-rated journals, noting that journal rankings formed only a minor part of the assessment formula in most of the 157 fields into which research was separated. Specific formulas were suggested by experts from each field and endorsed by the eight discipline-wide assessment committees, which met to assess the metrics and peer-review reports.

No table talk

Unlike some countries' initiatives, the Era's aim was not to rank universities but to assess where national research stood internationally. Professor Sheil is sceptical about producing institutional rankings because results depend on a university's size and subject offerings.

She also dismissed the notion that Era data could be used to build meaningful league tables because they lack volume measures and fine distinctions. But as newspapers tried to do so anyway, and seem sure to do so again in 2012, she intends to give them "more help" to draw up something more "meaningful" than simply averaging averages.

She was not surprised to find that while much of Australian science was above the "world standard", significant areas of social science were below it. Noting that investment in social science had historically been low and that institutions had not had a funding imperative to develop "scale and focus" in the discipline, she said that social scientists had no cause for alarm.

A consultation on how to use the Era results is still under way, but Professor Sheil said their primary purpose will be to help the government, the ARC and universities make "strategic decisions" about what subjects to invest in. This could mean putting more money into vital and popular but underperforming subjects rather than concentrating on areas of international excellence.

"It will be more of a threat to expensive disciplines that are not performing and don't have high student numbers," she said.

paul.jump@tsleducation.com.

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments

Featured Jobs

Assistant Recruitment - Human Resources Office

University Of Nottingham Ningbo China

Outreach Officer

Gsm London

Professorship in Geomatics

Norwegian University Of Science & Technology -ntnu

Professor of European History

Newcastle University

Head of Department

University Of Chichester
See all jobs

Most Commented

men in office with feet on desk. Vintage

Three-quarters of respondents are dissatisfied with the people running their institutions

students use laptops

Researchers say students who use computers score half a grade lower than those who write notes

Canal houses, Amsterdam, Netherlands

All three of England’s for-profit universities owned in Netherlands

As the country succeeds in attracting even more students from overseas, a mixture of demographics, ‘soft power’ concerns and local politics help explain its policy

Participants enjoying bubble soccer

Critics call proposal for world-first professional recognition system ‘demented’