News 1 August 2011

Proposal puts “democracy” in research

Proposal puts “democracy” in research - Featured Image
Authored by

AUSTRALIAN researchers have been given a big vote of confidence — from themselves.

One-quarter of Australian public health researchers voted themselves among the top five most influential researchers in their field, as part of a survey of 175 researchers on the nature of researcher influence.

“We were amused by that”, said Professor Simon Chapman, professor of public health at the University of Sydney, who was part of the team that conducted the survey, outlined in a ‘Viewpoint’ article published in this week’s MJA. (1)

“But it’s almost inevitable. People want to give themselves a chance”, Professor Chapman told MJA InSight.

The survey included Australia-based public health researchers who had published 10 or more papers that had been indexed by the Institute for Scientific Information in the past 10 years.

“It was a fairly soft entry point. You couldn’t really call yourself an expert without that.”

Self-nominations aside, there was “impressive consensus” on which researchers were the most influential within five of the six fields studied.

In the MJA ‘Viewpoint’, Professor Chapman and three other high-profile researchers called for a similar process to be used to assess researchers’ track record.

The track record rating is used when researchers apply for research grants or journal publications. It is currently determined by a handful of peer reviewers, and typically counts for 25% of the overall score in grant applications.

“An immense amount rides on it, and the system could so easily be improved”, Professor Chapman said.

The new proposal suggests that when researchers submit grant applications, they could be required to rank the track records of a reasonable number of peers in their field.

“We call it ‘democratising’ the track record assessment, because at the moment it is in the hands of two to three people”, Professor Chapman said. For example, it was proposed that each researcher could rank a random sample of 10 peers from within their field, leading to an aggregate score of between 100 (if all peers ranked them first) and 10 (if all ranked them 10th), which could be used as the track record score.

Links could be provided to publications, citations and grant successes, to assist the ranking process. Early career researchers would be placed in a separate category to ensure they weren’t disadvantaged.

“The proposal was prompted by many discussions that we have had as academics about the often appallingly inexpert and ill informed reviews that we get back from journals or grant assessments”, Professor Chapman said.

MJA InSight asked the NHMRC to comment on the proposal but it declined.

- Sophie McNamara

1. MJA 2011; 195: 147-148

Posted 1 August 2011

Loading comments…

Newsletters

Subscribe to the InSight+ newsletter

Immediate and free access to the latest articles

No spam, you can unsubscribe anytime you want.

By providing your information, you agree to our Access Terms and our Privacy Policy. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.