探花视频

Grant assessors accused of using ChatGPT to write feedback

Applicants to one of Australia鈥檚 major funding streams report spotting tell-tale signs of chatbot use

Published on
June 30, 2023
Last updated
June 30, 2023
Source: iStock

Peer reviewers assessing grant applications for one of Australia鈥檚 major funding bodies have been accused of using artificial intelligence chatbots to produce their feedback.

Applicants for grants of up to A$500,000 (拢262,000) awarded under the Australian Research Council鈥檚 Discovery Projects scheme alleged spotting the 鈥渢ell-tale鈥 signs of ChatGPT when receiving feedback from assessors, according to the .

One had even forgotten to remove the 鈥渞egenerate response鈥 prompt that appears at the bottom of all ChatGPT-created text, it was claimed.

Applicants said the reports were a 鈥済eneric regurgitation鈥 of their applications with little evidence of critique, insight or assessment, ARC Tracker said.

探花视频

ADVERTISEMENT

It added that the practice was 鈥渆ntirely predictable鈥 because of time pressures on researchers and part of the problem was that the ARC had 鈥渄one nothing鈥 to prevent the use of AI chatbots in assessing grants聽because聽it聽was not mentioned in the guidance issued to assessors.

ARC Tracker said the funder should consider banning those found to be using ChatGPT for this purpose and reporting them to their universities.

探花视频

ADVERTISEMENT

Philip Dawson, an academic integrity researcher and co-director of the Centre for Research in Assessment and Digital Learning at Deakin University, that the behaviour 鈥渟hould be treated worse than just an inappropriate review鈥.

鈥淩esearch grants are meant to be confidential and sharing them with [ChatGPT creator] OpenAI is a significant IP breach,鈥 he added.

Responding to the concerns, the ARC said it was 鈥渃onsidering a range of issues regarding the use of generative artificial intelligence (AI) that use algorithms to create new content (such as ChatGPT) and that may present confidentiality and security challenges for research and for grant programme administration鈥.

It reminded all peer reviewers 鈥渙f their obligations to ensure the confidentiality of information received as part of National Competitive Grants Programme processes鈥.

探花视频

ADVERTISEMENT

It said the Australian Code for the Responsible Conduct of Research set out that 鈥渋ndividuals are to participate in peer review in a way that is fair, rigorous and timely and maintains the confidentiality of the content鈥.

ARC said it had robust processes in place to consider聽concerns about how confidentiality had been managed during a review.

鈥淩elease of material that is not your own outside the closed research management system, including into generative AI tools, may constitute a breach of confidentiality. As such, the ARC advises that peer reviewers should not use AI as part of their assessment activities.鈥

Guidance on this area would聽be updated 鈥渋n the near future鈥, the ARC added.

探花视频

ADVERTISEMENT

tom.williams@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

The AI chatbot may soon kill the undergraduate essay, but its transformation of research could be equally seismic. Jack Grove examines how ChatGPT is already disrupting scholarly practices and where the technology may eventually take researchers 鈥 for good or ill

16 March

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT