Edinburgh staff urge university to ditch OpenAI deal

Academics raise concerns over ChatGPT owner’s links to US military and claim tools are ‘looking to replace knowledge workers’

Published on
March 25, 2026
Last updated
March 25, 2026
ChatGPT on phone
Source: iStock/hapabapa

Staff at the University of Edinburgh have demanded an end to the institution’s partnership with technology giant OpenAI, calling it “unsafe” and “insecure”.

In an open letter signed by more than 350 people, academics say they fear that the company “does not align” with the university’s artificial intelligence and procurement principles. 

“We are publicly writing to you as the University of Edinburgh’s contract with OpenAI is coming to an end, and we, as members of the university, wish to express our concerns and ask that the relationship with OpenAI does not continue,” they write.

The letter argues that “there are numerous outstanding court cases regarding harm and death arising from the use of OpenAI models”, and that the company has suffered a high number of of data breaches compared with other LLM providers. Labour practices are also flagged, citing news stories that alleged content moderators were being ‘exploited’ on low wages. 

ADVERTISEMENT

Concerns are also raised over allegations of bias in ChatGPT’s responses, and the letter claims the company has a poor history of transparency and accountability.

The letter also outlines concerns about recent geopolitical issues involving the company, including a recently announced partnership between OpenAI and the US Pentagon, which allows its AI tools to be used in the military’s classified systems.

ADVERTISEMENT

Environmental issues are also raised, given the high energy-usage of artificial intelligence, which is said to conflict with a university policy that states supply chains should “have targets and action plans in place to reduce their carbon emissions”.

One signatory of the letter, James Galbraith, a postdoctoral research associate in the School of Biological Sciences, told Times Higher Education he was uncomfortable with the relationship, and believed the partnership should not be renewed.

“The central issue is that contracting OpenAI to provide LLMs to staff and students does not follow the university’s AI policies, in particular the labour rights issues, the impact their data centres are having on the communities they have been built in, and their contracts with the US military and ICE,” he said.

Galbraith raised concerns that recent statements made by the company indicate that OpenAI is “looking to replace knowledge workers like us and the students we teach with predictive algorithms”.

ADVERTISEMENT

“No university or education provider should be contracting such a company,” he said.

The letter notes that the calls to end the university’s contract with OpenAI have been “backed by several of the university’s AI experts, along with many members of the wider community”.

Several universities have now established relationships with the ChatGPT operator, including the University of Oxford, while the University of Manchester became the first higher education institution to establish a partnership with Microsoft Copilot in January.

Gavin McLachlan, vice-principal, chief information officer and librarian to the University of Edinburgh, said: “The university aims to provide all students and staff safer access to AI tools and technology in a way that aligns with our values.

ADVERTISEMENT

“We welcome the opportunity to engage with our community on matters raised in the open letter, and plan to discuss the concerns directly with the authors and consider the matter with our governance groups.

“We already provide a large range of AI training as well as detailed guidance for students and staff on the use of AI. Access to the university AI platforms is contingent on each student and staff member reading and agreeing to the university AI guidelines.

ADVERTISEMENT

“We want to support our research and teaching across the full spectrum of disciplines. Our AI platforms exist to provide secure, safer, more economical and managed options for those working and studying at the university. It offers users the choice of large language models within a framework that also protects data, usage security and privacy.”

juliette.rowsell@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT