Strange case of ‘Einstein AI’ spotlights chatbot concerns

New tool promising to watch lectures and complete assignments for students abruptly taken down amid concerns about cheating and data protection

Published on
February 27, 2026
Last updated
February 27, 2026
Montage of Albert Einstein with a cyborg, to illustrate Einstein AI.
Source: Getty Images/iStock montage

A new artificial intelligence chatbot that promised to watch lectures on behalf of students and complete their assignments for them has been taken offline days after launching – but critics say it was emblematic of how such tools are changing higher education. 

Einstein AI, launched by tech company Companion, promised to log into institutions’ virtual learning environments and complete students’ work “while they slept”, sparking concerns about “contract cheating”.

The tool is the latest in a string of AI offerings aimed at students but, unlike the “study modes” offered by the likes of Google and ChatGPT, it explicitly offered watch recorded lectures, participate in online discussions and submit assignments from students’ accounts “just like you would”.

After prompting an online backlash, the company appears to have made substantial edits to its marketing materials before taking down the tool altogether on 26 February.

ADVERTISEMENT

Analysis by Times Higher Education showed that the webpage for the AI had been edited to remove references to completing essays on behalf of students, instead emphasising it could work “with” students.

Promises to “knock out assignments” while students “sleep” were changed to commitments to “prepare flashcards, summaries, and study plans overnight – so everything is ready when you wake up”. 

ADVERTISEMENT

It appeared that the tool was scuppered by copyright infringement rather than ethical concerns. After the site was taken down, CEO Advait Paliwal told Times Higher Education that it had received a “cease and desist” order from CMG Worldwide, which owns the licensing rights for the Einstein brand. He said he will now concentrate on promoting how the wider Companion AI can be used by students. 

David Hitchcock, course director of the History Subject Suite at Canterbury Christchurch University, said despite the tool’s short lifespan, it was emblematic of the challenges posed to educators by AI firms.

“At a very basic level, ‘Einstein’ was simply a distillation of what more general-purpose AI chatbots or agents already offer to students: the capacity to cease learning virtually anything at all or doing virtually any academic work for themselves, while retaining the prospect of still ‘earning a university degree’,” Hitchcock said.

It is the “most thorough example of an automated contract cheating engine that we have seen so far”, he said, adding that it “erases the process of learning entirely”.

Students and universities are being “deliberately targeted” by tech firms, and he said such tools were similar to Facebook and traditional social media platforms that “require high user counts and high traffic in order to later ‘lock in’ a dependent population of individuals or businesses”.

ADVERTISEMENT

Hitchcock feared that such trends could mean academics are “forced against their better judgement” to return to analogue exams, which would “represent a breakdown in a very important trust that students and their teachers should have”.

When interviewed by 404 Media, Einstein AI’s Paliwal said the tech was justified and even beneficial, comparing humans to horses. “[Horses] used to pull carriages but when cars came around, I’d argue horses became a lot more free…it would be weird if horses revolted and said ‘no, I want to pull carriages, this is my purpose in life’.”

Students must give the tool access to their accounts on Canvas, one of the world’s most popular virtual learning sites accessed through their university login, for the bot to scrape lectures and seminar notes, and submit assessments.

ADVERTISEMENT

Damien Williams, assistant professor of philosophy and data science at the University of North Carolina, said giving an “unknown third party” access to students’ login details, which are also used for their email registration and financial aid portals, was “extremely dangerous”. 

“If and when a breach of that system happens, that data may well be exposed to bad actors, or even just universities which may want to review logins against their student rosters,” he told Times Higher Education. 

He said the tool’s ability to automatically complete assessments is “another front in the ‘AI edtech’ arms race”, where companies create tools that impact teaching and learning, “creating new pressures for students to use them or get ‘left behind’ which means they aren’t developing the skills they need”.

“This is another case where a technology is tossed out into the world without real consideration for its educational, let alone ethical, implications. We already know the harm done to critical thinking and skills development when people over-rely on AI tools; this system seems tailor made to exacerbate those problems, while potentially worsening adversarial atmospheres of suspicion and mistrust in the classroom,” said Williams.

ADVERTISEMENT

juliette.rowsell@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT