AI and Academic Misconduct: What Every UK Student Needs to Know


5 mins

Posted on 06 Oct 2025

AI and Academic Misconduct: What Every UK Student Needs to Know
  • Be careful to check the rules for your particular institution and the specific assignment.
  • Explain and reference your use of AI, and keep drafts, notes, and version history
  • Use AI to support your work but not write and cite for you.

Generative artificial intelligence (“AI”) is now a part of everyday study—whether you’re using it to plan a reading strategy, unpack a complex concept, or tidy your drafts, AI is everywhere. In response to this, universities are setting out clearer rules, updating their misconduct procedures and, in many cases, providing AI specific guidance. While complaints about misuse of AI in academic misconduct procedures remain low, the Office of the Independent Adjudicator (OIA) has identified gaps in students’ understanding of what is appropriate use of generative AI.

This guide examines how students can use AI responsibly in their academic work and highlights key considerations to keep in mind throughout the process.

1. Not reading (and following) course-specific AI guidance

There is no single rulebook on how to use AI in academia. Policies differ by university, by faculty, and even by assessment. Some institutions permit limited AI use for planning or formative tasks; others forbid AI in summative work unless explicitly authorised in the brief. Your module brief is important—read it. If in doubt, ask your module leaders (in writing).

Avoid the pitfall: Bookmark your institution’s AI guidance and your module handbook. If the brief does not specifically address your query, assume AI is not permitted in the submission itself and seek written clarification.

2. Using AI but failing to properly reference it

Even where AI use is allowed, failing to declare it can still be considered misconduct. Many universities require you to say how AI supported your work. Some university regulations treat unauthorised AI use (i.e., presenting AI generated content as your own) as an academic offence, and distinguish it from permitted, declared use.

Avoid the pitfall: If AI is permitted, reference it. The method of doing so may differ from university to university, however, this is good practice and can protect you from misunderstanding.

3. Letting AI write your assignment

Generative AI produces plausible text by predicting likely word sequences; it does not know your topic, assess sources, or think critically. AI should never replace conventional research—at best, it can sit alongside it. For some universities, if you use unacknowledged AI generated content in summative work, it is considered misconduct unless the brief explicitly allows for it.

Avoid the pitfall: Use AI for process—brainstorming or to test your understanding—but draft and argue in your own words, based on sources you have read and can cite.

4. Letting AI cite for you

AI systems often produce confident sounding but inaccurate claims and references. Uncritical use of AI can lead to poor citation practices and undermine academic standards. Institutions should communicate their expectations of students early on and clearly, noting that improper use of AI harms academic integrity.

Avoid the pitfall: Never let AI fabricate your reading list. Find, read, and cite the original sources. If you use AI to discover leads, verify every citation via your library databases and reference managers before including it in your bibliography.

5. Trying to outsmart AI-detection tools

AI detection tools are widely used by universities, but even these tools are mere indicators of AI use, not necessarily proof. While “counter detector” tips circulate online, they will likely push markers to scrutinise your submission more closely.

Avoid the pitfall: Write like a human. Keep your notes and drafts. If your university allows for the use of AI, include a short methodology note to explain how you legitimately used AI in your process.

6. Poor “process hygiene”: no drafts, no notes, no version history

When questions arise, investigators will look for evidence that the work is yours, by way of notes, outlines, version histories, and supervisor emails, for example. The OIA’s Casework Note emphasises fair, transparent procedures and giving students a proper chance to respond; practically, that’s much easier if you can show your process development from beginning to end. Universities may consider the evidence you provide precisely because AI scores alone are not determinative.

Avoid the pitfall: Work in a versioned environment (e.g., cloud-based documents with version history), keep dated drafts and reading logs, and save screenshots if you used AI for brainstorming, for example. A paper trail shows authorship and intent, and supports you if questions ever arise.

7. Treating language support as a free pass

Many students, especially non native English speakers or those with specific learning differences, use grammar and style tools. That can be acceptable if your university permits it and you acknowledge it. But there is a line between light language polishing and substantive rewriting that alters content.

Avoid the pitfall: If language tools are allowed, say so in your submission. Keep your original draft so you can show what changed (and why). If the brief does not allow AI, don’t assume use for grammatical purposes is exempt—check with your module leaders first (in writing).

A quick AI safe study checklist

  • Read the brief. If AI use isn’t explicitly permitted for the submission, assume it isn’t—and ask.
  • Declare any permitted use. Acknowledgement can prevent misunderstandings.
  • Don’t outsource thinking. Use AI to support your process, not to write content.
  • Verify references. Never submit unverified citations or AI invented sources.
  • Keep your drafts. Notes, outlines and version histories are your best evidence of authorship.
  • Be human. Write naturally; avoid over polished, uniform language.

Final word

University AI policies will continue to evolve, but the core principles are likely to remain: clarity, acknowledgement, and genuine intellectual effort. Universities must run fair processes; students must follow clear instructions and uphold academic integrity. If you keep the paper trail of your learning process and use AI transparently—and only as permitted—you’ll reap the benefits without risking your degree.

Contact Us

Contact our education team online or call +44 (0)20 7329 9090

Victoria Denis

Victoria is an education law solicitor with a primary focus on further and higher education. She has extensive experience handling cases involving academic and non-academic misconduct, fitness to practice, PhD supervision, degree classification, fee disputes, and exclusions.

  • Solicitor
  • T: +44 (0) 207 123 8302
  • Email me

View profile

The articles published on this website, current at the date of publication, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your own circumstances should always be sought separately before taking any action.

Back to top