Published on: 16/07/2025 · Last updated on: 19/09/2025
Introduction
It is important that staff promote academic integrity throughout their courses. To better understand their obligations in this regard they are encouraged to consult the code of practice QA53: Promoting Academic Integrity. They may also find it useful to consult QA53 Appendix 1, which gives examples of offences grouped according to severity.
When considering types of academic misconduct, it may be difficult to identify precisely which form of misconduct has occurred. It may also be that a specific case involves various different forms of misconduct. The table below seeks to bring some clarity to this issue.
The left-hand column lists the type of academic misconduct, while the middle column defines these. (It uses the same definitions supplied to students in the Academic Integrity Initiative Test that all students at the University of Bath must pass.) The right-hand column gives examples as to why an offence might be suspected, investigated and ultimately upheld or dismissed. When determining whether an offence has occurred, staff must determine its likelihood on a balance of probabilities using their academic judgment.
Similarity Checking Software
One method of investigating misconduct that has several entries in the table below, is the use of similarity checking software. It is important, however, not to overestimate the importance of this tool, or the nature of the evidence it can provide. A high score of similarity should not, in and of itself, be taken as evidence that an assessment offence has occurred as there may be legitimate reasons for this, for example where an assessment contains extensive quotation, but this is properly cited. The suggestion that an academic offence has occurred should therefore not rest solely on the report of similarity checking software, but must be balanced with further evidence.
Please be aware that when registering at the University students give consent that their work will be put through the similarity checking software the University currently supports. This permission does not extend to the use of other tools, commonly marketed as ‘AI detection tools’, and so to input student work into these (often unreliable) tools, would likely constitute a breach of copyright. Even were they accurate, the university does not list ‘misuse of AI’ as a form of academic misconduct in itself, rather it may be a contributing factor to one of the offences listed below.
Oral Questioning
Another common method of investigating academic misconduct that features heavily in the table is to verify students’ familiarity with their work through oral questioning. This should be conducted sensitively.
Table Mapping Misconduct Against Indicators
Type of Academic Misconduct | Definition | Possible reasons for suspecting misconduct and methods for investigating it |
Plagiarism | Claiming, submitting or presenting existing work, ideas or concepts of others as if it is one’s own, without citing the original source. This includes that generated by AI. | Use of Plagiarism Detection Software: Use of similarity checking software allows tutors to check for copied text by comparing submissions against a large database of academic papers, articles, and online content. Please note: staff are only permitted to use software the university has a license for. This is partly for copyright reasons. Style or Tone Inconsistencies: Sudden shifts in writing style, vocabulary, or tone within an assessment can indicate sections that might not have been written by the student. Lack of Understanding: When questioned about their work, a student might struggle to explain certain sections, methodologies, or technical terms, suggesting they didn’t write or fully understand those parts. Uncited Sources: The tutor may notice specific ideas, facts, or phrases that seem familiar or come from well-known sources but are not properly cited. Inconsistencies Between Submissions: A student’s submission might dramatically differ in quality or style from previous work. Mismatched References: References listed in the bibliography may not match the content of the assessment or may include sources the student did not use. This could involve non-existent articles, incorrect formatting, or the inclusion of irrelevant references. Repetition of Commonly Plagiarised Content: Tutors might recognise commonly plagiarised content, such as overused introductory statements, famous quotes, or widely available model answers. Evidence of Copy-Pasting: Inconsistent fonts, formatting, or hyperlinks in the text may indicate content has been copy and pasted from a different source. Unusually High Word Count: If a submission exceeds the typical word count significantly it could indicate the inclusion of uncited, copied material. |
Fabrication | Submitting false or misleading representations of evidence, results, data or information which form part of your assessed work, with the intention to deceive the marker. | Inconsistent or Unverifiable Data: A lecturer may notice irregularities in the data presented, such as values that don’t match realistic expectations or patterns that seem overly perfect. If asked to provide raw data, the student might fail to do so or present incomplete files. Lack of Supporting Evidence: There may be references cited that can’t be found or verified when the lecturer attempts to access the original sources. The student might struggle to explain specific aspects of the sources they claimed to have reviewed. Contradictions in Methodology: A lecturer may identify discrepancies between the student’s described research process and the results obtained. For example, the student’s timeline or resources may not align with the complexity of their reported findings. Supporting documentation: If asked for additional information about fieldwork, surveys, or interviews, the student might provide vague or evasive responses. For example, they may struggle to produce interview transcripts or survey questionnaires. Cross-Verification: A lecturer may compare the fabricated work with the submissions of other students working on similar projects and notices discrepancies in methodologies, outcomes, or cited references. For instance, fabricated lab results might contradict the outcomes obtained by other students in the same experiment. |
Impersonation | Pretending to be another student or having someone pretend to be you in a class, at an exam or test, or in any other situation in which you are being evaluated. | Verification of Identity: Tutors can compare the student’s physical appearance to their university ID or previously submitted identification during exams or assessments. Where there are discrepancies this is a cause for concern. During online exams, tutors or invigilators might discover anomalies between a student’s registered identity and their actual identity using video monitoring. Consistency of Work: Tutors might notice the style, quality, or tone of the submitted work differs from past assignments. Especially if a student’s submission appears significantly more sophisticated than their usual work, this could raise suspicions. Technical Logs and Data: For online exams or submissions, tutors may review IP addresses, device information, or login timestamps and detect anomalies, such as a different location or unexpected access. Fraudulent Documentation: Tutors may detect discrepancies in attendance records, signatures, or participation logs where someone else might have pretended to be the student. |
Contract cheating | Presenting as your own, work that you have purchased, commissioned, or downloaded, or that has been prepared by someone/something other than yourself. | Inconsistent Writing Style: Tutors may notice a stark difference in writing style, quality, or tone compared to the student’s previous submissions. For instance, the use of vocabulary, phrasing, or technical terminology that seems beyond the student’s usual capability or a dramatic improvement in structure and clarity. Misalignment with the Student’s Knowledge: The submitted work might demonstrate a level of subject knowledge or expertise not reflected in the student’s previous performance or classroom discussions. If questioned about their work, the student may struggle to explain key concepts, methodologies, or findings included in the assignment. Generic or Off-Topic Content: Contracted work might include overly generic or template-like responses that omit reference to course materials, lectures, or assignment guidelines, as someone external may not have access to these. Use of Uncommon Sources: The work may reference sources that the student is unlikely to have accessed, such as obscure journals, highly advanced technical texts, or sources unrelated to the course material. Similarities to Common Model Answers: Tutors who are familiar with common essay-writing services or repositories of model answers may recognise language, arguments, or structures often used in these purchased works. Overly-Polished Work: Contract cheating may result in work that appears overly polished, with perfect grammar, formatting, and presentation, which can seem unnatural for a student under academic pressure. Metadata in Digital Files: For digital submissions, the file metadata might provide clues, such as a different author name or device information in the file’s properties. Plagiarism Detection Software: Similarity checking software might identify reused content from essay-writing services, other students’ work, or publicly available papers, even if the work was rephrased. Oral Follow-Up: Tutors may ask targeted questions about the student’s work: ‘Can you explain how you came to this conclusion?’ ‘Walk me through your research methodology.’ Students who used contract cheating services may struggle to answer confidently or consistently, though be aware this could also be a result of nervousness. Group Projects and Peer Feedback: In group work, peers might report an apparent lack of participation from the student in question, which could signal the use of an external service to complete their individual contribution. |
Collusion* | Submitting work as if it is your own that has been done with someone else or something else, such as other people or artificial intelligence and technologies. | Duplication Across Submissions: If multiple students submit similar or identical work, this may suggest collaboration or copying. Monitoring Exam Techniques Behavioural Observation: During exams, unusual behaviour, such as the use of concealed devices or communication with others, might indicate collusion. Answer Discrepancies: If an exam response suggests expertise beyond the expected level, this might hint at collusion. Use of Plagiarism Detection Tools: Similarity checking software can highlight sections of text that are duplicated or suspiciously similar across multiple submissions, flagging possible collusion. Evaluation of Lab Results: In scientific or technical work, if multiple students report identical experimental data or findings, it may indicate unauthorised collaboration. Discrepancies in Group Work Contributions: In group assignments, tutors might notice inconsistencies in individual contributions. For example, identical content from group members who were supposed to submit separate reflections. Response to Individual Questions: Tutors may ask follow-up questions or conduct oral assessments to verify whether each student can independently explain the content of their submission. Analysis of Online Exam Logs: For online exams, tutors can review login timestamps, IP addresses, and behaviour patterns. Simultaneous submissions from different accounts with overlapping content can indicate coordination. External Observation: In practical sessions, if students appear to collaborate excessively on tasks meant to be completed individually, tutors may observe this directly. Cross-Referencing Feedback: Peer reviews or reports from group members may reveal collusion, especially if some students disclose unauthorised collaboration in their reflections. |
Self-plagiarism/auto-plagiarism | Duplication of all or parts of your own work (including work at previous institutions), and submitting it as if for the first time, without reference to your previous submission. | Plagiarism Detection Software: Similarity checking software can identify similarities between a student’s new submission and their prior work if it has been previously submitted or published and is included in the tool’s database. Familiarity with Past Work: Tutors who have previously graded a student’s work may recognise repeated arguments, sections, or ideas from prior submissions. Inconsistent Citations: A tutor may notice that specific parts of an assignment lack citations —this could indicate reused material from the student’s past work. Overlapping Topics: If a student submits assignments on very similar topics for multiple courses, tutors may examine the content for duplication. Cross-Checking Metadata: If digital files are submitted, tutors might examine file metadata to detect whether the content matches previously submitted work. Requesting Additional Context: Tutors may ask the student to explain their research process, sources, or methods. If the student struggles to demonstrate original effort, it could raise suspicions of auto-plagiarism. |
*When setting group assessments, and/or Type B assessments, it is particularly important that clear expectations be outlined to students as to what constitutes collaboration – and is therefore authorised – and what constitutes collusion, which is an assessment offence.