Guidance for working with Generative AI (“GenAI”) in your studies

GenAI guidance for students.

The technology, ethics and use of AI is a fast-moving area. This guidance is current as of October 2024 and will be reviewed by 31st January 2025.

University position on GenAI

There is currently a lot of interest in Generative AI (GenAI) systems (e.g. ChatGPT, DALL-E, Microsoft Copilot, Claude, LLama and Google Gemini).

We recognise that developing skills in the responsible use of AI is important for you and will be an important part of your future life and work.

We want to help you understand how GenAI may be used to support and aid your learning, research and assessments, while making you aware of the limitations and risks.

All University of Edinburgh students have free access to ELM (Edinburgh (access to) Language Models), the University’s AI access and innovation platform, offering you a gateway to safer access to GenAI. ELM provides some key benefits over accessing AI through other methods.

We would encourage you to use ELM over other similar tools because:

  1. You can access a wider range of AI Large Language models through ELM including the very latest and most powerful versions of ChatGPT as well as, coming soon, Open Source LLMs.
  2. Your data is secure and will not be retained by third party services to train their models or for any other purpose. The University has a Zero Data Retention agreement with OpenAI which assures that your data is secure and private. All your chat histories and your document downloads are kept private to you on your instance of ELM.
  3. It is free to use for all staff and students, providing the same access for all.
  4. You can innovate on top of ELM by writing your own AI applications through our API.
  5. ELM is fully supported by the University through your local IT teams, EdHelp and the IS Helpline.

How to get started with ELM

Golden rules for GenAI use

When using any GenAI tool there are a few golden rules. By following these points, you will be able to benefit from using GenAI while also reducing the likelihood of engaging in academic misconduct.

  1. Learn, don’t copy: Use GenAI to aid your learning, but never copy-paste any GenAI outputs into your own assessed work. Doing so constitutes academic misconduct.
  2. Ask if uncertain: Always consult your Course Organiser if you are unclear about the use of GenAI in your assessed work. Some assessed work may encourage GenAI use, while others may impose restrictions.
  3. Credit use of tools: Before handing in your assessed work, make sure you acknowledge the use of GenAI, where used.
  4. Protect personal data: Avoid uploading personal data - yours or anyone else’s - to a GenAI platform, unless you are using the University’s secure platform, ELM, and complying with the University's data protection policy. 
  5. Respect copyrights: Never upload copyrighted materials to a GenAI platform without authorization from the copyright owner. If you are using the University’s secure platform, ELM, ensure you have the right to use the material for that purpose.  
  6. Verify facts: Always check GenAI output for factual accuracy, including references and citations.
  7. Diversify sources: Never rely solely on GenAI; it should supplement, but not replace, traditional sources.

University’s data protection policy.

Acceptable uses of GenAI

GenAI and reasonable adjustments

Please note, guidance on acceptable uses of GenAI does not preclude the use of AI tools where they are being used in the context of a reasonable adjustment. For example, if you have an agreed Schedule of Adjustments (a list of modifications to how you experience your teaching, learning and research) arranged through the Disability and Learning Support Service , it may specify other ways in which you are permitted to use AI tools.

Disability and Learning Support Service

Before using GenAI in any assessed work, please check whether there are any restrictions. This should be mentioned in the assessment task. If not, then ask your Course Organiser.

Some assessments may explicitly ask you to work with AI tools and to analyse and critique the content it generates. Other assessments may specify, for good reason, that AI tools should not be used in particular ways.

You will never be asked to pay to use external GenAI tools.

Some of the positive, and generally acceptable, ways GenAI might be used include:

  • Brainstorming ideas through prompts
  • Getting explanations of difficult ideas, questions and concepts
  • Self-tutoring through conversation with the GenAI tool
  • Creating practice questions and self-tests
  • Organising and summarising your notes
  • Planning and structuring your writing
  • Summarising a text, article or book

(Check first that the copyright owner permits use of GenAI for this purpose)

  • Helping to improve your grammar, spelling, and writing

(Check for restrictions where use of language is specified as an integral part of the assessment).

  • Translation of texts in other languages

(Check for restrictions where translation is the purpose of an assessment)

  • Overcoming writer’s block through dialogue with the GenAI tool
  • Help with writing, de-bugging code and logical reasoning

(check for restrictions where this is a core skill to be demonstrated in an assessment)

How to use ELM with prompts 

Will I be penalised for using GenAI in my assessed work?

You will not be penalised for using GenAI, as long as you are working within the acceptable uses of GenAI outlined, and you are adhering to any other restrictions that have been specified for a particular assessment.

Please bear in mind, though, that use of GenAI carries a number of risks and limitations (outlined below) that could impact the quality of your work and, ultimately, the grade you may receive.  While you may use GenAI to aid your learning, you may be required (e.g. in an exam or other assessment) to demonstrate your abilities without the aid of such tools.

Citing and acknowledging the use of GenAI

Where GenAI is used, it is important to be transparent about how you have used it and what content has been generated from it.

As a minimum, you should include the following in an acknowledgement:

  • Name and version (if included) of the GenAI system used; e.g. ELM; ChatGPT-3.5
  • Publisher (the company that made the GenAI system); e.g. University of Edinburgh, Edina; OpenAI
  • URL of the GenAI system (for example ELM and ChatGPT
  • Brief description (single sentence) of context in which the tool was used.

Example of citing your use of GenAI

I acknowledge the use of ELM  to help me generate initial ideas and proof read my final draft.

ELM

Further requirements regarding acknowledging the use of GenAI may be stipulated for particular assessments. These should be made clear in assessment tasks. If you are unsure, please check with Course Organisers/Lecturers. You may be asked to include the following additional information in an appendix:

  • The prompts used to generate material from a GenAI tool
  • The date the output was generated
  • The output obtained/or an extract of the output obtained
  • How the output was used, edited or incorporated into a piece of work (e.g. in the case of proof-reading by including a tracked-changes document).

Risks of over-reliance on GenAI

While fully AI-generated outputs can seem impressive on the surface, they can often contain factual errors, lack nuance, critical engagement, and depth of expression and understanding.

Importantly, overreliance on AI tools simply to generate written content, software code or analysis reduces your opportunity to develop and practice key skills (e.g. writing, critical thinking, evaluation, analysis, coding, reasoning). These are all important aspects of your learning at university and will continue to be required in your working life.

Written work is a key way of demonstrating critical thinking and deep engagement with your course material, much of which happens during the process of writing. Relying on AI-generated output will prevent you from developing the skills you acquire when you are doing it yourself. A vital aspect of your learning at university is about developing these advanced skills, learning how to think and build an argument through writing. GenAI is no substitute for this.

While GenAI can be useful for some tasks, it is essential that you are aware of its many limitations that include the following:

  • GenAI tools are language machines rather than databases of knowledge – they work by predicting the next plausible word, image, or snippet of programming code from patterns that have been ‘learnt’ from large data sets.
  • They have no understanding of what they generate.
  • The datasets that such tools are learning from are flawed and contain inaccuracies, biases, and limitations
  • They generate text that is not always factually correct. A knowledgeable human must check the output.
  • GenAI can create software/code that has security flaws and bugs. Often the code or calculation produced by AI will look plausible but contains errors in detailed working on closer inspection. A human trained in that programming language should fully check any code or calculation produced in this way.
  • The data GenAI models are trained on are not necessarily up-to-date – they may have limited or constrained data on the world after a certain point.
  • They can occasionally produce fake citations and references.
  • Such systems are amoral – they do not know that it is wrong to generate offensive, inaccurate or misleading content, and sometimes do so.
  • They include hidden plagiarism – meaning that they make use of words and ideas from human authors without referencing them.
  • GenAI may use illegal libraries and material generated from AI may infringe copyright or intellectual property.

GenAI and academic misconduct

GenAI can be a valuable learning aid, but should never be used as a substitute for your own assessed work.

All work submitted for assessment should be your own original work. Some assessments may require you to sign a declaration to state this – please check with your Course Organiser on this point.

It is not acceptable to present AI-generated content as your own work. If you do, this will be regarded as academic misconduct.

Academic misconduct is defined by the University as:

the use of unfair means in any University assessment. Examples of misconduct include (but are not limited to) plagiarism, self-plagiarism (that is, submitting the same work for credit twice at the same or different institutions), collusion, falsification, cheating (including contract cheating, where a student pays for work to be written or edited by somebody else), deceit, and personation (that is, impersonating another student or allowing another person to impersonate a student in an assessment).