AI POLICY FOR STUDENTS AT THE UNIVERSITY OF EASTERN FINLAND
Last modified: 25.02.2025
PrintAn AI policy, which is introduced briefly on this page, is followed at the University of Eastern Finland. The full version of the AI policy is available on the university’s intranet.
The use of AI does not absolve students of responsibility as regards the content and quality of their study attainments.
Guidelines in brief:
- AI can be used to support learning in study attainments, unless forbidden by the teacher.
- The use of a generative AI application or similar to support the preparation of study attainments must always be mentioned.
- If AI is used against the teacher’s instructions or the use of AI is not mentioned, the matter will be investigated in accordance with the university’s fraud process.
- The university recommends using the Copilot AI application that can be found in the app launcher via UEF login.
- An UEF email address must not be used when registering to external AI applications.
The use of AI to support learning
As a rule, AI can be used to support learning and the preparation of study attainments.
In these guidelines, AI and AI applications refer, in particular, to generative artificial intelligence that is based on large language models (both separate AI applications and add-ons integrated into other software that utilise generative AI) or similar AI applications. Such AI applications include ChatGPT and Copilot. Such AI applications exclude the spell check in Word.
The use of AI applications in study attainments may be prohibited in the course description or by the teacher in charge at the beginning of the course in writing if there is a risk that the use of AI might have a negative impact on achieving the learning outcomes.
The teacher may instruct on the use of AI, e.g., in the course description, in the learning environment (e.g., eLearn Moodle), by email or in the instructions of an individual assignment.
The teacher may instruct, e.g., on how AI applications can be used in the brainstorming or information acquisition phase or in language checking, stylistic improvement or translation during the course.
If an AI application is used in the preparation of study assignments or other assignments, the student must report, in writing, which application was used and in which manner it was used.
Students should, therefore, always report how a generative AI application or similar was used in the study attainment. If a student uses an AI application during a course or as part of a study attainment (e.g., an exam, assignment) where the use of AI has been forbidden in advance, the student’s activities are deemed to be fraudulent and will be treated in the same way as other cases of fraudulent conduct. The same applies to situations where the student does not report the use of AI. The use of AI applications can be investigated, e.g., by means of an electronic plagiarism identification system or by using suitable random checks.
If there is a reason to suspect an AI-related fraud, the teacher will first investigate the nature of the fraud (the scope, intent, recurrence, systematic nature of the fraud). Depending on the nature of the fraud, the teacher either instructs the student on the use of AI or transfers the matter to the Dean of the faculty for investigating the necessary disciplinary measures (including a written notification, a written warning and a fixed-term suspension). The fraud investigation process of the university is described.
Where required, the faculty, unit or independent institute (e.g., the Language Centre) may issue supplementary guidelines on the use of AI in their own teaching.
AI applications
The university offers Microsoft Copilot Enterprise for students over 18 years of age. Copilot refers to the Copilot service that is available via UEF login (login to UEF Microsoft account), not to any other AI application known as ‘Copilot’.
The Copilot service can be accessed through the app launcher.
Students may also use other AI applications than those provided by the university. In such cases, students are responsible for ensuring the lawfulness of their activities (including how the information provided is used in the service) and for finding user instructions for the application. The UEF email address or the same user ID and password combination as when logging into the university’s network must never be used when logging into these external AI applications.
Teachers may only require their students to use AI applications that are free of charge to them. Teachers cannot require their students to create user IDs for AI applications that require the disclosure of personal data (e.g., an email address and contact information that can be linked to a person).
The use of AI in theses
If an AI application is used in a thesis, the student must report, in writing, which application was used and in which manner it was used.
Theses must not be completed entirely by using an AI application. Each thesis must include a sufficient amount of independent work to ensure that the learning outcomes set for the thesis are achieved.
The sufficiency of the independent part of the thesis is assessed as part of the evaluation process of the thesis.
When AI is used in thesis work, particular attention must be paid to legislation related to copyright, personal data processing and confidential information, as well as to university’s instructions. The thesis must be drawn up in accordance with the relevant sections of the AI policy on research. The AI policy is available on the university’s intranet.
Maturity tests must be carried out in such a way that students cannot use AI in writing them.
Faculties and departments/schools will provide more detailed instructions on the maturity test.
Responsibility for content produced by using AI
The content and accuracy of materials produced by using AI is the responsibility of the student or other person who produced or published them. Artificial intelligence can make up information (hallucinate).
The university is not responsible for the content, reliability or ethics of the material produced by AI applications. The user of AI is always responsible for the use of AI and for assessing its ethicality and reliability.
As a rule, AI tools should not be used to produce material the accuracy of which cannot be verified by the student concerned.
Copyright and rights of use
AI must not be listed as the author of texts or other outputs.
AI applications may have different terms and conditions as regards the rights of use. When AI applications are used, it must be ensured that the terms and conditions of the application allow for the material to be used in the manner wished by the user.
The copyright of a work produced by an AI application (e.g., image, text, video or audio recording, composition) does not always belong to the user of the AI application, such as the student. In order for the work to be protected by copyright, the share of the work that was done by the author must be visible in the resulting work.
The ownership of the work/material that is used as background material will always remain with the original owner/author, regardless of the use of an AI application. When prompting an AI application, it must be ensured that no material for which the user of the AI application does not have a copyright or a licence granted by the copyright holder be fed into the application as background material, unless the work is available publicly and without restrictions.
Typically, this applies, in particular, to scientific texts, study materials, imaging materials and images, such as paintings and photographs.
AI applications may require that the application owner be granted, e.g., a parallel right of use to results generated by AI.
Personal data processing in AI applications
Personal data means any information relating to an identified or identifiable natural person.
Material that contains personal data must not be exported to AI applications, except when the university has concluded an agreement with the provider of the AI application that contains conditions related to the processing of personal data (e.g., Copilot used under the UEF Microsoft account).
If the material contains personal data, the analysis of the research data cannot be completed by using an AI application without conducting a data protection impact assessment (PIA/DPIA), which must be completed in accordance with the case-specific purpose. Students often process personal data in connection with their thesis, and in such cases, they must carry out a data protection impact assessment.
Data subjects (e.g., research subjects, students and job applicants) must always be informed of the processing of their personal data by means of AI.
Processing of confidential information
No confidential information must be entered into nor processed in external AI applications. Confidential information can also be fed into Copilot when Copilot is used via UEF login (UEF Microsoft account). The same applies to other AI applications and application add-ons that have been procured in a centralised manner through Digital Services.
Confidential material includes, e.g., research plans and research material (e.g., interview transcripts), study attainments (e.g., assignments, essays, examination answers and learning diaries), and research background material. Confidential materials are defined, in particular, in the Act on the Openness of Government Activities (621/1999, section 24).
Sustainability and responsibility
The goals and obligations related to sustainability and responsibility should, therefore, be observed in the use of AI.
The use of AI consumes considerably more energy than ordinary online searches. For example, an AI-based search can consume up to ten times more energy than a traditional search engine.