AI in Studies at Hanken – Guidelines for Students

Generative AI is changing how we learn, teach and create. At Hanken, we see AI as a tool that – when used responsibly – can improve learning, creativity and efficiency. At the same time, it is important to understand the legal, ethical and academic implications of AI.

At Hanken, we strive to strengthen students' AI knowledge, promote ethical and transparent use of generative AI, and ensure that AI supports – rather than replaces – learning. We safeguard academic integrity and originality and comply with the GDPR, Hanken's internal rules and the European AI Act.

To clearly show when and how generative AI may be used in studies, Hanken has developed a traffic light model – an educational framework that categorises tasks in colours from red (AI prohibited) to rainbow (thesis with section-specific AI use). The model helps you understand what is permitted, what requires reporting, and how to use generative AI responsibly.

Hanken offers Microsoft Copilot as an AI tool for students. All students have access via their Hanken account, and the tool is safe to use – all data is handled in accordance with Hanken's data protection guidelines and GDPR.

The following principles apply to all use of generative AI in studies at Hanken. These are not recommendations – they are rules that must be followed:

Learning objectives first
AI must never replace learning. Assignments are designed to help you develop your own knowledge and skills. Examiners have the right to restrict or prohibit AI if it threatens the course objectives or learning outcomes.

Originality is a requirement
Everything you submit must be your own work. It is always prohibited to copy AI-generated content and present it as your own.

Transparency is mandatory
You must be able to explain when, how and why you have used AI. Some assignments require you to report this in a special template.

Ethical use is non-negotiable
AI use must always comply with Hanken's rules for academic integrity, data protection and copyright. You must act responsibly and cite tools correctly.

AI is a support tool – not a replacement
AI can help you develop ideas, structure text or improve language, but it must never replace your own analysis, reflection or understanding.

The traffic light model (below) is an educational framework that helps students understand when and how generative AI may be used in different types of assignments, courses or degree projects. It consists of six colour categories that indicate the degree of permitted AI use. As a student at Hanken, you must always follow the instructions for AI use in each assignment or course.

It is always the examiner who decides which category (colour) applies to a particular assignment or the entire course, and what is permitted within that framework.

Information about the AI category for a course or assignment can be found in Moodle – and if you are unsure, you should always contact your examiner for guidance.

Category

AI use

Report

Example

🔴 Red

No AI use

Language test, personal reflections

🟡 Yellow

AI assistant

Outline, language review

🟢 Green

AI use allowed

Translation, background work

🔵 Blue

AI use required

Scenarios, simulations

🟣 Purple

Full AI integration

Creative projects, design

🌈 Rainbow

Thesis work – Section-Specific AI Use

Planning, analysis, not conclusions

For assignments in the Yellow, Blue or Rainbow categories, you must complete an AI usage report. In it, you describe how AI has been used, how it has affected your work, and how you have reflected on the process. The report helps you demonstrate that you have followed ethical guidelines and Hanken's rules.

The examiner always decides whether the report is required for the assignment and how it should be submitted. You can find the report template below in PDF-guide and an editable Word format. Don't forget to always list the AI tools you have used in the reference list, example below.

Example: OpenAI. (2025). ChatGPT (version 5 August) [Large language model]. https://chat.openai.com/chat

The thesis is an independent and academically demanding process where the use of generative AI requires special consideration. In the Rainbow category, it is recognised that AI can be a valuable aid in certain parts of the work, but it must never replace your own analysis, reflection or conclusions.

You may use AI in areas such as:
•    Brainstorming and idea development
•    Structuring text or arguments
•    Language review and stylistic improvement
•    Data analysis or visualisation

However, other parts require independent work, such as:

•    Formulation of research questions
•    Critical discussion and interpretation of results
•    Conclusions and reflections

To use AI correctly in your thesis, you should:

•    Discuss AI use with your supervisor at an early stage
•    Document AI use in an AI log/report or in the methodology chapter 

•    Save versions of text before and after AI use (e.g. if you have changed the language or structure using AI)
•    Be prepared to explain your work process and demonstrate your own contribution

The purpose of the Rainbow Model is for you to use AI as a support tool – not as a replacement – and to clearly show what is your own work. By being transparent and reflective, you strengthen both your academic credibility and your competence.

Using generative AI in your studies requires responsibility, transparency and an understanding of academic integrity. Hanken prohibits cheating – even when it is done with the help of AI. You must never present AI-generated content as your own. Everything you submit must reflect your own thinking, and you must be able to show what is your own work.

The following are considered cheating:

•    Using AI in an assignment where it is prohibited (Red category)
•    Presenting AI-generated text as your own
•    Submitting AI-generated data, references or facts without acknowledging it 

More information: Action Plan against Academic Dishonesty

To avoid suspicion of cheating:

•    Save all versions of your work until it has been approved and assessed
•    Follow Hanken's data management process, especially when using sensitive material.
•    Describe openly and clearly how you have used AI in your report or methodology chapter

Examiners and supervisors have the right to use AI detection tools. These tools can sometimes produce false positive results, which makes it extra important that you can show how your work has developed – for example, by showing previous versions or explaining your work process.

Can I use ChatGPT or other AI tools to write my assignment?
It depends on which Traffic Light category the assignment belongs to. In the Red category, AI use is prohibited. In Yellow, Blue, Purple and Rainbow, AI is permitted or recommended – but you must report its use according to the instructions. However, Hanken recommends that all students use Copilot, which protects data.
 
What happens if I use AI in an assignment where it is prohibited?
It counts as cheating. You risk disciplinary action according to Hanken's action plan. AI detection tools can be used by the examiner, and even though they sometimes give false positives, you must be able to show your writing process and save versions of your work.

How do I report AI use correctly?
Use Hanken's AI report template. Describe:
•    What you used AI for
•    How much AI influenced your work
•    How you reflected on the role of AI
•    How you ensured academic integrity and data protection
•    How your work developed before and after AI

See example (copy content is prohibited):

What does AI literacy mean?
That you:
•    Understand how AI works and affects society
•    Can collaborate with AI in a critical and ethical manner
•    Use AI as a support – not as a replacement

Do I have to save old versions of my work?
Yes. Save all half-finished versions until the work has been approved and assessed. Use, for example, the date in the file name. This helps you to show your own contribution and protects you against false accusations of cheating.

What is my responsibility if AI provides incorrect information?
You are always responsible for the content of your work – even if it has been generated by AI. You must verify facts and references. Submitting fabricated data or references is always cheating.

Can my teacher prohibit the use of AI?
Yes. Teachers have the right to prohibit AI if it hinders the achievement of the course learning outcomes. 

What AI tools does Hanken offer?
•    Microsoft Copilot (chatbot/language model)
•    Panopto (automatic transcription)
•    Feedback Fruits (peer review with AI support)
•    Scopus AI (literature search)