General principles
- Education providers are tasked with outlining the use of AI in their own organisations’ operations, such as the administration of studies, teaching, studies and guidance. Education providers are also tasked with assessing the need for staff training and guidance.
- Education providers are responsible for the compliance of the AI systems that they provide or have procured for use in their operations. Education providers must assess the obligations imposed by legislation (including the AI Act, the General Data Protection Regulation, the Data Protection Act, the Copyright Act, the Act on Information Management in Public Administration, the Administrative Procedure Act, the Act on the Provision of Digital Services, procurement legislation, the Non-Discrimination Act, the Act on Equality between Women and Men and applicable legislation on education and training, such as the Act on Early Childhood Education and Care, the Act on Primary and Secondary Education, the Act on General Upper Secondary Education, the Act on Vocational Education and Training, the Act on Preparatory Education Leading to an Upper Secondary Qualification, the Act on Liberal Adult Education and the Act on Basic Education in the Arts) and how they are taken into account in practical implementations.
- The use of AI must be in line with the national core curricula and qualification requirements, local curricula drawn up based on them and other statutory tasks of education providers.
- When using AI, education providers must ensure that learners are not subjected to marketing, advertising, ideological manipulation or to content or methods detrimental to the learners’ growth and development. Education providers must also take into account any age limits for services and applications utilising AI imposed by legislation or terms of use.
- As part of the deployment of AI systems, education providers must assess whether a deployment constitutes a procurement subject to procurement legislation and whether local procurement guidelines should be applied. Education providers must plan and prepare in advance operating instructions on how various AI systems can be deployed locally, how their compliance is ensured and how the process is reliably documented.
- Education providers must actively promote equality and non-discrimination in their operations and ensure that the use of artificial intelligence does not result in unlawful discrimination, for example due to algorithmic or data bias.
In terms of the following legislation, the aforementioned principles mean that e.g.:
AI Act
7. When planning the deployment of AI systems, education providers must assess whether the AI systems to be deployed fall within the scope of the EU AI Act. If the AI Act applies, the education provider must define the ways in which the system is used and determine its risk level. Based on the risk level, the education provider must then determine the obligations associated with the deployment of the system and plan their implementation.
8. Education providers that use AI systems subject to the AI Act must take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff.
Data protection, information security and information management
9. When considering the deployment of AI systems, education providers must verify that the system's data protection, information security and information management comply with legislative requirements. This verification process must be documented and carried out before the deployment of the system. The system must comply with the requirements until it is decommissioned. The verification process must take into account the education provider's obligation to carry out a data protection impact assessment and the prior consultation of the data protection authority in the cases mentioned in legislation.
10. Education providers must instruct their staff and learners that they are not allowed to enter confidential or non-public information or personal data into an AI system until it has been verified that the system is an environment that meets the requirements for data protection, information security and information management. Such information and data include names, personal identity codes, verbal assessments of learners, pedagogical documents, health information and information related to religious beliefs. It is important to note that the concept of personal data is broad and also covers things such as images and the voice of the learner (such as speech or singing) when the image or voice can be linked to the learner. AI systems may also collect a wide range of data on their users and the use of the system (including through monitoring technologies), which are classified as personal data.
11. Education providers must ensure that their staff and learners are informed of the processing of personal data in a concise, transparent, easily understandable and clear format, taking into account the needs and age of the target group.
Copyright
12. The Copyright Act (including its provisions concerning education) also applies to the educational use of AI applications. Education providers are obliged to take copyright issues into account in the procurement, deployment and governance of AI applications and systems and in staff guidelines.
13. Using copyrighted works to train AI without the copyright holder’s permission is prohibited. Content that is not protected by copyright and open content whose use is not subject to conditions can be entered into AI applications. There is plenty of copyright-free material (such as CC0-licensed material) available that is suitable for the educational use and training of AI.
14. Content generated by AI is generally not protected by copyright because it does not meet the criteria for human creative work. The final content should be primarily created by humans and exceed the threshold of originality in order to obtain copyright protection.