EVENTS CALENDAR

SEE ALL EVENTS

Understanding the AI Act & Open Source: Key Updates, March 2025

01 April 2025

Author: Ciarán

For anyone trying to understand the AI Act, but in particular those of us working with open source, this post will hopefully provide useful starting points.

First, some good news: the European Commission recently published a third draft of the AI Act Code of Practice (explained below) and it fixes the two problems which would have prevented the Code of Practice being used by free and open source software (FOSS) projects and companies. In the EU institutions there is more and more knowledge of how to work with FOSS and why it is important for the EU’s digital sovereignty and our opportunities to grow European IT businesses.

But let’s look at some of the main aspects of AI’s legal framework and their relation to the AI Act.

On training data

For the question of whether the use of data to train an AI model requires permission from a copyright holder, two important texts are the Information Society Directive of 2001 (“the InfoSoc Directive”) and the Directive on Copyright in the Digital Single Market of 2019 (“the Copyright Directive”, also called “The DSM Directive”). The Copyright Directive contains, for example, the regime for “text and data mining” (articles 3 & 4, and recitals 5-18). The InfoSoc Directive also contains exceptions for temporary reproductions (articles 2 & 5, and recital 33), and this is interesting because there is EU case law to support how it is interpreted (InfoPaq, InfoPaq II). Another important ruling, from a German court, is LAION e.v. vs Kneschke, 2024, which looks at both the InfoSoc and Copyright Directives.

For training data, there are also GDPR issues to consider. This article doesn’t aim to cover those, other than to highlight the EDPS decision of December 2024. That decision discusses the obligation to include training data in the evaluation of the risk of identification, but also many other topics.

The AI Act

On the AI Act itself, there are many topics to discuss but here I would like to highlight two. One is the timeline. The press might have reported that the AI Act was finished in 2024, but for most purposes the AI Act still isn’t actually in effect. The AI Act was completed on 13 June 2024 and published in the EU’s Official Journal on 7 July but the various provisions become applicable according to a multi-year timeline. A good resource for this is the timeline on artificialintelligence.eu.

The second is the FOSS provisions. Article 2(12) says: “This Regulation does not apply to AI systems released under free and open-source licences, unless they are placed on the market or put into service as high-risk AI systems or as an AI system that falls under Article 5 or 50.” However, recital 103 says “AI components that are provided against a price or otherwise monetised (…) should not benefit from the exceptions provided to free and open-source AI”. Recitals 102, 103 and 104 also generally provide details which lighten the regulatory regime for FOSS, in particular with regard to transparency obligations.

The AI Office and implementation instruments

Looking at the execution of the AI Act, it’s worth looking at the AI Office and at the various instruments that are being created, or will be created, to implement the AI Act.

The AI Office was created by the European Commission’s January 2024 Decision Establishing the European AI Office. In that decision, article 3 explains the AI Office’s role in implementing the AI Act, and article 2 details further tasks. For example, it will “contribute to the strategic, coherent and effective Union approach to international initiatives on AI (…)”.

The AI Act also requires various legislative instruments such as implementing acts, guidance, harmonised standards and a code of practice. The harmonised standards might be the most consequential. (People with a technical background may associate standards with data formats, but in EU regulation standards are usually a set of instructions to comply with the legislation.) They are important because if you use the harmonised standards – if your AI system is “in conformity” with the harmonised standards – then you get a legal presumption of being in conformity with the AI Act (see articles 40, 41, 55). Use of the harmonised standards is voluntary, but this presumption of conformity is very valuable and usually makes life a lot easier. It is thus vital that FOSS projects and companies are able to use the harmonised standards. A certain amount of FOSS expertise is required during the drafting to ensure that the described procedures are not incompatible with the licensing or business models of FOSS.

The standards are generally drafted by an approved European standards organisation (an ESO). There are three approved ESOs, listed in Annex I of Regulation 1025: CEN, Cenelec, and ETSI. When the EU wants an ESO to draft one or multiple harmonised standards (sometimes referred to as “hENs”), it does so via a standardisation request. In 2023, the European Commission published the standardisation requests for the AI Act and CEN is currently working on those standards for the AI Act. When drafting is finished, if the European Commission approves the standards drafted by the ESO they become “harmonised standards”.

And finally this brings us to the Code of Practice. Drafting the standards will take years but the EC wanted the AI Act to become applicable after just one year, so a faster procedure is being used to develop another set of instructions, namely the Code of Practice (AI Act article 56), to suggest how to comply with the legislation. The Code of Practice doesn’t have the legal weight of standards and following the Code of Practice will not confer to you a legal presumption of compliance with the AI Act. However, until the standards are ready, the Code of Practice will be the main instrument provided by the EC to help you demonstrate compliance. The Code of Practice is to be completed by 2 May 2025. The first and second drafts contained two requirements which would be impossible to implement for FOSS (because they would require applying use restrictions), but the third draft has fixed these two issues. In the third draft’s copyright document there is the addition of “this Measure does not apply to general-purpose AI models that are released under a free and open-source licence” on the last page in “Measure I.2.5 (2)”. And the “use” section of the transparency document, it now notes that “none exists” is an acceptable answer for details about the acceptable use policy.