Chapter III – Due diligence obligations for a transparent and safe online environment (Art. 11-48)
Art. 11 DSA - Points of contact for Member States’ authorities, the Commission and the Board arrow_right_alt
- Providers of intermediary services shall designate a single point of contact to enable them to communicate directly, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 61 for the application of this Regulation.
- Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. That information shall be easily accessible, and shall be kept up to date.
- Providers of intermediary services shall specify in the information referred to in paragraph 2 the official language or languages of the Member States which, in addition to a language broadly understood by the largest possible number of Union citizens, can be used to communicate with their points of contact, and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established.
- 42
Recital 42
In order to facilitate smooth and efficient two-way communications, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of intermediary services should be required to designate a single electronic point of contact and to publish and update relevant information relating to that point of contact, including the languages to be used in such communications. The electronic point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the electronic point of contact should serve operational purposes and should not be required to have a physical location. Providers of intermediary services can designate the same single point of contact for the requirements of this Regulation as well as for the purposes of other acts of Union law. When specifying the languages of communication, providers of intermediary services are encouraged to ensure that the languages chosen do not in themselves constitute an obstacle to communication. Where necessary, it should be possible for providers of intermediary services and Member States’ authorities to reach a separate agreement on the language of communication, or to seek alternative means to overcome the language barrier, including by using all available technological means or internal and external human resources.
Art. 12 DSA - Points of contact for recipients of the service arrow_right_alt
- Providers of intermediary services shall designate a single point of contact to enable recipients of the service to communicate directly and rapidly with them, by electronic means and in a user-friendly manner, including by allowing recipients of the service to choose the means of communication, which shall not solely rely on automated tools.
- In addition to the obligations provided under Directive 2000/31/EC, providers of intermediary services shall make public the information necessary for the recipients of the service in order to easily identify and communicate with their single points of contact. That information shall be easily accessible, and shall be kept up to date.
- 43
Recital 43
Providers of intermediary services should also be required to designate a single point of contact for recipients of services, enabling rapid, direct and efficient communication in particular by easily accessible means such as telephone numbers, email addresses, electronic contact forms, chatbots or instant messaging. It should be explicitly indicated when a recipient of the service communicates with chatbots. Providers of intermediary services should allow recipients of services to choose means of direct and efficient communication which do not solely rely on automated tools. Providers of intermediary services should make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that this communication is performed in a timely and efficient manner.
Art. 13 DSA - Legal representatives arrow_right_alt
- Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services.
- Providers of intermediary services shall mandate their legal representatives for the purpose of being addressed in addition to or instead of such providers, by the Member States’ competent authorities, the Commission and the Board, on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with necessary powers and sufficient resources to guarantee their efficient and timely cooperation with the Member States’ competent authorities, the Commission and the Board, and to comply with such decisions.
- It shall be possible for the designated legal representative to be held liable for non-compliance with obligations under this Regulation, without prejudice to the liability and legal actions that could be initiated against the provider of intermediary services.
- Providers of intermediary services shall notify the name, postal address, email address and telephone number of their legal representative to the Digital Services Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is publicly available, easily accessible, accurate and kept up to date.
- The designation of a legal representative within the Union pursuant to paragraph 1 shall not constitute an establishment in the Union.
- 44
Recital 44
Providers of intermediary services that are established in a third country and that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives to the relevant authorities and make it publicly available. In order to comply with that obligation, such providers of intermediary services should ensure that the designated legal representative has the necessary powers and resources to cooperate with the relevant authorities. This could be the case, for example, where a provider of intermediary services appoints a subsidiary undertaking of the same group as the provider, or its parent undertaking, if that subsidiary or parent undertaking is established in the Union. However, it might not be the case, for instance, when the legal representative is subject to reconstruction proceedings, bankruptcy, or personal or corporate insolvency. That obligation should allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for a legal representative to be mandated, in accordance with national law, by more than one provider of intermediary services. It should be possible for the legal representative to also function as a point of contact, provided the relevant requirements of this Regulation are complied with.
Art. 14 DSA - Terms and conditions arrow_right_alt
- Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.
- Providers of intermediary services shall inform the recipients of the service of any significant change to the terms and conditions.
- Where an intermediary service is primarily directed at minors or is predominantly used by them, the provider of that intermediary service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.
- Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.
- Providers of very large online platforms and of very large online search engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms and conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.
- Very large online platforms and very large online search engines within the meaning of Article 33 shall publish their terms and conditions in the official languages of all the Member States in which they offer their services.
- 45
- 46
- 47
- 48
Recital 45
Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Providers of the intermediary services should clearly indicate and maintain up-to-date in their terms and conditions the information as to the grounds on the basis of which they may restrict the provision of their services. In particular, they should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint-handling system. They should also provide easily accessible information on the right to terminate the use of the service. Providers of intermediary services may use graphical elements in their terms of service, such as icons or images, to illustrate the main elements of the information requirements set out in this Regulation. Providers should inform recipients of their service through appropriate means of significant changes made to terms and conditions, for instance when they modify the rules on information that is permitted on their service, or other such changes which could directly impact the ability of the recipients to make use of the service.
Recital 46
Providers of intermediary services that are primarily directed at minors, for example through the design or marketing of the service, or which are used predominantly by minors, should make particular efforts to render the explanation of their terms and conditions easily understandable to minors.
Recital 47
When designing, applying and enforcing those restrictions, providers of intermediary services should act in a non-arbitrary and non-discriminatory manner and take into account the rights and legitimate interests of the recipients of the service, including fundamental rights as enshrined in the Charter. For example, providers of very large online platforms should in particular pay due regard to freedom of expression and of information, including media freedom and pluralism. All providers of intermediary services should also pay due regard to relevant international standards for the protection of human rights, such as the United Nations Guiding Principles on Business and Human Rights.
Recital 48
Given their special role and reach, it is appropriate to impose on very large online platforms and very large online search engines additional requirements regarding information and transparency of their terms and conditions. Consequently, providers of very large online platforms and very large online search engines should provide their terms and conditions in the official languages of all Member States in which they offer their services and should also provide recipients of the services with a concise and easily readable summary of the main elements of the terms and conditions. Such summaries should identify the main elements of the information requirements, including the possibility of easily opting out from optional clauses.
Art. 15 DSA - Transparency reporting obligations for providers of intermediary services arrow_right_alt
- Providers of intermediary services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
- for providers of intermediary services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;
- for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;
- for providers of intermediary services, meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;
- for providers of intermediary services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;
- any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.
- Paragraph 1 of this Article shall not apply to providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online platforms within the meaning of Article 33 of this Regulation.
- The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.
- 49
Recital 49
To ensure an adequate level of transparency and accountability, providers of intermediary services should make publicly available an annual report in a machine-readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation in which they engage, including the measures taken as a result of the application and enforcement of their terms and conditions. However, in order to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro or small enterprises as defined in Commission Recommendation 2003/361/EC (1) and which are not very large online platforms within the meaning of this Regulation.
(1) Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises (OJ L 124, 20.5.2003, p. 36).
Art. 16 DSA - Notice and action mechanisms arrow_right_alt
- Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.
- The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:
- a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal content;
- a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content adapted to the type of content and to the specific type of hosting service;
- the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;
- a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
- Notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.
- Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
- The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.
- Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.
- 50
- 51
- 52
- 53
Recital 50
Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned (‘notice’), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content (‘action’). Such mechanisms should be clearly identifiable, located close to the information in question and at least as easy to find and use as notification mechanisms for content that violates the terms and conditions of the hosting service provider. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. The notification mechanism should allow, but not require, the identification of the individual or the entity submitting a notice. For some types of items of information notified, the identity of the individual or the entity submitting a notice might be necessary to determine whether the information in question constitutes illegal content, as alleged. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in so far as they qualify as hosting services covered by this Regulation.
Recital 51
Having regard to the need to take due account of the fundamental rights guaranteed under the Charter of all parties concerned, any action taken by a provider of hosting services pursuant to receiving a notice should be strictly targeted, in the sense that it should serve to remove or disable access to the specific items of information considered to constitute illegal content, without unduly affecting the freedom of expression and of information of recipients of the service. Notices should therefore, as a general rule, be directed to the providers of hosting services that can reasonably be expected to have the technical and operational ability to act against such specific items. The providers of hosting services who receive a notice for which they cannot, for technical or operational reasons, remove the specific item of information should inform the person or entity who submitted the notice.
Recital 52
The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and non-arbitrary processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. Those fundamental rights include but are not limited to: for the recipients of the service, the right to freedom of expression and of information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy; for the service providers, the freedom to conduct a business, including the freedom of contract; for parties affected by illegal content, the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination. Providers of hosting services should act upon notices in a timely manner, in particular by taking into account the type of illegal content being notified and the urgency of taking action. For instance, such providers can be expected to act without delay when allegedly illegal content involving a threat to life or safety of persons is being notified. The provider of hosting services should inform the individual or entity notifying the specific content without undue delay after taking a decision whether or not to act upon the notice.
Recital 53
The notice and action mechanisms should allow for the submission of notices which are sufficiently precise and adequately substantiated to enable the provider of hosting services concerned to take an informed and diligent decision, compatible with the freedom of expression and of information, in respect of the content to which the notice relates, in particular whether or not that content is to be considered illegal content and is to be removed or access thereto is to be disabled. Those mechanisms should be such as to facilitate the provision of notices that contain an explanation of the reasons why the individual or the entity submitting a notice considers that content to be illegal content, and a clear indication of the location of that content. Where a notice contains sufficient information to enable a diligent provider of hosting services to identify, without a detailed legal examination, that it is clear that the content is illegal, the notice should be considered to give rise to actual knowledge or awareness of illegality. Except for the submission of notices relating to offences referred to in Articles 3 to 7 of Directive 2011/93/EU of the European Parliament and of the Council (1), those mechanisms should ask the individual or the entity submitting a notice to disclose its identity in order to avoid misuse.
(26) Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Art. 17 DSA - Statement of reasons arrow_right_alt
- Providers of hosting services shall provide a clear and specific statement of reasons to any affected recipients of the service for any of the following restrictions imposed on the ground that the information provided by the recipient of the service is illegal content or incompatible with their terms and conditions:
- any restrictions of the visibility of specific items of information provided by the recipient of the service, including removal of content, disabling access to content, or demoting content;
- suspension, termination or other restriction of monetary payments;
- suspension or termination of the provision of the service in whole or in part;
- suspension or termination of the recipient of the service’s account.
- Paragraph 1 shall only apply where the relevant electronic contact details are known to the provider. It shall apply at the latest from the date that the restriction is imposed, regardless of why or how it was imposed.Paragraph 1 shall not apply where the information is deceptive high-volume commercial content.
- The statement of reasons referred to in paragraph 1 shall at least contain the following information:
- information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the information, or the suspension or termination of monetary payments related to that information, or imposes other measures referred to in paragraph 1 with regard to the information, and, where relevant, the territorial scope of the decision and its duration;
- the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted in accordance with Article 16 or based on voluntary own-initiative investigations and, where strictly necessary, the identity of the notifier;
- where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of content detected or identified using automated means;
- where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
- where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider of hosting services, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;
- clear and user-friendly information on the possibilities for redress available to the recipient of the service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.
- The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).
- This Article shall not apply to any orders referred to in Article 9.
- 54
- 55
Recital 54
Where a provider of hosting services decides, on the ground that the information provided by the recipients is illegal content or is incompatible with its terms and conditions, to remove or disable access to information provided by a recipient of the service or to otherwise restrict its visibility or monetisation, for instance following receipt of a notice or acting on its own initiative, including exclusively by automated means, that provider should inform in a clear and easily comprehensible way the recipient of its decision, the reasons for its decision and the available possibilities for redress to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Where the decision was taken following receipt of a notice, the provider of hosting services should only reveal the identity of the person or entity who submitted the notice to the recipient of the service where this information is necessary to identify the illegality of the content, such as in cases of infringements of intellectual property rights.
Recital 55
Restriction of visibility may consist in demotion in ranking or in recommender systems, as well as in limiting accessibility by one or more recipients of the service or blocking the user from an online community without the user being aware (‘shadow banning’). The monetisation via advertising revenue of information provided by the recipient of the service can be restricted by suspending or terminating the monetary payment or revenue associated to that information. The obligation to provide a statement of reasons should however not apply with respect to deceptive high-volume commercial content disseminated through intentional manipulation of the service, in particular inauthentic use of the service such as the use of bots or fake accounts or other deceptive uses of the service. Irrespective of other possibilities to challenge the decision of the provider of hosting services, the recipient of the service should always have a right to effective remedy before a court in accordance with the national law.
Art. 18 DSA - Notification of suspicions of criminal offences arrow_right_alt
- Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
- Where the provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or where its legal representative resides or is established or inform Europol, or both.
- For the purpose of this Article, the Member State concerned shall be the Member State in which the offence is suspected to have taken place, to be taking place or to be likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.
- 56
Recital 56
A provider of hosting services may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the provider of hosting services is aware, the suspicion that that recipient may have committed, may be committing or is likely to commit a criminal offence involving a threat to the life or safety of person or persons, such as offences specified in Directive 2011/36/EU of the European Parliament and of the Council (1), Directive 2011/93/EU or Directive (EU) 2017/541 of the European Parliament and of the Council (2). For example, specific items of content could give rise to a suspicion of a threat to the public, such as incitement to terrorism within the meaning of Article 21 of Directive (EU) 2017/541. In such instances, the provider of hosting services should inform without delay the competent law enforcement authorities of such suspicion. The provider of hosting services should provide all relevant information available to it, including, where relevant, the content in question and, if available, the time when the content was published, including the designated time zone, an explanation of its suspicion and the information necessary to locate and identify the relevant recipient of the service. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by providers of hosting services. Providers of hosting services should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.
(1) Directive 2011/36/EU of the European Parliament and of the Council of 5 April 2011 on preventing and combating trafficking in human beings and protecting its victims, and replacing Council Framework Decision 2002/629/JHA (OJ L 101, 15.4.2011, p. 1).
(2) Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
Art. 19 DSA - Exclusion for micro and small enterprises arrow_right_alt
- This Section, with the exception of Article 24(3) thereof, shall not apply to providers of online platforms that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC.This Section, with the exception of Article 24(3) thereof, shall not apply to providers of online platforms that previously qualified for the status of a micro or small enterprise as defined in Recommendation 2003/361/EC during the 12 months following their loss of that status pursuant to Article 4(2) thereof, except when they are very large online platforms in accordance with Article 33.
- By derogation from paragraph 1 of this Article, this Section shall apply to providers of online platforms that have been designated as very large online platforms in accordance with Article 33, irrespective of whether they qualify as micro or small enterprises.
- 57
Recital 57
To avoid disproportionate burdens, the additional obligations imposed under this Regulation on providers of online platforms, including platforms allowing consumers to conclude distance contracts with traders, should not apply to providers that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC. For the same reason, those additional obligations should also not apply to providers of online platforms that previously qualified as micro or small enterprises during a period of 12 months after they lose that status. Such providers should not be excluded from the obligation to provide information on the average monthly active recipients of the service at the request of the Digital Services Coordinator of establishment or the Commission. However, considering that very large online platforms or very large online search engines have a larger reach and a greater impact in influencing how recipients of the service obtain information and communicate online, such providers should not benefit from that exclusion, irrespective of whether they qualify or recently qualified as micro or small enterprises. The consolidation rules laid down in Recommendation 2003/361/EC help ensure that any circumvention of those additional obligations is prevented. Nothing in this Regulation precludes providers of online platforms that are covered by that exclusion from setting up, on a voluntary basis, a system that complies with one or more of those obligations.
Art. 20 DSA - Internal complaint-handling system arrow_right_alt
- Providers of online platforms shall provide recipients of the service, including individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, with access to an effective internal complaint-handling system that enables them to lodge complaints, electronically and free of charge, against the decision taken by the provider of the online platform upon the receipt of a notice or against the following decisions taken by the provider of the online platform on the grounds that the information provided by the recipients constitutes illegal content or is incompatible with its terms and conditions:
- decisions whether or not to remove or disable access to or restrict visibility of the information;
- decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
- decisions whether or not to suspend or terminate the recipients’ account;
- decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients.
- The period of at least six months referred to in paragraph 1 of this Article shall start on the day on which the recipient of the service is informed about the decision in accordance with Article 16(5) or Article 17.
- Providers of online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
- Providers of online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner. Where a complaint contains sufficient grounds for the provider of the online platform to consider that its decision not to act upon the notice is unfounded or that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the measure taken, it shall reverse its decision referred to in paragraph 1 without undue delay.
- Providers of online platforms shall inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 and other available possibilities for redress.
- Providers of online platforms shall ensure that the decisions, referred to in paragraph 5, are taken under the supervision of appropriately qualified staff, and not solely on the basis of automated means.
- 58
Recital 58
Recipients of the service should be able to easily and effectively contest certain decisions of providers of online platforms concerning the illegality of content or its incompatibility with the terms and conditions that negatively affect them. Therefore, providers of online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions that aim to ensure that the systems are easily accessible and lead to swift, non-discriminatory, non-arbitrary and fair outcomes, and are subject to human review where automated means are used. Such systems should enable all recipients of the service to lodge a complaint and should not set formal requirements, such as referral to specific, relevant legal provisions or elaborate legal explanations. Recipients of the service who submitted a notice through the notice and action mechanism provided for in this Regulation or through the notification mechanism for content that violate the terms and conditions of the provider of online platforms should be entitled to use the complaint mechanism to contest the decision of the provider of online platforms on their notices, including when they consider that the action taken by that provider was not adequate. The possibility to lodge a complaint for the reversal of the contested decisions should be available for at least six months, to be calculated from the moment at which the provider of online platforms informed the recipient of the service of the decision.
Art. 21 DSA - Out-of-court dispute settlement arrow_right_alt
- Recipients of the service, including individuals or entities that have submitted notices, addressed by the decisions referred to in Article 20(1) shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 3 of this Article in order to resolve disputes relating to those decisions, including complaints that have not been resolved by means of the internal complaint-handling system referred to in that Article.
Providers of online platforms shall ensure that information about the possibility for recipients of the service to have access to an out-of-court dispute settlement, as referred to in the first subparagraph, is easily accessible on their online interface, clear and user-friendly.
The first subparagraph is without prejudice to the right of the recipient of the service concerned to initiate, at any stage, proceedings to contest those decisions by the providers of online platforms before a court in accordance with the applicable law.
- Both parties shall engage, in good faith, with the selected certified out-of-court dispute settlement body with a view to resolving the dispute.
Providers of online platforms may refuse to engage with such out-of-court dispute settlement body if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content.
The certified out-of-court dispute settlement body shall not have the power to impose a binding settlement of the dispute on the parties.
- The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, for a maximum period of five years, which may be renewed, certify the body, at its request, where the body has demonstrated that it meets all of the following conditions:
- it is impartial and independent, including financially independent, of providers of online platforms and of recipients of the service provided by providers of online platforms, including of individuals or entities that have submitted notices;
- it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platform, allowing the body to contribute effectively to the settlement of a dispute;
- its members are remunerated in a way that is not linked to the outcome of the procedure;
- the out-of-court dispute settlement that it offers is easily accessible, through electronic communications technology and provides for the possibility to initiate the dispute settlement and to submit the requisite supporting documents online;
- it is capable of settling disputes in a swift, efficient and cost-effective manner and in at least one of the official languages of the institutions of the Union;
- the out-of-court dispute settlement that it offers takes place in accordance with clear and fair rules of procedure that are easily and publicly accessible, and that comply with applicable law, including this Article.
The Digital Services Coordinator shall, where applicable, specify in the certificate:
-
- the particular issues to which the body’s expertise relates, as referred to in point (b) of the first subparagraph; and
- the official language or languages of the institutions of the Union in which the body is capable of settling disputes, as referred to in point (e) of the first subparagraph.
- Certified out-of-court dispute settlement bodies shall report to the Digital Services Coordinator that certified them, on an annual basis, on their functioning, specifying at least the number of disputes they received, the information about the outcomes of those disputes, the average time taken to resolve them and any shortcomings or difficulties encountered. They shall provide additional information at the request of that Digital Services Coordinator.
Digital Services Coordinators shall, every two years, draw up a report on the functioning of the out-of-court dispute settlement bodies that they certified. That report shall in particular:
-
- list the number of disputes that each certified out-of-court dispute settlement body has received annually;
- indicate the outcomes of the procedures brought before those bodies and the average time taken to resolve the disputes;
- identify and explain any systematic or sectoral shortcomings or difficulties encountered in relation to the functioning of those bodies;
- identify best practices concerning that functioning;
- make recommendations as to how to improve that functioning, where appropriate.
Certified out-of-court dispute settlement bodies shall make their decisions available to the parties within a reasonable period of time and no later than 90 calendar days after the receipt of the complaint. In the case of highly complex disputes, the certified out-of-court dispute settlement body may, at its own discretion, extend the 90 calendar day period for an additional period that shall not exceed 90 days, resulting in a maximum total duration of 180 days.
- If the out-of-court dispute settlement body decides the dispute in favour of the recipient of the service, including the individual or entity that has submitted a notice, the provider of the online platform shall bear all the fees charged by the out-of-court dispute settlement body, and shall reimburse that recipient, including the individual or entity, for any other reasonable expenses that it has paid in relation to the dispute settlement. If the out-of-court dispute settlement body decides the dispute in favour of the provider of the online platform, the recipient of the service, including the individual or entity, shall not be required to reimburse any fees or other expenses that the provider of the online platform paid or is to pay in relation to the dispute settlement, unless the out-of-court dispute settlement body finds that that recipient manifestly acted in bad faith.
The fees charged by the out-of-court dispute settlement body to the providers of online platforms for the dispute settlement shall be reasonable and shall in any event not exceed the costs incurred by the body. For recipients of the service, the dispute settlement shall be available free of charge or at a nominal fee.
Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the service, including to the individuals or entities that have submitted a notice, and to the provider of the online platform concerned, before engaging in the dispute settlement.
- Member States may establish out-of-court dispute settlement bodies for the purposes of paragraph 1 or support the activities of some or all out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3.
Member States shall ensure that any of their activities undertaken under the first subparagraph do not affect the ability of their Digital Services Coordinators to certify the bodies concerned in accordance with paragraph 3.
- A Digital Services Coordinator that has certified an out-of-court dispute settlement body shall revoke that certification if it determines, following an investigation either on its own initiative or on the basis of the information received by third parties, that the out-of-court dispute settlement body no longer meets the conditions set out in paragraph 3. Before revoking that certification, the Digital Services Coordinator shall afford that body an opportunity to react to the findings of its investigation and its intention to revoke the out-of-court dispute settlement body’s certification.
- Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3, including where applicable the specifications referred to in the second subparagraph of that paragraph, as well as the out-of-court dispute settlement bodies the certification of which they have revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website that is easily accessible, and keep it up to date.
- This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive.
- 59
- 60
Recital 59
In addition, provision should be made for the possibility of engaging, in good faith, in the out-of-court dispute settlement of such disputes, including those that could not be resolved in a satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The independence of the out-of-court dispute settlement bodies should be ensured also at the level of the natural persons in charge of resolving disputes, including through rules on conflict of interest. The fees charged by the out-of-court dispute settlement bodies should be reasonable, accessible, attractive, inexpensive for consumers and proportionate, and assessed on a case-by-case basis. Where an out-of-court dispute settlement body is certified by the competent Digital Services Coordinator, that certification should be valid in all Member States. Providers of online platforms should be able to refuse to engage in out-of-court dispute settlement procedures under this Regulation when the same dispute, in particular as regards the information concerned and the grounds for taking the contested decision, the effects of the decision and the grounds raised for contesting the decision, has already been resolved by or is already subject to an ongoing procedure before the competent court or before another competent out-of-court dispute settlement body. Recipients of the service should be able to choose between the internal complaint mechanism, an out-of-court dispute settlement and the possibility to initiate, at any stage, judicial proceedings. Since the outcome of the out-of-court dispute settlement procedure is not binding, the parties should not be prevented from initiating judicial proceedings in relation to the same dispute. The possibilities to contest decisions of providers of online platforms thus created should leave unaffected in all respects the possibility to seek judicial redress in accordance with the laws of the Member State concerned, and therefore should not affect the exercise of the right to an effective judicial remedy under Article 47 of the Charter. The provisions in this Regulation on out-of-court dispute settlement should not require Member States to establish such out-of-court settlement bodies.
Recital 60
For contractual consumer-to-business disputes regarding the purchase of goods or services, Directive 2013/11/EU ensures that Union consumers and businesses in the Union have access to quality-certified alternative dispute resolution entities. In this regard, it should be clarified that the rules of this Regulation on out-of-court dispute settlement are without prejudice to that Directive, including the right of consumers under that Directive to withdraw from the procedure at any stage if they are dissatisfied with the performance or the operation of the procedure.
Art. 22 DSA - Trusted flaggers arrow_right_alt
- Providers of online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.
- The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:
- it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;
- it is independent from any provider of online platforms;
- it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.
- Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:
- the identity of the provider of hosting services,
- the type of allegedly illegal content notified,
- the action taken by the provider.
Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.
Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.
- Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.
- The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.
- Where a provider of online platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.
- The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.
- The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.
- 61
- 62
Recital 61
Action against illegal content can be taken more quickly and reliably where providers of online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and non-arbitrary manner. Such trusted flagger status should be awarded by the Digital Services Coordinator of the Member State in which the applicant is established and should be recognised by all providers of online platforms within the scope of this Regulation. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and that they work in a diligent, accurate and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and private or semi-public bodies such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. To avoid diminishing the added value of such mechanism, the overall number of trusted flaggers awarded in accordance with this Regulation should be limited. In particular, industry associations representing their members’ interests are encouraged to apply for the status of trusted flaggers, without prejudice to the right of private entities or individuals to enter into bilateral agreements with the providers of online platforms.
Recital 62
Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with this Regulation. Those reports should indicate information such as the number of notices categorised by the provider of hosting services, the type of content, and the action taken by the provider. Given that trusted flaggers have demonstrated expertise and competence, the processing of notices submitted by trusted flaggers can be expected to be less burdensome and therefore faster compared to notices submitted by other recipients of the service. However, the average time taken to process may still vary depending on factors including the type of illegal content, the quality of notices, and the actual technical procedures put in place for the submission of such notices.
For example, while the Code of conduct on countering illegal hate speech online of 2016 sets a benchmark for the participating companies with respect to the time needed to process valid notifications for removal of illegal hate speech, other types of illegal content may take considerably different timelines for processing, depending on the specific facts and circumstances and types of illegal content at stake. In order to avoid abuses of the trusted flagger status, it should be possible to suspend such status when a Digital Services Coordinator of establishment opened an investigation based on legitimate reasons. The rules of this Regulation on trusted flaggers should not be understood to prevent providers of online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council (1). The rules of this Regulation should not prevent the providers of online platforms from making use of such trusted flagger or similar mechanisms to take quick and reliable action against content that is incompatible with their terms and conditions, in particular against content that is harmful for vulnerable recipients of the service, such as minors.
(1) Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).
Art. 23 DSA - Measures and protection against misuse arrow_right_alt
- Providers of online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
- Providers of online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
- When deciding on suspension, providers of online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient of the service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online platforms. Those circumstances shall include at least the following:
- the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted within a given time frame;
- the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;
- the gravity of the misuses, including the nature of illegal content, and of its consequences;
- where it is possible to identify it, the intention of the recipient of the service, the individual, the entity or the complainant.
- Providers of online platforms shall set out, in a clear and detailed manner, in their terms and conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
- 63
- 64
Recital 63
The misuse of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate, proportionate and effective safeguards against such misuse, that need to respect the rights and legitimate interests of all parties involved, including the applicable fundamental rights and freedoms as enshrined in the Charter, in particular the freedom of expression. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded.
Recital 64
Under certain conditions, providers of online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by providers of online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes, such as child sexual abuse material. For reasons of transparency, this possibility should be set out, clearly and in sufficient detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by providers of online platforms and they should be subject to oversight by the competent Digital Services Coordinator. Providers of online platforms should send a prior warning before deciding on the suspension, which should include the reasons for the possible suspension and the means of redress against the decision of the providers of the online platform. When deciding on the suspension, providers of online platforms should send the statement of reasons in accordance with the rules set out in this Regulation. The rules of this Regulation on misuse should not prevent providers of online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, including through the violation of their terms and conditions, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Art. 24 DSA - Transparency reporting obligations for providers of online platforms arrow_right_alt
- In addition to the information referred to in Article 15, providers of online platforms shall include in the reports referred to in that Article information on the following:
- the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;
- the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.
- By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online platform or online search engine, in a publicly available section of their online interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.
- Providers of online platforms or of online search engines shall communicate to the Digital Services Coordinator of establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online platform or of the online search engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.
- When the Digital Services Coordinator of establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online platforms or of online search engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.
- Providers of online platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online platforms shall ensure that the information submitted does not contain personal data.
- The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.
- 65
- 66
Recital 65
In view of the particular responsibilities and obligations of providers of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms and online search engines may be very large online platforms or very large online search engines, respectively, that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms and online search engines should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union.
Recital 66
In order to ensure transparency and to enable scrutiny over the content moderation decisions of the providers of online platforms and monitoring the spread of illegal content online, the Commission should maintain and publish a database which contains the decisions and statements of reasons of the providers of online platforms when they remove or otherwise restrict availability of and access to information. In order to keep the database continuously updated, the providers of online platforms should submit, in a standard format, the decisions and statement of reasons without undue delay after taking a decision, to allow for real-time updates where technically possible and proportionate to the means of the online platform in question. The structured database should allow access to, and queries for, the relevant information, in particular as regards the type of alleged illegal content at stake.
Art. 25 DSA - Online interface design and organisation arrow_right_alt
- Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.
- The prohibition in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.
- The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably:
- giving more prominence to certain choices when asking the recipient of the service for a decision;
- repeatedly requesting that the recipient of the service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience;
- making the procedure for terminating a service more difficult than subscribing to it.
- 67
Recital 67
Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them. Providers of online platforms should therefore be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof. This should include, but not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision.
It should also include repeatedly requesting a recipient of the service to make a choice where such a choice has already been made, making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases or to sign out from a given online platform allowing consumers to conclude distance contracts with traders, and deceiving the recipients of the service by nudging them into decisions on transactions, or by default settings that are very difficult to change, and so unreasonably bias the decision making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with recipients of the service and to offer new or additional services to them. Legitimate practices, for example in advertising, that are in compliance with Union law should not in themselves be regarded as constituting dark patterns. Those rules on dark patterns should be interpreted as covering prohibited practices falling within the scope of this Regulation to the extent that those practices are not already covered under Directive 2005/29/EC or Regulation (EU) 2016/679.
Art. 26 DSA - Advertising on online platforms arrow_right_alt
- Providers of online platforms that present advertisements on their online interfaces shall ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:
- that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44;
- the natural or legal person on whose behalf the advertisement is presented;
- the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point (b);
- meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters.
- Providers of online platforms shall provide recipients of the service with a functionality to declare whether the content they provide is or contains commercial communications.
When the recipient of the service submits a declaration pursuant to this paragraph, the provider of online platforms shall ensure that other recipients of the service can identify in a clear and unambiguous manner and in real time, including through prominent markings, which might follow standards pursuant to Article 44, that the content provided by the recipient of the service is or contains commercial communications, as described in that declaration.
- Providers of online platforms shall not present advertisements to recipients of the service based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679.
- 68
- 69
Recital 68
Online advertising plays an important role in the online environment, including in relation to the provision of online platforms, where the provision of the service is sometimes in whole or in part remunerated directly or indirectly, through advertising revenues. Online advertising can contribute to significant risks, ranging from advertisements that are themselves illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory presentation of advertisements with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, providers of online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is presented. They should ensure that the information is salient, including through standardised visual or audio marks, clearly identifiable and unambiguous for the average recipient of the service, and should be adapted to the nature of the individual service’s online interface. In addition, recipients of the service should have information directly accessible from the online interface where the advertisement is presented, on the main parameters used for determining that a specific advertisement is presented to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling.
Such explanations should include information on the method used for presenting the advertisement, for example whether it is contextual or other type of advertising, and, where applicable, the main profiling criteria used; it should also inform the recipient about any means available for them to change such criteria. The requirements of this Regulation on the provision of information relating to advertising is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling, and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. Finally, this Regulation complements the application of the Directive 2010/13/EU which imposes measures to enable users to declare audiovisual commercial communications in user-generated videos. It also complements the obligations for traders regarding the disclosure of commercial communications deriving from Directive 2005/29/EC.
Recital 69
When recipients of the service are presented with advertisements based on targeting techniques optimised to match their interests and potentially appeal to their vulnerabilities, this can have particularly serious negative effects. In certain cases, manipulative techniques can negatively impact entire groups and amplify societal harms, for example by contributing to disinformation campaigns or by discriminating against certain groups. Online platforms are particularly sensitive environments for such practices and they present a higher societal risk. Consequently, providers of online platforms should not present advertisements based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679, using special categories of personal data referred to in Article 9(1) of that Regulation, including by using profiling categories based on those special categories. This prohibition is without prejudice to the obligations applicable to providers of online platforms or any other service provider or advertiser involved in the dissemination of the advertisements under Union law on protection of personal data.
Art. 27 DSA - Recommender system transparency arrow_right_alt
- Providers of online platforms that use recommender systems shall set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters.
- The main parameters referred to in paragraph 1 shall explain why certain information is suggested to the recipient of the service. They shall include, at least:
- the criteria which are most significant in determining the information suggested to the recipient of the service;
- the reasons for the relative importance of those parameters.
- Where several options are available pursuant to paragraph 1 for recommender systems that determine the relative order of information presented to recipients of the service, providers of online platforms shall also make available a functionality that allows the recipient of the service to select and to modify at any time their preferred option. That functionality shall be directly and easily accessible from the specific section of the online platform’s online interface where the information is being prioritised.
- 70
Recital 70
A core part of the online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online, including to facilitate the search of relevant information for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should consistently ensure that recipients of their service are appropriately informed about how recommender systems impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients of the service understand how information is prioritised for them. Those parameters should include at least the most important criteria in determining the information suggested to the recipient of the service and the reasons for their respective importance, including where information is prioritised based on profiling and their online behaviour.
Art. 28 DSA - Online protection of minors arrow_right_alt
- Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.
- Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.
- Compliance with the obligations set out in this Article shall not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor.
- The Commission, after consulting the Board, may issue guidelines to assist providers of online platforms in the application of paragraph 1.
- 71
Recital 71
The protection of minors is an important policy objective of the Union. An online platform can be considered to be accessible to minors when its terms and conditions permit minors to use the service, when its service is directed at or predominantly used by minors, or where the provider is otherwise aware that some of the recipients of its service are minors, for example because it already processes personal data of the recipients of its service revealing their age for other purposes. Providers of online platforms used by minors should take appropriate and proportionate measures to protect minors, for example by designing their online interfaces or parts thereof with the highest level of privacy, safety and security for minors by default where appropriate or adopting standards for protection of minors, or participating in codes of conduct for protecting minors. They should consider best practices and available guidance, such as that provided by the communication of the Commission on A Digital Decade for children and youth: the new European strategy for a better internet for kids (BIK+). Providers of online platforms should not present advertisements based on profiling using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor. In accordance with Regulation (EU) 2016/679, notably the principle of data minimisation as provided for in Article 5(1), point (c), thereof, this prohibition should not lead the provider of the online platform to maintain, acquire or process more personal data than it already has in order to assess if the recipient of the service is a minor. Thus, this obligation should not incentivize providers of online platforms to collect the age of the recipient of the service prior to their use. It should be without prejudice to Union law on protection of personal data.
Art. 29 DSA - Exclusion for micro and small enterprises arrow_right_alt
- This Section shall not apply to providers of online platforms allowing consumers to conclude distance contracts with traders that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC.
This Section shall not apply to providers of online platforms allowing consumers to conclude distance contracts with traders that previously qualified for the status of a micro or small enterprise as defined in Recommendation 2003/361/EC during the 12 months following their loss of that status pursuant to Article 4(2) thereof, except when they are very large online platforms in accordance with Article 33.
- By derogation from paragraph 1 of this Article, this Section shall apply to providers of online platforms allowing consumers to conclude distance contracts with traders that have been designated as very large online platforms in accordance with Article 33, irrespective of whether they qualify as micro or small enterprises.
Art. 30 DSA - Traceability of traders arrow_right_alt
- Providers of online platforms allowing consumers to conclude distance contracts with traders shall ensure that traders can only use those online platforms to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have obtained the following information, where applicable to the trader:
- the name, address, telephone number and email address of the trader;
- a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council (1);
- the payment account details of the trader;
- where the trader is registered in a trade register or similar public register, the trade register in which the trader is registered and its registration number or equivalent means of identification in that register;
- a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law.
- Upon receiving the information referred to in paragraph 1 and prior to allowing the trader concerned to use its services, the provider of the online platform allowing consumers to conclude distance contracts with traders shall, through the use of any freely accessible official online database or online interface made available by a Member State or the Union or through requests to the trader to provide supporting documents from reliable sources, make best efforts to assess whether the information referred to in paragraph 1, points (a) to (e), is reliable and complete. For the purpose of this Regulation, traders shall be liable for the accuracy of the information provided.
As regards traders that are already using the services of providers of online platforms allowing consumers to conclude distance contracts with traders for the purposes referred to in paragraph 1 on 17 February 2024, the providers shall make best efforts to obtain the information listed from the traders concerned within 12 months. Where the traders concerned fail to provide the information within that period, the providers shall suspend the provision of their services to those traders until they have provided all information.
- Where the provider of the online platform allowing consumers to conclude distance contracts with traders obtains sufficient indications or has reason to believe that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate, incomplete or not up-to-date, that provider shall request that the trader remedy that situation without delay or within the period set by Union and national law.
Where the trader fails to correct or complete that information, the provider of the online platform allowing consumers to conclude distance contracts with traders shall swiftly suspend the provision of its service to that trader in relation to the offering of products or services to consumers located in the Union until the request has been fully complied with.
- Without prejudice to Article 4 of Regulation (EU) 2019/1150, if a provider of an online platform allowing consumers to conclude distance contracts with traders refuses to allow a trader to use its service pursuant to paragraph 1, or suspends the provision of its service pursuant to paragraph 3 of this Article, the trader concerned shall have the right to lodge a complaint as provided for in Articles 20 and 21 of this Regulation.
- Providers of online platforms allowing consumers to conclude distance contracts with traders shall store the information obtained pursuant to paragraphs 1 and 2 in a secure manner for a period of six months after the end of the contractual relationship with the trader concerned. They shall subsequently delete the information.
- Without prejudice to paragraph 2 of this Article, the provider of the online platform allowing consumers to conclude distance contracts with traders shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 10 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
- The provider of the online platform allowing consumers to conclude distance contracts with traders shall make the information referred to in paragraph 1, points (a), (d) and (e) available on its online platform to the recipients of the service in a clear, easily accessible and comprehensible manner. That information shall be available at least on the online platform’s online interface where the information on the product or service is presented.
(1) Regulation (EU, Euratom) 2018/1046 of the European Parliament and of the Council of 18 July 2018 on the financial rules applicable to the general budget of the Union, amending Regulations (EU) No 1296/2013, (EU) No 1301/2013, (EU) No 1303/2013, (EU) No 1304/2013, (EU) No 1309/2013, (EU) No 1316/2013, (EU) No 223/2014, (EU) No 283/2014, and Decision No 541/2014/EU and repealing Regulation (EU, Euratom) No 966/2012 (OJ L 193, 30.7.2018, p. 1).
- 72
- 73
Recital 72
In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the providers of online platforms allowing consumers to conclude distance contracts with traders, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those providers of online platforms should store all information in a secure manner for the duration of their contractual relationship with the trader and 6 months thereafter, to allow any claims to be filed against the trader or orders related to the trader to be complied with.
This obligation is necessary and proportionate, so that the information can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. This obligation leaves unaffected potential obligations to preserve certain content for longer periods of time, on the basis of other Union law or national laws, in compliance with Union law. Without prejudice to the definition provided for in this Regulation, any trader, irrespective of whether it is a natural or legal person, identified on the basis of Article 6a(1), point (b), of Directive 2011/83/EU and Article 7(4), point (f), of Directive 2005/29/EC should be traceable when offering a product or service through an online platform. Directive 2000/31/EC obliges all information society services providers to render easily, directly and permanently accessible to the recipients of the service and competent authorities certain information allowing the identification of all providers. The traceability requirements for providers of online platforms allowing consumers to conclude distance contracts with traders set out in this Regulation do not affect the application of Council Directive (EU) 2021/514 (1), which pursues other legitimate public interest objectives.
(1) Council Directive (EU) 2021/514 of 22 March 2021 amending Directive 2011/16/EU on administrative cooperation in the field of taxation (OJ L 104, 25.3.2021, p. 1).
Recital 73
To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, providers of online platforms allowing consumers to conclude distance contracts with traders should make best efforts to assess the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System, or request the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified payment accounts’ statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the providers of online platforms concerned should not be required to engage in excessive or costly online fact-finding exercises or to carry out disproportionate verifications on the spot. Nor should such providers, which have made the best efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties.
Art. 31 DSA - Compliance by design arrow_right_alt
- Providers of online platforms allowing consumers to conclude distance contracts with traders shall ensure that its online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law.In particular, the provider concerned shall ensure that its online interface enables traders to provide information on the name, address, telephone number and email address of the economic operator, as defined in Article 3, point (13), of Regulation (EU) 2019/1020 and other Union law.
- Providers of online platforms allowing consumers to conclude distance contracts with traders shall ensure that its online interface is designed and organised in a way that it allows traders to provide at least the following:
- the information necessary for the clear and unambiguous identification of the products or the services promoted or offered to consumers located in the Union through the services of the providers;
- any sign identifying the trader such as the trademark, symbol or logo; and,
- where applicable, the information concerning the labelling and marking in compliance with rules of applicable Union law on product safety and product compliance.
- Providers of online platforms allowing consumers to conclude distance contracts with traders shall make best efforts to assess whether such traders have provided the information referred to in paragraphs 1 and 2 prior to allowing them to offer their products or services on those platforms. After allowing the trader to offer products or services on its online platform that allows consumers to conclude distance contracts with traders, the provider shall make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online interface whether the products or services offered have been identified as illegal.
- 74
Recital 74
Providers of online platforms allowing consumers to conclude distance contracts with traders should design and organise their online interface in a way that enables traders to comply with their obligations under relevant Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU, Article 7 of Directive 2005/29/EC, Articles 5 and 6 of Directive 2000/31/EC and Article 3 of Directive 98/6/EC of the European Parliament and of the Council (1). For that purpose, the providers of online platforms concerned should make best efforts to assess whether the traders using their services have uploaded complete information on their online interfaces, in line with relevant applicable Union law. The providers of online platforms should ensure that products or services are not offered as long as such information is not complete. This should not amount to an obligation for the providers of online platforms concerned to generally monitor the products or services offered by traders through their services nor a general fact-finding obligation, in particular to assess the accuracy of the information provided by traders. The online interfaces should be user-friendly and easily accessible for traders and consumers. Additionally and after allowing the offering of the product or service by the trader, the providers of online platforms concerned should make reasonable efforts to randomly check whether the products or services offered have been identified as being illegal in any official, freely accessible and machine-readable online databases or online interfaces available in a Member State or in the Union. The Commission should also encourage traceability of products through technology solutions such as digitally signed Quick Response codes (or ‘QR codes’) or non-fungible tokens. The Commission should promote the development of standards and, in the absence of them, of market led solutions which can be acceptable to the parties concerned.
(1) Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers (OJ L 80, 18.3.1998, p. 27).
Art. 32 DSA - Right to information arrow_right_alt
- Where a provider of an online platform allowing consumers to conclude distance contracts with traders becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider shall inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following:
- the fact that the product or service is illegal;
- the identity of the trader; and
- any relevant means of redress.
The obligation laid down in the first subparagraph shall be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality.
- Where, in the situation referred to in paragraph 1, the provider of the online platform allowing consumers to conclude distance contracts with traders does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress.
Art. 33 DSA - Very large online platforms and very large online search engines arrow_right_alt
- This Section shall apply to online platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms or very large online search engines pursuant to paragraph 4.
- The Commission shall adopt delegated acts in accordance with Article 87 to adjust the number of average monthly active recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least by 5 % in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it shall adjust the number so that it corresponds to 10 % of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions.
- The Commission may adopt delegated acts in accordance with Article 87, after consulting the Board, to supplement the provisions of this Regulation by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1 of this Article and Article 24(2), ensuring that the methodology takes account of market and technological developments.
- The Commission shall, after having consulted the Member State of establishment or after taking into account the information provided by the Digital Services Coordinator of establishment pursuant to Article 24(4), adopt a decision designating as a very large online platform or a very large online search engine for the purposes of this Regulation the online platform or the online search engine which has a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1 of this Article. The Commission shall take its decision on the basis of data reported by the provider of the online platform or of the online search engine pursuant to Article 24(2), or information requested pursuant to Article 24(3) or any other information available to the Commission.
The failure by the provider of the online platform or of the online search engine to comply with Article 24(2) or to comply with the request by the Digital Services Coordinator of establishment or by the Commission pursuant to Article 24(3) shall not prevent the Commission from designating that provider as a provider of a very large online platform or of a very large online search engine pursuant to this paragraph.
Where the Commission bases its decision on other information available to the Commission pursuant to the first subparagraph of this paragraph or on the basis of additional information requested pursuant to Article 24(3), the Commission shall give the provider of the online platform or of the online search engine concerned 10 working days in which to submit its views on the Commission’s preliminary findings and on its intention to designate the online platform or the online search engine as a very large online platform or as a very large online search engine, respectively. The Commission shall take due account of the views submitted by the provider concerned.
The failure of the provider of the online platform or of the online search engine concerned to submit its views pursuant to the third subparagraph shall not prevent the Commission from designating that online platform or that online search engine as a very large online platform or as a very large online search engine, respectively, based on other information available to it.
- The Commission shall terminate the designation if, during an uninterrupted period of one year, the online platform or the online search engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1.
- The Commission shall notify its decisions pursuant to paragraphs 4 and 5, without undue delay, to the provider of the online platform or of the online search engine concerned, to the Board and to the Digital Services Coordinator of establishment.
The Commission shall ensure that the list of designated very large online platforms and very large online search engines is published in the Official Journal of the European Union, and shall keep that list up to date. The obligations set out in this Section shall apply, or cease to apply, to the very large online platforms and very large online search engines concerned from four months after the notification to the provider concerned referred to in the first subparagraph.
- 75
- 76
- 77
- 78
Recital 75
Given the importance of very large online platforms, due to their reach, in particular as expressed in the number of recipients of the service, in facilitating public debate, economic transactions and the dissemination to the public of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on the providers of those platforms, in addition to the obligations applicable to all online platforms. Due to their critical role in locating and making information retrievable online, it is also necessary to impose those obligations, to the extent they are applicable, on the providers of very large online search engines. Those additional obligations on providers of very large online platforms and of very large online search engines are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
Recital 76
Very large online platforms and very large online search engines may cause societal risks, different in scope and impact from those caused by smaller platforms. Providers of such very large online platforms and of very large online search engines should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact. Once the number of active recipients of an online platform or of active recipients of an online search engine, calculated as an average over a period of six months, reaches a significant share of the Union population, the systemic risks the online platform or online search engine poses may have a disproportionate impact in the Union. Such significant reach should be considered to exist where such number exceeds an operational threshold set at 45 million, that is, a number equivalent to 10 % of the Union population. This operational threshold should be kept up to date and therefore the Commission should be empowered to supplement the provisions of this Regulation by adopting delegated acts, where necessary.
Recital 77
In order to determine the reach of a given online platform or online search engine, it is necessary to establish the average number of active recipients of each service individually. Accordingly, the number of average monthly active recipients of an online platform should reflect all the recipients actually engaging with the service at least once in a given period of time, by being exposed to information disseminated on the online interface of the online platform, such as viewing it or listening to it, or by providing information, such as traders on an online platforms allowing consumers to conclude distance contracts with traders.
For the purposes of this Regulation, engagement is not limited to interacting with information by clicking on, commenting, linking, sharing, purchasing or carrying out transactions on an online platform. Consequently, the concept of active recipient of the service does not necessarily coincide with that of a registered user of a service. As regards online search engines, the concept of active recipients of the service should cover those who view information on their online interface, but not, for example, the owners of the websites indexed by an online search engine, as they do not actively engage with the service. The number of active recipients of a service should include all unique recipients of the service that engage with the specific service. To this effect, a recipient of the service that uses different online interfaces, such as websites or applications, including where the services are accessed through different uniform resource locators (URLs) or domain names, should, where possible, be counted only once. However, the concept of active recipient of the service should not include incidental use of the service by recipients of other providers of intermediary services that indirectly make available information hosted by the provider of online platforms through linking or indexing by a provider of online search engine. Further, this Regulation does not require providers of online platforms or of online search engines to perform specific tracking of individuals online. Where such providers are able to discount automated users such as bots or scrapers without further processing of personal data and tracking, they may do so. The determination of the number of active recipients of the service can be impacted by market and technical developments and therefore the Commission should be empowered to supplement the provisions of this Regulation by adopting delegated acts laying down the methodology to determine the active recipients of an online platform or of an online search engine, where necessary, reflecting the nature of the service and the way recipients of the service interact with it.
Recital 78
In view of the network effects characterising the platform economy, the user base of an online platform or an online search engine may quickly expand and reach the dimension of a very large online platform or a very large online search engine, with the related impact on the internal market. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform or the online search engine to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator of establishment or the Commission should be able to request more frequent reporting from the provider of the online platform or of the online search engine on the number of active recipients of the service to be able to timely identify the moment at which that platform or that search engine should be designated as a very large online platform or very large online search engine, respectively, for the purposes of this Regulation.
Art. 34 DSA - Risk assessment arrow_right_alt
- Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.
They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:
-
- the dissemination of illegal content through their services;
- any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;
- any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
- any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.
- When conducting risk assessments, providers of very large online platforms and of very large online search engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:
- the design of their recommender systems and any other relevant algorithmic system;
- their content moderation systems;
- the applicable terms and conditions and their enforcement;
- systems for selecting and presenting advertisements;
- data related practices of the provider.
The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.
- Providers of very large online platforms and of very large online search engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital Services Coordinator of establishment.
- 79
- 80
- 81
- 82
- 83
- 84
- 85
Recital 79
Very large online platforms and very large online search engines can be used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. Effective regulation and enforcement is necessary in order to effectively identify and mitigate the risks and the societal and economic harm that may arise. Under this Regulation, providers of very large online platforms and of very large online search engines should therefore assess the systemic risks stemming from the design, functioning and use of their services, as well as from potential misuses by the recipients of the service, and should take appropriate mitigating measures in observance of fundamental rights. In determining the significance of potential negative effects and impacts, providers should consider the severity of the potential impact and the probability of all such systemic risks. For example, they could assess whether the potential negative impact can affect a large number of persons, its potential irreversibility, or how difficult it is to remedy and restore the situation prevailing prior to the potential impact.
Recital 80
Four categories of systemic risks should be assessed in-depth by the providers of very large online platforms and of very large online search engines. A first category concerns the risks associated with the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech or other types of misuse of their services for criminal offences, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous or counterfeit products, or illegally-traded animals. For example, such dissemination or activities may constitute a significant systemic risk where access to illegal content may spread rapidly and widely through accounts with a particularly wide reach or other means of amplification. Providers of very large online platforms and of very large online search engines should assess the risk of dissemination of illegal content irrespective of whether or not the information is also incompatible with their terms and conditions. This assessment is without prejudice to the personal responsibility of the recipient of the service of very large online platforms or of the owners of websites indexed by very large online search engines for possible illegality of their activity under the applicable law.
Recital 81
A second category concerns the actual or foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter, including but not limited to human dignity, freedom of expression and of information, including media freedom and pluralism, the right to private life, data protection, the right to non-discrimination, the rights of the child and consumer protection. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or by the very large online search engine or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. When assessing risks to the rights of the child, providers of very large online platforms and of very large online search engines should consider for example how easy it is for minors to understand the design and functioning of the service, as well as how minors can be exposed through their service to content that may impair minors’ health, physical, mental and moral development. Such risks may arise, for example, in relation to the design of online interfaces which intentionally or unintentionally exploit the weaknesses and inexperience of minors or which may cause addictive behaviour.
Recital 82
A third category of risks concerns the actual or foreseeable negative effects on democratic processes, civic discourse and electoral processes, as well as public security.
Recital 83
A fourth category of risks stems from similar concerns relating to the design, functioning or use, including through manipulation, of very large online platforms and of very large online search engines with an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence. Such risks may also stem from coordinated disinformation campaigns related to public health, or from online interface design that may stimulate behavioural addictions of recipients of the service.
Recital 84
When assessing such systemic risks, providers of very large online platforms and of very large online search engines should focus on the systems or other elements that may contribute to the risks, including all the algorithmic systems that may be relevant, in particular their recommender systems and advertising systems, paying attention to the related data collection and use practices. They should also assess whether their terms and conditions and the enforcement thereof are appropriate, as well as their content moderation processes, technical tools and allocated resources. When assessing the systemic risks identified in this Regulation, those providers should also focus on the information which is not illegal, but contributes to the systemic risks identified in this Regulation. Such providers should therefore pay particular attention on how their services are used to disseminate or amplify misleading or deceptive content, including disinformation. Where the algorithmic amplification of information contributes to the systemic risks, those providers should duly reflect this in their risk assessments. Where risks are localised or there are linguistic differences, those providers should also account for this in their risk assessments. Providers of very large online platforms and of very large online search engines should, in particular, assess how the design and functioning of their service, as well as the intentional and, oftentimes, coordinated manipulation and use of their services, or the systemic infringement of their terms of service, contribute to such risks. Such risks may arise, for example, through the inauthentic use of the service, such as the creation of fake accounts, the use of bots or deceptive use of a service, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination to the public of information that is illegal content or incompatible with an online platform’s or online search engine’s terms and conditions and that contributes to disinformation campaigns.
Recital 85
In order to make it possible that subsequent risk assessments build on each other and show the evolution of the risks identified, as well as to facilitate investigations and enforcement actions, providers of very large online platforms and of very large online search engines should preserve all supporting documents relating to the risk assessments that they carried out, such as information regarding the preparation thereof, underlying data and data on the testing of their algorithmic systems.
Art. 35 DSA - Mitigation of risks arrow_right_alt
- Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
- adapting the design, features or functioning of their services, including their online interfaces;
- adapting their terms and conditions and their enforcement;
- adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content moderation;
- testing and adapting their algorithmic systems, including their recommender systems;
- adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;
- reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;
- initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;
- initiating or adjusting cooperation with other providers of online platforms or of online search engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;
- taking awareness-raising measures and adapting their online interface in order to give recipients of the service more information;
- taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;
- ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.
- The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
- identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online platforms and of very large online search engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;
- best practices for providers of very large online platforms and of very large online search engines to mitigate the systemic risks identified.
Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.
- The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
- 86
- 87
- 88
- 89
- 90
Recital 86
Providers of very large online platforms and of very large online search engines should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessments, in observance of fundamental rights. Any measures adopted should respect the due diligence requirements of this Regulation and be reasonable and effective in mitigating the specific systemic risks identified. They should be proportionate in light of the economic capacity of the provider of the very large online platform or of the very large online search engine and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on those fundamental rights. Those providers should give particular consideration to the impact on freedom of expression.
Recital 87
Providers of very large online platforms and of very large online search engines should consider under such mitigating measures, for example, adapting any necessary design, feature or functioning of their service, such as the online interface design. They should adapt and apply their terms and conditions, as necessary, and in accordance with the rules of this Regulation on terms and conditions. Other appropriate measures could include adapting their content moderation systems and internal processes or adapting their decision-making processes and resources, including the content moderation personnel, their training and local expertise. This concerns in particular the speed and quality of processing of notices. In this regard, for example, the Code of conduct on countering illegal hate speech online of 2016 sets a benchmark to process valid notifications for removal of illegal hate speech in less than 24 hours. Providers of very large online platforms, in particular those primarily used for the dissemination to the public of pornographic content, should diligently meet all their obligations under this Regulation in respect of illegal content constituting cyber violence, including illegal pornographic content, especially with regard to ensuring that victims can effectively exercise their rights in relation to content representing non-consensual sharing of intimate or manipulated material through the rapid processing of notices and removal of such content without undue delay. Other types of illegal content may require longer or shorter timelines for processing of notices, which will depend on the facts, circumstances and types of illegal content at hand. Those providers may also initiate or increase cooperation with trusted flaggers and organise training sessions and exchanges with trusted flagger organisations.
Recital 88
Providers of very large online platforms and of very large online search engines should also be diligent in the measures they take to test and, where necessary, adapt their algorithmic systems, not least their recommender systems. They may need to mitigate the negative effects of personalised recommendations and correct the criteria used in their recommendations. The advertising systems used by providers of very large online platforms and of very large online search engines can also be a catalyser for the systemic risks. Those providers should consider corrective measures, such as discontinuing advertising revenue for specific information, or other actions, such as improving the visibility of authoritative information sources, or more structurally adapting their advertising systems. Providers of very large online platforms and of very large online search engines may need to reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks, and conduct more frequent or targeted risk assessments related to new functionalities. In particular, where risks are shared across different online platforms or online search engines, they should cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. They should also consider awareness-raising actions, in particular where risks relate to disinformation campaigns.
Recital 89
Providers of very large online platforms and of very large online search engines should take into account the best interests of minors in taking measures such as adapting the design of their service and their online interface, especially when their services are aimed at minors or predominantly used by them. They should ensure that their services are organised in a way that allows minors to access easily mechanisms provided for in this Regulation, where applicable, including notice and action and complaint mechanisms. They should also take measures to protect minors from content that may impair their physical, mental or moral development and provide tools that enable conditional access to such information. In selecting the appropriate mitigation measures, providers can consider, where appropriate, industry best practices, including as established through self-regulatory cooperation, such as codes of conduct, and should take into account the guidelines from the Commission.
Recital 90
Providers of very large online platforms and of very large online search engines should ensure that their approach to risk assessment and mitigation is based on the best available information and scientific insights and that they test their assumptions with the groups most impacted by the risks and the measures they take. To this end, they should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. They should seek to embed such consultations into their methodologies for assessing the risks and designing mitigation measures, including, as appropriate, surveys, focus groups, round tables, and other consultation and design methods. In the assessment on whether a measure is reasonable, proportionate and effective, special consideration should be given to the right to freedom of expression.
Art. 36 DSA - Crisis response mechanism arrow_right_alt
- Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online platforms or of very large online search engines to take one or more of the following actions:
- assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;
- identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;
- report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.
When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.
- For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.
- When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:
- the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;
- the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;
- the actions required by the decision are limited to a period not exceeding three months.
- After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:
- notify the decision to the provider or providers to which the decision is addressed;
- make the decision publicly available; and
- inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.
- The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.
- The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).
- The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.
Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.
- Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:
- revoking the decision and, where appropriate, requiring the very large online platform or very large online search engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;
- extending the period referred to paragraph 3, point (c), by a period of no more than three months;
- taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.
- The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.
- The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.
- The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.
- 91
Recital 91
In times of crisis, there might be a need for certain specific measures to be taken urgently by providers of very large online platforms, in addition to measures they would be taking in view of their other obligations under this Regulation. In that regard, a crisis should be considered to occur when extraordinary circumstances occur that can lead to a serious threat to public security or public health in the Union or significant parts thereof. Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health. The Commission should be able to require, upon recommendation by the European Board for Digital Services (‘the Board’), providers of very large online platforms and providers of very large search engines to initiate a crisis response as a matter of urgency. Measures that those providers may identify and consider applying may include, for example, adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces. The necessary requirements should be provided for to ensure that such measures are taken within a very short time frame and that the crisis response mechanism is only used where, and to the extent that, this is strictly necessary and any measures taken under this mechanism are effective and proportionate, taking due account of the rights and legitimate interests of all parties concerned. The use of the mechanism should be without prejudice to the other provisions of this Regulation, such as those on risk assessments and mitigation measures and the enforcement thereof and those on crisis protocols.
Art. 37 DSA - Independent audit arrow_right_alt
- Providers of very large online platforms and of very large online search engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
- the obligations set out in Chapter III;
- any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.
- Providers of very large online platforms and of very large online search engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.
Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online platforms and of very large online search engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.
- Audits performed pursuant to paragraph 1 shall be performed by organisations which:
- are independent from, and do not have any conflicts of interest with, the provider of very large online platforms or of very large online search engines concerned and any legal person connected to that provider; in particular:
- have not provided non-audit services related to the matters audited to the provider of very large online platform or of very large online search engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;
- have not provided auditing services pursuant to this Article to the provider of very large online platform or of very large online search engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;
- are not performing the audit in return for fees which are contingent on the result of the audit;
- have proven expertise in the area of risk management, technical competence and capabilities;
- have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.
- are independent from, and do not have any conflicts of interest with, the provider of very large online platforms or of very large online search engines concerned and any legal person connected to that provider; in particular:
- Providers of very large online platforms and of very large online search engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:
- the name, address and the point of contact of the provider of the very large online platform or of the very large online search engine subject to the audit and the period covered;
- the name and address of the organisation or organisations performing the audit;
- a declaration of interests;
- a description of the specific elements audited, and the methodology applied;
- a description and a summary of the main findings drawn from the audit;
- a list of the third parties consulted as part of the audit;
- an audit opinion on whether the provider of the very large online platform or of the very large online search engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;
- where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.
- Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.
- Providers of very large online platforms or of very large online search engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.
- The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).
- 92
- 93
Recital 92
Given the need to ensure verification by independent experts, providers of very large online platforms and of very large online search engines should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaken pursuant to codes of conduct and crises protocols. In order to ensure that audits are carried out in an effective, efficient and timely manner, providers of very large online platforms and of very large online search engines should provide the necessary cooperation and assistance to the organisations carrying out the audits, including by giving the auditor access to all relevant data and premises necessary to perform the audit properly, including, where appropriate, to data related to algorithmic systems, and by answering oral or written questions. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Providers of very large online platforms and of very large online search engines should not undermine the performance of the audit. Audits should be performed according to best industry practices and high professional ethics and objectivity, with due regard, as appropriate, to auditing standards and codes of practice. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks. This guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation. Auditors should have the necessary expertise in the area of risk management and technical competence to audit algorithms. They should be independent, in order to be able to perform their tasks in an adequate and trustworthy manner. They should comply with core independence requirements for prohibited non-auditing services, firm rotation and non-contingent fees. If their independence and technical competence is not beyond doubt, they should resign or abstain from the audit engagement.
Recital 93
The audit report should be substantiated, in order to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the providers of the very large online platform and of the very large online search engine to comply with their obligations under this Regulation. The audit report should be transmitted to the Digital Services Coordinator of establishment, the Commission and the Board following the receipt of the audit report. Providers should also transmit upon completion without undue delay each of the reports on the risk assessment and the mitigation measures, as well as the audit implementation report of the provider of the very large online platform or of the very large online search engine showing how they have addressed the audit’s recommendations. The audit report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A ‘positive opinion’ should be given where all evidence shows that the provider of the very large online platform or of the very large online search engine complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A ‘positive opinion’ should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A ‘negative opinion’ should be given where the auditor considers that the provider of the very large online platform or of the very large online search engine does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, an explanation of reasons for the failure to reach such a conclusion should be included in the audit opinion. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited.
Art. 38 DSA - Recommender systems arrow_right_alt
In addition to the requirements set out in Article 27, providers of very large online platforms and of very large online search engines that use recommender systems shall provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679.
- 94
Recital 94
The obligations on assessment and mitigation of risks should trigger, on a case-by-case basis, the need for providers of very large online platforms and of very large online search engines to assess and, where necessary, adjust the design of their recommender systems, for example by taking measures to prevent or minimise biases that lead to the discrimination of persons in vulnerable situations, in particular where such adjustment is in accordance with data protection law and when the information is personalised on the basis of special categories of personal data referred to in Article 9 of the Regulation (EU) 2016/679. In addition, and complementing the transparency obligations applicable to online platforms as regards their recommender systems, providers of very large online platforms and of very large online search engines should consistently ensure that recipients of their service enjoy alternative options which are not based on profiling, within the meaning of Regulation (EU) 2016/679, for the main parameters of their recommender systems. Such choices should be directly accessible from the online interface where the recommendations are presented.
Art. 39 DSA - Additional online advertising transparency arrow_right_alt
- Providers of very large online platforms or of very large online search engines that present advertisements on their online interfaces shall compile and make publicly available in a specific section of their online interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.
- The repository shall include at least all of the following information:
- the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;
- the natural or legal person on whose behalf the advertisement is presented;
- the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);
- the period during which the advertisement was presented;
- whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;
- the commercial communications published on the very large online platforms and identified pursuant to Article 26(2);
- the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted.
- As regards paragraph 2, points (a), (b) and (c), where a provider of very large online platform or of very large online search engine has removed or disabled access to a specific advertisement based on alleged illegality or incompatibility with its terms and conditions, the repository shall not include the information referred to in those points. In such case, the repository shall include, for the specific advertisement concerned, the information referred to in Article 17(3), points (a) to (e), or Article 9(2), point (a)(i), as applicable.
The Commission may, after consultation of the Board, the relevant vetted researchers referred to in Article 40 and the public, issue guidelines on the structure, organisation and functionalities of the repositories referred to in this Article.
- 95
Recital 95
Advertising systems used by very large online platforms and very large online search engines pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s or search engine’s online interface. Very large online platforms or very large online search engines should ensure public access to repositories of advertisements presented on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements, including the name of the product, service or brand and the subject matter of the advertisement, and related data on the advertiser, and, if different, the natural or legal person who paid for the advertisement, and the delivery of the advertisement, in particular where targeted advertising is concerned. This information should include both information about targeting criteria and delivery criteria, in particular when advertisements are delivered to persons in vulnerable situations, such as minors.
Art. 40 DSA - Data access and scrutiny arrow_right_alt
- Providers of very large online platforms or of very large online search engines shall provide the Digital Services Coordinator of establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation.
- Digital Services Coordinators and the Commission shall use the data accessed pursuant to paragraph 1 only for the purpose of monitoring and assessing compliance with this Regulation and shall take due account of the rights and interests of the providers of very large online platforms or of very large online search engines and the recipients of the service concerned, including the protection of personal data, the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
- For the purposes of paragraph 1, providers of very large online platforms or of very large online search engines shall, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems.
- Upon a reasoned request from the Digital Services Coordinator of establishment, providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.
- Within 15 days following receipt of a request as referred to in paragraph 4, providers of very large online platforms or of very large online search engines may request the Digital Services Coordinator of establishment, to amend the request, where they consider that they are unable to give access to the data requested because one of following two reasons:
- they do not have access to the data;
- giving access to the data will lead to significant vulnerabilities in the security of their service or the protection of confidential information, in particular trade secrets.
- Requests for amendment pursuant to paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request.
The Digital Services Coordinator of establishment shall decide on the request for amendment within 15 days and communicate to the provider of the very large online platform or of the very large online search engine its decision and, where relevant, the amended request and the new period to comply with the request.
- Providers of very large online platforms or of very large online search engines shall facilitate and provide access to data pursuant to paragraphs 1 and 4 through appropriate interfaces specified in the request, including online databases or application programming interfaces.
- Upon a duly substantiated application from researchers, the Digital Services Coordinator of establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online platform or of very large online search engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:
- they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;
- they are independent from commercial interests;
- their application discloses the funding of the research;
- they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;
- their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in paragraph 4;
- the planned research activities will be carried out for the purposes laid down in paragraph 4;
- they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679.
Upon receipt of the application pursuant to this paragraph, the Digital Services Coordinator of establishment shall inform the Commission and the Board.
- Researchers may also submit their application to the Digital Services Coordinator of the Member State of the research organisation to which they are affiliated. Upon receipt of the application pursuant to this paragraph the Digital Services Coordinator shall conduct an initial assessment as to whether the respective researchers meet all of the conditions set out in paragraph 8. The respective Digital Services Coordinator shall subsequently send the application, together with the supporting documents submitted by the respective researchers and the initial assessment, to the Digital Services Coordinator of establishment. The Digital Services Coordinator of establishment shall take a decision whether to award a researcher the status of ‘vetted researcher’ without undue delay.
While taking due account of the initial assessment provided, the final decision to award a researcher the status of ‘vetted researcher’ lies within the competence of Digital Services Coordinator of establishment, pursuant to paragraph 8.
- The Digital Services Coordinator that awarded the status of vetted researcher and issued the reasoned request for data access to the providers of very large online platforms or of very large online search engines in favour of a vetted researcher shall issue a decision terminating the access if it determines, following an investigation either on its own initiative or on the basis of information received from third parties, that the vetted researcher no longer meets the conditions set out in paragraph 8, and shall inform the provider of the very large online platform or of the very large online search engine concerned of the decision. Before terminating the access, the Digital Services Coordinator shall allow the vetted researcher to react to the findings of its investigation and to its intention to terminate the access.
- Digital Services Coordinators of establishment shall communicate to the Board the names and contact information of the natural persons or entities to which they have awarded the status of ‘vetted researcher’ in accordance with paragraph 8, as well as the purpose of the research in respect of which the application was made or, where they have terminated the access to the data in accordance with paragraph 10, communicate that information to the Board.
- Providers of very large online platforms or of very large online search engines shall give access without undue delay to data, including, where technically possible, to real-time data, provided that the data is publicly accessible in their online interface by researchers, including those affiliated to not for profit bodies, organisations and associations, who comply with the conditions set out in paragraph 8, points (b), (c), (d) and (e), and who use the data solely for performing research that contributes to the detection, identification and understanding of systemic risks in the Union pursuant to Article 34(1).
- The Commission shall, after consulting the Board, adopt delegated acts supplementing this Regulation by laying down the technical conditions under which providers of very large online platforms or of very large online search engines are to share data pursuant to paragraphs 1 and 4 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with researchers can take place in compliance with Regulation (EU) 2016/679, as well as relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data, taking into account the rights and interests of the providers of very large online platforms or of very large online search engines and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
- 96
- 97
- 98
Recital 96
In order to appropriately monitor and assess the compliance of very large online platforms and of very large online search engines with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data, including data related to algorithms. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the very large online platform’s or the very large online search engine’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, including, where appropriate, training data and algorithms, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Such data access requests should not include requests to produce specific information about individual recipients of the service for the purpose of determining compliance of such recipients with other applicable Union or national law. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing providers of online platforms, providers of online search engines, Digital Services Coordinators, other competent authorities, the Commission and the public.
Recital 97
This Regulation therefore provides a framework for compelling access to data from very large online platforms and very large online search engines to vetted researchers affiliated to a research organisation within the meaning of Article 2 of Directive (EU) 2019/790, which may include, for the purpose of this Regulation, civil society organisations that are conducting scientific research with the primary goal of supporting their public interest mission. All requests for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including the protection of personal data, trade secrets and other confidential information, of the very large online platform or of the very large online search engine and any other parties concerned, including the recipients of the service. However, to ensure that the objective of this Regulation is achieved, consideration of the commercial interests of providers should not lead to a refusal to provide access to data necessary for the specific research objective pursuant to a request under this Regulation. In this regard, whilst without prejudice to Directive (EU) 2016/943 of the European Parliament and of the Council (1), providers should ensure appropriate access for researchers, including, where necessary, by taking technical protections such as through data vaults. Data access requests could cover, for example, the number of views or, where relevant, other types of access to content by recipients of the service prior to its removal by the providers of very large online platforms or of very large online search engines.
(1) Directive (EU) 2016/943 of the European Parliament and of the Council of 8 June 2016 on the protection of undisclosed know-how and business information (trade secrets) against their unlawful acquisition, use and disclosure (OJ L 157, 15.6.2016, p. 1).
Recital 98
In addition, where data is publicly accessible, such providers should not prevent researchers meeting an appropriate subset of criteria from using this data for research purposes that contribute to the detection, identification and understanding of systemic risks. They should provide access to such researchers including, where technically possible, in real-time, to the publicly accessible data, for example on aggregated interactions with content from public pages, public groups, or public figures, including impression and engagement data such as the number of reactions, shares, comments from recipients of the service. Providers of very large online platforms or of very large online search engines should be encouraged to cooperate with researchers and provide broader access to data for monitoring societal concerns through voluntary efforts, including through commitments and procedures agreed under codes of conduct or crisis protocols. Those providers and researchers should pay particular attention to the protection of personal data, and ensure that any processing of personal data complies with Regulation (EU) 2016/679. Providers should anonymise or pseudonymise personal data except in those cases that would render impossible the research purpose pursued.
Art. 41 DSA - Compliance function arrow_right_alt
- Providers of very large online platforms or of very large online search engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online platform or of the very large online search engine to monitor the compliance of that provider with this Regulation.
- The management body of the provider of the very large online platform or of the very large online search engine shall ensure that compliance officers have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3.
The management body of the provider of the very large online platform or of the very large online search engine shall ensure that the head of the compliance function is an independent senior manager with distinct responsibility for the compliance function.
The head of the compliance function shall report directly to the management body of the provider of the very large online platform or of the very large online search engine, and may raise concerns and warn that body where risks referred to in Article 34 or non-compliance with this Regulation affect or may affect the provider of the very large online platform or of the very large online search engine concerned, without prejudice to the responsibilities of the management body in its supervisory and managerial functions.
The head of the compliance function shall not be removed without prior approval of the management body of the provider of the very large online platform or of the very large online search engine.
- Compliance officers shall have the following tasks:
- cooperating with the Digital Services Coordinator of establishment and the Commission for the purpose of this Regulation;
- ensuring that all risks referred to in Article 34 are identified and properly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35;
- organising and supervising the activities of the provider of the very large online platform or of the very large online search engine relating to the independent audit pursuant to Article 37;
- informing and advising the management and employees of the provider of the very large online platform or of the very large online search engine about relevant obligations under this Regulation;
- monitoring the compliance of the provider of the very large online platform or of the very large online search engine with its obligations under this Regulation;
- where applicable, monitoring the compliance of the provider of the very large online platform or of the very large online search engine with commitments made under the codes of conduct pursuant to Articles 45 and 46 or the crisis protocols pursuant to Article 48.
- Providers of very large online platforms or of very large online search engines shall communicate the name and contact details of the head of the compliance function to the Digital Services Coordinator of establishment and to the Commission.
- The management body of the provider of the very large online platform or of the very large online search engine shall define, oversee and be accountable for the implementation of the provider’s governance arrangements that ensure the independence of the compliance function, including the division of responsibilities within the organisation of the provider of very large online platform or of very large online search engine, the prevention of conflicts of interest, and sound management of systemic risks identified pursuant to Article 34.
- The management body shall approve and review periodically, at least once a year, the strategies and policies for taking up, managing, monitoring and mitigating the risks identified pursuant to Article 34 to which the very large online platform or the very large online search engine is or might be exposed to.
- The management body shall devote sufficient time to the consideration of the measures related to risk management. It shall be actively involved in the decisions related to risk management, and shall ensure that adequate resources are allocated to the management of the risks identified in accordance with Article 34.
- 99
Recital 99
Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, providers of very large online platforms and of very large online search engines should establish a compliance function, which should be independent from the operational functions of those providers. The head of the compliance function should report directly to the management of those providers, including for concerns of non-compliance with this Regulation. The compliance officers that are part of the compliance function should have the necessary qualifications, knowledge, experience and ability to operationalise measures and monitor the compliance with this Regulation within the organisation of the providers of very large online platform or of very large online search engine. Providers of very large online platforms and of very large online search engines should ensure that the compliance function is involved, properly and in a timely manner, in all issues which relate to this Regulation including in the risk assessment and mitigation strategy and specific measures, as well as assessing compliance, where applicable, with commitments made by those providers under the codes of conduct and crisis protocols they subscribe to.
Art. 42 DSA - Transparency reporting obligations arrow_right_alt
- Providers of very large online platforms or of very large online search engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.
- The reports referred to in paragraph 1 of this Article published by providers of very large online platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:
- the human resources that the provider of very large online platforms dedicates to content moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;
- the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;
- the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.
The reports shall be published in at least one of the official languages of the Member States.
- In addition to the information referred to in Articles 24(2), the providers of very large online platforms or of very large online search engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.
- Providers of very large online platforms or of very large online search engines shall transmit to the Digital Services Coordinator of establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):
- report setting out the results of the risk assessment pursuant to Article 34;
- the specific mitigation measures put in place pursuant to Article 35(1);
- the audit report provided for in Article 37(4);
- the audit implementation report provided for in Article 37(6);
- where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.
- Where a provider of very large online platform or of very large online search engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.
- 100
Recital 100
In view of the additional risks relating to their activities and their additional obligations under this Regulation, additional transparency requirements should apply specifically to very large online platforms and very large online search engines, notably to report comprehensively on the risk assessments performed and subsequent measures adopted as provided by this Regulation.
Art. 43 DSA - Supervisory fee arrow_right_alt
- The Commission shall charge providers of very large online platforms and of very large online search engines an annual supervisory fee upon their designation pursuant to Article 33.
- The overall amount of the annual supervisory fees shall cover the estimated costs that the Commission incurs in relation to its supervisory tasks under this Regulation, in particular costs related to the designation pursuant to Article 33, to the set-up, maintenance and operation of the database pursuant to Article 24(5) and to the information sharing system pursuant to Article 85, to referrals pursuant to Article 59, to supporting the Board pursuant to Article 62 and to the supervisory tasks pursuant to Article 56 and Section 4 of Chapter IV.
- The providers of very large online platforms and of very large online search engines shall be charged annually a supervisory fee for each service for which they have been designated pursuant to Article 33.
The Commission shall adopt implementing acts establishing the amount of the annual supervisory fee in respect of each provider of very large online platform or of very large online search engine. When adopting those implementing acts, the Commission shall apply the methodology laid down in the delegated act referred to in paragraph 4 of this Article and shall respect the principles set out in paragraph 5 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.
- The Commission shall adopt delegated acts, in accordance with Article 87, laying down the detailed methodology and procedures for:
- the determination of the estimated costs referred to in paragraph 2;
- the determination of the individual annual supervisory fees referred to in paragraph 5, points (b) and (c);
- the determination of the maximum overall limit defined in paragraph 5, point (c); and
- the detailed arrangements necessary to make payments.
When adopting those delegated acts, the Commission shall respect the principles set out in paragraph 5 of this Article.
- The implementing act referred to in paragraph 3 and the delegated act referred to in paragraph 4 shall respect the following principles:
- the estimation of the overall amount of the annual supervisory fee takes into account the costs incurred in the previous year;
- the annual supervisory fee is proportionate to the number of average monthly active recipients in the Union of each very large online platform or each very large online search engine designated pursuant to Article 33;
- the overall amount of the annual supervisory fee charged on a given provider of very large online platform or very large search engine does not, in any case, exceed 0,05 % of its worldwide annual net income in the preceding financial year.
- The individual annual supervisory fees charged pursuant to paragraph 1 of this Article shall constitute external assigned revenue in accordance with Article 21(5) of Regulation (EU, Euratom) 2018/1046 of the European Parliament and of the Council (1).
- The Commission shall report annually to the European Parliament and to the Council on the overall amount of the costs incurred for the fulfilment of the tasks under this Regulation and the total amount of the individual annual supervisory fees charged in the preceding year.
(1) Regulation (EU, Euratom) 2018/1046 of the European Parliament and of the Council of 18 July 2018 on the financial rules applicable to the general budget of the Union, amending Regulations (EU) No 1296/2013, (EU) No 1301/2013, (EU) No 1303/2013, (EU) No 1304/2013, (EU) No 1309/2013, (EU) No 1316/2013, (EU) No 223/2014, (EU) No 283/2014, and Decision No 541/2014/EU and repealing Regulation (EU, Euratom) No 966/2012 (OJ L 193, 30.7.2018, p. 1).
- 101
Recital 101
The Commission should be in possession of all the necessary resources, in terms of staffing, expertise, and financial means, for the performance of its tasks under this Regulation. In order to ensure the availability of the resources necessary for the adequate supervision at Union level under this Regulation, and considering that Member States should be entitled to charge providers established in their territory a supervisory fee to in respect of the supervisory and enforcement tasks exercised by their authorities, the Commission should charge a supervisory fee, the level of which should be established on an annual basis, on very large online platforms and very large online search engines. The overall amount of the annual supervisory fee charged should be established on the basis of the overall amount of the costs incurred by the Commission to exercise its supervisory tasks under this Regulation, as reasonably estimated beforehand. Such amount should include costs relating to the exercise of the specific powers and tasks of supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines, including costs related to the designation of very large online platforms and of very large online search engines or to the set up, maintenance and operation of the databases envisaged under this Regulation.
It should also include costs relating to the set-up, maintenance and operation of the basic information and institutional infrastructure for the cooperation among Digital Services Coordinators, the Board and the Commission, taking into account the fact that in view of their size and reach very large online platforms and very large online search engines have a significant impact on the resources needed to support such infrastructure. The estimation of the overall costs should take into account the supervisory costs incurred in the previous year including, where applicable, those costs exceeding the individual annual supervisory fee charged in the previous year. The external assigned revenues resulting from the annual supervisory fee could be used to finance additional human resources, such as contractual agents and seconded national experts, and other expenditure related to the fulfilment of the tasks entrusted to the Commission by this Regulation. The annual supervisory fee to be charged on providers of very large online platforms and of very large online search engines should be proportionate to the size of the service as reflected by the number of its active recipients of the service in the Union. Moreover, the individual annual supervisory fee should not exceed an overall ceiling for each provider of very large online platforms or of very large online search engines taking into account the economic capacity of the provider of the designated service or services.
Art. 44 DSA - Standards arrow_right_alt
- The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:
- electronic submission of notices under Article 16;
- templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms and conditions and changes thereto;
- electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;
- specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;
- auditing of very large online platforms and of very large online search engines pursuant to Article 37;
- interoperability of the advertisement repositories referred to in Article 39(2);
- transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);
- technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial communications referred to in Article 26;
- choice interfaces and presentation of information on the main parameters of different types of recommender systems, in accordance with Articles 27 and 38;
- standards for targeted measures to protect minors online.
- The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.
- 102
Recital 102
To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary standards covering certain technical procedures, where the industry can help develop standardised means to support providers of intermediary services in complying with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or standards related to terms and conditions or standards relating to audits, or standards related to the interoperability of advertisement repositories. In addition, such standards could include standards related to online advertising, recommender systems, accessibility and the protection of minors online. Providers of intermediary services are free to adopt the standards, but their adoption does not presume compliance with this Regulation. At the same time, by providing best practices, such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
Art. 45 DSA - Codes of conduct arrow_right_alt
- The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law in particular on competition and the protection of personal data.
- Where significant systemic risk within the meaning of Article 34(1) emerge and concern several very large online platforms or very large online search engines, the Commission may invite the providers of very large online platforms concerned or the providers of very large online search engines concerned, and other providers of very large online platforms, of very large online search engines, of online platforms and of other intermediary services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
- When giving effect to paragraphs 1 and 2, the Commission and the Board, and where relevant other bodies, shall aim to ensure that the codes of conduct clearly set out their specific objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Services Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants.
- The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives, having regard to the key performance indicators that they might contain. They shall publish their conclusions.
The Commission and the Board shall also encourage and facilitate regular review and adaptation of the codes of conduct.
In the case of systematic failure to comply with the codes of conduct, the Commission and the Board may invite the signatories to the codes of conduct to take the necessary action.
- 103
- 104
- 105
- 106
Recital 103
The Commission and the Board should encourage the drawing-up of voluntary codes of conduct, as well as the implementation of the provisions of those codes in order to contribute to the application of this Regulation. The Commission and the Board should aim that the codes of conduct clearly define the nature of the public interest objectives being addressed, that they contain mechanisms for independent evaluation of the achievement of those objectives and that the role of relevant authorities is clearly defined. Particular attention should be given to avoiding negative effects on security, the protection of privacy and personal data, as well as to the prohibition on imposing general monitoring obligations. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidelines provided by the Commission and the Board, by participating in the same codes of conduct.
Recital 104
It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities or any adverse effects on minors. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of intentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as minors. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform or a very large online search engine may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by a provider of an online platform or of an online search engine of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform or the online search engine has infringed the obligations laid down by this Regulation. The mere fact of participating in and implementing a given code of conduct should not in itself presume compliance with this Regulation.
Recital 105
The codes of conduct should facilitate the accessibility of very large online platforms and very large online search engines, in compliance with Union and national law, in order to facilitate their foreseeable use by persons with disabilities. In particular, the codes of conduct could ensure that the information is presented in a perceivable, operable, understandable and robust way and that forms and measures provided pursuant to this Regulation are made available in a manner that is easy to find and accessible to persons with disabilities.
Recital 106
The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of understanding on the sale of counterfeit goods on the internet, the Code of conduct on countering illegal hate speech online, as well as the Code of Practice on Disinformation. In particular for the latter, following the Commission’s guidance, the Code of Practice on Disinformation has been strengthened as announced in the European Democracy Action Plan.
Art. 46 DSA - Codes of conduct for online advertising arrow_right_alt
- The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level by providers of online platforms and other relevant service providers, such as providers of online advertising intermediary services, other actors involved in the programmatic advertising value chain, or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for actors in the online advertising value chain beyond the requirements of Articles 26 and 39.
- The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information that fully respects the rights and interests of all parties involved, as well as a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct at least address the following:
- the transmission of information held by providers of online advertising intermediaries to recipients of the service concerning the requirements set in Article 26(1), points (b), (c) and (d);
- the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 39;
- meaningful information on data monetisation.
- The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.
- The Commission shall encourage all the actors in the online advertising value chain referred to in paragraph 1 to endorse the commitments stated in the codes of conduct, and to comply with them.
- 107
Recital 107
The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertisements with advertisers. Codes of conduct should support and complement the transparency obligations relating to advertising for providers of online platforms, of very large online platforms and of very large online search engines set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. This should include facilitating the transmission of the information on the advertiser who pays for the advertisement when they differ from the natural or legal person on whose behalf the advertisement is presented on the online interface of an online platform. The codes of conduct should also include measures to ensure that meaningful information about the monetisation of data is appropriately shared throughout the value chain. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. In order to ensure the effectiveness of codes of conduct, the Commission should include evaluation mechanisms in drawing up the codes of conduct. Where appropriate, the Commission may invite the Fundamental Rights Agency or the European Data Protection Supervisor to express their opinions on the respective code of conduct.
Art. 47 DSA - Codes of conduct for accessibility arrow_right_alt
- The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level with the involvement of providers of online platforms and other relevant service providers, organisations representing recipients of the service and civil society organisations or relevant authorities to promote full and effective, equal participation, by improving access to online services that, through their initial design or subsequent adaptation, address the particular needs of persons with disabilities.
- The Commission shall aim to ensure that the codes of conduct pursue the objective of ensuring that those services are accessible in compliance with Union and national law, in order to maximise their foreseeable use by persons with disabilities. The Commission shall aim to ensure that the codes of conduct address at least the following objectives:
- designing and adapting services to make them accessible to persons with disabilities by making them perceivable, operable, understandable and robust;
- explaining how the services meet the applicable accessibility requirements and making this information available to the public in an accessible manner for persons with disabilities;
- making information, forms and measures provided pursuant to this Regulation available in such a manner that they are easy to find, easy to understand, and accessible to persons with disabilities.
- The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.
Art. 48 DSA - Crisis protocols arrow_right_alt
- The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.
- The Commission shall encourage and facilitate the providers of very large online platforms, of very large online search engines and, where appropriate, the providers of other online platforms or of other online search engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:
- prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;
- ensuring that the provider of intermediary services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online platforms or of very large online search engines, the compliance officer referred to in Article 41;
- where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.
- The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
- The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:
- the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;
- the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;
- a clear procedure for determining when the crisis protocol is to be activated;
- a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;
- safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;
- a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.
- If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.
- 108
Recital 108
In addition to the crisis response mechanism for very large online platforms and very large online search engines, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Such can be the case, for example, where online platforms are misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, providers of such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating providers of very large online platforms and of very large online search engines to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.