Digital Services Act: Reining in the ‘Tech’ Giants and Re-shaping Digital Rights

13 June 2023

Michelle Smith de Bruin BL examines the Digital Services Act, the changes it brings, and its implications for large technology companies in Ireland.

Regulation (EU) 2022/2065, more commonly referred to as the Digital Services Act (“DSA”) aims to tackle issues in the digital sphere, including the sale of illegal products, hate speech, fake news and disinformation. The Regulation applies to providers of intermediary services (“ISPs”), offering services to users in the European Union (EU), irrespective of whether they are established in, or outside the EU. It focusses on creating a safer online environment for digital users (individuals and companies) and protecting fundamental rights in the digital space.

By Michelle Smith de Bruin BL

What is an Intermediary Service Under the Digital Services Act?

An intermediary service includes the following:

  • a ‘mere conduit’ service;
  • a ‘caching’ service; and
  • a ‘hosting’ service;

The DSA imposes obligations on ISPs at four different levels, as follows.

1. Universal Obligations

Basic obligations will apply to all providers of intermediary services. The first category includes internet service providers, direct messaging services, virtual private networks (VPNs), domain name systems, VOIP and top level domain name registries. All ISPs are bound by general rules relating to terms and conditions, reporting, transparency and the designation of representatives, or points of contact. Each ISP must designate a ‘single point of contact’ to enable them to communicate directly with Member States’ authorities, the Commission and the Board. A single point of contact must also be available to enable business users, consumers and other users, to communicate directly and rapidly with them, by electronic means and in a user-friendly manner. 

An ISP may place restrictions on services, but if it does so, it must provide information regarding any such restriction in its terms and conditions. This includes information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making. If restrictions are placed on services, the ISP must also have due regard to the fundamental rights of users, freedom of expression and the pluralism of the media. ISP’s will have to set out the process for making complaints about content, explain how the human review of complaints works and set out the rules applicable to its internal complaint handling system.

The Digital Services Act also contains important provisions for the protection of minors. If restrictions are placed on the use of services which are primarily aimed at minors, or predominantly used by them, the ISP must explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

2. Hosting Services / Online Platforms and Illegal Content

Hosting services include cloud service providers, social media, app stores and travel and accommodation platforms. Under the new rules, providers of hosting services and online platforms will be obliged to put in place, ‘notice and action’ mechanisms, in order that any individual, or entity, can notify the ISP of the presence on their service, of material which they believe to be illegal content.

The concept of ‘illegal content’ (as opposed to harmful content) includes information relating to hate speech, terrorist content, unlawful discriminatory content, or illegal activities. Examples include the sharing of images depicting child sexual abuse, the unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the sale of products or  provision of services in infringement of consumer protection law, the non-authorised use of copyright protected material, the illegal offer of accommodation services, or the illegal sale of live animals.

Once notified of the potential presence of illegal content on its service, ISPs will be deemed to have actual knowledge or awareness of illegal content if a diligent provider could identify it as illegal without having to conduct a detailed legal examination of the material.

If deemed to be illegal, the ISP will have the power to remove the content, disable access to content, or demote the content. It can also suspend, or terminate the provision of the service in whole or in part, or suspend or terminate the recipient of the service’s account. The ISP must, however, provide a clear statement of reasons to any recipient of a service who has been affected by any restrictions.

3. Online Platforms

The third category of ISP’s includes online marketplaces such as Amazon and eBay, app stores and social networks such as Facebook, Tik Tok, YouTube, Instagram and WhatsApp.  Online platforms such as these, will have to put in place an effective internal complaint-handling system. Each Member State will have a Digital Services Coordinator (DSC). Coimisiún na Meán (the Media Commission) has been designated as the Digital Services Coordinator for Ireland.

Any person, or entity who wishes to make a complaint may submit a ‘notice’ and can opt to resolve their complaint using an out-of-court dispute settlement body which has been certified by the DSC. It is important to note, however, that unlike a Court, this body will not have the power to impose a binding settlement of the dispute upon the parties.

Online platforms will be obliged to comply with transparency reporting obligations, setting out details of notices, disputes dealing with disputes and how they were resolved.

Platforms which are accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors. Further, they will not be allowed to present advertising to minors based on profiling, when they are aware with reasonable certainty that the recipient of the service is a minor.

Advertising based on personal profiling will not be permitted where it is based on the profiling of personal data which reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life, or sexual orientation.

Some platforms use recommender systems, which provide personalised suggestions relevant to the user. This could include, for example, what they might like to purchase on Amazon, music they might like on Spotify, or what a user might be interested in seeing in their newsfeed. If an online platform uses recommender systems, they must set out, in their terms and conditions, the main parameters used in their system. The user will also have the right to opt-out from recommender systems based on personal profiling.

4. Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs)

Much attention has been focussed on very large online platforms and very large search engines. Platforms will be classified as VLOPs and VLOSEs, if they reach 45 million users, or more in the EU (representing around 10% of the EU population) and are designated as such by the Commission.

VLOPs (like Facebook, Twitter, Amazon, Wikipedia and YouTube) and VLOSEs (like Bing and Google Search) must diligently identify, analyse and assess any systemic risks stemming from the design or functioning of their service and related systems, including algorithmic systems, or from the use made of their services.

Enforcement and Penalties

In Ireland, enforcement and penalties for breaches of the Act, will be overseen by Coimisiún na Meán (the Media Commission). The Digital Services Act provides for a wide range of possible enforcement measures. The maximum fine which can be imposed for a breach of the Act, is 6% of the annual worldwide turnover of the ISP in the preceding financial year. If an ISP supplies incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and failure to submit to an inspection, they could face a fine of 1% of the annual income or worldwide turnover of the ISP, or person concerned, in the preceding financial year.

When Does it Apply?

The Digital Services Act entered into force on 16th November 2022. It will be directly applicable all Member States from 17th February 2024, but will apply to VLOPs, and VLOSEs at an earlier date. The government has published the General Scheme of the Digital Services Bill 2023. When enacted, it will give effect to the provisions contained in the Digital Services Act. It will also set out detailed provisions governing applications to Court for various Orders.

If the Act fulfils its aims, it will lead to increased protection for consumers and their fundamental rights online. It will also put in place a framework for transparency and accountability for online platforms. Whilst much of the focus has been on platforms such as Google, TikTok and WhatsApp, the Act will apply to all businesses having a digital presence in the EU. Exceptions are provided, however, for online platforms that qualify as ‘micro’ or ‘small enterprises.’ Businesses should therefore assess whether the Act applies to them and if so, ensure that they are sufficiently prepared once the Act becomes directly applicable in Ireland.


The views expressed above are the author’s own and do not reflect the views of The Bar of Ireland.


Discover our Specialist Bar Associations

The Media, Internet and Data Protection Bar Association (MIDBA), supported by the Bar of Ireland, is a specialist association for barristers who practice in, or have an interest in those areas of law.

Given the seismic changes in the media landscape in recent years, and with many of the largest technology companies having their European headquarters located in Ireland, it is essential that law practitioners are kept up to date on developments in these complex and fast-moving areas of law.