Digital Public Forums - Govt. vs. Elon

#28
#28
the government should stay out of it. none of the proposed "worst case scenarios" are improved by more involvement from a government. its a world wide/Europe wide HOA of Karen's in one place deciding that your hedges are 2 inches too tall and thus should be completely removed.
Either big tech decides the rules, or the government does.

Here it's been mostly the former, but then a bunch of malcontents decided they have a "First Amendment" right to have their dopey hot takes promoted on Facebook or Twitter, and if they don't, they go crying to the politicians about "censureship."
 
#29
#29
Just to play devil's advocate, what if someone anonymously posts slanderous, horrible online posts about you or your loved ones?

You subsequently complain to Twitter X, and they tell you "Sorry. We don't filter content."

So private industry won't intervene and, per your wishes, the government doesn't regulate the industry.

Now what do you do?
Act like an adult and get over it.
 
#30
#30
Just to play devil's advocate, what if someone anonymously posts slanderous, horrible online posts about you or your loved ones?

You subsequently complain to Twitter X, and they tell you "Sorry. We don't filter content."

So private industry won't intervene and, per your wishes, the government doesn't regulate the industry.

Now what do you do?
Assume most people realize that what they read online isn't real? Or way for them to get there.
 
#31
#31
1. General information on the Digital Services Act
What is the Digital Services Act?

The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others.

It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.

The European Parliament and Council reached a political agreement on the new rules on 23 April, 2022 and the DSA entered into force on 16 November 2022 after being published in the EU Official Journal on 27 October 2022.

The Digital Services Act is a Regulation that is directly applicable across the EU. Some of the obligations for intermediaries include:

Measures to counter illegal content online, including illegal goods and services. The DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers' to identify and remove illegal content;
New rules to trace sellers on online market places, to help build trust and go after scammers more easily; a new obligation by online market places to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
Effective safeguards for users, including the possibility to challenge platforms' content moderation decisions based on a new obligatory information to users when their content gets removed or restricted;
Wide ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users;
New obligations for the protection of minors on any platform in the EU;
Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits;
A new crisis response mechanism in cases of serious threat for public health and security crises, such as a pandemic or a war;
Bans on targeted advertising on online platforms by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation. Enhanced transparency for all advertising on online platforms and influencers' commercial communications;
A ban on using so-called ‘dark patterns' on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
New provisions to allow access to data to researchers of key platforms, in order to scrutinise how platforms work and how online risks evolve;
Users will have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Representative organisations will also be able to defend user rights for large scale breaches of the law;
A unique oversight structure. The Commission is the primary regulator for very large online platforms and very large online search engines (reaching 45 million users), while other platforms and search engines will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti-trust proceedings. An EU-wide cooperation mechanism will be established between national regulators and the Commission;
The liability rules for intermediaries have been reconfirmed and updated by the co-legislator, including a Europe-wide prohibition of generalised monitoring obligations.
Does the Digital Services Act define what is illegal online?

No. The new rules set out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for very large online platforms and search engines on how illegal content spreads on their service.

What constitutes illegal content is defined in other laws either at EU level or at national level – for example terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.

Will the Digital Services Act replace sector specific legislation?

No. The Digital Services Act sets the horizontal rules covering all services and all types of illegal content, including goods or services. It does not replace or amend, but it complements sector-specific legislation such as the Audiovisual Media Services Directive (AVMSD), the Directive on Copyright in the Digital Single Market, the Consumer Protection Acquis, or the Proposal for a Regulation on preventing the dissemination of terrorist content online.

What rules preceded the Digital Services Act, and why did they have to be updated?

The e-Commerce Directive, adopted in 2000, has been the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market.

Much has changed in more than 20 years and the rules needed to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users' rights, information flows and public participation. In addition, the e-Commerce Directive did not specify any cooperation mechanism between authorities. The “Country of Origin” principle meant that the supervision was entrusted to the country of establishment.

The Digital Services Act builds on the rules of the e-Commerce Directive, and addresses the particular issues emerging around online intermediaries. Member States have regulated these services differently, creating barriers for smaller companies looking to expand and scale up across the EU and resulting in different levels of protection for European citizens.

With the Digital Services Act, unnecessary legal burdens due to different laws will be lifted, fostering a better environment for innovation, growth and competitiveness, and facilitating the scaling up of smaller platforms, SMEs and start-ups. At the same time, it will equally protect all users in the EU, both as regards their safety from illegal goods, content or services, and as regards their fundamental rights.

What is the relevance of the Regulation of intermediaries at global level?

The new rules are an important step in defending European values in the online space. They respect international human rights norms, and help better protect democracy, equality and the rule of law.

The DSA sets high standards for effective intervention, for due process and the protection of fundamental rights online; it preserves a balanced approach to the liability of intermediaries, and establishes effective measures for tackling illegal content and societal risks online. In doing so, the DSA aims at setting a benchmark for a regulatory approach to online intermediaries also at the global level.

Will these rules apply to companies outside of the EU?

They apply in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they will have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments. At the same time, online intermediaries will also benefit from the legal clarity of the liability exemptions and from a single set of rules when providing their services in the EU.

Does the Digital Service Act include provisions for digital taxation?

No, the Commission's proposal for an interim digital tax for revenue from digital activities is a separate initiative to the Digital Services Act. There are no provisions in the Digital Services Act in the field of taxation.

2. Impact on users
How will citizens benefit from the new rules?

Online platforms play an increasingly important role in the daily lives of Europeans. The rules will create a safer online experience for citizens to freely express their ideas, communicate and shop online, by reducing their exposure to illegal activities and dangerous goods and ensuring the protection of fundamental rights. The benefits include:

Better services for consumers: Online marketplaces will need to identify their business users and clarify who is selling a product or offering a service; this will help track down rogue traders and will protect online shoppers against illegal products, such as counterfeit and dangerous products. Online marketplace will be required to inform consumers who purchased a product or service when they become aware of the illegality of such products or services, about a) the illegality, b) the identity of the trader and c) any relevant means of redress. They will randomly check the documentation of products sold on their platform, and should increasingly rely on enhanced product traceability solutions, to make sure fewer and fewer non-compliant goods reach European consumers.
New rights for users: At the same time, citizens will be able to notify illegal content, including products, that they encounter and contest the decisions made by online platforms when their content is removed: platforms are obliged to notify them of any decision taken, of the reason to take that decision and to provide for a mechanism to contest the decision.
More transparency on advertising: Users will also receive more information about ads they are seeing on online platforms – for example, if and why an ad targets them specifically. Platforms will no longer present behaviourally targeted ads for minors and will no longer present ads to their users based on profiling that rests on special categories of personal data, such as their ethnicity, political views or sexual orientation.
More responsibilities for very large platforms: Specific rules are introduced for very large online platforms and very large online search engines that reach more than 45 million users, given their systemic impact in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas. When such platforms recommend content, users will be able to modify the criteria used, and choose not to receive personalised recommendations. Citizens will not have to take these companies at their word; citizens will be able to scrutinise their actions through the reports of independent auditors and vetted researchers.
Clearer consequences: Users will be able to seek compensation from providers of intermediary services for any damage or loss suffered due to an infringement of the DSA by such provider.
What measures does the legislation take to counter illegal content?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal content, but also illegal goods and services:

Users will be empowered to report illegal content in an easy and effective way.
A priority channel will be created for trusted flaggers – entities which have demonstrated particular expertise and competence – to report illegal content to which platforms will have to react with priority.
When enabled by national laws, Member State authorities will be able to order any platform operating in the EU, irrespective of where they are established, to remove illegal content. Very large online platforms will need to take mitigating measures at the level of the overall organisation of their service to protect their users from illegal content, goods and services.
How will the DSA protect people from unsafe or counterfeit goods?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal goods:

Platforms will have mandatory procedures in place for removing illegal goods.
Online marketplaces will also be requested to trace their traders (“know your business customer”). This will ensure a safe, transparent and trustworthy environment for consumers and discourage traders who abuse platforms from selling unsafe or counterfeit goods.
Online platforms will further be requested to organise their online interfaces in a way that allows traders to comply with their information obligations towards consumers.
A new system of trusted flaggers will also be available, for example, for brand owners fighting counterfeit goods, and for faster and easier flagging and removal of counterfeit goods.
Public authorities will have new tools to order the removal of unsafe products directly.
Marketplaces will also be required to implement reasonable efforts to randomly check whether products or services have been identified as being illegal in any official database and take the appropriate action.
Very large online platforms will be subject to an audited risk assessment that will include an analysis on their vulnerability to illegal goods on their platforms, and their mitigation measures at this organisational level will also be subject to annual audits.
How will the DSA protect minors?

Under new rules, providers of online platforms that are accessible to minors will be required to put in place appropriate measures to ensure high level of privacy, safety and security of minors, on their services.

In addition, the new rules will ban targeted advertising to minors based on profiling using the personal data of users of their services when they can establish with reasonable certainty that the recipient of the service is a minor.

How can harmful but not illegal content be effectively addressed?

To the extent that it is not illegal, harmful content should not be treated in the same way as illegal content. The new rules will only impose measures to remove or encourage removal of illegal content, in full respect of the freedom of expression.

At the same time, the DSA regulates very large online platforms' and very large online search engines responsibilities when it comes to systemic issues such as disinformation, hoaxes and manipulation during pandemics, harms to vulnerable groups and other emerging societal harms. Following their designation by the Commission as very large online platforms and very large online search engines that reach 45 million users, they will have to perform an annual risk assessment and take corresponding risk mitigation measures stemming from the design and use of their service. Any such measures will need to be carefully balanced against restrictions of freedom of expression. They will also need to undergo an independent audit.

In addition, the proposal sets out a co-regulatory framework where service providers can work under codes of conduct to address negative impacts regarding the viral spread of illegal content as well as manipulative and abusive activities, which are particularly harmful for vulnerable recipients of the service, such as children and minors.

The DSA will foster a co-regulatory framework for online harms, including codes of conduct such as a revised Code of Practice on disinformation, and crisis protocols.

How will you keep a fair balance with fundamental rights such as the freedom of expression?

The DSA puts protection of freedom of expression at its very core. This includes protection from government interference in people's freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and accompanied by robust safeguards for freedom of expression and an effective right of redress – to avoid both under-removal and over-removal of content on grounds of illegality.

The DSA gives users the possibility to contest the decisions taken by the online platforms to remove their content, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.

The Digital Services Act proposes rules on transparency of content moderation decisions. For very large platforms, users and consumers will be able to have a better understanding of the ways these platforms impact our societies and will be obliged to mitigate those risks, including as regards freedom of expression. They will be held accountable through independent auditing reports and specialised and public scrutiny.

All the obligations in the DSA, including the crisis response mechanism, are carefully calibrated to promote the respect of fundamental rights, such as freedom of expression.

How does the Digital Services Act tackle disinformation?

Through the proposed rules on how platforms moderate content, on advertising, algorithmic processes and risk mitigation, the DSA will aim to ensure that platforms – and in particular the very large ones – are more accountable and assume their responsibility for the actions they take and the systemic risks they pose, including on disinformation and manipulation of electoral processes.

The Digital Services Act will foster a co-regulatory framework, together with the updated Code of Practice on Disinformation and the new Commission Guidance, as announced in the European Democracy Action Plan.

How does the Digital Services Act regulate online advertising?

The Digital Services Act covers any type of advertising, from digital marketing to issues-based advertising and political ads and complements existing rules such as the General Data Protection Regulation, which already establishes, for example, rules on users' consent or their right to object to targeted digital marketing.

The DSA introduces two new restrictions concerning targeted advertising on online platforms. First, it bans targeted advertising of minors based on profiling. Second, it bans targeted advertising based on profiling using special categories of personal data, such as sexual orientation or religious beliefs.
 
#32
#32
The new rules will empower users in understanding and making informed decisions about the ads they see. They will have to be clearly informed whether and why they are targeted by each ad and who paid for the ad; they should also see very clearly when content is sponsored or organically posted on a platform and should also see when influencers are promoting commercial messages. Notice and action obligations also apply for potentially illegal ads, as for any other type of content.

For very large online platforms, the societal stakes are higher, and the rules include additional measures to mitigate risks and enable oversight. They will have to maintain and provide access to ad repositories, allowing researchers, civil society and authorities to inspect how ads were displayed and how they were targeted. They will also need to assess whether and how their advertising systems are manipulated or otherwise contribute to societal risks, and take measures to mitigate these risks.

The rules are complemented by measures in the Digital Markets Act, which tackles the economic concerns over gatekeepers' advertising models.

How does the Digital Services Act protect personal data?

The DSA has been designed in full compliance with existing rules on data protection, including the General Data Protection Regulation (GDPR) and the ePrivacy Directive, and does not modify the rules and safeguards set out in these laws.

How does the Digital Services Act address dark patterns?

Under new rules, dark patterns are prohibited. Providers of online platforms will be required not to design, organise or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of users of their services to make free and informed decisions.

The ban complements, but does not overwrite the prohibitions already established under consumer protection and data protection rules, where a large numbers of dark patterns that mislead consumers are already banned in the EU.

3. Impact on businesses
What digital services does the act cover?

The Digital Services Act applies to a wide range of online intermediaries, which include services such as internet service providers, cloud services, messaging, marketplaces, or social networks. Specific due diligence obligations apply to hosting services, and in particular to online platforms, such as social networks, content-sharing platforms, app stores, online marketplaces, and online travel and accommodation platforms. The most far-reaching rules in the Digital Services Act focus on very large online platforms, which have a significant societal and economic impact, reaching at least 45 million users in the EU (representing 10% of the population). Similarly, very large online search engines with more than 10% of the 450 million consumers in the EU will bear more responsibility in curbing illegal content online. The first Very Large Online Platforms and Very Large Online Search Engines were designated on 25/04/2023.

What impact will the Digital Services Act have on businesses?

The DSA modernises and clarifies rules dating back to the year 2000. It sets a global benchmark, under which online businesses will benefit from a modern, clear and transparent framework assuring that rights are respected and obligations are enforced.

Moreover, for online intermediaries, and in particular for hosting services and online platforms, the new rules will cut the costs of complying with 27 different regimes in the single market. This will be particularly important for innovative SMEs, start-ups and scale-ups, which will be able to scale at home and compete with very large players. Small and micro-enterprises will be exempted from some of the rules that might be more burdensome for them, and the Commission will carefully monitor the effects of the new Regulation on SMEs.

Other businesses will also benefit from the new set of rules. They will have access to simple and effective tools for flagging illegal activities that damage their trade, as well as internal and external redress mechanisms, affording them better protections against erroneous removal, limiting losses for legitimate businesses and entrepreneurs.

Furthermore, those providers which voluntarily take measures to further curb the dissemination of illegal content will be reassured that these measures cannot have the negative consequences of being unprotected from legal liability.

What impact will the Digital Services Act have on start-ups and innovation in general?

With a single framework for the EU, the DSA makes the single market easier to navigate, lowering compliance costs and establishing a level playing field. Fragmentation of the single market disproportionately disadvantages SMEs and start-ups wishing to grow, due to the absence of a large enough domestic market and to the costs of complying with many different legislations. The costs of fragmentation are much easier to bear for businesses, which are already large.

A common, horizontal, harmonised rulebook applicable throughout the Digital Single Market will give SMEs, smaller platforms and start-ups, access to cross-border customers in their critical growth phase. The rules are accompanied by standardisation actions and Codes of Conduct that should support a smooth implementation by smaller companies.

How does the Digital Services Act differentiate between small and big players?

The DSA sets asymmetric due diligence obligations on different types of intermediaries depending on the nature of their services as well as on their size and impact, to ensure that their services are not misused for illegal activities and that providers operate responsibly. Certain substantive obligations are limited only to very large online platforms, which have a central role in facilitating the public debate and economic transactions. Very small platforms are exempt from the majority of obligations.

By rebalancing responsibilities in the online ecosystem according to the size of the players, the proposal ensures that the regulatory costs of these new rules are proportionate.

What impacts does the proposed Digital Services Act have on platforms and very large platforms?

All platforms, except the smallest (employing fewer than 50 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million), are required to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms, cooperate with trusted flaggers, take measures against abusive notices, deal with complaints, vet the credentials of third party suppliers, and provide user-facing transparency of online advertising.

In addition, very large online platforms and very large online search engines, reaching at least 45 million users (i.e. representing 10% of the European population) are subject to specific rules due to the particular risks they pose in the dissemination of illegal content and societal harms.

Very large online platforms have to meet risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.

What penalties will businesses face if they do not comply with the new rules?

The new enforcement mechanism, consisting of national and EU-level cooperation, will supervise how online intermediaries adapt their systems to the new requirements. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities. To do so, it will impose penalties, including financial fines. Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.

For the case of very large online platforms and very large online search engines, the Commission will have direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider.

The enforcement mechanism is not only limited to fines: the Digital Services Coordinator and the Commission will have the power to require immediate I'm actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.

For rogue platforms refusing to comply with important obligations and thereby endangering people's life and safety, it will be possible as a last resort to ask a court for a temporary suspension of their service, after involving all relevant parties.

4. Impact on Member States
How can the gaps between laws in Member States be filled?

The experience and attempts of the last few years have shown that individual national action to rein in the problems related to the spread of illegal content online, in particular when very large online platforms are involved, falls short of effectively addressing the challenges at hand and protecting all Europeans from online harm. Moreover, uncoordinated national action puts additional hurdles on the smaller online businesses and start-ups who face significant compliance costs to be able to comply with all the different legislation. Updated and harmonised rules will better protect and empower all Europeans, both individuals and businesses.

The Digital Services Act provides one set of rules for the entire EU. All citizens in the EU will have the same rights, a common enforcement system will see them protected in the same way and the rules for online platforms will be the same across the entire Union. This means standardised procedures for notifying illegal content, the same access to complaints and redress mechanisms across the single market, the same standard of transparency of content moderation or advertising systems, and the same supervised risk mitigation strategy where very large online platforms are concerned.

At the same time, as a Regulation, the Digital Services Act applies directly and will supersede overlapping national laws that follow the same objective. Besides, as the DSA is a full harmonisation instrument, EU Member States cannot go beyond the Regulation in their national laws.

Which institutions will supervise the rules, and who will select them?

The supervision of the rules will be shared between the Commission – primarily responsible for platforms and search engines with more than 45 million users in the EU – and Member States, responsible for any smaller platforms and search engines according to the Member State of establishment.

The Commission will have the same supervisory powers as it has under current anti-trust rules, including investigatory powers and the ability to impose fines of up to 6% of global revenue.

Member States will be required to designate competent authorities – referred to as Digital Services Coordinators – by 17 February 2024 for supervising compliance of the services established on their territory with the new rules, and to participate in the EU cooperation mechanism of the proposed Digital Services Act. The Digital Services Coordinator will be an independent authority with strong requirements to perform their tasks impartially and with transparency. The new Digital Services Coordinator within each Member State will be an important regulatory hub, ensuring coherence and digital competence.

The Digital Services Coordinators will cooperate within an independent advisory group, called the European Board for Digital Services, which can support with analysis, reports and recommendations, as well as coordinating the new tool of joint investigations by Digital Services Coordinators.

What will the Commission's role be in the supervision of platforms?

The enforcement of the Digital Services Act for providers of intermediary services established on their territory is primarily a task for national competent authorities, notably the Digital Services Coordinators.

However, when it comes to supervision of very large online platforms and online search engines, it will be the Commission who will be the sole authority to supervise and enforce the specific obligations under the DSA that apply only to these providers. In addition, the Commission will be, together with the Digital Services Coordinators, also responsible for supervision and enforcement for any other systemic issue concerning very large online platforms and very large online search engines.

An important part of the supervisory and enforcement framework under the DSA will also be the Board, whose members will be independent Digital Services Coordinators.

How will the Commission finance costs associated with the new supervisory and enforcement competences?

In order to ensure effective compliance with the DSA, it is important that the Commission has at its disposal necessary resources, in terms of staffing, expertise, and financial means, for the performance of its tasks under this Regulation. To this end, the Commission will charge supervisory fees on such providers, the level of which will be established on an annual basis, for the first time at the end of 2023. The overall amount of annual supervisory fees charged will be established on the basis of the overall amount of the costs incurred by the Commission to exercise its supervisory tasks under this Regulation, as reasonably estimated beforehand.

The annual supervisory fee to be charged to providers of very large online platforms and search engines should be proportionate to the size of the service as reflected by the number of its recipients in the Union. To this end, the individual annual supervisory fee should not exceed an overall ceiling (set at 0.05% of the annual worldwide net income) for each provider of very large online platforms and very large online search engines, in order to take into account the economic capacity of the provider of the designated service or services.

Detailed rules specifying the procedure and detailed methodology for the application of the supervisory fees were adopted by the Commission on 2 March 2023, and sent to the European Parliament and Council for their three months scrutiny, before publication and entry into force.

When will the DSA start applying?

The rules will start applying in two steps:

The DSA will be directly applicable across the EU from 17 February 2024, fifteen months after entry into force. By then, Member States need to empower their national authorities to enforce the new rules on smaller platforms and rules concerning non-systemic issues on very large online platforms and very large online search engines.

For very large online platforms and very large online search engines, which are directly supervised by the Commission as regards systemic obligations, the new rules will kick in earlier. First and foremost, all online platforms, except micro and small ones, were required to publish information on the number of active monthly users by 17 February 2023, an exercise which will be repeated at least once every 6 months afterwards. They are also invited to communicate these numbers to the Commission, which is responsible for assessing whether they reach threshold of 45 million users and should therefore be designated as very large online platforms or very large online search engines. Once designated by the Commission, providers of very large platforms and very large online search engines have four months to comply with the DSA, including to undertake and provide to the Commission first risk assessment under the DSA.

5. European Centre for Algorithmic Transparency
What technical expertise does the Commission have to supervise the biggest online intermediaries?

Since the end of the negotiations, the Commission has been preparing to take on the responsibility of supervising very large online platforms and search engines under the DSA, including efforts to increase staffing and expertise in the field of data science and algorithms, amongst others.

The Commission's supervisory role is enhanced by the European Centre for Algorithmic Transparency, housed in the Commission's Joint Research Centre (JRC). The Centre will contribute technical expertise, scientific research and foresight to the Commission's exclusive supervisory and enforcement role of the systemic obligations on very large online platforms and search engines provided for under the DSA. It will count on a team of specialised experts, who will also work on identifying and measuring systemic risks.

What is the role of the European Centre for Algorithmic Transparency?

The Centre provides in-house technical assistance in the area of algorithmic systems linked to the DSA's aim of ensuring a safe, predictable and trusted online environment, drawing from expertise in different disciplines to integrate technical, ethical, economic, legal and environmental perspectives.

The Centre will centralise research with a focus on algorithmic transparency, ensuring that decisions made by algorithms supporting the provision of digital services are transparent, explainable and in line with the risk management obligations of the very large online platforms and search engines.

When will the European Centre for Algorithmic Transparency be operational?

The ECAT was formally launched in April 2023. While most of its staff will be located in the Joint Research Centre site based in Seville, Spain, it will also work closely with JRC colleagues based in Ispra, Italy and Brussels, Belgium.

*Updated on 25/04/2023

Print friendly pdf
Questions and Answers: Digital Services Act
English (87.967 kB - PDF)
Download (87.967 kB - PDF)
Contacts for media
Johannes BAHRKE
Phone
+32 2 295 86 15
Mail
johannes.bahrke@ec.europa.eu
Thomas Regnier
Phone
+32 2 29 9 1099
Mail
thomas.regnier@ec.europa.eu
If you do not work for a media organisation, you are welcome to contact the EU through Europe Direct in writing or by calling 00 800 6 7 8 9 10 11.
QANDA/20/2348
Share this page:
TwitterFacebookLinkedInE-mail
European Commission
European Commission website
Follow the European Commission
FacebookTwitterOther social media
European Union
EU institutions
European Union
About the Commission's new web presenceLanguage policyResources for partnersCookiesPrivacy policyLegal noticeContact
 
#33
#33
EU opens investigation into X over alleged disinformation

The EU is investigating Elon Musk's X over the possible spread of terrorist and violent content, and hate speech, after Hamas' attack on Israel.


The investigation, the first under the EU's new tech rules, will also look at the way complaints are handled.

X, formerly known as Twitter, said it had removed hundreds of Hamas-affiliated accounts from the platform.

TikTok and Meta have also been warned by the EU for not doing enough to tackle disinformation.
 
#35
#35
1. General information on the Digital Services Act
What is the Digital Services Act?

The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others.

It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.

The European Parliament and Council reached a political agreement on the new rules on 23 April, 2022 and the DSA entered into force on 16 November 2022 after being published in the EU Official Journal on 27 October 2022.

The Digital Services Act is a Regulation that is directly applicable across the EU. Some of the obligations for intermediaries include:

Measures to counter illegal content online, including illegal goods and services. The DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers' to identify and remove illegal content;
New rules to trace sellers on online market places, to help build trust and go after scammers more easily; a new obligation by online market places to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
Effective safeguards for users, including the possibility to challenge platforms' content moderation decisions based on a new obligatory information to users when their content gets removed or restricted;
Wide ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users;
New obligations for the protection of minors on any platform in the EU;
Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits;
A new crisis response mechanism in cases of serious threat for public health and security crises, such as a pandemic or a war;
Bans on targeted advertising on online platforms by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation. Enhanced transparency for all advertising on online platforms and influencers' commercial communications;
A ban on using so-called ‘dark patterns' on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
New provisions to allow access to data to researchers of key platforms, in order to scrutinise how platforms work and how online risks evolve;
Users will have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Representative organisations will also be able to defend user rights for large scale breaches of the law;
A unique oversight structure. The Commission is the primary regulator for very large online platforms and very large online search engines (reaching 45 million users), while other platforms and search engines will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti-trust proceedings. An EU-wide cooperation mechanism will be established between national regulators and the Commission;
The liability rules for intermediaries have been reconfirmed and updated by the co-legislator, including a Europe-wide prohibition of generalised monitoring obligations.
Does the Digital Services Act define what is illegal online?

No. The new rules set out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for very large online platforms and search engines on how illegal content spreads on their service.

What constitutes illegal content is defined in other laws either at EU level or at national level – for example terrorist content or child sexual abuse material or illegal hate speech is defined at EU level. Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.

Will the Digital Services Act replace sector specific legislation?

No. The Digital Services Act sets the horizontal rules covering all services and all types of illegal content, including goods or services. It does not replace or amend, but it complements sector-specific legislation such as the Audiovisual Media Services Directive (AVMSD), the Directive on Copyright in the Digital Single Market, the Consumer Protection Acquis, or the Proposal for a Regulation on preventing the dissemination of terrorist content online.

What rules preceded the Digital Services Act, and why did they have to be updated?

The e-Commerce Directive, adopted in 2000, has been the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market.

Much has changed in more than 20 years and the rules needed to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users' rights, information flows and public participation. In addition, the e-Commerce Directive did not specify any cooperation mechanism between authorities. The “Country of Origin” principle meant that the supervision was entrusted to the country of establishment.

The Digital Services Act builds on the rules of the e-Commerce Directive, and addresses the particular issues emerging around online intermediaries. Member States have regulated these services differently, creating barriers for smaller companies looking to expand and scale up across the EU and resulting in different levels of protection for European citizens.

With the Digital Services Act, unnecessary legal burdens due to different laws will be lifted, fostering a better environment for innovation, growth and competitiveness, and facilitating the scaling up of smaller platforms, SMEs and start-ups. At the same time, it will equally protect all users in the EU, both as regards their safety from illegal goods, content or services, and as regards their fundamental rights.

What is the relevance of the Regulation of intermediaries at global level?

The new rules are an important step in defending European values in the online space. They respect international human rights norms, and help better protect democracy, equality and the rule of law.

The DSA sets high standards for effective intervention, for due process and the protection of fundamental rights online; it preserves a balanced approach to the liability of intermediaries, and establishes effective measures for tackling illegal content and societal risks online. In doing so, the DSA aims at setting a benchmark for a regulatory approach to online intermediaries also at the global level.

Will these rules apply to companies outside of the EU?

They apply in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they will have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments. At the same time, online intermediaries will also benefit from the legal clarity of the liability exemptions and from a single set of rules when providing their services in the EU.

Does the Digital Service Act include provisions for digital taxation?

No, the Commission's proposal for an interim digital tax for revenue from digital activities is a separate initiative to the Digital Services Act. There are no provisions in the Digital Services Act in the field of taxation.

2. Impact on users
How will citizens benefit from the new rules?

Online platforms play an increasingly important role in the daily lives of Europeans. The rules will create a safer online experience for citizens to freely express their ideas, communicate and shop online, by reducing their exposure to illegal activities and dangerous goods and ensuring the protection of fundamental rights. The benefits include:

Better services for consumers: Online marketplaces will need to identify their business users and clarify who is selling a product or offering a service; this will help track down rogue traders and will protect online shoppers against illegal products, such as counterfeit and dangerous products. Online marketplace will be required to inform consumers who purchased a product or service when they become aware of the illegality of such products or services, about a) the illegality, b) the identity of the trader and c) any relevant means of redress. They will randomly check the documentation of products sold on their platform, and should increasingly rely on enhanced product traceability solutions, to make sure fewer and fewer non-compliant goods reach European consumers.
New rights for users: At the same time, citizens will be able to notify illegal content, including products, that they encounter and contest the decisions made by online platforms when their content is removed: platforms are obliged to notify them of any decision taken, of the reason to take that decision and to provide for a mechanism to contest the decision.
More transparency on advertising: Users will also receive more information about ads they are seeing on online platforms – for example, if and why an ad targets them specifically. Platforms will no longer present behaviourally targeted ads for minors and will no longer present ads to their users based on profiling that rests on special categories of personal data, such as their ethnicity, political views or sexual orientation.
More responsibilities for very large platforms: Specific rules are introduced for very large online platforms and very large online search engines that reach more than 45 million users, given their systemic impact in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas. When such platforms recommend content, users will be able to modify the criteria used, and choose not to receive personalised recommendations. Citizens will not have to take these companies at their word; citizens will be able to scrutinise their actions through the reports of independent auditors and vetted researchers.
Clearer consequences: Users will be able to seek compensation from providers of intermediary services for any damage or loss suffered due to an infringement of the DSA by such provider.
What measures does the legislation take to counter illegal content?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal content, but also illegal goods and services:

Users will be empowered to report illegal content in an easy and effective way.
A priority channel will be created for trusted flaggers – entities which have demonstrated particular expertise and competence – to report illegal content to which platforms will have to react with priority.
When enabled by national laws, Member State authorities will be able to order any platform operating in the EU, irrespective of where they are established, to remove illegal content. Very large online platforms will need to take mitigating measures at the level of the overall organisation of their service to protect their users from illegal content, goods and services.
How will the DSA protect people from unsafe or counterfeit goods?

The Digital Services Act will set out effective means for all actors in the online ecosystem to counter illegal goods:

Platforms will have mandatory procedures in place for removing illegal goods.
Online marketplaces will also be requested to trace their traders (“know your business customer”). This will ensure a safe, transparent and trustworthy environment for consumers and discourage traders who abuse platforms from selling unsafe or counterfeit goods.
Online platforms will further be requested to organise their online interfaces in a way that allows traders to comply with their information obligations towards consumers.
A new system of trusted flaggers will also be available, for example, for brand owners fighting counterfeit goods, and for faster and easier flagging and removal of counterfeit goods.
Public authorities will have new tools to order the removal of unsafe products directly.
Marketplaces will also be required to implement reasonable efforts to randomly check whether products or services have been identified as being illegal in any official database and take the appropriate action.
Very large online platforms will be subject to an audited risk assessment that will include an analysis on their vulnerability to illegal goods on their platforms, and their mitigation measures at this organisational level will also be subject to annual audits.
How will the DSA protect minors?

Under new rules, providers of online platforms that are accessible to minors will be required to put in place appropriate measures to ensure high level of privacy, safety and security of minors, on their services.

In addition, the new rules will ban targeted advertising to minors based on profiling using the personal data of users of their services when they can establish with reasonable certainty that the recipient of the service is a minor.

How can harmful but not illegal content be effectively addressed?

To the extent that it is not illegal, harmful content should not be treated in the same way as illegal content. The new rules will only impose measures to remove or encourage removal of illegal content, in full respect of the freedom of expression.

At the same time, the DSA regulates very large online platforms' and very large online search engines responsibilities when it comes to systemic issues such as disinformation, hoaxes and manipulation during pandemics, harms to vulnerable groups and other emerging societal harms. Following their designation by the Commission as very large online platforms and very large online search engines that reach 45 million users, they will have to perform an annual risk assessment and take corresponding risk mitigation measures stemming from the design and use of their service. Any such measures will need to be carefully balanced against restrictions of freedom of expression. They will also need to undergo an independent audit.

In addition, the proposal sets out a co-regulatory framework where service providers can work under codes of conduct to address negative impacts regarding the viral spread of illegal content as well as manipulative and abusive activities, which are particularly harmful for vulnerable recipients of the service, such as children and minors.

The DSA will foster a co-regulatory framework for online harms, including codes of conduct such as a revised Code of Practice on disinformation, and crisis protocols.

How will you keep a fair balance with fundamental rights such as the freedom of expression?

The DSA puts protection of freedom of expression at its very core. This includes protection from government interference in people's freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and accompanied by robust safeguards for freedom of expression and an effective right of redress – to avoid both under-removal and over-removal of content on grounds of illegality.

The DSA gives users the possibility to contest the decisions taken by the online platforms to remove their content, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.

The Digital Services Act proposes rules on transparency of content moderation decisions. For very large platforms, users and consumers will be able to have a better understanding of the ways these platforms impact our societies and will be obliged to mitigate those risks, including as regards freedom of expression. They will be held accountable through independent auditing reports and specialised and public scrutiny.

All the obligations in the DSA, including the crisis response mechanism, are carefully calibrated to promote the respect of fundamental rights, such as freedom of expression.

How does the Digital Services Act tackle disinformation?

Through the proposed rules on how platforms moderate content, on advertising, algorithmic processes and risk mitigation, the DSA will aim to ensure that platforms – and in particular the very large ones – are more accountable and assume their responsibility for the actions they take and the systemic risks they pose, including on disinformation and manipulation of electoral processes.

The Digital Services Act will foster a co-regulatory framework, together with the updated Code of Practice on Disinformation and the new Commission Guidance, as announced in the European Democracy Action Plan.

How does the Digital Services Act regulate online advertising?

The Digital Services Act covers any type of advertising, from digital marketing to issues-based advertising and political ads and complements existing rules such as the General Data Protection Regulation, which already establishes, for example, rules on users' consent or their right to object to targeted digital marketing.

The DSA introduces two new restrictions concerning targeted advertising on online platforms. First, it bans targeted advertising of minors based on profiling. Second, it bans targeted advertising based on profiling using special categories of personal data, such as sexual orientation or religious beliefs.

Yep, nothing in there about how it will address slander.
 
#36
#36
The new rules will empower users in understanding and making informed decisions about the ads they see. They will have to be clearly informed whether and why they are targeted by each ad and who paid for the ad; they should also see very clearly when content is sponsored or organically posted on a platform and should also see when influencers are promoting commercial messages. Notice and action obligations also apply for potentially illegal ads, as for any other type of content.

For very large online platforms, the societal stakes are higher, and the rules include additional measures to mitigate risks and enable oversight. They will have to maintain and provide access to ad repositories, allowing researchers, civil society and authorities to inspect how ads were displayed and how they were targeted. They will also need to assess whether and how their advertising systems are manipulated or otherwise contribute to societal risks, and take measures to mitigate these risks.

The rules are complemented by measures in the Digital Markets Act, which tackles the economic concerns over gatekeepers' advertising models.

How does the Digital Services Act protect personal data?

The DSA has been designed in full compliance with existing rules on data protection, including the General Data Protection Regulation (GDPR) and the ePrivacy Directive, and does not modify the rules and safeguards set out in these laws.

How does the Digital Services Act address dark patterns?

Under new rules, dark patterns are prohibited. Providers of online platforms will be required not to design, organise or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of users of their services to make free and informed decisions.

The ban complements, but does not overwrite the prohibitions already established under consumer protection and data protection rules, where a large numbers of dark patterns that mislead consumers are already banned in the EU.

3. Impact on businesses
What digital services does the act cover?

The Digital Services Act applies to a wide range of online intermediaries, which include services such as internet service providers, cloud services, messaging, marketplaces, or social networks. Specific due diligence obligations apply to hosting services, and in particular to online platforms, such as social networks, content-sharing platforms, app stores, online marketplaces, and online travel and accommodation platforms. The most far-reaching rules in the Digital Services Act focus on very large online platforms, which have a significant societal and economic impact, reaching at least 45 million users in the EU (representing 10% of the population). Similarly, very large online search engines with more than 10% of the 450 million consumers in the EU will bear more responsibility in curbing illegal content online. The first Very Large Online Platforms and Very Large Online Search Engines were designated on 25/04/2023.

What impact will the Digital Services Act have on businesses?

The DSA modernises and clarifies rules dating back to the year 2000. It sets a global benchmark, under which online businesses will benefit from a modern, clear and transparent framework assuring that rights are respected and obligations are enforced.

Moreover, for online intermediaries, and in particular for hosting services and online platforms, the new rules will cut the costs of complying with 27 different regimes in the single market. This will be particularly important for innovative SMEs, start-ups and scale-ups, which will be able to scale at home and compete with very large players. Small and micro-enterprises will be exempted from some of the rules that might be more burdensome for them, and the Commission will carefully monitor the effects of the new Regulation on SMEs.

Other businesses will also benefit from the new set of rules. They will have access to simple and effective tools for flagging illegal activities that damage their trade, as well as internal and external redress mechanisms, affording them better protections against erroneous removal, limiting losses for legitimate businesses and entrepreneurs.

Furthermore, those providers which voluntarily take measures to further curb the dissemination of illegal content will be reassured that these measures cannot have the negative consequences of being unprotected from legal liability.

What impact will the Digital Services Act have on start-ups and innovation in general?

With a single framework for the EU, the DSA makes the single market easier to navigate, lowering compliance costs and establishing a level playing field. Fragmentation of the single market disproportionately disadvantages SMEs and start-ups wishing to grow, due to the absence of a large enough domestic market and to the costs of complying with many different legislations. The costs of fragmentation are much easier to bear for businesses, which are already large.

A common, horizontal, harmonised rulebook applicable throughout the Digital Single Market will give SMEs, smaller platforms and start-ups, access to cross-border customers in their critical growth phase. The rules are accompanied by standardisation actions and Codes of Conduct that should support a smooth implementation by smaller companies.

How does the Digital Services Act differentiate between small and big players?

The DSA sets asymmetric due diligence obligations on different types of intermediaries depending on the nature of their services as well as on their size and impact, to ensure that their services are not misused for illegal activities and that providers operate responsibly. Certain substantive obligations are limited only to very large online platforms, which have a central role in facilitating the public debate and economic transactions. Very small platforms are exempt from the majority of obligations.

By rebalancing responsibilities in the online ecosystem according to the size of the players, the proposal ensures that the regulatory costs of these new rules are proportionate.

What impacts does the proposed Digital Services Act have on platforms and very large platforms?

All platforms, except the smallest (employing fewer than 50 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million), are required to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms, cooperate with trusted flaggers, take measures against abusive notices, deal with complaints, vet the credentials of third party suppliers, and provide user-facing transparency of online advertising.

In addition, very large online platforms and very large online search engines, reaching at least 45 million users (i.e. representing 10% of the European population) are subject to specific rules due to the particular risks they pose in the dissemination of illegal content and societal harms.

Very large online platforms have to meet risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.

What penalties will businesses face if they do not comply with the new rules?

The new enforcement mechanism, consisting of national and EU-level cooperation, will supervise how online intermediaries adapt their systems to the new requirements. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities. To do so, it will impose penalties, including financial fines. Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.

For the case of very large online platforms and very large online search engines, the Commission will have direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider.

The enforcement mechanism is not only limited to fines: the Digital Services Coordinator and the Commission will have the power to require immediate I'm actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.

For rogue platforms refusing to comply with important obligations and thereby endangering people's life and safety, it will be possible as a last resort to ask a court for a temporary suspension of their service, after involving all relevant parties.

4. Impact on Member States
How can the gaps between laws in Member States be filled?

The experience and attempts of the last few years have shown that individual national action to rein in the problems related to the spread of illegal content online, in particular when very large online platforms are involved, falls short of effectively addressing the challenges at hand and protecting all Europeans from online harm. Moreover, uncoordinated national action puts additional hurdles on the smaller online businesses and start-ups who face significant compliance costs to be able to comply with all the different legislation. Updated and harmonised rules will better protect and empower all Europeans, both individuals and businesses.

The Digital Services Act provides one set of rules for the entire EU. All citizens in the EU will have the same rights, a common enforcement system will see them protected in the same way and the rules for online platforms will be the same across the entire Union. This means standardised procedures for notifying illegal content, the same access to complaints and redress mechanisms across the single market, the same standard of transparency of content moderation or advertising systems, and the same supervised risk mitigation strategy where very large online platforms are concerned.

At the same time, as a Regulation, the Digital Services Act applies directly and will supersede overlapping national laws that follow the same objective. Besides, as the DSA is a full harmonisation instrument, EU Member States cannot go beyond the Regulation in their national laws.

Which institutions will supervise the rules, and who will select them?

The supervision of the rules will be shared between the Commission – primarily responsible for platforms and search engines with more than 45 million users in the EU – and Member States, responsible for any smaller platforms and search engines according to the Member State of establishment.

The Commission will have the same supervisory powers as it has under current anti-trust rules, including investigatory powers and the ability to impose fines of up to 6% of global revenue.

Member States will be required to designate competent authorities – referred to as Digital Services Coordinators – by 17 February 2024 for supervising compliance of the services established on their territory with the new rules, and to participate in the EU cooperation mechanism of the proposed Digital Services Act. The Digital Services Coordinator will be an independent authority with strong requirements to perform their tasks impartially and with transparency. The new Digital Services Coordinator within each Member State will be an important regulatory hub, ensuring coherence and digital competence.

The Digital Services Coordinators will cooperate within an independent advisory group, called the European Board for Digital Services, which can support with analysis, reports and recommendations, as well as coordinating the new tool of joint investigations by Digital Services Coordinators.

What will the Commission's role be in the supervision of platforms?

The enforcement of the Digital Services Act for providers of intermediary services established on their territory is primarily a task for national competent authorities, notably the Digital Services Coordinators.

However, when it comes to supervision of very large online platforms and online search engines, it will be the Commission who will be the sole authority to supervise and enforce the specific obligations under the DSA that apply only to these providers. In addition, the Commission will be, together with the Digital Services Coordinators, also responsible for supervision and enforcement for any other systemic issue concerning very large online platforms and very large online search engines.

An important part of the supervisory and enforcement framework under the DSA will also be the Board, whose members will be independent Digital Services Coordinators.

How will the Commission finance costs associated with the new supervisory and enforcement competences?

In order to ensure effective compliance with the DSA, it is important that the Commission has at its disposal necessary resources, in terms of staffing, expertise, and financial means, for the performance of its tasks under this Regulation. To this end, the Commission will charge supervisory fees on such providers, the level of which will be established on an annual basis, for the first time at the end of 2023. The overall amount of annual supervisory fees charged will be established on the basis of the overall amount of the costs incurred by the Commission to exercise its supervisory tasks under this Regulation, as reasonably estimated beforehand.

The annual supervisory fee to be charged to providers of very large online platforms and search engines should be proportionate to the size of the service as reflected by the number of its recipients in the Union. To this end, the individual annual supervisory fee should not exceed an overall ceiling (set at 0.05% of the annual worldwide net income) for each provider of very large online platforms and very large online search engines, in order to take into account the economic capacity of the provider of the designated service or services.

Detailed rules specifying the procedure and detailed methodology for the application of the supervisory fees were adopted by the Commission on 2 March 2023, and sent to the European Parliament and Council for their three months scrutiny, before publication and entry into force.

When will the DSA start applying?

The rules will start applying in two steps:

The DSA will be directly applicable across the EU from 17 February 2024, fifteen months after entry into force. By then, Member States need to empower their national authorities to enforce the new rules on smaller platforms and rules concerning non-systemic issues on very large online platforms and very large online search engines.

For very large online platforms and very large online search engines, which are directly supervised by the Commission as regards systemic obligations, the new rules will kick in earlier. First and foremost, all online platforms, except micro and small ones, were required to publish information on the number of active monthly users by 17 February 2023, an exercise which will be repeated at least once every 6 months afterwards. They are also invited to communicate these numbers to the Commission, which is responsible for assessing whether they reach threshold of 45 million users and should therefore be designated as very large online platforms or very large online search engines. Once designated by the Commission, providers of very large platforms and very large online search engines have four months to comply with the DSA, including to undertake and provide to the Commission first risk assessment under the DSA.

5. European Centre for Algorithmic Transparency
What technical expertise does the Commission have to supervise the biggest online intermediaries?

Since the end of the negotiations, the Commission has been preparing to take on the responsibility of supervising very large online platforms and search engines under the DSA, including efforts to increase staffing and expertise in the field of data science and algorithms, amongst others.

The Commission's supervisory role is enhanced by the European Centre for Algorithmic Transparency, housed in the Commission's Joint Research Centre (JRC). The Centre will contribute technical expertise, scientific research and foresight to the Commission's exclusive supervisory and enforcement role of the systemic obligations on very large online platforms and search engines provided for under the DSA. It will count on a team of specialised experts, who will also work on identifying and measuring systemic risks.

What is the role of the European Centre for Algorithmic Transparency?

The Centre provides in-house technical assistance in the area of algorithmic systems linked to the DSA's aim of ensuring a safe, predictable and trusted online environment, drawing from expertise in different disciplines to integrate technical, ethical, economic, legal and environmental perspectives.

The Centre will centralise research with a focus on algorithmic transparency, ensuring that decisions made by algorithms supporting the provision of digital services are transparent, explainable and in line with the risk management obligations of the very large online platforms and search engines.

When will the European Centre for Algorithmic Transparency be operational?

The ECAT was formally launched in April 2023. While most of its staff will be located in the Joint Research Centre site based in Seville, Spain, it will also work closely with JRC colleagues based in Ispra, Italy and Brussels, Belgium.

*Updated on 25/04/2023

Print friendly pdf
Questions and Answers: Digital Services Act
English (87.967 kB - PDF)
Download (87.967 kB - PDF)
Contacts for media
Johannes BAHRKE
Phone
+32 2 295 86 15
Mail
johannes.bahrke@ec.europa.eu
Thomas Regnier
Phone
+32 2 29 9 1099
Mail
thomas.regnier@ec.europa.eu
If you do not work for a media organisation, you are welcome to contact the EU through Europe Direct in writing or by calling 00 800 6 7 8 9 10 11.
QANDA/20/2348
Share this page:
TwitterFacebookLinkedInE-mail
European Commission
European Commission website
Follow the European Commission
FacebookTwitterOther social media
European Union
EU institutions
European Union
About the Commission's new web presenceLanguage policyResources for partnersCookiesPrivacy policyLegal noticeContact

And again nothing in there about how an individual can address slander through this new abomination.

Truthfully the way this reads it looks like a way for the EU to decide what people can and cannot see on the internet with a few protections against online scams sprinkled in to distract the mouthbreathers from it’s true purpose. And that is to protect the people and institutions in charge.
 
  • Like
Reactions: MemphisVol77
#37
#37
Either big tech decides the rules, or the government does.

Here it's been mostly the former, but then a bunch of malcontents decided they have a "First Amendment" right to have their dopey hot takes promoted on Facebook or Twitter, and if they don't, they go crying to the politicians about "censureship."
I am fine with Big Tech running their own websites. What I am not fine is the government applying pressure on those companies to remove things they didn't like. Like the FBI did with Facebook.

That was Big Tech working for the government, thats wrong. especially in this case because it was pure politics. If the government was actually partisan or capable of putting their bias aside I wouldn't have an issue with it. but them being able to apply their own bias onto others is down right wrong. Giving the government more power would lead to even more cases of this new ruling body to apply its own biases at important times to influence elections their way.

If people don't like what Facebook/Twitter allow posted then leave. That goes for both sides, too much or not enough.

I see them the same as newspapers. they get to select what they think is important/relevant/should be shown to their consumers. any editing they do on their own, but they have to own it and not hide behind the government. they can present one side of the story, both sides fairly, or anything in between, and the consumers get to decide if they want to read that paper or not. Just because a newspaper doesn't run a story you wanted to read doesn't make it a First Amendment issue. Just because they won't publish your letter to the editor doesn't mean its a First Amendment issue. I think it does become a First Amendment issue when the government tells them what to post.

just because people assume they are a public forum, where the 1A would apply, doesn't mean they are. Now if the government came in and started mandating specific forums (Facebook, instagram, VN, reddit) I think it starts to lean towards being a public resource where the 1A does apply. currently they are/should be completely voluntary privately owned entities and not subject to the 1A. we are in their house, their rules apply.
 
#38
#38
And again nothing in there about how an individual can address slander through this new abomination.

Truthfully the way this reads it looks like a way for the EU to decide what people can and cannot see on the internet with a few protections against online scams sprinkled in to distract the mouthbreathers from it’s true purpose. And that is to protect the people and institutions in charge.

Curious. Do you drive the posted speed limit on roads?
 
#42
#42
Yes, no. Mind getting to whatever point you're (hopefully) trying to get to?

Seems you respect the government's driving regulations: when it tells you to change your habits and wear a seat belt (shouldn't this be a personal choice? Afterall, it's only your life you're risking), how fast you're allowed to drive (used to be 55 on interstates... was that rational and reasonable? It's now 85 in Texas. Hmm.) , etc.

Yet you don't seem to respect the government when it proposes to regulate the internet.

Non sequitur.
 
#43
#43
Seems you respect the government's driving regulations: when it tells you to change your habits and wear a seat belt (shouldn't this be a personal choice? Afterall, it's only your life you're risking), how fast you're allowed to drive (used to be 55 on interstates... was that rational and reasonable? It's now 85 in Texas. Hmm.) , etc.

Yet you don't seem to respect the government when it proposes to regulate the internet.

Non sequitur.

Actually I started wearing my seatbelt when a girlfriend was killed in a car wreck and the other 3 girls walked away with just bruises. She didn't want to wrinkle her dress.
 
  • Like
Reactions: MontyPython
#44
#44
Actually I started wearing my seatbelt when a girlfriend was killed in a car wreck and the other 3 girls walked away with just bruises. She didn't want to wrinkle her dress.
I'm sorry to hear that Hogg.
 
#46
#46
I am fine with Big Tech running their own websites. What I am not fine is the government applying pressure on those companies to remove things they didn't like. Like the FBI did with Facebook.

That was Big Tech working for the government, thats wrong. especially in this case because it was pure politics. If the government was actually partisan or capable of putting their bias aside I wouldn't have an issue with it. but them being able to apply their own bias onto others is down right wrong. Giving the government more power would lead to even more cases of this new ruling body to apply its own biases at important times to influence elections their way.

If people don't like what Facebook/Twitter allow posted then leave. That goes for both sides, too much or not enough.

I see them the same as newspapers. they get to select what they think is important/relevant/should be shown to their consumers. any editing they do on their own, but they have to own it and not hide behind the government. they can present one side of the story, both sides fairly, or anything in between, and the consumers get to decide if they want to read that paper or not. Just because a newspaper doesn't run a story you wanted to read doesn't make it a First Amendment issue. Just because they won't publish your letter to the editor doesn't mean its a First Amendment issue. I think it does become a First Amendment issue when the government tells them what to post.

just because people assume they are a public forum, where the 1A would apply, doesn't mean they are. Now if the government came in and started mandating specific forums (Facebook, instagram, VN, reddit) I think it starts to lean towards being a public resource where the 1A does apply. currently they are/should be completely voluntary privately owned entities and not subject to the 1A. we are in their house, their rules apply.
Gov't staying out is a no brainer and the courts have weighed in and agreed with us. But it could get complicated. Google is so big and strong, what if (it may have already happened) the powers that be at Google decide that they want to control the reach of conservatives? Google is kind of in charge of our information. They could effectively cut off our information through algorithms and whatnot. That could become a problem that would necessitate gov't intervention. Some conservative websites have complained that this is already happening
 
#47
#47
Gov't staying out is a no brainer and the courts have weighed in and agreed with us. But it could get complicated. Google is so big and strong, what if (it may have already happened) the powers that be at Google decide that they want to control the reach of conservatives? Google is kind of in charge of our information. They could effectively cut off our information through algorithms and whatnot. That could become a problem that would necessitate gov't intervention. Some conservative websites have complained that this is already happening
then the conservative websites need to come up with a better product. Kinda defeats the purpose to be a conservative website but rely on non-conservative policies if your very complaint is the proliferation of non-conservative policies.
 

VN Store



Back
Top