Civile.it
/internet
Osservatorio sul diritto e telecomunicazioni informatiche, a cura del dott. V. Spataro dal 1999, 9332 documenti.

Il dizionario e' stato letto volte.



Segui via: Email - Telegram
  Dal 1999   spieghiamo il diritto di internet  Store  Podcast  Dizionario News alert    
             

  


WPkit.it: privacy, formulari, check up per WordPress

Temi attuali:
Algoritmi ChatGPT Intelligenza artificiale Privacy WordPress



Servizi online 12.01.2021    Pdf    Appunta    Letti    Post successivo  

La proposta europea di regolamentare i servizi digitali

REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC


Valentino Spataro

 

Indice generato dai software di IusOnDemand
su studi di legal design e analisi testuali e statistiche

B

Brussels, 15.12.2020

COM(2020) 825 final

2020/0361(COD)

Proposal for a

REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

Indici generati dai tool di IusOnDemand 4 LegalDesign.it

SEZIONI:

 Article 114 of the Treaty on the Functioning of the European Union gives the legislator the possibility to adopt regulations and directives.

Section 1 lays down obligations applicable to all providers of intermediary services, in particular: the obligation to establish a single point of contact to facilitate direct communication with Member States’ authorities, the Commission and t

Section 2 lays down obligations, additional to those under Section 1, applicable to providers of hosting services. In particular, that section obliges those providers to put in place mechanisms to allow third parties to notify the presence of alleg

Section 3 lays down obligations applicable to all online platforms, additional to those under Sections 1 and 2. The Section specifies that it does not apply to online platforms that are micro or small enterprises within the meaning of the Annex to

Section 4 lays down obligations, additional to the obligations laid down in Sections 1 to 3, for very large online platforms (as defined by Article 25) to manage systemic risks. Very large online platforms are obliged to conduct risk assessments on

Section 5 contains transversal provisions concerning due diligence obligations, namely the processes for which the Commission will support and promote the development and implementation of harmonised European standards (Article 34); the framework f

Section 1 lays down provisions concerning national competent authorities, including Digital Services Coordinators, which are the primary national authorities designated by the Member States for the consistent application of this Regulation (Article

Section 2 lays down provisions regarding the European Board for Digital Services, an independent advisory group of Digital Services Coordinators (Article 47). It also sets out the structure of that Board (Article 48) and its tasks (Article 49).

Section 3 concerns the supervision, investigation, enforcement and monitoring of very large online platforms. It provides for enhanced supervision in the event such platforms infringe the provisions of Chapter III, Section 4 (Article 50). It also p

Section 4 includes the common provisions on enforcement. It first lays down rules on an information-sharing system supporting communications between Digital Services Coordinators, the Commission and the Board (Article 67). It also includes the righ

Section 5 relates to the adoption of delegated and implementing acts in accordance with Articles 290 and 291 of the Treaty on the Functioning of the European Union, respectively (Articles 69 and 70).

Articolato:


HAVE ADOPTED THIS REGULATION:

Chapter I – General provisions


Article 1 Subject matter and scope

Article 2 Definitions

Article 3 ‘Mere conduit’

Article 4 ‘Caching’

Article 5 Hosting

Article 6 Voluntary own-initiative investigations and legal compliance

Article 7 No general monitoring or active fact-finding obligations

Article 8 Orders to act against illegal content

Article 9 Orders to provide information

Section 1 Provisions applicable to all providers of intermediary services

Article 10 Points of contact

Article 11 Legal representatives

Article 12 Terms and conditions

Article 13 Transparency reporting obligations for providers of intermediary services

Section 2 Additional provisions applicable to providers of hosting services, including online platforms

Article 14 Notice and action mechanisms

Article 15 Statement of reasons

Section 3 Additional provisions applicable to online platforms

Article 16 Exclusion for micro and small enterprises

Article 17 Internal complaint-handling system

Article 18 Out-of-court dispute settlement

Article 19 Trusted flaggers

Article 20 Measures and protection against misuse

Article 21 Notification of suspicions of criminal offences

Article 22 Traceability of traders

Article 23 Transparency reporting obligations for providers of online platforms

Article 24 Online advertising transparency

Section 4 Additional obligations for very large online platforms to manage systemic risks

Article 25 Very large online platforms

Article 26 Risk assessment

Article 27 Mitigation of risks

Article 28 Independent audit

Article 29 Recommender systems

Article 30 Additional online advertising transparency

Article 31 Data access and scrutiny

Article 32 Compliance officers

Article 33 Transparency reporting obligations for very large online platforms

Section 5 other provisions concerning due diligence obligations

Article 34 Standards

Article 35 Codes of conduct

Article 36 Codes of conduct for online advertising

Article 37 Crisis protocols

Section 1 Competent authorities and National Digital Services Coordinators

Article 38 Competent authorities and Digital Services Coordinators

Article 39 Requirements for Digital Services Coordinators

Article 40 Jurisdiction

Article 41 Powers of Digital Services Coordinators

Article 42 Penalties

Article 43 Right to lodge a complaint

Article 44 Activity reports

Article 45 Cross-border cooperation among Digital Services Coordinators

Article 46 Joint investigations and requests for Commission intervention

Section 2 European Board for Digital Services

Article 47 European Board for Digital Services

Article 48 Structure of the Board

Article 49 Tasks of the Board

Section 3 Supervision, investigation, enforcement and monitoring in respect of very large online platforms

Article 50 Enhanced supervision for very large online platforms

Article 51 Intervention by the Commission and opening of proceedings

Article 52 Requests for information

Article 53 Power to take interviews and statements

Article 54 Power to conduct on-site inspections

Article 55 Interim measures

Article 56 Commitments

Article 57 Monitoring actions

Article 58 Non-compliance

Article 59 Fines

Article 60 Periodic penalty payments

Article 61 Limitation period for the imposition of penalties

Article 62 Limitation period for the enforcement of penalties

Article 63 Right to be heard and access to the file

Article 64 Publication of decisions

Article 65 Requests for access restrictions and cooperation with national courts

Article 66 Implementing acts relating to Commission intervention

Section 4 Common provisions on enforcement

Article 67 Information sharing system

Article 68 Representation

Section 5 Delegated acts

Article 69 Exercise of the delegation

Article 70 Committee

Article 71 Deletion of certain provisions of Directive 2000/31/EC

Article 72 Amendments to Directive 2020/XX/EC on Representative Actions for the Protection of the Collective Interests of Consumers

Article 73 Evaluation

Article 74 Entry into force and application

Article ………….

Memorandum:

EXPLANATORY MEMORANDUM

1.CONTEXT OF THE PROPOSAL

Reasons for and objectives of the proposal

Since the adoption of Directive 2000/31/EC 1 (the “e-Commerce Directive”), new and innovative information society (digital) services have emerged, changing the daily lives of Union citizens and shaping and transforming how they communicate, connect, consume and do business. Those services have contributed deeply to societal and economic transformations in the Union and across the world. At the same time, the use of those services has also become the source of new risks and challenges, both for society as a whole and individuals using such services. Digital services can support achieving Sustainable Development Goals by contributing to economic, social and environmental sustainability. The coronavirus crisis has shown the importance of digital technologies in all aspects of modern life. It has clearly shown the dependency of our economy and society on digital services and highlighted both the benefits and the risks stemming from the current framework for the functioning of digital services.

In the Communication ‘Shaping Europe’s Digital Future’ 2 , the Commission committed to update the horizontal rules that define the responsibilities and obligations of providers of digital services, and online platforms in particular.

In doing so, the Commission has taken account of the issues identified in the European Parliament’s own initiative reports and analysed the proposals therein. The European Parliament adopted two resolutions on the basis of Article 225 of the Treaty on the Functioning of the European Union (TFEU) on the ‘Digital Services Act – Improving the functioning of the Single Market’ 3 and on the ‘Digital Services Act: adapting commercial and civil law rules for commercial entities operating online’ 4 . The European Parliament also adopted a resolution under the non-legislative procedure on the ‘Digital Services Act and fundamental rights issues posed’ 5 . In substance, the resolutions are complementary in many aspects. They include a strong call for maintaining the core principles of the e-Commerce Directive and for protecting fundamental rights in the online environment, as well as online anonymity wherever technically possible. They call for transparency, information obligations and accountability for digital services providers and advocate for effective obligations to tackle illegal content online. They also advocate for public oversight at EU and national level, and cooperation between competent authorities across jurisdictions in enforcing the law, especially when addressing cross-border matters.

The resolution on ‘Digital Services Act – Improving the functioning of the Single Market’ calls for an ambitious reform of the existing EU e-commerce legal framework while maintaining the core principles of its liability regime, the prohibition of general monitoring and the internal market clause, which it considers to be still valid today. Confirming the objectives of the e-Commerce Directive, the resolution calls for measures which have consumer protection at their core, by including a detailed section on online marketplaces, and which ensure consumer trust in the digital economy, while respecting users’ fundamental rights. The resolution also advocates for rules to underpin a competitive digital environment in Europe, and envisages the Digital Services Act as a standard-setter at global level.

The resolution on ‘Digital Services Act: adapting commercial and civil law rules for commercial entities operating online’ calls for more fairness, transparency and accountability for digital services’ content moderation processes, ensuring that fundamental rights are respected, and guaranteeing independent recourse to judicial redress. The resolution also includes the request for a detailed ‘notice-and-action’ mechanism addressing illegal content, comprehensive rules about online advertising, including targeted advertising, and enabling the development and use of smart contracts.

The non-legislative resolution on the ‘Digital Services Act and fundamental rights issues posed’ highlights the need for legal clarity for platforms and users, and respect for fundamental rights given the rapid development of technology. It calls for harmonised rules for addressing illegal content online and for liability exemptions and content moderation. The resolution also includes clear reporting and transparency responsibilities for platforms and authorities.The Council’s Conclusions 6 also welcomed the Commission’s announcement of a Digital Services Act, emphasising ‘the need for clear and harmonised evidence-based rules on responsibilities and accountability for digital services that would guarantee internet intermediaries an appropriate level of legal certainty’, and stressing ‘the need to enhance European capabilities and the cooperation of national authorities, preserving and reinforcing the fundamental principles of the Single Market and the need to enhance citizens’ safety and to protect their rights in the digital sphere across the Single Market’. This call for action was reiterated in the Council’s Conclusions of 2 October 2020 7 .

Building on the key principles set out in the e-Commerce Directive, which remain valid today, this proposal seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services.

The proposal defines clear responsibilities and accountability for providers of intermediary services, and in particular online platforms, such as social media and marketplaces. By setting out clear due-diligence obligations for certain intermediary services, including notice-and-action procedures for illegal content and the possibility to challenge the platforms’ content moderation decisions, the proposal seeks to improve users’ safety online across the entire Union and improve the protection of their fundamental rights. Furthermore, an obligation for certain online platforms to receive, store and partially verify and publish information on traders using their services will ensure a safer and more transparent online environment for consumers. Recognising the particular impact of very large online platforms on our economy and society, the proposal sets a higher standard of transparency and accountability on how the providers of such platforms moderate content, on advertising and on algorithmic processes. It sets obligations to assess the risks their systems pose to develop appropriate risk management tools to protect the integrity of their services against the use of manipulative techniques. The operational threshold for service providers in scope of these obligations includes those online platforms with a significant reach in the Union, currently estimated to be amounting to more than 45 million recipients of the service. This threshold is proportionate to the risks brought by the reach of the platforms in the Union; where the Union’s population changes by a certain percentage, the Commission will adjust the number of recipients considered for the threshold, so that it consistently corresponds to 10 % of the Union’s population. Additionally, the Digital Services Act will set out a co-regulatory backstop, including building on existing voluntary initiatives.

The proposal maintains the liability rules for providers of intermediary services set out in the e-Commerce Directive – by now established as a foundation of the digital economy and instrumental to the protection of fundamental rights online. Those rules have been interpreted by the Court of Justice of the European Union, thus providing valuable clarifications and guidance. Nevertheless, to ensure an effective harmonisation across the Union and avoid legal fragmentation, it is necessary to include those rules in a Regulation. It is also appropriate to clarify some aspects of those rules to eliminate existing disincentives towards voluntary own-investigations undertaken by providers of intermediary services to ensure their users’ safety and to clarify their role from the perspective of consumers in certain circumstances. Those clarifications should help smaller, innovative providers scale up and grow by benefitting from greater legal certainty.

A deeper, borderless single market for digital services requires enhanced cooperation among Member States to guarantee effective oversight and enforcement of the new rules set out in the proposed Regulation. The proposal sets clear responsibilities for the Member State supervising the compliance of service providers established in its territory with the obligations set by the proposed Regulation. This ensures the swiftest and most effective enforcement of rules and protects all EU citizens. It aims at providing the simple and clear processes for both citizens and service providers to find relief in their interactions with supervising authorities. Where systemic risks emerge across the Union, the proposed Regulation provides for supervision and enforcement at Union level.

Consistency with existing policy provisions in the policy area

The current EU legal framework regulating digital services is underpinned, first and foremost, by the e-Commerce Directive. This proposed Regulation is without prejudice to the e-Commerce Directive, and builds on the provisions laid down therein, notably on the internal market principle set out in Article 3. The proposed Regulation provides for a cooperation and coordination mechanism for the supervision of the obligations it imposes. With regard to the horizontal framework of the liability exemption for providers of intermediary services, this Regulation deletes Articles 12-15 in the e-Commerce Directive and reproduces them in the Regulation, maintaining the liability exemptions of such providers, as interpreted by the Court of Justice of the European Union.

Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content. Such orders, in particular where they require the provider to prevent that illegal content reappears, must be issued in compliance with Union law, in particular with the prohibition of general monitoring obligations, as interpreted by the Court of Justice of the European Union 8 . This proposal, in particular its Article 8, leaves this case-law unaffected. This proposal should constitute the appropriate basis for the development of robust technologies to prevent the reappearance of illegal information, accompanied with the highest safeguards to avoid that lawful content is taken down erroneously; such tools could be developed on the basis of voluntary agreements between all parties concerned and should be encouraged by Member States; it is in the interest of all parties involved in the provision of intermediary services to adopt and implement such procedures; the provisions of this Regulation relating to liability should not preclude the development and effective operation, by the different interested parties, of technical systems of protection and identification and of automated recognition made possible by digital technology within the limits laid down by Regulation 2016/679.

Consistency with other Union policies

The proposed Regulation introduces a horizontal framework for all categories of content, products, services and activities on intermediary services. The illegal nature of such content, products or services is not defined in this Regulation but results from Union law or from national law in accordance with Union law.

Sector-specific instruments do not cover all the regulatory gaps evidenced in the impact assessment report: they do not provide fully-fledged rules on the procedural obligations related to illegal content and they only include basic rules on transparency and accountability of service providers and limited oversight mechanisms. In addition, sector-specific laws cover situations where adapted approaches are necessary. In terms of scope, they are limited from two perspectives. First, the sector-specific interventions address a small subset of issues (e.g. copyright infringements, terrorist content, child sexual abuse material or illegal hate speech, some illegal products). Second, they only cover the dissemination of such content on certain types of services (e.g. sub-set of online platforms for copyright infringements, only video-sharing platforms and only as regards audiovisual terrorist content or hate speech). However, it is important that the relationship between the new proposed Regulation and the sector-specific instruments is clarified.

The proposed Regulation complements existing sector-specific legislation and does not affect the application of existing EU laws regulating certain aspects of the provision of information society services, which apply as lex specialis. By way of example, the obligations set out in Directive 2010/13/EC, as amended by Directive (EU) 2018/1808, on video-sharing platform providers (“AVSMD”) as regards audiovisual content and audiovisual commercial communications will continue to apply. However, this Regulation applies to those providers to the extent that the AVSMD or other Union legal acts, such as the proposal for a Regulation on addressing the dissemination on terrorist content online, do not contain more specific provisions applicable to them.

The framework established in the Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services to ensure that business users of such services and corporate website users in relation to online search engines are granted appropriate transparency, fairness and effective redress possibilities, will apply as lex specialis.

Furthermore, the rules set out in the present proposal will be complementary to the consumer protection acquis and specifically with regard to Directive (EU) 2019/2161 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU which establish specific rules to increase transparency as to certain features offered by certain information society services.

This proposal is without prejudice to the Regulation (EU) 2016/679 (the General Data Protection Regulation) and other Union rules on protection of personal data and privacy of communications. For example, the measures concerning advertising on online platforms complement but do not amend existing rules on consent and the right to object to processing of personal data. They impose transparency obligations towards users of online platforms, and this information will also enable them to make use of their rights as data subjects. They also enable scrutiny by authorities and vetted researchers on how advertisements are displayed and how they are targeted.

This proposal will be complemented by further actions under the European Democracy Action Plan COM(2020) 790 final, with the objective of empowering citizens and building more resilient democracies across the Union. In particular, the rules on codes of conduct established in this Regulation could serve as a basis and be complemented by a revised and strengthened Code of practice on disinformation, building on the guidance of the Commission.

The proposal is also fully consistent and further supports equality strategies adopted by the Commission in the context of the Union of Equality. The proposal is without prejudice to the Commission’s initiative aimed at improving the labour conditions of people working through digital platforms.

Finally, the proposed regulation builds on the Recommendation on illegal content of 2018. 9 It takes account of experiences gained with self-regulatory efforts supported by the Commission, such as the Product Safety Pledge 10 , the Memorandum of Understanding against counterfeit goods 11 , the Code of Conduct against illegal hate speech 12 , and the EU Internet Forum with regard to terrorist content.

2.LEGAL BASIS, SUBSIDIARITY AND PROPORTIONALITY

Legal basis

The legal basis for the proposal is Article 114 of the Treaty on the Functioning of the European Union, which provides for the establishment of measures to ensure the functioning of the Internal Market.

The primary objective of this proposal is to ensure the proper functioning of the internal market, in particular in relation to the provision of cross-border digital services (more specifically, intermediary services). In line with this objective, the proposal aims to ensure harmonised conditions for innovative cross-border services to develop in the Union, by addressing and preventing the emergence of obstacles to such economic activity resulting from differences in the way national laws develop, taking into account that several Member States have legislated or intend to legislate on issues such as the removal of illegal content online, diligence, notice and action procedures and transparency. At the same time, the proposal provides for the appropriate supervision of digital services and cooperation between authorities at Union level, therefore supporting trust, innovation and growth in the internal market.

Subsidiarity

Taking into account that the Internet is by its nature cross-border, the legislative efforts at national level referred to above hamper the provision and reception of services throughout the Union and are ineffective in ensuring the safety and uniform protection of the rights of Union citizens and businesses online. Harmonising the conditions for innovative cross-border digital services to develop in the Union, while maintaining a safe online environment, can only be served at Union level.

Union level action provides predictability and legal certainty, and reduces compliance costs across the Union. At the same time, it fosters the equal protection of all Union citizens, by ensuring that action against illegal content online by providers of intermediary services is consistent, regardless of their place of establishment. A well coordinated supervisory system, reinforced at Union level, also ensures a coherent approach applicable to providers of intermediary services operating in all Member States.

To effectively protect users online, and to avoid that Union-based digital service providers are subject to a competitive disadvantage, it is necessary to also cover to relevant service providers established outside of the Union whose operate on the internal market.

Proportionality

The proposal seeks to foster responsible and diligent behaviour by providers of intermediary services to ensure a safe online environment, which allows Union citizens and other parties to freely exercise their fundamental rights, in particular the freedom of expression and information. Key features of the proposal limit the Regulation to what is strictly necessary to achieve those objectives.

In particular, the proposal sets asymmetric due diligence obligations on different types of digital service providers depending on the nature of their services and their size, to ensure their services are not misused for illegal activities and providers operate responsibly. This approach addresses certain identified problems only there where they materialise, while not overburdening providers unconcerned by those problems. Certain substantive obligations are limited only to very large online platforms, which due to their reach have acquired a central, systemic role in facilitating the public debate and economic transactions. Very small providers are exempt from the obligations altogether.

As regards digital service providers established outside of the Union offering services in the Union, the Regulation requires the appointment of a legal representative in the Union to ensure effective oversight and, where necessary, enforcement.

Proportionate to the obligations, and taking into account the cross-border nature of digital services, the proposal will introduce a cooperation mechanism across Member States with enhanced Union level oversight of very large online platforms. Additionally, the proposal does not amend sector-specific legislation or the enforcement and governance mechanisms set thereunder, but provides for a horizontal framework to rely on, for aspects beyond specific content or subcategories of services regulated in sector-specific acts.

By establishing a clear framework, accompanied by cooperation between and with Member States, as well as by self-regulation, this proposal aims to enhance legal certainty and increase trust levels, while staying relevant and effective in the long term because of the flexibility of the cooperation framework.

Choice of the instrument

Article 114 of the Treaty on the Functioning of the European Union gives the legislator the possibility to adopt regulations and directives.

The Commission has decided to put forward a proposal for a Regulation to ensure a consistent level of protection throughout the Union and to prevent divergences hampering the free provision of the relevant services within the internal market, as well as guarantee the uniform protection of rights and uniform obligations for business and consumers across the internal market. This is necessary to provide legal certainty and transparency for economic operators and consumers alike. The proposed Regulation also ensures consistent monitoring of the rights and obligations, and equivalent sanctions in all Member States, as well as effective cooperation between the supervisory authorities of different Member States and at Union level.

3.RESULTS OF EX-POST EVALUATIONS, STAKEHOLDER CONSULTATIONS AND IMPACT ASSESSMENTS

Ex-post evaluations/fitness checks of existing legislation

This proposal builds on the evaluation of the e-Commerce Directive, conducted as a ‘back to back’ evaluation with the Impact Assessment accompanying the proposal. The specific objectives of the e-Commerce Directive were to ensure (i) a well-functioning internal market for digital services, (ii) the effective removal of illegal content online in full respect of fundamental rights, and (iii) an adequate level of information and transparency for consumers.

As regards the effectiveness of the e-Commerce Directive, the evaluation shows that while the e-Commerce Directive has provided an important incentive for the growth of the internal market for digital services, and enabled entry and scaling up of new providers of such services, the initial objectives have not been fully achieved.

In particular, the dynamic growth of the digital economy and the appearance of new types of service providers raises certain new challenges, dealt with differently by Member States, where the initial set of objectives need to be clarified. In addition, these developments put an additional strain on achieving already existing objectives as the increased legal fragmentation shows.

The evaluation also showed that while several new regulatory instruments make valuable contributions to the attainment of some of the policy objectives set out in the e-Commerce Directive, they provide only sector-specific solutions for some of the underlying problems (e.g. in addressing the proliferation of specific types of illegal activity). They therefore do not address such issues consistently for the entire digital ecosystem, as they are limited to certain types of services or certain types of illegal content. Furthermore, while self-regulatory initiatives have generally shown positive results, they cannot be legally enforced, nor do they cover all participants in the digital economy. As regards the efficiency of the e-Commerce Directive, the Directive imposed only limited additional costs for Member States' administrations and providers of information society services. The evaluation has not revealed particularly high or disproportionate costs and no substantial concerns have been raised regarding impacts on small and medium-sized enterprises. The main concern in this regard is related to the lack of clarity in the cooperation mechanism across Member States, creating burdens and duplication of costs, despite the opposite objective of the Directive, in particular where the supervision of online platforms is concerned. This has essentially reduced its efficiency in maintaining the functioning of the internal market.

In relation to questions about the continued relevance of the objectives pursued by the e-Commerce Directive, the evaluation shows that the objectives of the e-Commerce Directive continue to remain valid, while at the same time there are several new developments that are not well reflected in the existing public policy objectives.

First, the open public consultation, targeted submissions by stakeholders, reports issued by the European Parliament 13 as well as Council conclusions 14 confirm that the existing principles and objectives of the e-Commerce Directive remain valid today. However, new information asymmetries and risks have arisen since the entry into force of the Directive, notably related to the emergence of online platforms, in particular very large ones, and the scale of the digital transformation. This is for example the case in the areas of algorithmic decision making (with an impact on how information flows are intermediated online), or in online advertising systems.

The evaluation showed that the e-Commerce Directive is coherent with other EU interventions that took place since its adoption. The evaluation also did not identify any internal in-coherence of the e-Commerce Directive.

Finally, at least parts of the actual benefits of the e-Commerce Directive that the evaluation identified could be considered as EU added value. It is likely that Member States would have continued applying their own regulatory systems without any common set of principles and that some Member States would have continued to have no horizontal rules in place at all. In the absence of robust evidence, it is however not possible to draw firm conclusions on the extent of this EU added value.

Stakeholder consultations

Over the past five years, the Commission has consulted a wide range of different stakeholders, including providers of digital services such as online platforms and other intermediary services, businesses trading online, media publishers, brand owners and other businesses, social partners, users of digital services, civil society organisations, national authorities, academia, the technical community, international organisations and the general public. An array of targeted consultation steps have captured thoroughly stakeholder views on issues related to digital services and platforms over the last years.

The open public consultation on the Digital Services Act was open for 14 weeks, between 2nd of June and 8th of September and received 2,863 responses and around 300 position papers from a diverse group of stakeholders. Most feedback was submitted by the general public (66% from Union citizens, 8% from non-EU citizens), companies/ businesses organizations (7.4%), business associations (6%), and NGOs (5.6%). This was followed by public authorities (2.2%), academic/research institutions (1.2%), trade unions (0.9%), and consumer and environmental organisations (0.4%).

Overall, there is a general agreement amongst stakeholders for a need for action, both in addressing online safety and in furthering the internal market for digital services.

Stakeholders converge on the continued relevance of the main principles of the e-Commerce Directive and agree that they should be maintained, including the internal market principle for the supervision of digital services, the liability regime, and the prohibition of general monitoring obligations.

Stakeholders also broadly agree on the need to upgrade the framework in light of today’s challenges by establishing clear obligations for service providers, harmonised across the EU. A majority of respondents, all categories included, indicated that they have encountered both harmful and illegal content, goods or services online, and specifically noted an alarming spike during the Covid-19 pandemic. A large share of respondents who say they have notified illegal content or goods to digital service providers expressed their dissatisfaction with the response and the ineffectiveness of reporting mechanisms after the exposure took place. Moreover, users perceive there to be a mismatch between providers’ policies as stated and their concrete actions.

There is broad consensus, including among service providers responding to the consultation, on the need for simple, standardised, transparent notice and action obligations, harmonised across the internal market. This is considered as essential to enable rapid responses to illegal content and enhance legal clarity for users of platforms and for small platforms seeking to scale in the internal market. Respondents also agree on the importance of a redress mechanisms.

Concerning online marketplaces, several stakeholders flagged the need for more targeted measures such as the identification of sellers.

Respondents also generally agree that the territorial scope for these obligations should include all players offering goods, information or services in the Union, regardless of their place of establishment. A large share of respondents also emphasized the importance of these issues in particular where large platforms are concerned.

There is a general agreement among stakeholders that ‘harmful’ (yet not, or at least not necessarily, illegal) content should not be defined in the Digital Services Act and should not be subject to removal obligations, as this is a delicate area with severe implications for the protection of freedom of expression.

However, the way algorithmic systems shape information flows online is an area of concern among a wide category of stakeholders. Several stakeholders, in particular civil society and academics, pointed out the need for algorithmic accountability and transparency audits, especially with regard to how information is prioritized and targeted. Similarly, regarding online advertising, stakeholder views echoed the broad concerns around the lack of user empowerment and lack of meaningful oversight and enforcement.

When it comes to enforcement, there is a general understanding among stakeholders that cooperation between authorities should be improved both cross-border and within each Member State. EU oversight is considered crucial and the majority of respondents seems to favour a unified oversight entity.

Collection and use of expertise

The preparatory steps for the proposal rest on an array of studies and expert advice, including a number of legal studies commissioned focusing on the implementation of the E-Commerce Directive and the state of legal fragmentation 15 , studies on algorithmic transparency and accountability 16 , as well as internal studies on costs of content moderation, liability regimes for intermediaries, and cost of non-Europe, with the support of the Joint Research Centre of the European Commission. For gathering the views and perceptions of the general public, the Commission ran a Eurobarometer survey in 2018 with a representative sample of over 33,000 respondents from all Member States 17 .

The legal analysis also rests on a rich collection of case law, in particular by the Court of Justice of the European Union, of several provisions of the e-Commerce Directive and related acts, such as provisions concerning the interpretation of the notion of “information society services” 18 or provisions concerning the liability of intermediary services providers 19 The Commission also gathered expertise and views through targeted consultations and engagement activities, including a series of workshops, conferences, interviews with experts and judges, consultations of the Expert Group on e-commerce, as well as numerous bilateral meetings and analysis of ad hoc position and research papers from organizations, industry representatives, civil society and academia.

Finally, the analysis rests on additional literature review, studies and research papers submitted by academics in the open public consultation and other independent studies, including the collection of studies carried out for the European Parliament 20 .

Impact assessment

The Regulatory Scrutiny Board issued a positive opinion with reservations on the impact assessment, including suggestions for improvement 21 . The Impact Assessment report was further revised along these lines, notably in clarifying the interlinks between the Digital Services Act and the broader regulatory framework, providing more detailed descriptions of the policy options, and a more detailed analysis of the underlying evidence addressed in the revised impact assessment report.

The importance in our economy and society, but also the growing risks brought by digital services will continue to scale. In the baseline scenario, the Commission will continue to enforce existing rules, including on sector-specific issues, and will support the self-regulatory efforts in place. However, faced with the evolving problems, Member States will continue to legislate independently. The legal fragmentation with the resulting patchwork of national measures will not just fail to effectively tackle illegal activities and protect citizens’ fundamental rights throughout the EU, it will also hinder new, innovative services from scaling up in the internal market, cementing the position of the few players which can afford the additional compliance costs. This leaves the rule setting and enforcement mostly to very large private companies, with ever-growing information asymmetry between online services, their users and public authorities.

Three main policy options were assessed, in addition to the baseline. Option 1 would codify the Recommendation of 2018: it would lay down a range of procedural obligations for online platforms to tackle illegal activities conducted by their users. The obligations would also include the necessary safeguards in order to protect users’ fundamental rights and ensure transparency. It would also enhance the administrative cooperation mechanisms for authorities to resolve cross-border issues through a digital clearing house, facilitating information flows. Option 2 would in addition to measures in option 1, remove disincentives for service providers to take voluntary measures against illegal content, and introduce measures to enhance transparency around recommender systems and advertising. The enforcement and cooperation mechanism would be enhanced with the appointment of a central coordinator in each Member State. Option 3, building on the measures outlined in the previous options, includes targeted, asymmetric measures with stronger obligations for very large online platforms which are prone to the highest levels of risks for the EU society and economy, as well as certain limited clarifications of the liability regime for providers of intermediary services and an EU governance system with reinforced oversight and enforcement powers.

The assessment of the economic and social impacts identified, and the comparison of its effectiveness, efficiency, coherence and proportionality showed that Option 3 would most effectively meet the objectives of the intervention by establishing the proportionate framework fit for adapting to emerging challenges in the dynamic digital world. The components included in Option 3 are also broadly supported by stakeholders, including positions from the European Parliament and Member States.

The preferred option would support the access to the internal market for European Union intermediary service providers and their ability to scale-up by reducing costs related to the legal fragmentation. While costs for compliance with due diligence obligations are expected, it is estimated this is offset by reducing the current fragmentation through harmonisation. It is expected to have a positive impact on competitiveness, innovation and investment in digital services, in particular European Union start-ups and scale-ups offering platform business models but also, to varying extents, on sectors underpinned and amplified by digital commerce.

The preferred option intends to define the appropriate division of responsibilities between intermediary services, their recipients and authorities when fighting against illegal content online. To do so, it introduces an asymmetric approach to the due diligence obligations imposed on very large online platforms: this is a supervised risk management approach, with an important role of the governance system for enforcement. The asymmetric obligations are only imposed on very large online platforms, which have, based on the current data, not only the broadest reach, but are also the large companies with important turnover. Consequently, while the targeted measures are more restrictive than for other companies, they are proportionate to the ability of the companies to comply.

For public authorities, the proposed option would cut the costs brought by the inefficiencies and duplications in the existing set-up for the cooperation of authorities. While Member States would bear the costs of appointing a competent authority, new or already established, the efficiency gains are expected to outweigh them: for the individual authorities through mutualisation of resources, better information flows, and straight-forward processes for interacting with their counterparts across the internal market, as well as with service providers.

Regulatory fitness and simplification

The Impact Assessment accompanying this proposal identifies the sole added value of Union intervention addressing the risk of legal fragmentation triggered by divergent regulatory and supervisory approaches (hence without accounting for the increased safety and trust on digital services) in a possible increase of cross-border digital trade of 1 to 1.8%, i.e. the equivalent of an increase in turnover generated cross-border of EUR 8.6 billion and up to EUR 15.5 billion.

With regard to added value in the enforcement of measures, the initiative creates important efficiency gains in the cooperation across Member States and mutualising some resources for technical assistance at EU level, for inspecting and auditing content moderation systems, recommender systems and online advertising on very large online platforms. This, in turn, leads to an increased effectiveness of enforcement and supervision measures, whereas the current system relies to a large extent on the limited capability for supervision in a small number of Member States.

Fundamental rights

Union citizens and others are exposed to ever-increasing risks and harms online – from the spread of illegal content and activities, to limitations to express themselves and other societal harms. The envisaged policy measures in this legislative proposal will substantially improve this situation by providing a modern, future-proof governance framework, effectively safeguarding the rights and legitimate interests of all parties involved, most of all Union citizens. The proposal introduces important safeguards to allow citizens to freely express themselves, while enhancing user agency in the online environment, as well as the exercise of other fundamental rights such as the right to an effective remedy, non-discrimination, rights of the child as well as the protection of personal data and privacy online.

The proposed Regulation will mitigate risks of erroneous or unjustified blocking speech, address the chilling effects on speech, stimulate the freedom to receive information and hold opinions, as well as reinforce users’ redress possibilities. Specific groups or persons may be vulnerable or disadvantaged in their use of online services because of their gender, race or ethnic origin, religion or belief, disability, age or sexual orientation. They can be disproportionately affected by restrictions and removal measures following from (unconscious or conscious) biases potentially embedded in the notification systems by users and third parties, as well as replicated in automated content moderation tools used by platforms. The proposal will mitigate discriminatory risks, particularly for those groups or persons and will contribute to the protection of the rights of the child and the right to human dignity online. The proposal will only require removal of illegal content and will impose mandatory safeguards when users’ information is removed, including the provision of explanatory information to the user, complaint mechanisms supported by the service providers as well as external out-of-court dispute resolution mechanism. Furthermore, it will ensure EU citizens are also protected when using services provided by providers not established in the Union but active on the internal market, since those providers are covered too.

With regard to service providers’ freedom to conduct a business, the costs incurred on businesses are offset by reducing fragmentation across the internal market. The proposal introduces safeguards to alleviate the burden on service providers, including measures against repeated unjustified notices and prior vetting of trusted flaggers by public authorities. Furthermore, certain obligations are targeted to very large online platforms, where the most serious risks often occur and which have the capacity absorb the additional burden.

The proposed legislation will preserve the prohibition of general monitoring obligations of the e-Commerce Directive, which in itself is crucial to the required fair balance of fundamental rights in the online world. The new Regulation prohibits general monitoring obligations, as they could disproportionately limit users’ freedom of expression and freedom to receive information, and could burden service providers excessively and thus unduly interfere with their freedom to conduct a business. The prohibition also limits incentives for online surveillance and has positive implications for the protection of personal data and privacy.

All measures in the proposal are fully compliant and aligned with the high standard of personal data protection and protection of privacy of communications and private life, set in EU legislation.

4.BUDGETARY IMPLICATIONS

The budgetary impact of the proposal will be covered by the allocations foreseen in the MFF 2021-27 under the financial envelopes of the Digital Europe Programme and Single Market Programme as detailed in the legislative financial statement accompanying this proposal for a regulation.These implications require also reprogramming of Heading 7 of the Financial Perspective.

The legislative financial statement accompanying this proposal for a Regulation covers the budgetary impacts for the Regulation itself.

5.OTHER ELEMENTS

Implementation plans and monitoring, evaluation and reporting arrangements

The Commission will establish a comprehensive framework for continuously monitoring the output, results and impact of this legislative instrument upon the date of its application. Based on the established monitoring programme, an evaluation of the instrument is envisaged within five years from its entry into force.

Detailed explanation of the specific provisions of the proposal

Chapter I sets out general provisions, including the subject matter and scope of the Regulation (Article 1) and the definitions of key terms used in the Regulation (Article 2).

Chapter II contains provisions on the exemption of liability of providers of intermediary services. More specifically, it includes the conditions under which providers of mere conduit (Article 3), caching (Article 4) and hosting services (Article 5) are exempt from liability for the third-party information they transmit and store. It also provides that the liability exemptions should not be disapplied when providers of intermediary services carry out voluntary own-initiative investigations or comply with the law (Article 6) and it lays down a prohibition of general monitoring or active fact-finding obligations for those providers (Article 7). Finally, it imposes an obligation on providers of intermediary services in respect of orders from national judicial or administrative authorities to act against illegal content (Article 8) and to provide information (Article 9).

Chapter III sets out the due diligence obligations for a transparent and safe online environment, in five different sections.

Section 1 lays down obligations applicable to all providers of intermediary services, in particular: the obligation to establish a single point of contact to facilitate direct communication with Member States’ authorities, the Commission and the Board (Article 10); the obligation to designate a legal representative in the Union for providers not established in any Member State, but offering their services in the Union (Article 11); the obligation to set out in their terms and conditions any restrictions that they may impose on the use of their services and to act responsibly in applying and enforcing those restrictions (Article 12); and transparency reporting obligations in relation to the removal and the disabling of information considered to be illegal content or contrary to the providers’ terms and conditions (Article 13).

Section 2 lays down obligations, additional to those under Section 1, applicable to providers of hosting services. In particular, that section obliges those providers to put in place mechanisms to allow third parties to notify the presence of alleged illegal content (Article 14). Furthermore, if such a provider decides to remove or disable access to specific information provided by a recipient of the service, it imposes the obligation to provide that recipient with a statement of reasons (Article 15).

Section 3 lays down obligations applicable to all online platforms, additional to those under Sections 1 and 2. The Section specifies that it does not apply to online platforms that are micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC (Article 16). The Section lays down the obligation for online platforms to provide an internal complaint-handling system in respect of decisions taken in relation to alleged illegal content or information incompatible with their terms and conditions (Article 17). It also obliges online platforms to engage with certified out-of-court dispute settlement bodies to resolve any dispute with users of their services (Article 18). It further obliges online platforms to ensure that notices submitted by entities granted the status of trusted flaggers are treated with priority (Article 19) and sets out the measures online platforms are to adopt against misuse (Article 20). Furthermore, this Section includes a requirement for online platforms to inform competent enforcement authorities in the event they become aware of any information giving rise to a suspicion of serious criminal offences involving a threat to the life or safety of persons (Article 21). The Section also obliges online platforms to receive, store, make reasonable efforts to assess the reliability of and publish specific information on the traders using their services where those online platforms allow consumers to conclude distance contracts with those traders (Article 22). Those online platforms are also obliged to organise their interface in a way that enables traders to respect Union consumer and product safety law (Article 22a). Online platforms are also obliged to publish reports on their activities relating to the removal and the disabling of information considered to be illegal content or contrary to their terms and conditions (Article 23). The Section also includes transparency obligations for online platforms in respect of online advertising (Article 24).

Section 4 lays down obligations, additional to the obligations laid down in Sections 1 to 3, for very large online platforms (as defined by Article 25) to manage systemic risks. Very large online platforms are obliged to conduct risk assessments on the systemic risks brought about by or relating to the functioning and use of their services (Article 26) and to take reasonable and effective measures aimed at mitigating those risks (Article 27). They are also to submit themselves to external and independent audits (Article 28). The Section includes also a specific obligation in case very large online platforms use recommender systems (Article 29) or display online advertising on their online interface (Article 30). Furthermore, the Section sets out the conditions under which very large online platforms provide access to data to the Digital Services Coordinator of establishment or the Commission and vetted researchers (Article 31), the obligation to appoint one or more compliance officers to ensure compliance with the obligations laid down in the Regulation (Article 32) and specific, additional transparency reporting obligations (Article 33).

Section 5 contains transversal provisions concerning due diligence obligations, namely the processes for which the Commission will support and promote the development and implementation of harmonised European standards (Article 34); the framework for the development of codes of conduct (Article 35); and the framework for the development of specific codes of conduct for online advertising (Article 36). There is also a provision on crisis protocols to address extraordinary circumstances affecting public security or public health (Article 37).

Chapter IV contains the provisions concerning the implementation and enforcement of this Regulation.

Section 1 lays down provisions concerning national competent authorities, including Digital Services Coordinators, which are the primary national authorities designated by the Member States for the consistent application of this Regulation (Article 38). The Digital Services Coordinators, like other designated competent authorities, are independent and perform their tasks impartially, transparently and in a timely manner (Article 39). Member States where the main establishment of the provider is located have jurisdiction to enforce this Regulation (Article 40). The Digital Services Coordinators are granted specific powers (Article 41). Member States are to lay down rules on penalties applicable to breaches of the obligations by providers of intermediary services under this Regulation (Article 42). Digital Services Coordinators can receive complaints against providers of intermediary services for breaches of the obligations laid down in this Regulation (Article 43). Digital Services Coordinators are required to publish annual reports on their activities (Article 44) and to cooperate with Digital Services Coordinators of other Member States (Article 45). Digital Services Coordinators can also participate in joint investigations with regard to matters covered by the Regulation (Article 46).

Section 2 lays down provisions regarding the European Board for Digital Services, an independent advisory group of Digital Services Coordinators (Article 47). It also sets out the structure of that Board (Article 48) and its tasks (Article 49).

Section 3 concerns the supervision, investigation, enforcement and monitoring of very large online platforms. It provides for enhanced supervision in the event such platforms infringe the provisions of Chapter III, Section 4 (Article 50). It also provides the possibility for the Commission to intervene vis à vis very large online platforms in case the infringements persist (Article 51). In these cases the Commission can carry out investigations, including through requests for information (Article 52), interviews (Article 53) and on-site inspections (Article 54), can adopt interim measures (Article 55) and make binding commitments by very large online platforms (Article 56), as well as monitor their compliance with the Regulation (Article 57). In case of non-compliance, the Commission can adopt non-compliance decisions (Article 58), as well as fines (Article 59) and periodic penalty payments (Article 60) for breaches of the Regulation by very large online platforms as well as for supply of incorrect, incomplete or misleading information in the context of the investigation. The Regulation sets also a limitation period for the imposition of penalties (Article 61) and for their enforcement (Article 62). Finally, the Regulation sets the procedural guarantees in front of the Commission, in particular the right to be heard and of access to the file (Article 63) and the publication of decisions (Article 64). The Section also provides for the cooperation of the Commission with national courts (Article 65) and for the adoption of implementing acts on practical arrangement on the proceedings (Article 66).

Section 4 includes the common provisions on enforcement. It first lays down rules on an information-sharing system supporting communications between Digital Services Coordinators, the Commission and the Board (Article 67). It also includes the right of recipients of intermediary services to mandate a body, organisation and association to exercise their rights on their behalf (article 68).

Section 5 relates to the adoption of delegated and implementing acts in accordance with Articles 290 and 291 of the Treaty on the Functioning of the European Union, respectively (Articles 69 and 70).

Finally, Chapter V contains the final provisions of this Regulation, which concern the deletion of Articles 12 to 15 of the e-Commerce Directive given that they have been incorporated in the Regulation (Article 71), amendments to Directive 2020/XX/EC (Article 72), evaluation of the Regulation (Article 73), and its entry into force and application (Article 74).

2020/0361 (COD)

12.01.2021 Valentino Spataro
Europa


Usare i servizi online per mettere al centro i clienti
DATA ACT - Per le PMI e i servizi online le novità
Down di Aruba sui dns - effetti su email, pec e servizi online - come…
014 Tinder e Facebook: come ridurre la condivisione di dati personali tra i servizi online
051 accedere e uscire dai servizi online, senza internet point
Il caso Replika: quando i servizi online devono chiedere l'età secondo i Garanti Privacy
Telegram vs Twitter vs Facebook: un post inedito
Sony Play Station 3 senza Linux: i consumatori non riescono a convincere
Agcom e diritti dei consumatori: una guida in 200 pagine in pdf
P.A.: emanata direttiva per la qualità dei servizi online



Segui le novità in materia di Servizi online su Civile.it via Telegram
oppure via email: (gratis Info privacy)





dallo store:
visita lo store








Dal 1999 il diritto di internet. I testi sono degli autori e di IusOnDemand srl p.iva 04446030969 - diritti riservati - Privacy - Cookie - Condizioni d'uso - in 0.269