Digital Services Act.

Written

EU's commission has made a proposal for digital services[alt]. The act is regulation (not directive so it is the same level as member state law) which creates a framework for a member state requests for service providers on the internet to remove illegal content. Same time some fundamental right of citizen are improved. Or as the proposal says:

The resolution on ‘Digital Services Act: adapting commercial and civil law rules for commercial entities operating online’ calls for more fairness, transparency and accountability for digital services’ content moderation processes, ensuring that fundamental rights are respected, and guaranteeing independent recourse to judicial redress. The resolution also includes the request for a detailed ‘notice-and-action’ mechanism addressing illegal content, comprehensive rules about online advertising, including targeted advertising, and enabling the development and use of smart contracts.

You may think that "oh no, Fourth Reich is at it again" but no it isn't. Not this time at least. Although the framework can be in theory be used to censor, the proposal itself does not allow censorship. In fact, the act does not define what is illegal content. Definition of illegal content is: ... any information,,- which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; To be fair for "Fourth Reichers" what illegal content is thought of as: In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. So it is rather a board scope and what is terrorist content and unlawful vs lawful discriminatory content maybe should have been defined. Obviously, there is a problem that the EU doesn't at the moment too much force basic rights because that is the Council of Europe's jobs.

So what does the Digital Services Act define? ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. An ancillary feature is something like a comment section under a news story. Hosting service was defined under intermediary service:

(f)‘intermediary service’ means one of the following services:
  • a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
  • a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;
  • a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service;

Basiclly under hosting is email server provider and online platform is Youtube, Twitch, Reddit, Neocities and LBRY. I'm not entirely sure where a general server provider like AWS would land. Maybe there are two different agreements. On information storage and other when amazon allows pubic access. So until public access agreement service isn't under online platform.

I'm sure that Youtube, Twitch, and Reddit qualify under substantial connection to the Union to be offer services in the Union regardless of establishment. Neocities and LBRY I am not so sure of. I don't think that Neocities has an establishment in the union so ether Neocities would have to have targeting of activities towards one or more Member States or a significant number of users in one or more Member States to qualify. This is a vague definition. On purpose so that it is ECJ's fun time.

The framework is covered in articles 8 (Orders to act against illegal content) and 9 (Orders to provide information). The articles are similar but there are weird differences. Intermediary service must without undue delay tell the effect given to the order but if information request (Article 9) also an acknowledgement of the order and if takedown request (article 8) specifying the action taken and the moment when the action was taken. It isn't clear to me does acknowledgement and the effect have to be sent at the same time under article 9 and why under article 8 you don't have to acknowledge the order. I guess when the action was taken has similar information. A major difference between articles is that reason doesn't have to be told for information request if investigation/prosecution needs it. Other then that member states do have the right to add to this.

Hosting service isn't liable for any piece of information long as [Hosting service] does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent or upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content (article 5). I approve that access disabling is mentioned since that emphasizes that content takedown can be geolocational in the EU. Meaning if Germany continues to be overly scared of nazis then that doesn't mean a citizen of another member would have to worry about complete speech denial.

For the awareness of the content article 6 says: Providers of intermediary services shall not be deemed ineligible for the exemptions from liability ... solely because they carry out voluntary own-initiative investigations ... to illegal content ... and article 7 is no general monitoring or active fact-finding obligations. These two most likely lead to best practise being rather passive to content moderation since you don't really get anything nor forced to be proactive. This is passivity is countered via article 14: Providers of hosting services shall put mechanisms in place to allow any .. entity to notify them of the presence ... of specific items of information that the ... entity considers to be illegal content. Valid notice is considered awareness. In fact, intermediary service must process these notices in a timely, diligent and objective manner. Objective here is the key since it indicates that intermediary service can't just react without thinking of the content owner.

On user rights provider must tell the reason. Where a provider of hosting services decides to remove or disable ... it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision (Article 15). But this isn't all. Reason must state: is it removal or disabling, factual circumstances for decision including was it other user or bot, if an illegal content then a reference to the legal ground and if terms of service violation reference to that, and redress possibilities. Youtube does this pretty much but it is good that is now mandatory. Helps that you don't have to argue small stuff in the court. Especially since reason has to be clear enough to redress (15(3)).

Then we get to something a bit bizarre. Article 15 paragraph 4: Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. This is weird since this applies to every hosting service. I get why this rule is. It is to have an independent body for legal issues. However, does this rule mean that if we have a small forum for knitters they have to inform the EU when they remove porn left by a troll? It seems like the database EU is building here going to be too large for its own good. Although this is good for "case law" for big players like Youtube and Twitch since it at the EU website normal person not going to use this as case law to understand what terms of service says.

I think I covered the main things I want to say. There are other good things in the act like bigger enterprises has to have an internal dispute mechanism, six month complaint period, and some transparency on recommendation systems. Other interesting things in the act are: very large platforms have to make an assessment on the risk they make to society and how to mitigate, member states have to make the position of Digital Services Coordinator, and rules about crisis protocols. Overall, I think the law is good even with its problems. There is a need for this type of framework for child pornography or revenge porn. Even if it doesn't make it cheaper for intermediary services, protection for users are welcome. I maybe go further on wording terms of service must have but this is a good enough start.