Digital Services Act After Committee.
Written
Digital Services Act has now a list of European Parliament's first reading's ammendments. Additions are overwhelmingly positive but some problems still remain. For example, Article 15 Paragraph 4 about hosting services publishing their moderation decision is now in bulk (at least once a year) and the commission's database of these has to be machine-readable. Improvement but the fundamental problem of article applying to any hosting service still exists. Only one problem was kinda added.
The draft has only amendments listed with the original proposal next to it so if you want a better context of what has been amendment Commission's original proposal is here. I only quote the amendments column here. I have bolded in the quotes the same as what the committee bolded. Bolds are what the committee added if the article existed at the proposal. What committee deleted from pre-existing articles is bolded in the original's column and hence are not shown here. I did not see the point in quoting both as deletions weren't that interesting.
Definitions were kept pretty in line with the proposal. Most changes here are about how substantial connection to the Union is written. The biggest addition is that a trusted flagger is added to the definition. This makes clear that a trusted flagger within the act is someone who gets that title from Digital Service Coordinator. Youtube after all has used the same name for it's trusted flagger's program.
For recitals interesting example was given:
(27a) A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service.
The point that one can divide elements of a webpage into services is rather interesting. For example, youtube could be divided into multiple different hosting services. I suppose that comment sections would be categorised as auxiliary services. Also, I guess the user of the advertisement service could be the website, not the end-user.
New paragraph was added to Article 6 Voluntary own-initiative investigations and legal compliance: 1a. Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content.
I like this as documentation requirement means I can ask for that documentation if the evidence is needed for the court. Any investigation must have human oversight so if the bot says this is a video is against the rules human has to check box it? No, I think oversight here means that human has to check later what the bot did or some other data analysis tool has to be used. Transparent is rather a weird requirement and I am not sure what would constitute transparent. Bot doing something is not transparent. The bot has learned this thing is against the rules is? I suppose test on test data could determent something and the EU cannot require something impossible.
Couple of limits to member states where given. In Article 7 No general monitoring or active fact-finding obligations paragraph was added which makes clear member state cannot demand general monotoring: 1c. Member States shall not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. Member States shall not oblige providers of intermediary services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or national law.
. My understanding right now is that general monitoring obligations wouldn't go down well legally really anywhere in the EU. Hence this one is somewhat of a no brainer. I would consider this a basic right.
Another limit is Article 23's new paragraph 2a would make it so that member states cannot add transparency reporting obligation:2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in connection with the exercise of their supervisory powers.
Article 23's new paragraph I suppose is to limit member states creating more obligations and hence fracturing the law. Supervisory is for one time things maybe? Not sure.
New article was added to demand that when intermediary service does voluntary own-initiative investigations measure are proportional. 1a. Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content.
The error rate is a good measure for automated means since it is rather questionable 100% context algorithms even exist. Hence there is always some amount of false positives. I suppose there is a chance one could argue based upon their record algorithm should white-list them. I don't know. If there is a white-list for mainstream media companies then the argument probably follows. If there is not then I could see intermediary service could argue that it cannot give anyone trust. An automated trust system could potentially be abused.
Article 8 Orders to act against illegal content got bunch of amendments. Paragraph 2's point b was edited to make clear orders should be limited to member state issuing the order:
(b) the territorial scope of the order on the basis of the applicable rules of Union and national law in conformity with Union law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;the territorial scope of the order shall be limited to the territory of the Member State issuing the order unless the illegality of the content derives directly from Union law or the rights at stake require a wider territorial scope, in accordance with Union and international law;
The territorial scope clause does further make clear that Germany cannot decide the lowest point of speech protection and applied to the rest of Europe. I also like that only matters of Union law can be EU wide. This does mean there is no situation of obvious defamation that is only disabled one member state but removed as a whole.
A new point cb was added under paragraph 2 to handle multiple hosts: (cb) where more than one provider of intermediary services is responsible for hosting the specific items of illegal content, the order is issued to the most appropriate provider that has the technical and operational ability to act against those specific items.
The new point cb I think will be confusing. For example, does LBRY instance host or cache a video? Videos that are uploaded to it sure but videos which portal shows it would perhaps be caching. This probably is not clear for the issuer and LBRY may get notices for instances that it does not control but show in the Odysee. I have to wonder how Odysee would react to that if it begins to be a huge amount. This wording capture more BitTorrent type systems where everyone is a host rather than LBRY or Mastadon where there are multiple hosts.
New paragraph 4a was added so that member state maybe act if applicant request it. 4a. Member States shall ensure that the relevant authorities may, at the request of an applicant whose rights are infringed by illegal content, issue against the relevant provider of intermediary services an injunction order in accordance with this Article to remove or disable access to that content.
I think this is just making clear that member states can if someone wants to act at their behest. I don't know does this mean a member state can make a service where orders for known illegal content is searched and served. Actual I don't really know in which situation this would happen. If there is a police investigation? My part here does so that relevant authorities do not have to do anything so it is not clear. Maybe this is just clarifying something is not prohibited.
For Article Article 10 Points of contact, and Article 11 Legal representatives there is added paragraph allowing intermediary services sharing single individual. Paragraphs are 2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
, and 5a. Providers of intermediary services that qualify as micro, small or medium-sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
I think this is fine because it perhaps can create an industry for points of contact allowing the knowledge of the law to be more widespread and lower entry price. There are some problems with this idea as national law can say no but let's see.
New article 13a Online interface design and organisation was added about website design when user's choices are considered:
1. Providers of intermediary services shall not use the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice. In particular, providers of intermediary services shall refrain from:(a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision;
(b) repeatedly requesting that a recipient of the service consents to data processing, where such consent has been refused, pursuant to Article 7(3) of Regulation (EU) 2016/679, regardless of the scope or purpose of such processing, especially by presenting a pop-up that interferes with user experience;
(c) urging a recipient of the service to change a setting or configuration of the service after the recipient has already made a choice;
(d) making the procedure of terminating a service significantly more cumbersome than signing up to it; or
(e) requesting consent where the recipient of the service exercises his or her right to object by automated means using technical specifications, in line with Article 21(5) of Regulation (EU) 2016/679.
This paragraph shall be without prejudice to Regulation(EU) 2016/679.
2. The Commission is empowered to adopt a delegated act to update the list of practices referred to in paragraph 1.
3. Where applicable, providers of intermediary services shall adapt their design features to ensure a high level of privacy, safety, and security by design for minors.
I'm not sure is there some duplication here but at least paragraph 1 point b sounds like a direct response to GDPR causing websites to constantly asking privacy requests. Paragraph 3 is a bit vague. What is safe for a minor on Youtube? Forcing them into Youtube kids?
New paragraph 5a was added to Article 14 Notice and action mechanisms 5a. The anonymity of individuals who submitted a notice shall be ensured towards the recipient of the service who provided the content, except in cases of alleged violations of personality rights or of intellectual property rights.
I'm a bit worried that this allows harassment through the mechanism. But since there is a complaint mechanism and harasser usually have content directly asking for notices being sent there is other ways harassment is punishment. Hence this is not a deal-breaker.
Article 15 Statement of reasons and Article 17 Internal complaint-handling system wwas updated to have a bigger list of actions which hosting service took against a piece of content. The full list now is: removal, the disabling of access, the demotion of, or imposes other measures to visibility, availability or accessibility
. I did get what I wished for with this. Youtube cannot do anything to content without it being questioned. Shadowbanning is also now strictly illegal. Still, the database problem exists as I mentioned in the introduction.
Now we get to Section 3 of the proposal Additional Provisions Applicable to Online Platforms. In the original micro and small enterprises were excluded under it. Now additions to that online platform can make apply to a waiver of anything under Section 3. New paragraphs 2 to 7 under Article 16 Exclusion for micro and small enterprises cover the waiver:
2. Providers of intermediary services may submit an application accompanied by a justification for a waiver from the requirements of this section [section 3] provided that they:(a) do not present significant systemic risks and have limited exposure to illegal content; and
(b) qualify as non-for-profit or qualify as a medium enterprise within the meaning of the Annex to Recommendation 2003/361/EC.
3. The application shall be submitted to the Digital Services Coordinator of establishment who shall conduct a preliminary assessment. The Digital Services Coordinator of establishment shall transmit to the Commission the application accompanied by its assessment and where applicable, a recommendation on the Commission’s decision. The Commission shall examine such an application and, after consulting the Board, may issue a total or a partial waiver from the requirements of this Section.
4. Where the Commission grants such a waiver, it shall monitor the use of the waiver by the provider of intermediary services to ensure that the conditions for use of the waiver are respected.
5. Upon the request of the Board, the Digital Services Coordinator of establishment or the provider, or on its own initiative, the Commission may review or revoke the waiver in whole or in parts.
6. The Commission shall maintain a list of all waivers issued and their conditions and shall make the list publicly available.
7. The Commission shall be empowered to adopt a delegated act in accordance with Article 69 as to the process and procedure for the implementation of the waiver system in relation with this Article.
Maybe medium enterprise requirement here is meant to make wavers allow a growing business to have some time to develop the required systems. Non-for-profit I don't understand. Maybe there is something about non-for-profit growth which makes it to better be under the waver mechanism. I don't know. This is a better language than what was in amendments but there is no time limitation requirement on waivers. That is kinda bad as I have to trust Commission does not fuck up on this. There is not a lot of things end-users can do to complain if the waiver is granted and to remove it the user has to hope that the Commission's condition can be used.
Article 17 Internal complaint-handling system had couple of interesting additions. Paragraph 5 about which demands that complaints cannot be handled just by bots got interresting addition about human interlocutor: 5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Online platform shall ensure that decisions are taken by qualified staff.
So what does where necessary
mean here? What does time of the submission
mean here? I hope where neccssary means if there is something to be clarified. Like complain was for identity theft of X but X was not "shown" on the video so every one is confused about what happened. I think time of the submission
means there is a checkbox on the complaint form "request human interlocutor" or "request a meeting" or something like that. I didn't find anything in the recitals to confirm this.
New paragraph on Article 17 Internal complaint-handling system makes clear that recipients of the service member states must allow fast run to judicial redress: 5a. Recipients of the service shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned.
This is great in situations where a political campaign is held on online platforms. Ask the court for a fast fix and the online platform must fix it if ordered. I'm a bit surprised that the amendment about protecting politicians didn't make it. Amendment said along lines if "online platform is taking down politicians then request has to be made to digital coordinator first". But this does fix some of that missing.
One thing missing from Article 23 Transparency reporting obligations for providers of online platforms was how many complaints about moderation decision online platform gets. This would be great to know as false positive cannot be estimate without. Paragraph 1 got new point aa to add this: (aa) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed;
. There is a problem of does online platforms report only EU numbers and how they would know when the user is from the EU but overall this is great.
Section 3 end has two new articles. First one I like, second one creates problems which I may ask ammendment. New Article 24a Recommender system transparency requires online platform open up the recommender system.
1. Online platforms shall set out in their terms and conditions and via a designated online resource that can be directly reached and easily found from the online platform’s online interface when content is recommended, in a clear, accessible and easily comprehensible manner the main parameters used in their recommender systems, as well as any options for the recipient of the service to modify or influence those main parameters that they have made available.2. The main parameters referred to in paragraph 1 shall include, at a minimum:
(a) the main criteria used by the relevant system which individually or collectively are most significant in determining recommendations;
(b) the relative importance of those parameters;
(c) what objectives the relevant system has been optimised for; and
(d) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
The requirements set out in paragraph 2 shall be without prejudice to rules on protection of trade secrets and intellectual property rights.
3. Where several options are available pursuant to paragraph 1, online platforms shall provide a clear and easily accessible function on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
This doesn't really go far enough. Why can't I affect the recommender system at search time? With this wording, youtube will have the option buried in settings. No one is going to use it. Steam's recommender system is pretty much the gold standard when it comes to affecting things at search time. One of the things which really annoy me is when in search I cannot disinclude words. I think it isn't done often because it requires computational power on more complex systems. It would be fine if these computations would be offloaded to the client. I would like to just have the option. I still article like it as it gives more power to the user. Why recommender system openness is not a general requirement I don't know. Even small businesses should be able to make a transparent recommender system without having huge maintenance costs.
Other Article 24b Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content added is special reguirement about pornographic content. It is horrible and thank fuck it is in online platform part so damage is limited.
Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure:(a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration;
(b) professional human content moderation, trained to identify image-based sexual abuse, including content having a high probability of being illegal;
(c) the accessibility of a qualified notification procedure in the form that, additionally to the mechanism referred to in Article 14, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be suspended without undue delay.
Horribleness comes from two things. One, why online platform has to be primarily about pornography? Like does www.pixiv.net, or www.picarto.tv count here? They have pornographic content but general idea of the side is to be content platform for a artist. Second, ether of the side isn't for pornographic content between real humans but for artist expression. Does www.pixiv.net has to have trained people to identify image-based abuse when images them self are not aboutreal people? Actual, www.pixiv.net terms say Works that are mainly real-life photographs (*)
are prohibited (with some exceptions). So, there shouldn't be a need for this at all in the first place. Now, this is not as bad as what was on the table but this needs to be fixed.
Finally ,for Section 4 very large online platforms Article 30a Deep fakes was added
Where a very large online platform becomes aware that a piece of content is a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful (deep fakes), the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
This is somewhat odd since it requires the large online platform to know in fact deep fake is falsely appearing truthful. I could see someone using deep fake tools to make parody or satire. So if an online platform does not require everything using deep fakes label it in the terms of service online platform could face liability for incorrect labelling. Small thing. Youtube's lawers probably make blanked ban on deep fakes tool usage such as to be sure.
Overall improvements were made. Probably have to do something about the wording of Article 24b but things are going in the right direction.