EU lawmakers should set a feasible framework for user notification and redress.
The Digital Services Act (DSA) will bring significant changes in how users will interact with online platforms in the EU. As platforms have developed, the range of content available to users has increased substantially. This has created a need for robust content moderation and to ensure users can fully understand the content they are interacting with.
The DSA debate has been from the beginning mostly focused on “Big Tech”. However, Europe’s digital ecosystem is made up of a vast array of digital entrepreneurs, including ambitious software developers, looking to respond to user needs and innovate. They need a practically feasible legal framework that would allow them to efficiently moderate content while preserving the rights of their EU customers.
As EU lawmakers reach the crunch point of their negotiations on the DSA, there remain a number of outstanding issues. An important one, for example, is how exactly to ensure there is fair redress for users when it comes to decisions on content moderation.
This is a complex task considering the vast amount of content that appears online in Europe every day. Users should be able to understand why decisions on content are taken. However, there is a fine line between providing sufficient information for users and an overload. Digital entrepreneurs need a balanced, workable system of user redress, that would function correctly while protecting free speech.
Many of the proposed amendments go beyond these goals. Of particular concern is the fact that certain proposals extend notifications to users beyond the mere availability of content. As it stands, a user will receive multiple notifications for any action that impacts the ‘visibility’, ‘ranking’, or ‘demotion’ of content. There are many reasons that a platform may change the visibility or ranking of content, including due to user preferences. If the definitions around user notification are not tightened, it could unleash a vast wave of notifications that overwhelm users and platforms alike. Such a situation could see online platforms struggle to adequately moderate illegal content, for example.
Moreover, there are proposals to extend the redress mechanism to user notifications beyond those who upload content and where platforms take no action. This expansion of the scope is completely disproportionate and fails to take into account the fact that a large volume of user notifications received by platforms each day are inaccurate. The content moderation systems could be compromised by such excessive requirements; unable to cope with the increase in notifications at the same time they are issuing far more notices themselves.
The DSA proposal also sets out out-of-court and internal complaint redress provisions, allowing users to challenge a decision made regarding their content. To ensure a feasible mechanism, the co-legislators should set out essential safeguards which would prevent online platforms from being subject to multiple proceedings on the same dispute. If not, there would be nothing stopping bad actors from clogging the redress system with numerous spurious actions which divert platforms’ resources away from legitimate actions and content moderation.
The broad-brush approach to user redress fails to take account of the realities of the digital environment and is a rough attempt to accommodate standards from the offline world to the online one. It also fails to take into account that the DSA will not be applied in isolation. It is part of a broader set of rules such as those of GDPR and on copyright – an intricate regulatory framework. An impractical mechanism for user redress will only add more legal complications and costs for European digital entrepreneurs. It is the right time for the EU lawmakers to lay down optimal solutions.