December 21, 2021
0
Platform Power

Today, 15th of December, the European Parliament has approved its position on the Digital Markets Act (DMA). While unfortunately, it scales down the DMA scope by limiting who will be considered a gatekeeper, the Parliament position adds a number of notable improvements from a digital rights perspective that help challenge digital gatekeepers’ overwhelming power.


In particular, European Digital Rights (EDRi) welcomes the addition of interoperability requirements for instant messaging and social media services provided by gatekeepers. Such core service interoperability is a pro-competitive tool that can play a major role in ensuring market contestability, innovation, competition and the empowerment of all of us.

“Enabling providers of smaller social media and messenger apps to interoperate with gatekeeper services is an important milestone in bringing back real choice for people.”

– Jan Penfrat, Senior Policy Advisor, EDRi.

While we welcome the Parliament position’s clear prohibition for gatekeepers to combine or cross-use personal data without consent, we call on the Rapporteur to close potential legal loopholes in article 5.1(a) that could allow gatekeepers to circumvent their legal obligation under GDPR to request consent for each processing purpose.

We support the new prohibition of dark patterns and encourage negotiators to ensure that clearer language is used to describe how regular dark patterns look like as has been the case in the European Parliament IMCO Committee’s position on the Digital Services Act. Dark patterns are user interface designs deliberately built to push users into making a certain choice that they would otherwise never have made, like the way most cookie banners work today.

“We are very glad to see that the Parliament is taking a first step towards a fair and interoperable market. It will now be important to put the objective of ending corporate concentration into a clear and enforceable language.”

– Christoph Schmon, International Policy Director, EDRi member, Electronic Frontier Foundation (EFF).

Just like the original European Commission proposal, the Parliament’s DMA position contains a number of additional important “dos and don’ts” that will increase user freedom and protection, as well as hopefully lead to a more diverse digital environment for all. For example, the freedom to install and remove the apps people actually prefer as well as to choose the best app stores rather than the one proscribed by the gatekeeper.

EDRi also very much welcomes the Parliament’s clear commitment to enabling end-users, consumer groups and other civil society organisations to support the DMA’s enforcement regime with a right to lodge complaints, provide input to the European Commission’s own investigations, and access documents relating to ongoing cases.

“Participation of end users and civil society organisations in the monitoring and enforcement of the DMA is fundamental to its successful implementation and to redress the imbalance of power between individuals and big platforms.”

– Tomaso Falchetta, Global Policy Lead, EDRi member, Privacy International.

The European Parliament and the Council are planning to commence trialogue negotiations for the Digital Markets Act in early 2022, expecting to finalise the Regulation by the spring.

Big Tech
Platform Power
September 24, 2021
0
Platform Power

“Why are some voices not heard online, while others are amplified globally?” – How many times have you asked yourself this question?

Sensationalist and misleading content is made viral. Why? Because it generates profit for BigTech. Accurate information is hidden under a pile of walls. Why? Because it does not generate profit for BigTech. We have all seen it. How can we change our society if we are consistently limited to hearing and reading only content making BigTech rich ?

Prioritising some voices and silencing others online is a very lucrative model Big Tech companies and governments take advantage of.

Companies will always prioritise their profit, which means only voices who generate profit are deemed worthy of exposure. This model is very convenient for governments, that want to silence political dissidents or hide events. By exploiting BigTech’s hunger for profts, governments can ask uncomfortable content to be randomly taken down.

We must prevent arbitrarily or abusive decisions regarding people’s online content and empower all to be able to raise our voices for social change

To do so, Big Tech Companies should not be the judge and jury of what can or can’t be online: this will only reinforce their power and prevent us from having real solutions and conversations. We must always leave courts to decide what’s legal and what’s not.

Governments and companies should invest in real solutions, centered on the needs of victims of online harms.

Let’s repair social issues with social means, not tech “fixes”.

We urge all those involved in scrutinising and passing legislation in Europe to stand on the side of the people you represent and defend our fundamental rights against abuses by Big Tech and overreaching governments alike, critically – the right to free speech for all.

green
Platform Power
0
Platform Power

83% of us don’t want the personal data we’ve shared with the social media company to be used for targeted ads.

Who gets to choose how our life looks like, what opportunities we’re offered?

Big Tech companies think they should be able to decide about our life and future and freely influence what we see and do.

However, their model is based on exploiting our private conversations, intimate moments with family and friends, and our interactions with the content we’re most prone to engage with. People are reduced to mere data points companies associate as us, that ignore the context of our life – but in the same time influence it. What we see online is limited to what BigTech’s algorithms have decided.

Turn off the manipulation machine! Big Tech’s toxic recommender systems, surveillance ads and algorithms must stop!

The effects of these tools are staggering. Surveillance ads and recommendation algorithms can change vote results, deprive people of life opportunities and harm our mental health. And we’ve seen all of this happening:

Big Tech Companies argue surveillance ads and toxic recommendation algorithms give us a better experience. Surveys show a huge majority of us oppose being targeted by surveillance ads. Despite this, we are harmed, discriminated against and not allowed to choose what our experience should look like. We must be able to choose our platform experience, independent of BigTech’s commercial choices.

It is time for these platforms to de-risk their design, detox their algorithms and give users real control over them.

red
Platform Power
September 25, 2021
0
Platform Power

Who gets to decide how to fix Big Tech in a democratic society? So far we been asked to trust Big Tech companies to make changes based on their own good will.

We have been expected to sit and wait for structural changes that never came.

We haven’t followed those expectations. Instead, for years, we have been contesting Big Tech’s power and pointed at its business model.

We asked questions about BigTech lobby, called out their harmful actions and demanded change in the way we regulate them. The response from BigTech has always been weak and based on superficial tweaks rather than addressing the real issues:

It is clear, the changes made are designed within the same set of rules and model that created the issue in the first place. BigTech’s profit will always come first, second and third. People are left as a secondary effect of Big Tech’s profit.

Big Tech’s self-regulation has done nothing but prove that we can’t let the wolf guard the sheep. We need independent bodies that can evaluate the harms to society, and then apply the laws to mitigate those harms. This is the way we can guarantee transparency, accountability and that people’s interests are at the core of the solutions.

Let’s hold Big Tech to account!

purple
Platform Power