July 6, 2022
0
Platform Power

The biggest youth platform in the world is joining the Platform Power Coalition for a Digital Services Act that empowers young people. European Youth Forum will bring youth voices to the coalition, vindicating that digital rights are youth rights. Young people should be able to enjoy their digital environment without fearing privacy violations, discrimination or manipulation. Here is what you need to know about this alliance.

Platform Power Campaign

The Platform Power campaign is focused on three main issues around the Digital Services Act. The Digital Services Act (or DSA) is a piece of legislation that aims to regulate and consolidate various separate pieces of EU legislation that address illegal or harmful online content, and the provision of digital services across the EU. In April 2022 a political deal was achieved on the DSA, and on 4 July this will be sealed by the EU Parliament.

We work on a coalition of 12 organisations (more to follow!) and coordinate actions to have a Digital Services Act done right! The European Youth Forum (YFJ) is one of them. YFJ works with youth rights to make sure that young people are empowered and encouraged to achieve their fullest potential as global citizens.

As the biggest youth platform in the world, representing over 10 million young Europeans, bringing YFJ into the Platform Power coalition means adding youth voices to some of the most important discussions of our time.

Why is the DSA important for young people?

Today’s young people are the first generation whose entire lives are encoded in digital data. As the biggest group of users on social media platforms, young people are most affected by changes to the way the online world is managed.

In fact, 75% of young people reported wanting to know how their data is used when they use their social media accounts to access other websites and 90% of young people in the EU would find it useful to know their digital rights. However, young people’s rights often tend to fall under the radar. Strict rules that are in place to protect children suddenly fall away for 18-year-olds and above, exposing young people to a deliberately confusing landscape of data extraction, unwanted content and cyberbullying.

Far from the common belief of being “digital natives”, young people are not always aware of the harms facing them in the online world. Whilst the majority are at ease using social media apps and entertainment platforms, this does not necessarily translate into an innate knowledge of how to browse safely, how algorithms function or how to protect oneself online.

In fact, 60% of young people surveyed do not believe that social media companies know their ethnicity, sexual orientation, religion or political beliefs. Yet personal information about young people is being inferred by data profiling all the time, resulting in a risk of discrimination.

Young people are not in favour of targeted advertising. These are invasive, manipulative and triggering ads based on our personal data. However, we know that Meta’s algorithms target young people with personalised ads that exploit users’ mental vulnerabilities such as trauma.

While we are all affected by this, not all young people experience the digital space in the same way. LGBTQ+ people, young activists, young women and people of colour are often more vulnerable to these downfalls. Big Tech companies use algorithms and surveillance ads that make us stay longer on their platforms. These are generally focused on polarisation and misinformation. All of this – surveillance ads, algorithms that shape what we see or not on the internet and deceptive design practices- reduce our ability to organise among young people for the causes we care about, such as climate change, social justice, access to fair remuneration and employment, democratic engagement, etc.

Big Tech companies should not have a free playground to decide young people’s lives and future.

Young people should have the power to choose what they want to see online, and not to be drawn into manipulative design practices that lull us into doing or buying things we didn’t want. When young people don’t have this power, their life’s decisions are reduced and their opportunities are diminished.

What needs to be done?

Young people should have the power and right to choose from a world of opportunities and discovery on the internet and in their digital lives, not be subject to data extraction, manipulation and illegal content.

The DSA is a good first step. Now the implementation of this law and follow-up recommendations need to stay true to the intentions behind this legislation.

We now need:

  • a real end to surveillance ads for all
  • rights-respecting content governance, in all languages used on the platform
  • limits to algorithmic recommendations pushing young people to act in ways that do not reflect their true wants or needs
  • proper enforcement of laws
  • enough political will from our decision-makers
  • Big Tech taking the law seriously
  • people looking for alternatives to BigTech platforms.

We are excited to work together and make sure that the implementation of the Digital Services Act empowers young people!

Join us in this movement, and make digital spaces a space you can flourish and thrive in.

(Contribution by: Maria Belén Luna Sanz, Campaigns Officer, EDRi and Lauren Mason, Policy and Advocacy Manager, European Youth Forum)

DSA
Platform Power
June 9, 2022
0
Platform Power

Friday night’s political agreement on the Digital Services Act (DSA) is a good first step towards protecting people’s rights on the internet and to some extent limiting the immense power that Big Tech companies have over people and democracies.

EDRi welcomes the conclusion of a political agreement for the DSA on the night of 22nd to 23rd April. The DSA has the potential to serve as a global benchmark for how to regulate today’s hyper-centralised platform economy while also protecting people’s fundamental rights online, including freedom of expression and access to information, the rights to privacy, and non-discrimination.

In particular, we welcome the DSA’s appeals and redress mechanisms that will allow users to flag potentially illegal online content to hosting intermediaries, who in turn will be required to react through a transparent response process. Crucially, intermediaries will be able to carefully follow that process without being threatened by immediate legal liability at the expense of the rule of law, and without replacing independent judicial redress options for users.

“It’s relieving to see that EU legislators have learned from past mistakes such as the Copyright Directive, and in the DSA stayed away from creating a general monitoring obligation for online platforms,”

Jan Penfrat, Senior Policy Advisor leading EDRi’s work on the DSA.

EDRi also welcomes the first timid steps taken by the EU to limit the harmful business model of surveillance advertising and prohibit some of the most deceptive interface design practices deployed by online platforms against their users. Such a move by the EU was unthinkable just a few years ago. EDRi and other civil society organisations have consistently raised lawmakers’ awareness of surveillance ads as one of the root causes of online harms and polarisation. It is regrettable, however, that the political agreement has watered down this much-needed systemic change to such an extent that it is unclear how much it will actually bring visible positive change for people.

“At EDRi, we have advocated hard for the DSA to enable a real transformation of the online advertising industry—away from cheating and spying on users and towards a safer, privacy-respecting ad ecosystem. While the DSA compromise will help phase out some of the worst practices of ad-driven online platforms, it can only be the starting point for more profound change,”

Jan Penfrat

For example, the DSA’s prohibition of using highly sensitive personal data to target people with surveillance ads will be limited to online platforms only, and therefore leave the vast majority of ad networks embedded in common websites—as well as the data extraction industry behind it—untouched. Similarly, the prohibition of deceptive interface designs will likely exclude the most pervasive and annoying ones: cookie and tracking banners.

“Although the DSA’s final ad regulation is not perfect, at EDRi we are proud that we were able to help bring a ban of surveillance ads into the mainstream political debate in the EU”

Diego Naranjo, Head of Policy at European Digital Rights.

As a human rights organisation EDRi has been very concerned about the new Crisis Response Mechanism that was added to the DSA at the 11th hour as a response to the Russian war of aggression against Ukraine. Temporary crises should not lead to permanent infrastructures of state control. While earlier versions of this mechanism would have given the EU Commission unchecked executive power to unilaterally declare an EU-wide state of ‘digital emergency’, it is not least thanks to a strong advocacy effort from EDRi and many other civil society organisations that, according to press reports, negotiators eventually built in a requirement for the Commission to obtain the green light of national independent platform regulators first.

“We celebrate today a first step to reduce the power of Big Tech companies, even though we would have preferred that both the DSA and DMA were more ambitious, the compromise reflects what was possible given the current political majorities. We will work towards advancing safeguards and rights during the implementation of the new rules and support strong enforcement by regulatory authorities. By doing so, we also aim at supporting real alternatives to the currently dominant surveillance business model.”

Diego Naranjo

EDRi and its member organisations will continue to advocate legislation, policies and practices that uphold people’s rights and that shape the internet as an open, fair and inclusive digital environment for everyone.

READ MORE:

Big Tech
Platform Power
April 19, 2022
0
Platform Power

The dangers of online tracking have been shown once again: French presidential candidate Eric Zemmour has bought tens of thousands of phone numbers of Jewish voters from a data broker to incite division with unsolicited anti-Muslim text messages. The DSA must put an end to the use of this kind of inferred sensitive data for ads.

On Friday evening, April 8, many members of the Jewish community in France received text messages from the far-right “Reconquest!” party of presidential candidate Eric Zemmour. The texts speak of an “expansion of Islam”, “Islamic terrorism”, “anti-Semitism”, and “daily violence of scum”. The text messages specifically targeted people based on inferred sensitive data, namely their religious views, trying to goad one religious community to hate another. This scandal shows once again how the massive collection of inferred personal data for targeting ads can be used to polarise our societies so far as to sow hatred among religious groups.

Inferred data: Where did the phone numbers of French Jewish voters come from?

Victims of Zemmour’s incendiary messages and the underlying data abuse might not have made their religious views public. In order for a data broker to know about it, they did not have to explicitly tell Facebook or Instagram what religious group they identified with. Instead, this sensitive personal information has most likely been inferred from other types of personal information such as ‘likes’ and comments on social media, or geolocation data revealing the regular attendance at a synagogue, in order to assume an individual’s Jewish beliefs.

In March, over 35 civil society organisations have called on Emanuel Macron to commit to protecting people against these practices. Also in March, over 70 NGOs wrote to over 20 state representatives involved in the DSA negotiations from 11 Member States.

Many recipients of these intrusive messages complained publicly, and rightly so. Information about someone’s religious views is highly sensitive and—alongside political opinions, health data and sexual preferences—enjoys particular protection under the European data protection law GDPR. That is also why the French National Commission for Information Technology and Civil Liberties (CNIL) has opened an investigation into the data abuse scandal.

What can we do about the misuse of sensitive data for ads?

Surveillance ads drive a whole data industry that needs strong regulation, ideally at the EU level in order to avoid the fragmentation of the EU’s digital single market. More concretely:

  1. The Digital Services Act (DSA) should prohibit the use of sensitive and inferred data for the purpose of ads. The DSA’s Article 24 is the perfect place to put an end to the kind of data misuse that has damaged the presidential election in France. But its protections go further: it would also help us curb the targeted spread of disinformation and the systematic manipulation of our public debates by foreign powers.
  2. The DSA should prohibit the use of deceptive interface designs (‘dark patterns’). Data brokers like the one who sold phone numbers of Jewish voters to Eric Zemmour have huge amounts of other types of personal data that is being sold to whoever puts money on the table. Often people’s consent for this kind of data trade is collected by pop-up banners that make it incredibly hard to say No. The DSA’s Article 13a provides a unique opportunity to put an end to those cheating interface designs and give people a real choice of whether or not they agree their data is being sold.

In March, over 35 civil society organisations have called on Emanuel Macron to commit to protecting people against these practices. Also in March, over 70 NGOs wrote to over 20 state representatives involved in the DSA negotiations from 11 Member States. The EU should act now to protect its citizens through the DSA.

Platform Power
March 31, 2022
0
Platform Power

As civil society, we welcomed Minister Cedric O’s commitment on Friday to prohibit targeted advertising to minors as well as the use of sensitive data for ad targeting in the DSA. Now the French Council Presidency must follow through and protect citizens, 35 NGOs write in an open letter.

The open letter was sent to France’s President Emmanuel Macron and the French Presidency of the EU Council by 35 NGOs

Dear President Macron,

We are writing to you on behalf of the People vs Big Tech Coalition to express our deep concern regarding France’s failure to follow through on its promise to meaningfully protect EU citizens from the invasive use of their sensitive personal data for targeted advertising in the Digital Services Act.

This is a system that has been weaponised by foreign and nefarious actors to distort public debate and democracy – not least by Russia. It is also a system that routinely tramples on the rights of European citizens.

According to our newly published YouGov poll, an overwhelming majority of French citizens (70%) support a ban on the use of people’s sensitive personal data to target online advertisements. They are counting on you to secure this baseline protection in the DSA.

While we commend you for France’s tenacity in seeing through sweeping reform of the Big Tech platforms in the form of the Digital Markets Act, agreed last week, one of our movement’s core demands is that the Digital Services Act and Digital Markets Act take adequate steps to rein in the most invasive and harmful practices in online advertising.

This is why we were so encouraged to hear Minister O’s commitment on Friday, announcing that the DSA would include the proposal to prohibit targeted advertising to minors as well as the use of sensitive information for ad targeting.

Minister O rounded off his commitment with a reference to “how much trust there was between (the negotiators) to allow us to move forward and to take the most logical approach” on this all-important issue.

To our dismay, that trust now appears to have been broken. Mere days later, the French Council Presidency appears to have diluted the provision on sensitive data in ad targeting by moving it to a recital and severely weakening it so that it no longer meaningfully protects citizens from this exploitative practice.

This means European citizens will continue to be exposed to intrusive advertising on the basis of inferences about them which they may never choose to explicitly share or meaningfully consent to – including sensitive categories such as religious or political views, health conditions, and sexual preferences.

Beyond the well-documented harms to people’s rights, the use of sensitive data for advertising raises serious democracy and national security concerns. By segmenting the paid-for messages that are seen by specific groups of the electorate, dialogue between communities is prevented and disinformation can more easily thrive.

This type of advertising can and has already been weaponised by nefarious actors to distort public debate and influence democratic processes in Europe.

Russian interference in the US 2016 election via targeted ads was a clear example and, at a time when the world order is increasingly precarious and actors such as Russia seek to undermine the EU, the risks are now even higher.

The Digital Services Act is a vital opportunity to move towards a safer online advertising system which European consumers and businesses are able to trust and which safeguards citizen’s fundamental rights.

European citizens are counting on France to follow through on its promise to ensure that a final deal on the DSA prohibits the use of sensitive data, including the drawing of inferences about a person’s sensitive characteristics, for the purpose of displaying advertisements.

This is a critical baseline protection, already limited in scope to online platforms only, proportionate to the harms and necessary to achieve the aims.

If France wants a swift deal on the Digital Services Act, it cannot afford to betray European citizens at the eleventh hour. We hope that instead you will lead the way in ensuring a Digital Services Act that offers vital and overdue protections for European citizens.

Yours sincerely,

Access Now

All Out

Alliance4Europe

Avaaz

Bits of Freedom

Bulgarian Helsinki Committee

Civil Liberties Union for Europe (Liberties)

Citizen D / Državljan D

Cultural Broadcasting Archive (CBA)

Defend Democracy

D3 – Defesa dos Direitos Digitais

Democracy and Human Rights Education in Europe – DARE network

Digitas Institute

European Digital Rights Initiative (Edri)

Fair Vote

Federation of German Consumer Organisations (vzbv)

Fix the Status Quo

Global Action Plan UK

Global Forum for Media Development

Global Witness

HateAid

Institute for Strategic Dialogue

Irish Council for Civil Liberties 

#JeSuisLa

LobbyControl

Panoptykon Foundation

Peter Tatchell Foundation

Ranking Digital Rights

Sum of Us

The Coalition For Women In Journalism

The Daphne Caruana Galizia Foundation

The Signals Network

Vrijschrift.org

Waag

Wikimedia Deutschland

Wikimedia France

Platform Power
March 28, 2022
0
Platform Power

Last night, 24 March, the European Union made a great step forward to better protecting our rights online as it approved the political trilogue compromise for the Digital Markets Act (DMA). This decision promises to challenge the strongly centralised environment of Big Tech platforms exerting too much power over our rights and over the flow of information in society. Tech companies like Facebook, Google, Amazon, and Apple will have to start following strict rules that ensure free and fair competition in the digital markets.

Following intense discussions and months of advocacy and campaigning from civil society, negotiators from the European Parliament, the French Presidency of the Council of the EU and the European Commission agreed on the final version of the DMA. In effect, this legislation will contribute to reining in Big Tech’s grip on people’s experience online as well as on some of the major digital markets.

From a human rights perspective, this is a major win to ensure the protection of people’s rights and enable open, fair, competitive digital spaces.

“The DMA will put an end to some of the most harmful practices of Big Tech and narrow the power imbalance between people and online platforms. If correctly implemented, the new agreement will empower individuals to choose more freely the type of online experience and society we want to build in the digital era.”

– Diego Naranjo, Head of Policy, EDRi

Civil society has successfully pushed for a DMA that builds on existing data protection rules under the General Data Protection Regulation (GDPR). As a result, the DMA will protect people’s data from being used across several gatekeeper services like Gmail, YouTube, Facebook and WhatsApp, without a person’s explicit consent.

According to French State Secretary and DMA chief negotiator for the Council, Cédric O, there was an agreement between negotiating bodies to include a ban on the use of sensitive personal data for surveillance advertising in the Digital Services Act (DSA). While this DMA deal will limit some of the worst data use practices of the surveillance ads industry, EDRi will continue to advocate for a complete phase-out of surveillance-based online advertising in Europe.

EDRi is pleased about the inclusion of interoperability requirements for instant messaging services of gatekeepers. Under the new rules, gatekeepers that offer messaging services will be required to allow their users to connect and communicate with people on similar services while preserving the privacy protection afforded by end-to-end encryption. This interoperability requirement will empower people to move away from the dominant gatekeeper’s service without losing their secure connections with friends and family who decide to stay there. It will also enable the creation of a whole new ecosystem of interoperable chat apps that work in the interest of their users rather than for the benefit of advertisers and data brokers.

It is disappointing, however, to see that the final DMA compromise allows gatekeepers an unnecessarily long transition period to introduce the interoperability of voice calls and group chats. Both features are considered by users to be irreplaceable standards in chat apps and their absence is likely to seriously limit the benefit of the whole obligation.

The DMA will also empower the European Commission to add additional obligations for gatekeepers in the future, notably an interoperability obligation on social networks like Facebook. We hope the European Commission will consider using this power soon.

Now, it is vital that the Council and the European Parliament approve the negotiated political compromise and ensure that people’s rights are put before Big Tech’s corporate interests and profit.

“The DMA is a major step towards limiting the tremendous market power that today’s gatekeeper tech firms have. We must now make sure that the new obligations not to re-use personal data and the prohibition of using sensitive data for surveillance advertising are respected and properly enforced by the European Commission. Only then will the change be felt by people who depend on digital services every day.”

– Jan Penfrat, Senior Policy Advisor, EDRi
Platform Power
0
Platform Power

Ahead of the upcoming Digital Services Act (DSA) trilogue meeting on 15 March, EDRi, Liberties and Amnesty International and 69 other civil society organisations have sent a joint open letter to 20 ministers and state secretaries in 9 EU Member States. On Tuesday 1.03.2022, several organisations in the Netherlands, Denmark, Germany, France, Spain, Italy, Luxembourg, Austria, Croatia delivered the letter to relevant decision-makers responsible for their country’s position in the EU negotiations.

UPDATE: After publication, the letter was further delivered to responsible ministries in Ireland and Poland, totaling the number of addressed Member States to 11.

—-

Ahead of the upcoming Digital Services Act (DSA) trilogue meeting on 15 March, EDRi, Liberties and Amnesty International and 69 other civil society organisations have sent a joint open letter to 20 ministers and state secretaries in 9 EU Member States. 

On Tuesday 1.03.2022, several organisations in the Netherlands, Denmark, Germany, France, Spain, Italy, Luxembourg, Austria, Croatia delivered the letter to relevant decisionmakers responsible for their country’s position in the EU negotiations.  

Read the original open letter (17/03/2022) 

The Digital Services Act is an enormous opportunity to address the toxic consequences of online platforms’ business models based on exploitative tracking and targeting of people. Such consequences include amplification of harmful content, distorted public debate and discrimination of people.

The letter calls for a phase-out of unwanted online tracking ads and dark patterns. Additionally, the call emphasises the need for the DSA to safeguard the privacy protections we all have under the EU Charter of Fundamental Rights.

“The DSA must put an end to manipulative tracking ads practices that have infested every corner of the internet. Pervasive online tracking is fueling harm – from discriminatory job ads to Russian lies about their war in Ukraine. We urge Ministers in the EU to support the European Parliament’s proposals against tracking ads and dark patterns.”

Jan Penfrat, Senior Policy Advisor at EDRi

Dark patterns are manipulative interfaces designed to trick users into unintentionally consenting to share their personal data. 

Tracking-based online advertising relies on mass harvesting of personal data and algorithmic inferences which threaten our human rights, above all the right to privacy. However, this has a series of knock-on effects on other rights including freedom of opinion and expression, freedom of thought, and the right to equality and non-discrimination. 

“The targeted advertising model is based on personal data harvesting. It manipulates public debate, amplifies harmful content, and sows division in society. We can see long-term consequences of keeping people in an information bubble, decreasing the level of public discussion, and using targeting to spread disinformation, which impacts free and fair elections.”

— Eva Simon, Senior advocacy officer at Civil Liberties Union For Europe

The majority of people actually do not want personalised ads and opt against tracking when given a real choice. Even, small and medium-sized businesses would like to see large online platforms face stricter regulation on how they use personal data for ad targeting.

 “By supporting key amendments to the Digital Services Act, EU member states will take important steps towards changing the current system of invasive data harvesting and microtargeting.”

Alia Al Ghussain, Campaigns and Communications Officer at Amnesty Tech

The DSA has the potential to change the broken system based on data harvesting and protect the fundamental rights of internet users. In order to do so, the DSA must phase out the pervasive online tracking business model and prohibit dark patterns that trick users into sharing personal data they would not otherwise want to.

Please note that this letter was originally published on 17 March 2022 with 35 signatory organisations, and remains open for additional signatures. To add your organisation, please fill in the sign-up form.

Read the original letter in these languages:

Platform Power
January 31, 2022
0
Platform Power

Thousands of people took action in the days before the Digital Services Act (DSA) vote in the EU Parliament, asking Members of the EU Parliament (MEPs) to end surveillance advertising.

As part of the Platform Power campaign, we have coordinated with many civil society organisations and raised our voices for stronger laws against the business model of Big Tech online platforms. Together, we succesfully pressured law-makers to put people at the center of the debate.

On 20 January 2022, the Members of the EU Parliament (MEPs) decided BigTech platforms should no longer be allowed to use surveillance ads on children and have significantly limited surveillance ads for adults. More, the EU Parliament voted BigTech platforms should be prohibited from using ‘dark patterns’, so called manipulative interfaces 

Banning surveillance ads

See below how our representatives in the EU Parliament voted on the restrictions to surveillance ads.

Despite the fact that MEPs did not vote for a full ban on surveillance ads, they voted a strong amendment that severely restricted the use of people’s most sensitive personal data to target them with paid messages. In detail, the amendment prohibits the targeting or amplifications techniques that process, reveal or infer personal data of children or the sensible data of adults – for the purpose of displaying advertisement. Such sensible data includes religious beliefs, sexual orientation and racial or ethnic origin.

Prohibition on ‘dark patterns

MEPs also agreed to prohibit the use of ‘dark patterns’, so called manipulative interfaces that are designed to trick users into unintentionally consenting to sharing their personal data. Dark patterns are systematically used by Big Tech platforms like Facebook and Youtube but also by countless apps and websites to push users into consenting to surveillance based advertising.

See below how our representatives in the EU Parliament voted on ending ‘dark patterns’

People power makes change happen

The actions digital rights activists have taken ahead of the vote have been crucial for this success. We cherish the energy you put into making the digitalised society one based on fairness, equal opportunities, choice and justice.

The January DSA vote in the EU Parliament emphasised important protections for our rights – also in regards to people’s freedom of expression online and right to secure communications.

Moving ahead, the EU negotiations on how to regulate BigTech go into trilogues – a process notorious for its opacity and lack of democratic scrutiny. Do you want to know more about how the EU works and why the trilogues are special?

Let us know with a vote in the Twitter poll below.

Big Tech
Platform Power
January 17, 2022
0
Platform Power
Ask your EU representatives to END SURVEILLANCE ADS. Join the tweetstorm before the 19 January.

On 20 January, your representative will vote on whether surveillance advertisement should be allowed. It’s time they hear from you.


Almost 3 years ago, you voted for politicians to represent your rights in the EU Parliament. In the same time, Members of the EU Parliament (MEPs) promised to defend those who voted them.

In a few days, MEPs will vote whether BigTech platforms should be allowed to continue targeting all of us with surveillance ads. Will they side with PEOPLE or BIGTECH?

Surveillance ads sit at the core of Big Tech’s business model. By now, everyone knows how damaging Big Tech’s business model is – for our democratic systems, our rights as consumers and as human beings. By now, everyone knows surveillance ads polarise, discriminate, and affect the mental well-being of children, youth and adults alike.

In Brussels, BigTech lobbying is intense and deceitful.


BigTech’s people are talking to EU legislators, trying to reshape the narrative. They claim people want surveillance ads and that small businesses are happy with BigTech’s business model.

Big tech lies. You must tell the truth.

The truth is, 83% of people surveyed recently in Germany and France refuse to be targeted based on their personal data. More, another survey shows 75% of the leaders of small and medium business (SMEs) in Germany and France think surveillance ads intrude privacy. 69% of SMEs interviewed feel they have no other option but to use surveillance ads – due to BigTech’s industry dominance.

This is the reality BigTech lobbyists are trying to hide. We need you to raise your voice against BigTech’s power. Ahead of Thursday’s vote, tweet at your MEPs.

Let your representatives know we need to put an end to surveillance ads:

Platform Power
January 13, 2022
0
Platform Power

No, thank you.

That’s what people think about surveillance ads.

Global Witness commissioned a research to conduct a survey amongst 2,034 regular social media users (past 30 days) in France and Germany to find out if the biggest argument used by Big Tech to keep prying on our personal data was true. The results show that 83% don’t want the personal data they’ve shared with the social media company to be used for targeted ads.

Big Tech companies argue that surveillance ads are needed for us to get the full experience: get the info we care about, find the shoes we want, and discover new worlds. But we know that is not true. Most often, surveillance ads end up discriminating against us, limiting our choice and opportunities, and harming our mental health.

However the question remained. ‘Do people want personalised ads online?’

The research found out that people don’t even want the personal data that they shared with the social media company to be used for targeted ads (83%) nor their behavioural data tracked outside the platform (78%). In fact, they found that regardless of the purpose, the majority of people don’t want to receive personalised ads online (57%), including commercial or political. Furthermore, in the particular case of political ads, there is also a solid understanding of how using surveillance ads can hinder democracy (44%).

There is much more interesting data you can find in the full report, but if you are leaving this blog with a message, let it be: surveillance ads only benefit Big Tech. People don’t want them.

Big Tech
Platform Power
December 21, 2021
0
Platform Power

Today, 14th of December, the European Parliament Committee on the Internal Market and Consumer Protection (IMCO) has approved its much-anticipated report on the Digital Services Act (DSA).

The DSA affects how intermediaries like Google and Amazon regulate and influence user activity on their platforms, including people’s ability to exercise their rights and freedoms online. The DSA also aims at limiting the negative impact of the most powerful online platforms on people and puts limits on how EU Member States can interfere with people’s free expression online.


European Digital Rights (EDRi) welcomes IMCO’s clear commitment to the cornerstones of EU internet regulation, namely the conditional liability regime for online intermediaries and the prohibition of general monitoring obligations. Conditional liability prevents incentives for platforms to over-remove legitimate online speech for avoiding liability risk, and therefore protects people’s freedom of expression.

The prohibition of general monitoring obligations prevents EU Member States from requiring online platforms to scan and unilaterally judge all the information people share online. Those foundations are vital to protect our freedom of expression and access to information in a digital society.

“The IMCO committee has done a good job in fending off some of the worst ideas that had floated around in the negotiations, like short removal timelines for online content or the exemption of online content posted by TV and radio stations from any scrutiny – even if those outlets can spread disinformation or are mere government propaganda tools.”

– Jan Penfrat, Senior Policy Advisor at EDRi.

EDRi also welcomes IMCO’s decision to add strong protections against the manipulation of people consent online via ‘dark patterns’ to the DSA and to take a clear position for protecting the overwhelming need of users and businesses for secure end-to-end encryption technology. Dark patterns are user interface designs deliberately built to push users into making a certain choice that they would otherwise never have made, like the way most cookie banners work today.

EDRi is however disappointed by the lack of ambition in regulating surveillance advertising. The surveillance-based business model of dominant technology companies is based on pervasive profiling and on extracting as much personal information from individuals and groups, both online and offline.

An overwhelming majority of experts agree that the best way of protecting people against pervasive surveillance by companies is to ban those right-infringing, manipulative and discriminatory practices in favour of an advertising ecosystem that respects people and the law.

Many small and medium-tech enterprises in Europe, the EDPS, the EDPB, as well as a large coalition of consumer groups, social justice and human rights advocates have pushed for the DSA to better protect Europeans and foster innovation in the online advertising market. Yet, the full might of Big Tech’s lobbying apparatus in Brussels eventually prevented any meaningful reform.”

– Jan Penfrat, Senior Policy Advisor at EDRi.

The lack of interoperability for recommender systems, that could have enabled concrete alternatives to the current addictive, obsessive engagement-based systems that enable polarisation and disinformation is also disappointing. IMCO missed a real opportunity here for improving people’s online experience by giving them control over the kind of content they wish to see and interact with.

Furthermore, we regret the inclusion of mandatory identification on porn platforms which falls short of the need for holistic solutions to online gender-based violence and the publication of non-consensual sexual images. This measure will be detrimental to the rights of sex workers and online content creators. By forcing sex workers to expose their real identities and contact details, it puts these often stigmatised and criminalised communities at risk of hacking and abuse.

EDRi and partners invite anyone who would like to continue to challenge the grip that platforms have over our lives, communities and democracies to join our campaign to take back our power.

Big Tech
Platform Power