How big tech is changing who’s in charge of our rights and freedoms

How big tech is changing who's in charge of our rights and freedoms

Ascannio/Shutterstock

Since the end of the 20th century, daily life for most of us has increasingly moved into the digital sphere. This has led to the rise of the so-called “onlife” dimension, which represents the intimate intertwining of our online and offline lives. One day we may see the creation of the so-called metaverse, a perpetual online environment providing new digital spaces where people can interact, work and play as avatars.

The result is that people’s rights and freedoms are increasingly shaped by the rules set by big technology firms. Twitter’s decision to silence the former US president Donald Trump in the aftermath of the violence at Capitol Hill, Facebook’s banning of Australian publishers and users from sharing or viewing news content, and the decision of YouTube to block anti-vaccine content from spreading misinformation, are just some examples of how tech firms have expanded their role not only as global gatekeepers of information but also as private powers.

These examples raise constitutional questions about who has legitimacy, who should have power, and how democracy can best function in the digital age. This points to the rise of digital constitutionalism, a new phase where individual rights and public powers are “relocated” among different groups – such as technology companies – on a global scale.

A new power play

Digital constitutionalism does not mean revolutionising the roots of modern constitutionalism, the principles of which include responsible and accountable government, individual rights and the rule of law. Rather, it is about reframing the role of constitutional law in the digital age.

See also  Monitor or talk? 5 ways parents can help keep their children safe online

Modern constitutionalism has always pursued two missions: protecting fundamental rights and limiting powers through checks and balances.

In the digital age, one of the primary concerns regards the exercise of public powers which threaten rights and freedoms, such as internet blackouts or surveillance. This was underlined by the Snowden affair, where a CIA employee leaked documents revealing the extent of surveillance of the US’s National Security Agency (NSA), prompting debate about national security and individual privacy.

But private companies now dominate the internet and enforce terms of service or community guidelines which apply to billion of users across the globe. These rules provide alternative standards which compete with the constitutional protection of fundamental rights and democratic values.

The challenge for constitutional democracies no longer comes from state authorities. Rather, the biggest concerns come from formally private entities but which control things traditionally governed by public authorities – without any safeguards. The capacity of tech firms to set and enforce rights and freedoms on a global scale is an expression of their growing power over the public.

For example, when Facebook or Google moderate online content, they are making decisions on freedom of expression and other individual rights or public interest based on private standards that do not necessarily reflect constitutional safeguards. And these decisions are enforced directly by the company, not a court.

This situation has led to calls for transparency and accountability. The Cambridge Analytica scandal, which highlighted the extensive collection of personal data for political advertising, and the recent revelations that Facebook’s own research showed the potentially harmful effects of social media on young people’s mental health, have increased the debate around the responsibilities of these big tech companies.

See also  What will the UK election mean for online privacy?

Addressing big tech powers

Constitutional democracies are still figuring out how to deal with tech firms’ powers. And though they share the same global challenge, countries don’t always react in the same way. Even if constitutional democracies generally protect rights and freedoms as part of everyday life in a democratic society, this does not mean that this protection is equal across the world.

In Europe, the Digital Services Act and the General Data Protection Regulation arose from the desire to make tech firms more accountable when it comes to content moderation and data protection.

Mark Zuckerberg giving a speech against a blue background.

Mark Zuckerberg’s Facebook recently launched its Oversight Board to satisfy concerns about transparency and accountability.
Frederick Legrand/Shutterstock

But the US still sees self-regulation as the best approach to protect freedom of expression in the digital age. Even the US Supreme Court has underlined that the internet – and particularly social media – plays a critical role as a democratic forum.

As a result, online platforms have lost no time in consolidating their policy. The introduction of social media councils like the Facebook Oversight Board has been welcomed as a critical step for transparency and accountability. But this could also be seen as another step towards the consolidation of powers by adopting the veneer of more institutional system such as a “supreme court”, as Facebook has also done.

Digital constitutionalism offers a variety of perspectives to analyse the protection of rights and the exercise of power by big tech companies. It should also prompt us to raise the debate about how individual rights and freedoms are not just subject to the powers of the state, but increasingly to big tech companies too.

See also  Twitter is refusing to pay Google for cloud services. Here’s why it matters, and what the fallout could be for users

The Conversation

Giovanni De Gregorio does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.