top of page

The Second-Hand Saviour Complex: How Data, Surveillance and Big Tech Resurrected Colonialism

By Sakthisree, One Future Collective


Are we really free from colonialism?


You may think that our fight against colonialism ended and that the world and our country now live (largely) independently and freely. But as people, do we find ourselves “free”? What if we were, in fact, still living under some form of invisible colonialism? Sounds a bit crazy, doesn’t it?


We often think of colonialism as being exclusively part of our history textbooks and technology being exclusively part of Silicon Valley culture, but if you add two and two together, you might catch a clearer glimpse of reality. Indeed, we are a free nation and you and I are empowered with our fundamental rights enshrined in our Constitution - but there are times we all feel like our actions are not under our own control. Be it through social media advertisements, or perhaps through our addiction to the virtual world, there is one thing that is always pulling or pushing us to do something that we may not necessarily want to.


What is data colonialism?


Historical colonialism usurped territories, ravaged natural resources and exploited the work of native people. Today, colonialism has taken a new form, where exploitation is not of resources or land, but of the human experience. This is facilitated by the advancing technology of data collection and data processing, enabling powerful entities to reap the benefits of personal data for unfair profits and behavioural influence.


“Data colonialism’s power grab is both simpler and deeper: the capture and control of human life itself through appropriating the data that can be extracted from it for profit,” explains Couldry and Mejías in their book, The Costs of Connection: How Data is Colonizing Human Life and Appropriating it for Capitalism.

This form of colonialism, called “data colonialism”, is a new social order which operates on the principle of continuous tracking, processing and surveillance. It allows unrestricted and unprecedented opportunities for corporations and governments to control and profit from the data extracted through the digital world. Simply put, people are making money from your online existence: your Instagram likes, your Facebook posts, your tweets, as well as your messages being watched. With all this data, your experience online is tailored to suit your behaviour.


It is true that data colonialism may not have the exact features of historic colonialism that we all know of. But when you take a look at the core functionto exploit the world’s resources on an extravagant scale, to redefine human behaviour and relations for capitalistic productionthe parallel is clear.


Big Data and surveillance


Over the past five years, there has been a growing scepticism and caution on the social impact of Big Data projects, especially in the Global North, with the European Union bringing about the GDPR to protect the data privacy interests of its residents. However, there seems to be an overwhelmingly positive narrative around data technologies in the Global South, without paying much heed to its hidden consequences. For instance, India largely celebrated the nation-wide project to create a unique identification number through Aadhar and praised the adoption of the Arogya Setu app, which tracks the proximity and number of Covid-19 cases around you based on self-reported information fed into the app by its users, during the pandemic. As much as the benefits of these projects lay true, the hidden cost of the benefits comes at the price of our personal data at vulnerable reach.


For example, in September 2020, major privacy concerns were raised when the National Law School of India University in Bengaluru decided to implement an online, home-based remote proctoring technology (RPT) for its National Law Admission Test. The consequences of RPTs are dangerous since people are watching young students for extended periods of time, in their homes, while they write an exam. Sensitive information like the names, contact details, biometric records, room scans, audio and facial data of students is being collected in the process. Such data, if leaked or misused, could possibly lead to cybercrimes such as fraud, profiling and worse, online violence especially targeted at young girls and other marginalised communities.


It is important to adopt this nature of critique because of the ways in which unfair data extraction can discriminate against society’s most vulnerable through the automation of inequality or perpetuate “surveillance capitalism” through social media and other ‘free’ services that monitor our behaviour in extraordinary & precise detail. This is especially the case in the Global South, wherein the personal data of millions of lives are in the hands of the majority of the Global North companies.


Tech and policy should go together


More often than not, there is a "heroic" narrative given to data systems and models: that data is good, and that it helps us live easier, more efficient lives. There is a cost to this. As much as some projects help our society, most of them possess a serious threat if developers, founders, funders, and policymakers are not careful about their governance, development and use. From Silicon Valley tech giants to AI-powered credit and surveillance systems in China, it is crucial that we observe, analyse, critique and challenge power where it is imperative. This calls for reformation and the importance of watertight policies to ensure that neither our government nor corporates overstep the boundaries of invasiveness and violate our basic right to privacy and free choice.


A noteworthy example of the need for tech policy comes through the example of the encryption debate in America. In the wake of the San Bernardino shootings, there was an intensive debate surrounding Apple’s iPhone encryption. Encryption is an imperative tool for protecting privacy and ensuring security online, but the same also raises challenges for law enforcement investigations. The lesson learnt was that engineers cannot make development and design decisions without an informed understanding of policy tradeoffs, and policymakers cannot make choices without understanding the nuances and consequences of technology. This paved the path for America to set up tech-policy makers who can best govern and lead such situations.


Closer home, the introduction of The Personal Data Protection Bill 2019 in India has been progressive in its approach to addressing data privacy and the future of data technologies in India. However, some of its major revisions highlight worrisome indications of the state exercising power over healthcare and employment data, amongst other things. Inspiring movements such as the “Save Our Privacy” movement, highlight the consequences of such provisions and lead an extraordinary collective action through strategic litigation, policy recommendations and awareness campaigns to ensure our country stands true to its democratic principles.


Beyond policy: how to fight data colonialism


How can we ensure that our government and social institutions have the skills and capacity to confront these significant challenges? And how do we inspire the next generation of leadership to guide our society through these coming challenges?


We can fight against the consequences of data colonialism through active learning and unlearning societal structures of oppression. Education is absolutely vital. From high schools to universities, and beyond, we should be following fundamental principles by which we can develop technology through the lens of justice and intersectionality. People have already begun to do this: Catherine D'Ignazio and Lauren Klein talk about the seven principles of Data Feminism, that are (1) examine power, (2) challenge power, (3) elevate emotion + embodiment, (4) rethink binaries and hierarchies, (5) embrace pluralism, (6) consider context, and (7) make labour visible. We could use this to build a framework of principles relevant to our own socio-economic and cultural contexts.


Young people in India are raising alarms about data colonialism. People are slowly, but surely, waking up to the realities of what technology can perpetuate and the importance of conscious choice. This was evidently seen in the recent incident of the WhatsApp policy update when numerous people understood the importance of privacy and immediately transitioned to more secure platforms. And this is only the beginning! Independent organisations such as the Internet Freedom Foundation are working to make sure citizens know their privacy rights, and are also going to court to defend these rights and challenge unconstitutional policies that affect them.


Data colonialism strikes at the heart of our fundamental right to privacy and free choice. In the wrong hands, our data can be used for mass surveillance, to swing elections, influence the economy, and affect everyday human behaviour. As a democracy, we must regulate who can collect data, in what manner, and ensure that there are enough safeguards and checks on its use and dissemination. As a people, we have the chance to participate in a conversation about a fledgeling fundamental right in India, along with the rest of the nation. It is up to us to protect the way our data is collected and used because it has the potential to affect how all of us exist in the world.

 

The People's Podium is a monthly column, in collaboration with One Future Collective, that explores current policy-legal affairs in India and around the world, from a lens of social justice and intersectionality.


About the author

Sakthisree is a CPL Fellow 2021, One Future Fellow 2020, and an intersectional feminist. She is passionate about looking at data through the lens of justice and intersectionality. She hopes to work in the global policy field of AI/tech and is currently conducting research on the democratisation of data and data feminism in the global south.


One Future Collective is a feminist youth led not for profit based in India. Their mission is to nurture radical kindness in people, communities and organisations through the work they do on gender justice, feminist leadership and mental health, to enable a world built on social justice led by communities of care.


1 comment

1 Comment


Unknown member
Nov 10, 2021

SEO processes are available in various aspects for the people to check on the ranking sites and make your sites to rank. The Best SEO company in Salem available is First Success technologies. Also there are millions of features waiting for the persons who are interested in ranking their websites. Though the things are a little easy for the people, you can get them at a very affordable price. Many of the people are waiting for the processes which are known for ranking the site apart from the competitive sites. Also you can make use of them in aspects according to the company or website ranking details.

Like
bottom of page