Looking Beyond Eyes: What We Lose When Technology Watches Us
Looking Beyond Eyes: What We Lose When Technology Watches Us
Rahul Ramya
16.09.2027
When you open Google Maps to find the quickest way to the market, you might feel grateful for the convenience. Yet in that moment, your phone is not just helping you—it is also quietly recording where you are, where you came from, and how long you stay. What feels like a service is actually surveillance. In 2018, Google received fewer than 1,000 “geofence warrants” from U.S. police asking for data on everyone in a certain location. By 2020, the number had exploded to over 11,000 (Wired, 2020). Innocent bystanders walking near a protest or crime scene suddenly found their movements stored in criminal investigations. The shift is stark: the tool that helps us navigate the city is also what governments use to navigate our freedoms.
This isn’t just a story about the United States. In India, millions of people live with Aadhaar numbers linked to SIM cards, bank accounts, and even ration systems. For gig workers on delivery apps, GPS never sleeps. Rani, a delivery worker in Delhi, recalls how the app deducted her rating because she stopped briefly for a cup of chai. The system flagged her as “idle,” reducing her chance of future work. She said, “I wasn’t lazy, I was just human.” Yet in this new world, being human is almost a liability. A survey by Fairwork India in 2022 found that 83% of Indian gig workers reported working more than 10 hours a day under constant algorithmic surveillance (Fairwork India Report, 2022).
Companies like Google frame these tools as helpful. In 2015, it introduced Your Timeline, which allows people to see where they have been on a neat visual map. Many found it amusing or useful. But behind the glossy interface, the real purpose is clear: users were nudged to add photos, comments, and corrections, effectively training Google’s surveillance engine for free. In Brazil, people later discovered how location data tied to WhatsApp forwards was used to flood neighborhoods with political disinformation ahead of Jair Bolsonaro’s election victory in 2018. As The Guardian reported, business groups secretly financed mass WhatsApp campaigns that targeted voters by geography and demographics (The Guardian, 2018). One resident of São Paulo admitted: “It felt like they knew when I came home, what street I walked, and they used that to scare me into supporting them.”
The same pattern repeats globally. In China, the Social Credit System is perhaps the most visible symbol of surveillance politics. By 2019, it covered over a billion people. Its reach is not symbolic—it is literal. That year alone, 2.5 million citizens were blocked from flights and 90,000 from high-speed rail because of low credit scores, often linked to unpaid debts or “untrustworthy behavior” (BBC, 2019). Courts publicly named 300,000 people as “untrustworthy.” The lesson is clear: once surveillance is tied to identity, freedom itself becomes conditional.
Even where democratic safeguards exist, surveillance still gnaws at freedoms. In Hong Kong, during the 2019 protests, young demonstrators described the fear of carrying smartphones, knowing police could trace their locations. One student confessed she stayed home, not because she disagreed with the cause, but because “I did not want my face to be linked to my phone forever” (Reuters, 2019). Surveillance didn’t have to jail her—it silenced her before she even spoke.
In India, the chilling effect has been just as visible. During the farmers’ movement (2020–21), the government used Twitter and telecom companies to block protest hashtags and suspend accounts critical of farm laws (BBC, 2021). The anti-CAA protests (2019–20) saw police using CCTV and facial recognition software in Delhi to identify demonstrators (Indian Express, 2020). Protesters reported that fear of being recorded kept many away from marches. This shows how surveillance shrinks not just private space but also public space—the ground on which democracy stands.
Meanwhile, the money flowing from surveillance is enormous. The global market for location-based advertising was worth $111 billion in 2023 and is projected to nearly triple to $296 billion by 2030 (Statista, 2024). In India alone, it is expected to leap from $2.8 billion in 2023 to $10.9 billion by 2030 (IMARC Group, 2024). For tech companies, our daily movements are not private—they are profit streams. The more data extracted, the more precisely ads, nudges, and even political messages can be targeted.
This brings us to the corporate–government handshake. In the United States, under Donald Trump, law enforcement agencies leaned heavily on Google’s location data and Palantir’s predictive policing tools. In India, Narendra Modi’s government welcomed huge investments by Facebook and Google in Jio Platforms, cementing a nexus between political power and corporate data giants (Reuters, 2020). In China, the Communist Party directly integrates Alibaba and Tencent data into governance systems. Across these cases, the pattern is the same: surveillance capitalism and political authority feed each other, building an infrastructure of control that is hard to resist.
What does this mean for democracy? It means free speech is reshaped when political campaigns use Google Ads to geofence voters near polling stations. It means the right to assemble is eroded when geofence warrants track everyone in a protest zone. It means public debate is tilted when misinformation finds its way more quickly to the people most likely to believe it, guided by location and behavioral data. Citizens are not just persuaded—they are engineered.
The human cost is not only about privacy but about dignity. Rani’s pause for tea, the São Paulo resident’s manipulated fear, the Hong Kong student’s silent absence, the Indian farmer silenced online—all show how surveillance convenience turns into surveillance deprivation. We lose not only our right to choose freely but also the basic respect of being treated as more than a data point.
Technology cannot be abolished, and indeed it brings genuine benefits. But if left unchecked, it transforms into a system where freedom itself feels like a service we must subscribe to—if we can afford it. The task before us is not to reject technology but to reclaim it: through strong laws like the EU’s GDPR and India’s new Digital Personal Data Protection Act, through political debate that refuses digital resignation, and through daily acts of civic resistance—from unions of gig workers demanding transparency to citizens supporting local, non-surveilled alternatives.
As philosophers from Aristotle to Amartya Sen remind us, freedom is not only about survival but about dignity, reflection, and agency. When our choices are silently nudged by systems we cannot see, our democracy weakens long before our vote is taken away. Looking beyond eyes, we must ask: do we want tools that serve us, or systems that watch us until we forget what it means to be free?
A Roadmap to Reclaiming Freedom in the Age of Surveillance
Technology is here to stay. We cannot abolish it, nor should we—because it also brings us opportunities for knowledge, connection, and growth. What we need is not rejection but reclamation: a way to use technology without letting it quietly strip away our freedom. Below is a step-by-step roadmap that works at four levels—personal, civic, political, and legal—woven with real stories.
1. Personal: Building Everyday Resistance
Digital literacy as self-defense: In 2021, a group of college students in Hyderabad launched workshops teaching auto drivers how to use Signal and DuckDuckGo instead of WhatsApp and Google Maps, after finding their location data was being sold to advertisers. This shows that even simple shifts in apps can reduce exposure.
Owning your data trail: Ola and Uber drivers in Delhi began using a browser extension called “Uber Cheats” to track how fares were being manipulated. One driver, Shyam Singh, told The Wire (2022): “I realized the app was taking a bigger cut at night. Once we exposed it, more drivers joined our union.”
Reclaiming attention: Globally, a U.S. study found that 70% of YouTube views come from algorithmic recommendations. Choosing your own playlist instead of autoplay is a way of breaking free from nudging. In India, parents increasingly turn off YouTube autoplay to protect children from endless “Baby Shark” loops—proof that resisting algorithms starts at home.
2. Civic: Collective Action in Communities
Digital unions and cooperatives: In Bengaluru, the Indian Federation of App-Based Transport Workers (IFAT) has organized strikes demanding algorithmic transparency from Uber and Swiggy. In 2022, gig workers protested in Delhi after Swiggy’s algorithm cut delivery times, making them drive recklessly to earn the same pay. Their collective voice forced Swiggy to revise targets.
Community alternatives: Kerala’s Kudumbashree women’s collective uses a local e-commerce app to sell farm produce, bypassing Amazon and Flipkart. One farmer told Frontline (2023): “For the first time, we control prices instead of being trapped by middlemen.” Such experiments show that community-owned tech can loosen corporate control.
3. Political: Demanding Accountability from Power
Speaking truth to surveillance power: During the 2020–21 farmers’ protest, Delhi police used drones and facial-recognition cameras to monitor gatherings. Yet, farmers held their ground. One protester, Gurpreet Singh, said to The Indian Express: “If the government wants to see us, let them. We will not hide, because our fight is just.” Their defiance forced Parliament to repeal the controversial farm laws—an example of resistance winning even under surveillance.
Campaign financing transparency: A 2019 investigation revealed that the BJP used thousands of WhatsApp groups to push targeted propaganda, many run by volunteers trained in “IT cells.” Ordinary voters often didn’t know they were receiving paid political content. Without transparency in political advertising, citizens remain in the dark while corporations and governments collude.
Citizen-driven platforms: India’s UPI (Unified Payments Interface) now handles over 10 billion transactions monthly. Unlike Paytm or Google Pay, UPI is state-backed, fee-free, and open. If such infrastructure is protected from corporate monopolization, it can show how technology can serve the public good instead of private profit.
4. Legal: Building Guardrails for Freedom
Robust data protection laws: When Jharkhand’s 11-year-old Santoshi Kumari died of starvation in 2017 because her Aadhaar-linked ration card was canceled, the tragedy showed how weak protections allow data systems to decide who lives with dignity and who doesn’t. Stronger consent and accountability laws could have prevented such an outcome.
Algorithmic accountability: In the U.S., the EU’s AI Act is forcing companies to explain how AI decides loan approvals or welfare allocations. In India, gig workers still face “black box” algorithms. A Swiggy worker in Chennai told Scroll.in (2023): “One day my ID was blocked without reason. I had no way to appeal.” Unless laws demand algorithmic transparency, workers remain helpless.
Safeguarding dissent: India saw over 700 internet shutdowns between 2012 and 2022—the highest in the world. From Kashmir to Delhi protests, shutdowns cut citizens off from organizing. The right to privacy and the right to assemble must be treated as inseparable pillars of democracy, or else free speech becomes conditional on government approval.
---
A Final Word
Reclaiming freedom in the age of surveillance is not about rejecting technology but about reshaping its use. At a personal level, it means small acts of resistance. At a civic level, it means building solidarity. At a political level, it means forcing governments to choose people over corporations. And at a legal level, it means demanding guardrails that protect rights.
If Indians—and citizens everywhere—can weave these layers together, then technology can move from being a tool of quiet domination to a platform for human flourishing. The challenge before us is not whether technology will define our future, but who controls it, and for what purpose.
Comments
Post a Comment