MANILA, Philippines – Chances are you’ve been hit by an online advertisement for a product you were searching for just minutes ago. This is among the impacts of the surveillance business model, which The Signal Foundation — parent organization behind privacy-focused chat app Signal — president Meredith Whittaker discussed during Rappler’s 2024 Social Good Summit, held on Saturday, October 19.
Whittaker, who joined the summit virtually, explained that surveillance has become central to the tech industry and artificial intelligence as we know it. Companies currently rely on advertising to understand their customers better, which now involves gathering large amounts of data on potential consumers in the pursuit of business growth and profit.
“There were decisions made in the neoliberal spirit to allow internet companies to conduct mass surveillance at any level they wanted, [to get] as much data as they wanted, and to endorse advertising as the engine of the tech economy — which of course matters, because advertising is the mandate to know your customer. So how do you know your customer as an internet company? Well, you gather more and more data on them,” she said.
But Whittaker said the design of social media and other tech platforms often risk creating “deeply perverse incentives and consequences” when it comes to online advertising. Outside commercial purposes, advertising tactics can also be used to spread propaganda at a large scale, ultimately influencing behaviors and public opinion.
For instance, Whittaker says advertising can be perceived as a “very benign thing” when brands persuade users to buy everyday products like bags or coffee beans. But the danger lies in platforms microtargeting users to influence their belief systems, to divide communities, and to dictate which authority figures are trustworthy and which ones aren’t.
“We kind of realize just how collapsed the categories of propaganda and advertising are when we look at them in terms of behavior and incentives, and not in terms of whether they’re selling a handbag or a presidential candidate,” she added.
The discussion on microtargeting, the siphoning of personal data, and its harms is one that has been around now for quite a time. But the lack of meaningful changes — that is, the current form of the practice remains harmful — has also meant that it’s a discussion that needs to continue.
What happens when personal, private data is exploited?
Whittaker said privacy measures have become “an increasingly valued set of capabilities” in tech. She cited apps like WhatsApp and iMessage that have been recently advertising themselves as private, taking pride in their secure messaging features.
“You see tech companies across the board trying to advertise themselves as private, adopting small measures to ensure different forms of privacy because people are simply realizing that it is too important, after data breach, after data breach,” she said.
In recent years, data has been exploited and abused for various purposes. Among the most notorious data-related scandals is the 2018 Cambridge Analytica scandal, which found that personal data of 87 million Facebook users were harvested for political campaigning. Of the 87 million, 1.2 million were from the Philippines.
In the United States, personal data has been used against women seeking reproductive health care. In 2022, months before Roe v. Wade was overturned, a mother and daughter from Nebraska were charged after their private Facebook messages showed they accessed medication to induce an abortion at home.
During the summit, Whittaker and Rappler CEO and Nobel laureate Maria Ressa also talked about how tech and data can be weaponized for military purposes, particularly in Gaza.
“The real stakes of this are every day becoming more and more chillingly apparent to everyday people. And we are seeing a move toward privacy, we’re seeing a push for better and more rights-respecting technology,” Whittaker said.
Reclaiming tech for good
When asked whether data privacy still exists today, Whittaker turned to Signal, her foundation’s messaging app that has become known for secure, end-to-end encryption communication.
“Signal is, I think, just a shining example that it is possible…at a large scale to build pleasing and useful and real-time technologies that do respect privacy, that get as close to collecting no data as possible, that respect the right to private communication, and that do not participate in the practice of making money from surveillance that is at the core of the tech industry,” she said.
But what about larger tech companies that are driven by profit? Whittaker said it’s not about dismantling tech per se, but about recognizing what role tech can play in fulfilling a vision or goal for a certain sector.
She cited the education system as an example, as teachers everywhere have been adapting to AI and other advanced technologies to reinvent their teaching methods. Whittaker emphasized the need to conduct research and consult stakeholders to “ensure the absolute best environment and outcomes for the children and other people who are being educated.”
“Let’s walk back to a place where safe, respectful tech is based on the desires of the people who deeply know that institution and love that profession and understand those dynamics. And I think in every sector, that is really the starting point,” she added. – Rappler.com