Would you be willing to let the government scan all your digital communications if it meant catching more child predators? What about your online dating profile? Would you still want Tinder to run background checks on potential matches if it meant discriminating against people based on their race?
Good intentions are no protection against unrestricted data surveillance becoming a human rights disaster. Governments and companies alike, despite noble intentions to protect people from harm, can end up restricting rights or further oppressing marginalized communities when they fail to find the right balance between protecting security and privacy.
The problem with the ‘nothing to hide’ argument
Digital rights organizations have lambasted the European Commission in recent weeks for its heavy-handed approach to stymieing child sex abuse and exploitation online. The EC’s proposed legislation would enable and even encourage companies to carry out comprehensive and automated scanning of everybody’s private messages and chats instead of limiting searches to potential suspects. After all, innocent people shouldn’t worry about this type of surveillance as long as they “have nothing to hide,” right?
Such a move gives companies and governments enormous power and poses a serious risk to journalists, whistleblowers, and others who rely on the privacy of online communications for their safety. Furthermore, it calls into question rights to privacy and data protection which are enshrined in the European Convention on Human Rights and Charter of Fundamental Rights. Last month, 48 European digital rights advocacy associations signed a letter urging the European Commission to consider other (more proportionate but equally effective) solutions for protecting children, which is certainly a shared and commendable goal.
The U.S.-based dating app company Tinder has also been at the center of recent controversy over its use of data following the launch of a new function that allows users to run criminal background checks on potential matches. Whereas Tinder’s move is intended to make meeting strangers from the internet safer, it likewise takes the approach of indiscriminately surveilling personal data and risks reinforcing existing inequities.
Background checks are not an appropriate proxy measure for the risk involved in meeting up with someone from a dating app. Experts have pointed out that the checks (which are run by a third party) could give users a false sense of security as only about a third of sexual assaults are reported to police and even fewer result in criminal convictions. In the United States, background checks also disproportionately flag people who have a greater number of interactions with police. Amidst a national awakening of the injustices that are baked into criminal justice systems, banning people based on their interactions with police reinforces discrimination and oppression against specific groups.
Proportionality is key.
Using all the personal data that’s available in service of protecting security isn’t necessarily appropriate, ethical, or even effective. Finding the right policy balance between security and privacy is complex; it requires the public and private sector to exercise self-restraint and use the least amount of data that is necessary for the purposes at hand and to anticipate unintended consequences on privacy and people’s rights.
In this context, participatory approaches (ones that involve people or their representatives directly in these processes) can help flag potential issues and give people the chance to share in decision making on the inevitable trade-offs between security and privacy. The #RestoreDataRights movement has called upon African governments to include people in decisions about how to manage and protect personal data both during and after COVID-19, noting that failure to do so “can detrimentally impact on how citizens view the government, and by extension it can undermine their trust in the government as a responsible data steward. It can also create or perpetuate policy blind spots, for instance around how data about marginalized or vulnerable groups are collected or used.”
People want transparency when it comes to how their personal data is handled.
We saw the benefits of participatory approaches in practice in the way contact tracing apps were rolled out in Germany and France during the COVID-19 pandemic. The Corona-Warn-App in Germany is open source and privacy-focused. Its code was published on GitHub and shaped by a collective effort—more than 100 people contributed and more than 5,300 commits to review the code were made in the month following its release. Corona-Warn-App has been downloaded more than 44 million times.
Meanwhile, in June 2020, the French government launched a similar Stop Covid app. While parts of the source code were made public in May before the release, the most important lines remained undisclosed and/or subject to proprietary rights. Furthermore, no participatory approach was taken for the further development and refinement of the app besides allowing some technical auditing by security and privacy experts. The app was downloaded just 2.5 millions times before the French government replaced it with yet another app, TousAntiCovid.
Contact tracing apps track people’s exposure to the virus and share that information with others with whom they may have come in contact. For such potentially intrusive applications, citizens considered the greater stakeholder involvement and openness of the German app as a sign of better oversight and thus trusted this initiative more. The success of the German app has been ascribed, at least to some extent, to the greater transparency and the more participatory and collective approach taken in its development. Such approaches don’t always have to include open source code development. One example of data sharing in response to the pandemic in Ghana involved setting up a multi-stakeholder steering committee to review unanticipated requests for access to mobile phone data that was covered by a data sharing agreement between a telecommunications company and a government agency for a specific purpose. Bringing together interested stakeholders through the committee ensures that the purpose of each new data request is considered separately.
Effectively addressing the trade-off between security and privacy became a key concern during the pandemic. But this debate has been around for a long time. Data and digitization have exacerbated the costs and benefits on both sides and made participatory and redress mechanisms even more important to striking the right balance.
Increasing public awareness of and engagement in these decisions is key to thwarting overreach and harmful practices. Otherwise, well-intentioned efforts to protect people that include surveillance of personal data will likely harm the people who stand to gain—and lose—the most from data use.