This op-ed was originally published in Just Security.
By Ken Gude
Western military assistance to Ukraine has clearly been an essential part of its ability to defend itself against Russian attacks. Another critical factor, highlighted during a series of recent transatlantic dialogues organized by the Munich Security Conference, has been the capability of widely available technology that has empowered everyday Ukrainians to participate in a truly whole-of-nation effort to repel the Russian invaders. After Russia began its full-scale assault in February, downloads of encrypted messaging apps such as Signal surged, as users looked for safe channels to communicate. Major technology companies responded by further expanding their use of encrypted technologies.
That is why it is worrying to also learn that European governments are considering proposals that, while technically not affecting encryption itself, could unintentionally undermine the very reason the Ukrainian people – and others around the world, including endangered human rights defenders — put their trust in that technology: the ability to keep their communications confidential. These policies could force some providers of encrypted messaging services to abandon the basic guarantee of confidentiality or pull out of the market.
Ukraine has a vibrant tech sector and a tech savvy population, with many highly skilled technology workers in Kyiv and other major cities employed by domestic companies or Silicon Valley giants. Immediately upon the Russian invasion, this community sprang to life. Encrypted communication tools empowered Ukrainians to combat disinformation, organize relief efforts, and protect evacuees. Some apps helped protect citizens and soldiers, dramatically improving Ukraine’s Cold War-era warning system to alert mobile devices of incoming missile attacks. Others allowed anyone to monitor Russian troop movements and send the information to the authorities, crowdsourcing military intelligence. Without internet communication and encrypted messaging and tracking apps, it is unlikely the Ukrainians would have been able to resist the Russian war machine as effectively as they have.
At the same time the European Union is pressing ahead with a kind of mechanism to combat child sexual abuse and exploitation online that has prompted strong criticism from privacy experts. The proposed legislation would first assess which technology companies are at high risk of being used to transmit or store child sexual abuse material, and then establish an obligation that such material is detected and reported. The categories of material would be known images, new images, and grooming activities or soliciting children for sexual abuse, meaning it involves both pictures and text. The proposal does not mandate that providers of communications and hosting services use specific technical approaches, and it allows for significant discretion in how the law would be implemented and how the obligation to detect this material is met.
Potential Serious Privacy Concerns
The juxtaposition of the critical role encryption has played in the war in Ukraine with the concern about this proposal’s potential unintended consequences was another major takeaway from these discussions among transatlantic security officials and experts. For encrypted communications and storage capabilities, the means to detect this material could raise serious privacy concerns for all users, even though it would not affect the actual encryption.
Even though the EU proposal does not specify exactly how the material would be detected, it is likely that some form of content scanning will be used. In the context of child sexual abuse material, images are compared to a central database of such known material through a process called hashing or fingerprinting. A hash, or a cryptographic snapshot that is exact to the content at issue, is compared to hashes in the database, which under the EU proposal would be maintained by a new European Center to Prevent and Counter Child Sexual Abuse that would be independent of the EU yet funded by it. When notified of a match, the European Center would then determine whether to transmit it to law enforcement for possible further action. It is unclear what method would be used to detect grooming activities, but it would likely require scanning the contents of messages for keywords.
Of critical significance is where this scanning takes place. One way the EU legislation could be implemented is by scanning the content on a provider’s servers. Users agree, as they do now when they accept a provider’s terms of service, to only transmit or store certain types of material on a provider’s servers, so providers can scan material on their servers for illegal content or that which violates their terms of service. This is how most of this type of scanning is done currently, whether for comparatively innocuous material like email spam or existing voluntary efforts to detect child sexual abuse material. An example of this is Microsoft’s PhotoDNA, which scans hashes of user uploads against the hash database of known child sexual abuse images maintained by the National Center for Missing and Exploited Children.
For encrypted communications and storage, however, this option of scanning on the server is not available, because the content is encrypted and the provider cannot access it. If any content scanning is to occur, it must be done on each individual user’s device, in a process called client-side scanning. The method is the same — hashes are compared to those in a database, but the location of the scan shifts to the user’s device. Such a potentially generalized search of all content that takes place on a user’s device is what triggers the concerns from privacy advocates and others.
U.K. Pursues This Route Too
It’s not just the EU going in this direction, the U.K.’s Online Safety Bill makes a similar, yet broader, proposal that is primarily focused on protecting children from sexual abuse online but also covers detection of content related to terrorism, threats, stalking, drugs, weapons sales, and disclosure of private sexual images. This bill also does not specify a defined method of detecting this material, but U.K. authorities appear to support client-side scanning after two of Britain’s top cyber officials wrote a paper in favor of it.
A growing cohort of encryption experts, computer scientists, human rights advocates, privacy experts, and civil libertarians are warning about serious privacy concerns in addition to the uncertain effectiveness of client-side scanning.
The EU’s own European Data Protection Supervisor and the European Data Protection Board issued a joint statement about the EU proposal, stating that, “measures permitting the public authorities to have access on a generalised (sic) basis to the content of a communication in order to detect solicitation of children are more likely to affect the essence of the rights guaranteed in Articles 7 and 8 of the Charter” of Fundamental Rights of the European Union.
A report on the right to privacy in the digital age by the United Nations High Commissioner for Human Rights, released in August, warned that “mandating general client side scanning would inevitably affect everyone using modern means of communications,” and that “frequent false positives cannot be avoided.” It concluded that client-side scanning “would constitute a paradigm shift that raises a host of serious problems with potentially dire consequences.”
The High Commissioner also specifically identified risks to human rights defenders if the confidentiality of encryption is undermined: “In specific instances, journalists and human rights defenders cannot do their work without the protection of robust encryption, shielding their sources and sheltering them from the powerful actors under investigation.”
Writing earlier this year about the U.K. proposal, a group of 45 organizations and experts highlighted the “possibility of similar approaches being taken to infiltrate private communications channels for other purposes both in the UK and around the world, including to further violate human rights.”
Law Professor Jeffrey Vagle wrote in Just Security last year raising important questions about client-side scanning, and noting that forced implementation would essentially require everyone to relinquish control over what technologies have access to sensitive data. Vagle argued that, to maintain trust, users should be able to choose who has access to our data, what technologies have access to their devices, and that in his view, maintaining trust “means ensuring users retain control over the devices they own.”
A “General Mass-Surveillance Tool?”
Additionally, more than a dozen prominent computer scientists, privacy experts, and academics wrote a detailed paper in October 2021 identifying the risks that client-side scanning could be abused as “a general mass-surveillance tool,” and that implementing it “would be much more privacy invasive than previous proposals.” Their conclusion:
“Plainly put, [client-side scanning] is a dangerous technology. Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope. We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”
Will Cathcart, the chief executive officer of encrypted messaging app WhatsApp, told the BBC’s Tech Tent podcast that WhatsApp is a global product and that the company would not change its technology so that it could be used under any new regulation. If either the EU or U.K. proposals were to become law, the result could be that those governments may choose to block its use. As Cathcart says, WhatsApp offers “a global service and some countries choose to block it. I hope that never happens in a liberal democracy.”
We are currently witnessing a significant real-world example of the power of widely available commercial encryption technology that is helping a democratic nation to defend itself against a more powerful invader. Western democracies should support the ability to have effective means of secure communications, particularly when used in actions to preserve democratic institutions. Policymakers should guard against the unintended consequences of well-meaning proposals that could result in the requirement to embed tools today that could be used in ways that go far beyond the original purpose tomorrow. The ability to choose tools like encryption to keep communications confidential empowers individuals with the confidence that they can take steps to control the technology they use. It is a key driver of a future digital ecosystem based on trust.
Ken Gude is the Executive Director of Trusted Future. He previously worked 15 years at the Center for American Progress, including as Chief of Staff and Vice President and Managing Director of the National Security and International Policy Team.