Shane Tews with Jeff Greene, Jim Kohlenberger, Jennifer Huddleston:
Encryption at a Crossroads: Can We Keep Data Secure Without Sacrificing Safety?

Originally published by American Enterprise Institute

Strong encryption is the backbone of digital privacy and secure data. Pressure on the government to weaken encryption is mounting, which raises some serious concerns. How do we preserve strong encryption standards in the face of security concerns? And must we pick between security and privacy?

Earlier this year, Shane Tews moderated a panel titled Privacy and Governmental Surveillance at the Technology Policy Institute’s 2025 Aspen Forum. Panelists for this discussion included The Aspen Institute’s Jeff Greene, JK Strategies’ Jim Kohlenberger, and the Cato Institute’s Jennifer Huddleston. Their extensive and well-rounded professional backgrounds made for an engaging and productive conversation.

Below is a lightly edited and abridged transcript of our discussion. You can listen to this and other episodes of Explain to Shane on AEI.org and subscribe via your preferred listening platform. If you enjoyed this episode, leave us a review, and tell your friends and colleagues to tune in.

Shane Tews: Let’s talk about something that was a wake-up call for people—the issue of Salt Typhoon. Tell us what happened and then what your recommendations were, both at the time when you had the position at CISA and what you would say now.

Jeff Greene: Salt Typhoon itself is an industry term that describes a People’s Republic of China hacking group. It was an intrusion into the wireless backbone of the United States that we detected last year.

We, the government, were able to share some information and activity that we had seen with private sector partners. This group had burrowed deeply into the wireless backbone of the United States. Each victim—and I encourage you to think of them as victims; they are not bad actors in this situation—each of the victims, though, was differently situated in terms of how far in the PRC got, whether they were just popping in and out.

There were some very big names that you can read about. And then there are some very small parts of the overall ecosystem that even we in the government were surprised to realize how centrally positioned they were. Unfortunately, the Chinese had figured that out, which is why they had targeted them. And some were names you never would have heard of, providers and infrastructure companies.

This activity was entirely on private sector networks. They exfiltrated call information, call data—they did that for many devices and calls. Public reporting will tell you that it was primarily in the Washington, D.C., Maryland, and Virginia area.

They were also able to target individual devices including some high-profile people. The final thing they did was access parts of some of the providers where they stored what’s called CALEA information—that is, lawful intercept information—and they were able to exfiltrate a significant volume of that. This was deep, broad access, probably gained over a long period of time. These networks were put together over the course of years, if not decades. Because of mergers and acquisitions, it makes it very hard to mitigate and even to detect.

In terms of the recommendations for the telcos, we put out, along with our U.S. government partners and many of our international partners, a hardening guidance that, if followed, would kick out most of the activity and detect additional future activity. And I think recently, the head of the FBI’s Joint Cyber Center said that they believe they have fully kicked out this intrusion.

For individuals, there’s not a lot you can do because there is no malicious activity on your device and there is no intent or effort to compromise your individual devices. So, you do what you can. And I got in a bit of trouble for saying this on a press call, but our simple advice was: Encryption is your friend—whether it is in messaging or voice calls, find a way to use an encrypted device. That way, even if the adversary is on the network, and even if they target you individually, your information would be held private.

That was a bit controversial inside the government, but really, that was our key recommendation: Make sure you’re using an encrypted device, you’re keeping it as updated as possible, and control what you control.

Shane Tews: Why do we have friction with encryption, knowing how important it is, and with law enforcement?

Jim Kohlenberger: I think we’re at a turning point. After the Salt Typhoon hack, we’re increasingly recognizing encryption’s value. The question is, how do we make it more ubiquitous?

Every day we rely on encryption in important ways. Every website purchase, our text messages, our phones if lost or stolen, and increasingly in the cloud—we rely on it to protect our data.

Why? Because end-to-end encryption has a superpower: the host doesn’t have access to the key. If anybody gets access to the information, it just looks like gibberish.

We saw recommendations that everybody use end-to-end encrypted communications. We also saw an announcement between the NSA, FBI, and foreign governments to encrypt our internet communications as much as possible.

But we’re seeing increasing threats to that vital encryption that runs every part of our economy.

Last week in Russia, Telegram and WhatsApp stood up when regulators tried to degrade those services, saying no to backdoors. Yesterday, Proton Mail announced they’re leaving Switzerland because of a law requiring secret backdoors—”snoopers’ charters.” Most importantly, in February, The Washington Post broke news that the U.K. had given a secret order to Apple to create a master backdoor to all cloud-encrypted data. That’s access to all of our data—your data and my data.

Why do countries want this? They think it will help catch more crime. But it’s easy for a bad actor to just use an open-source tool and re-encrypt it. They argue they could keep a master key safe. But there’s never been a backdoor created for the good guys that doesn’t get abused by the bad guys.

In the U.K. case, it’s secret. None of the companies have been able to acknowledge they’ve gotten this order. We don’t know how many U.S. companies this may have applied to. Fortunately, bipartisan members of Congress expressed outrage. Director of National Intelligence Tulsi Gabbard stepped forward, recognizing this was potentially a violation of the CLOUD Act. I’m happy to report that last night she tweeted the U.K. has decided to decline that backdoor and remove this restriction.

She thanked the President and Vice President for helping push the U.K. back to, quote, “ensure Americans’ private data remains private and our constitutional rights and civil liberties are protected.”

That’s exactly right. But we need more companies to step up and say they won’t create backdoors. We need more folks to stand up to these countries as they try to put our data at risk.

Shane Tews: What we’re seeing now is the patchwork of state laws that they’re creating and calling them privacy or child online safety laws. What guidance are you giving the people in the states when you go out and talk to all the state representatives and legislatures right now?

Jennifer Huddleston: When we talk about privacy, we’re oftentimes talking about several different things. Sometimes we’re talking about the individual’s relationship to the government and questions around encryption. Sometimes we’re talking about individual consumer privacy choices.

Most actions at a state level have been creating some framework for those consumer choices, oftentimes preferencing particular privacy preferences that may or may not reflect what an individual consumer wants and may or may not reflect the way that data security and data privacy are going to evolve. Law is static; technology is dynamic. In privacy, we’re often seeing new solutions that can improve our privacy and cybersecurity develop all the time.

We’re also seeing companies offering certain privacy features as part of how they distinguish themselves in the market, and having a one-size-fits-all approach that presumes everyone always wants the most privacy-protective version over potential other issues—whether concerns around speech or innovation—can limit the way we’re able to see beneficial technology evolve, including in the privacy space.

We’re up to 19 to 20 states that have their own consumer data privacy framework, which is confusing for consumers, particularly those who might not know what their rights are or why they got 50 new pop-ups. But also particularly for small and mid-sized businesses, who now might have to comply not only with existing laws in their industry but with a whole plethora of laws in every state they operate, potentially preventing them from operating nationwide.

What we actually see, particularly in the privacy space, is this “Brussels–Sacramento sandwich,” which is what happens when companies find themselves regulated by California’s laws—like the California Consumer Privacy Act—that require compliance even beyond the state’s borders, as well as with a European regulation mode, where that has become their greatest tech policy export. No longer do we see many innovative companies but attempts to regulate American companies. Who loses out are all the American consumers, who are losing out on innovation as costs are put towards compliance rather than towards improving their products, as well as when we see changes that might not best be serving the consumer but are necessary merely to comply.

When we’re thinking about the potential of a state-by-state approach, I normally am a big fan of federalism. I think the states as the laboratories of democracy and governing on the most local level possible is a great thing. This is where I’m going to prevent myself from just fully nerding out about the Dormant Commerce Clause for half an hour—you can catch me outside in the lobby if you want to hear that full rant.

The internet, by its very nature, is almost always interstate. Very rarely is it an intrastate issue, which means the most local level of governance, if we look at this from a constitutional perspective, is going to be a federal data privacy framework.

Jim Kohlenberger: I agree with what Jennifer said. We do need a national data privacy framework and a national kids’ data privacy piece.

But I want to drill down into some of these age verification bills—because I think they create very significant privacy challenges and actually can undermine the goal we’re trying to achieve in protecting kids. At Trusted Future, we did surveys that showed 90 percent of parents are absolutely concerned about protecting children’s privacy, identity, and safety online. But when you look at some of those proposals, they have a number of different flaws.

In some cases, they’re requiring a third party to collect government-issued IDs, which is exactly the problem we saw in the Tee breach that impacted so many women. In other cases, they’re doing age estimation using facial recognition or other types of things that are not actually very accurate. And in addition to the constitutional issues in many of these cases—which I know NetChoice has successfully pushed back in courts as being unaligned with the First Amendment—some of these proposals really put significant privacy at risk.

I’ll take a couple of examples. In Utah, Texas, and Louisiana, there’s a set of bills—and there’s a companion in the Senate as well as in the House—that would shift responsibility to app stores.

It’s got really significant problems. First, it requires folks to share the specific age of the child with not just the companies that need that data, but with all of the 4 billion apps that are out there. And they do that without requiring parental consent. One of the things we see in our survey data is parents want to be in charge of the information that gets shared about their children. And there are few controls on how that data can be used. It creates a ginormous honeypot with all of these 4 billion apps having to create databases of private information of kids.

Instead, there are much better ways we can do this. People are developing really great tools that allow parents to choose in advance what apps they want to download, that enable them to keep control, to choose whether or not they allow their children’s private data to be shared, to use age-range data and only send it to those apps that specifically need it. There are important ways to make progress in this debate that don’t require us to undermine privacy and safety.

Jeff Greene: I’ve got some thoughts echoing what Jim said, but first on post-quantum encryption. PQC is one of the biggest problems we’re facing that we’re not doing enough on as a government and in the private sector. When I was at NIST, we started a project looking at how do you deploy these? The average data we saw was that it would take an average of 10 to 15 years to deploy a new algorithm, and that was when we used encryption much less broadly than now.

Companies and organizations need to be working on this now. If you’re in a position in your company where you can ask, “Hey, what are we doing?” ask that question, drive that change, because this could come on us more quickly than we think.

Two things on what Jim and Jen said. The data collected is itself a target. Bad guys are incredibly creative at finding ways to use data against individuals, against families, against businesspeople. It worries me when the government is mandating mass collection of data when I know from experience that it’s really hard to protect that, particularly when you have companies that are often working on a shoestring budget. We need to think hard before we say, okay, this is for a public good, so it’s okay to do this. We need to focus on what is the end—what do we think the actual result will be?

And on that vein, the second thing, going back to the U.K. law—which thankfully, Jim, I didn’t even know it sounds like they’re pulling back on—my first thought when I read that, fairly fresh out of government, was this will lead to not much actual data being shared with the government, but it will lead to a significant pullback in how companies are willing to work with the government.

I have yet to meet a company that wants to support criminals, wants to support child predators or terrorists. In almost every case, when I’ve been in a company or in different parts of the government, the companies want to help. But what they don’t want to do is undermine the core security of their devices. And in many situations when the government has a legal mandate or legal authority to get information, we do not have to resort to breaking encryption if the government engages early enough, if the company is willing to work with them.

If we create an environment where the companies are incredibly defensive because they’re worried about being mandated to have a backdoor or the government’s going to use some other digital means, ultimately, in my experience, there’s a good chance it’s going to lead to less cooperation and less security. I bristle at these proposals that have these broad—what I call fairy dust mandates—which say, okay, do a process where you can break your encryption but keep it equally secure. Math is still math; that doesn’t work. We need to think about that when we go into it. And as Jen said, often really well-intentioned, but often not thinking about the practical outcomes.