Just over a year ago, Meredith Whittaker stepped into the role of president of the Signal Foundation — and from the beginning, she has been dealing with political threats to encryption.

The foundation’s flagship product, the Signal messaging app, has drawn in users with its default end-to-end encryption and an uncompromising stance on privacy. But those same features have also made it a target. Governments in China, Egypt, Cuba, Uzbekistan and, most recently, Iran have banned Signal outright. In the U.K., recently passed legislation could target messenger services and require an app like Signal to moderate harmful content such as terrorist content or child abuse imagery. To find that content, Signal would need access to user conversations, which would mean breaking the service’s end-to-end encryption. Similar bills have already been passed in India and proposed in Brazil. Whittaker doesn’t mince words, calling such laws an existential threat to Signal.

Rest of World spoke to Whittaker in late September at a coworking space in New York, where she leads the fully remote Signal team. Her bags were (nearly) packed before she set off on a six-week-long world tour, including stops in Tokyo, Bengaluru, and Brussels. Our conversation touched on the Indian government’s recent ban on more than a dozen encrypted messaging apps, her concerns about Signal competitor Telegram, and her headline-making threat to pull Signal out of the U.K. earlier this year.

This interview has been edited for length and clarity.


Let’s start on something that you’ve been quite vocal about recently, which is the U.K. Online Safety Bill. The bill will require platforms to monitor and take action against several kinds of harmful content. You’ve said Signal will stop operating in the U.K. altogether if one specific provision is enforced, which would require Signal to scan user messages for child abuse imagery. Why did you decide to take such a public stance specifically on this law? 

It’s pretty simple. It was the first in line of an array of global laws that mirrored the same pretext, that mirrored the same legislative intent, that were all aiming to effectively undermine the right to private communication digitally. So the U.K. was just the tip of the spear on what I see as a global campaign.

It was important to us to make our analysis and the dangers of this law very clear, very loudly, because it’s existential to Signal, right? If we are forced to undermine the technology that guarantees privacy for the people who rely on us in the U.K., then we cannot operate in the U.K.

We were able to bring the long-standing technical consensus to bear on the conversation. The government acknowledged that there’s no technology that can magically scan everyone’s communications in a private and secure way. There are a number of interested vendors that have been, in one way or another, claiming the technology exists. It doesn’t exist in any safe form. There’s no way to square that circle.

Signal has chosen to speak publicly about legislative shifts around encryption in the U.S., the U.K., and the EU. How does Signal go about tracking developments in privacy law and policy in the non-Western countries where it operates?

We don’t have a policy team. We’re about 45 people now. I’ve done [policy] work, but I don’t have time to read a 200-page bill every three hours when another one in another country is published. So we’re lucky to have an informal but very good network of folks around the world.

I joined in September of last year. One of the first things I did was convene a meeting in Berlin of a bunch of the old heads of the digital rights and policy space to be like, let’s map what’s happening. What’s the fight like? Who are our allies, what are the pretexts they’re using? So we work with InternetLab in Brazil. We work with Internet Freedom Foundation in India.

We have a geopolitical position that is similar to Meta and others in terms of the global importance of Signal. What we don’t have is armies of policy folks in offices in every capital city, or the ability to just pay lobbyists or external law firms. We don’t have any of those resources. So we have to be smart and we basically have to organize with people who are doing this work — to be [well] networked.

It sounds like a real infrastructural disadvantage in some ways, compared to many global social media companies, even if you’re finding solutions and adapting.

On the infrastructural disadvantage, that’s true, but we also have the advantage of … we don’t have to be full of shit. We’re not actually a surveillance company. I’m not trying to pretend Facebook is good. I don’t have to toe a party line that is divorced from reality. And we aren’t Big Tech.

There’s one Signal for a reason, right? There isn’t a proliferation of small, independent tech efforts that are actually shipping high-availability tech at scale, and that’s because it costs millions and millions of dollars a year to maintain that. That cost is forever ongoing. There’s a lot that Signal is doing that just doesn’t have an analog. In part, that’s because of the new political economy of the tech industry, which is built on surveillance, which we don’t participate in.

A photograph of Signal President Meredith Whitaker standing in front of a plain wall, wearing a black leather jacket and black clothing.

A policy in India called IT Rules, 2021 could similarly put pressure on Signal to moderate encrypted content. In part, it mandates that social media platforms remove content when ordered by the Ministry of Electronics and Information Technology. Twitter and WhatsApp sued the Indian government citing takedown orders they have received based on IT Rules, 2021. Has Signal taken any specific action to oppose this policy, like you have in the U.K.?

Our perspectives don’t change based on jurisdiction. We do one thing: We provide private communications. We don’t know who you’re talking to. No one else knows who you’re talking to. We’re not looking to be the world’s advocate. Part of what we’re going to do in [my trip to] India is just learning.

I’m a white lady. I’m based in the U.S. I don’t have an instinct for the nuances and realpolitik of India, which itself is heterogeneous. We’re going there to learn. We’re going there to understand who’s fighting this, what are the tactics that have worked and then how do we support you. If it does turn out that the way we can support is to be vocal, of course. But we’re going to take our guidance from people who have that expertise and not sort of be obnoxious and American about it.

Has Signal received any direct communication from the Indian government based on IT Rules, 2021?

My answer to that is actually, I don’t know. It’s actually difficult to authenticate requests, right? You get an email and there’s a little footer and it could be spoofed or not. The big companies have a huge apparatus for authentication. So there’s actually a resource issue around that. If you want to process a request to Signal, there’s a P.O. box to send it to, it’s authorized, and that gets checked periodically.

We fight the subpoena requests we get. And if we aren’t able to fight them, we then provide the data we have, which is the fact that a given phone number registered a Signal account, when that phone number was registered for a Signal account, and when they last logged into Signal.

So I mean you get litanies of requests. [We say] we don’t know, we couldn’t know [even] if we wanted to, which is the whole premise. If I turn evil, we want to guard against that. But I’m not planning on it.

In May, the Indian government banned 14 messaging apps, most of which use end-to-end encryption, claiming they were being used for terrorist activity. Do you think Signal could be targeted using similar grounds?

It was clearly an escalation, but it’s an escalation in step with a global attack on encryption. I can’t prove coordination. Obviously, coincidence happens. But there has been a sort of a renewed focus on encryption in the last five years. [India’s app ban] was in line with that sentiment. And it was disturbing.

I want to bring up a specific incident that happened in August that you commented on at the time. The Iraqi government lifted a ban on Telegram after several days. The government came out and said it was because Telegram complied with their orders to moderate several channels on its platform. Telegram, just for the record, said that they did not share any user data with the government in that process. Why did you feel compelled to come out and publicly criticize Telegram’s decision?

There wasn’t a strategy there, but we frequently discuss the fact that there’s actual stakes to Telegram’s misrepresentations. They’ve been really scammy insofar as they make a huge public show of a commitment to human rights and privacy, while not actually offering [end-to-end] encryption other than in opt-in one-to-one private chats. The pattern you see is of Telegram making a bunch of noise and then quietly cooperating with governments. This means that people are, I would say, tricked into trusting Telegram as a safe app for communications in high-stakes environments like Hong Kong. There’s literally life-or-death consequences, or at least existential consequences. You can get disappeared.

[A spokesperson for Telegram disputed Whittaker’s comments, saying, “Telegram has not and will not participate in political censorship and was created to protect the right to protest.” The company also emphasized that all Telegram chats are subject to server-client encryption.]

“You’re not just trusting me to not play nice with governments. We literally don’t have the data, which is the only way to actually preserve privacy.”

How do you respond to the government pressure to comply when you’re threatened with an outright ban?

I mean, it’s simple: We can’t provide the information. So you can put a gun to my head — you’ll have to shoot. We don’t have it because it’s end-to-end encrypted. There’s no balancing that’s in our hands because of the guarantees we make. In part, [governments] are threatened because we can’t provide that information. There’s no way to sort of sit on us hard enough so that we start to undermine what we’re doing. Our code is open, it’s verified; our protocol implementation is open, and it’s been tried and tested. Everyone’s thrown everything against it, trying to find a vulnerability. They haven’t [found it]. This is a robust system. You’re not just trusting me to not play nice with governments. We literally don’t have the data, which is the only way to actually preserve privacy.

When you’re weighing the possibility of a ban, do you consider the communities of dissidents, activists, and marginalized people who rely on Signal?

Ensuring that the people who are most at risk have a meaningful way to communicate privately outside of the surveillance of oppressive governments or corporations or employers is core to our mission. So yeah, we are very disturbed by those bans. We’re not casual about them. Like in Iran, we worked with the community of people who are dedicated to Signal privacy to set up proxy servers, which helped somewhat and allowed people who had downloaded Signal already to access Signal via a proxy, which got some people around the ban.

Our hope would be that we can implement Signal in a way that [would require a government to] cut off all of Google or all of the internet if you want to block Signal. And thus, we have some level of cover there. This has been referred to as domain fronting. I won’t get into detail on how we use those because there’s an advantage to not [getting into it]. We are actively working on it.

Have you actually been seeing success with these efforts, including in Iran?

It’s not sufficient. It’s what we can do. It does highlight the fact that in regions where the government either owns or is tightly entwined with the state-owned telecommunications operator, they still have power. Look, we are an extremely proficient but small nonprofit tech organization that does not control the tech ecosystem on which we’re deploying. We’re doing everything we can. But in some sense, I think there’s a point at which the boundaries of what we’re in control of can’t be superseded by technical cleverness. One of those boundaries is that nation-states with infrastructure are sometimes threatened by privacy — whether it’s Iran or the U.K.

It’s clear that Russia is keenly aware of Signal’s user base. The New York Times reported earlier this year that Russian security services are using digital surveillance tools to track the activity of encrypted chat app users, including Signal users. While they are unable to read the content of messages, they can track whether someone uses multiple phones, map contact networks, and triangulate when a phone has been in a certain location. Do you consider this surveillance a privacy breach for Signal users?

Well, let’s be clear. It doesn’t mean that Signal itself has been breached. The integrity of Signal’s encryption, the integrity of Signal, is maintained. But it does mean that people need to do more than simply install Signal on [an insecure] device and assume it’s safe.

We are living in worlds where we have vanishingly little control or insight into how the computational technology we rely on works. However, Signal doesn’t control your device. It doesn’t control the metadata that might be sent by your device outside of Signal, particularly if [you’re] using it for a highly sensitive context.

Signal is the best choice. We do encounter a lot of bad faith ops, where it’s very clear that sowing doubt about Signal’s integrity is their main goal. I’m trying to thread the needle, but my core concern is making it clear Signal has a lot of integrity, but there’s also a giant, feral technological ecosystem out there. If you depend on that, you can also suffer consequences.