The Signal app and the danger of privacy at all costs


Two weeks ago, the Twitter co-founder Jack Dorsey passionately advocated in a blog post the view that neither Twitter nor the government nor any other company should exert control over what participants post.

“It’s critical,” he said, “that the people have tools to resist this, and that those tools are ultimately owned by the people.”

Mr Dorsey is promoting one of the most potent and fashionable notions in Silicon Valley: that a technology free of corporate and government control is in the best interest of society. To that end, he announced he would give US$1 million (S$1.34 million) a year to Signal, a text-messaging app.

Like Messages on your iPhone, Facebook Messenger and WhatsApp, Signal uses end-to-end encryption, making it impossible for the company to read the contents of user messages. But unlike those other companies, Signal also refrains from collecting metadata about its users. The company does not know the identity of users, which users are talking to each other or who is in a group message. It also allows users to set timers that automatically delete messages from the sender’s and receiver’s respective accounts.

The company – a limited liability company (LLC) that is governed by a non-profit – is founded on the belief that it needs to combat what it calls “state corporate surveillance” of our online activities in defence of an uncompromisable value: individual privacy. Distrustful of government and large corporations and apparently persuaded that they are irredeemable, technologists look for workarounds.

This level of privacy can be beneficial on a number of fronts. For instance, Signal is used by journalists to communicate with confidential sources.

But it is no coincidence that criminals have also used this government-evading technology. When the United States’ Federal Bureau of Investigation (FBI) arrested several members of the right-wing Oath Keepers militia for rioting at the US Capitol on Jan 6, 2021, one of its primary pieces of evidence was messages on Signal. It is unclear how the FBI got access to the messages in this instance; there is a longstanding cat-and-mouse game between lawmakers and technology.

The ethical universe, according to Signal, is simple: The privacy of individuals must be respected above all else, come what may. If terrorists or child abusers or other criminals use the app, or one like it, to coordinate activities or share child sexual abuse imagery behind impenetrable closed doors, that is a shame – but privacy is all that matters.

One should always worry when a person or organisation places one value above all. The moral fabric of the world is complex. It is nuanced. Sensitivity to moral nuance is difficult, but unwavering support of one principle to rule them all is morally dangerous.

The way Signal wields the word “surveillance” reflects its coarse-grained understanding of morality. To the company, surveillance covers everything from a server holding encrypted data that no one looks at, to a law enforcement agent reading data after obtaining a warrant, to Eastern Germany randomly tapping citizens’ phones. One cannot think carefully about the value of privacy – including its relative importance to other values in particular contexts – with such a broad brush.

What’s more, the company’s proposition that if anyone has access to data, then many unauthorised people probably will have access to that data is false. This response reflects a lack of faith in good governance, which is essential to any well-functioning organisation or community seeking to keep its members and society at large safe from bad actors. There are some people who have access to the nuclear launch codes, but Mission Impossible movies aside, we are not particularly worried about a slippery slope leading to lots of unauthorised people having access to those codes.

There is a bigger issue here: Small groups of technologists are developing and deploying applications of their technologies for explicitly ideological reasons, with those ideologies baked into the technologies. To use those technologies is to use a tool that comes with an ethical or political bent.

Signal is pushing against businesses like Meta that turn users of their social media platforms into the product by selling user data. But Signal embeds within itself a rather extreme conception of privacy, and scaling its technology is scaling its ideology. Signal’s users may not be the product, but they are the witting or unwitting advocates of the moral views of the 40 or so people who operate Signal.

There is something somewhat sneaky in all this, though I do not think the owners of Signal intend to be sneaky. Usually advocates know that they are advocates. They engage in some level of deliberation and reach the conclusion that a set of beliefs is for them.

But users of apps like Signal need not have such beliefs. They may merely (mistakenly) think, “Here’s a way to message people that my friends are using.” Signal’s influence does not necessarily hit us at the belief level. It hits us at the action level: what we do, how we operate, day in and day out. In using this technology, we are acting out the ethical and political commitments of the technologists.


This website uses cookies. By continuing to use this site, you accept our use of cookies.