The debate on freedom of expression on social networks is over

0

The Supreme Court of the United States is firmly committed to the “market of ideas”. He tends to believe, in the words of Judge Louis Brandeis, that the remedy for lies and fallacies is “more talk, no forced silence”.

If you believe this, you might also believe that if people are lying about COVID-19, the 2020 presidential election, a politician, a journalist, a neighbor – or you or me – nothing can be done. Of course, you can answer with “counter-speech”: the truth. And that’s all.

The problem is that in many cases counter-speech is ineffective. Lies lodge in the human mind. They’re like cockroaches: you can’t quite get rid of them.

This psychological reality raises serious questions about current constitutional understandings and also the current practices of social media platforms, including Facebook, YouTube, and Twitter, to try to end the lies. Ironically, these understandings and practices may themselves be based on an error of fact – something like disinformation.

In United States v. Alvarez, ruled in 2012, the Supreme Court appeared to rule that lying and lying are protected by the First Amendment. The court struck down a provision of the Stolen Valor Act, which makes it a federal felony if you falsely claim that you won the Medal of Honor. According to the court, this provision is unconstitutional; the government cannot punish this lie.

As the court said, “A database created by the government could list Medal of Honor winners. If a database were accessible via the Internet, it would be easy to verify and expose false claims. In short: the right remedy for lies is more talk, not forced silence.

The Alvarez case involved a boastful lie about yourself, and it’s not entirely clear how this applies to vicious lies about others, or lies about health, safety, and the election. In limited circumstances, judges have allowed civil actions against defamation, even where a public figure is involved. But in general, the court has been reluctant to allow any sort of “truth policing”. Social media providers, notably Facebook, felt the same.

But the broad protection of lying, defended by reference to the marketplace of ideas, rests on an insufficient understanding of human psychology.

Whenever someone tells us something, our initial tendency is to believe it. If we’re told it’s going to snow tomorrow, that the local sports team won today’s soccer game, or that a good friend just had a heart attack, we tend to think that we were told the truth.

Of course, you might not believe a source you’ve learned to be wary of. But most of the time, we assume that what we are hearing is true.

This is called “truth bias”, and it extends wider than that. Suppose you hear something that you really know is wrong. Or suppose that right after you have been told something, you are explicitly told, “That was a lie! For example, the lie could be that a vaccine does not work, that a business executive has engaged in sexual harassment, that an aspiring politician has been a member of the Communist Party, or that a prominent sociologist. is a cocaine addict.

Even if you are told that what you heard is wrong – a joke or malicious act – you will likely have a persistent feeling that it is true, or at least that it could be true. This impression can last a long time. It will likely create a cloud of suspicion, fear, or doubt. It can easily affect your behavior.

It can make you fear or hate someone, or believe that there is something wrong with that person, even though there really isn’t. You might think that overall, one statement is probably wrong.

But “probably false” does not mean “definitely false”. It means “maybe true”.

University of Pennsylvania psychologist Paul Rozin has undertaken some fascinating experiments that help explain what’s going on here. In one of his experiments, people were asked to put sugar from a commercial sugar package into two similar brown bottles. Then people were given two labels, “sugar” and “sodium cyanide,” and were asked to put them on both bottles as they wanted.

After doing this, people were reluctant to take sugar out of the bottle labeled “sodium cyanide” – even though they had affixed the label themselves! When you see the “cyanide” label on a bottle, people don’t want to use what’s inside, even though they know it’s just sugar. It helps to explain why lies and lies are so corrosive; part of us believes them, even though we know we shouldn’t.

Lies and lies, including conspiracy theories, often have a lasting detrimental effect, long after they have been successfully debunked. This finding has strong implications for practice. This suggests, for example, that social media providers should not be convinced at all that corrections, labels, warnings and clarifications will reverse the effects of lies.

A much more effective approach would be to reduce the likelihood that the most harmful lies will circulate in the first place, not necessarily by removing them, but by reducing their importance and visibility (on, for example, the News Feed) and therefore the probability that they will circulate. Facebook should do more. And when serious damage is inevitable, catching lies, or not allowing them in the first place, would of course be even more effective.

No one – much less public officials – should take on the role of a Ministry of Truth. But informed by psychological research, some social media providers have improved their policies for dealing with COVID-19 lies – sometimes by removing them.

It is a strong drug, generally to be avoided. But when there is a serious threat to health or safety, or to democracy itself, that’s exactly what the doctor ordered.

Share.

Comments are closed.