Vaccines want efficient messenger substances

0
18

This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.

Getting the science right is only one element to coronavirus vaccine success. People have to trust them too, and that requires effective mobilization of communication.

My colleague Davey Alba co-wrote an article Tuesday on the recent surge in misleading claims about vaccines against the coronavirus. She spoke to me about the challenge for health professionals and others to get the message across about the effectiveness and safety of vaccines and the role of internet companies in slowing down misleading information.

Shira: You can't blame people carefully about new vaccines to prevent a virus that scientists don't fully understand.

Davey: Absolutely not. I want to distinguish between misinformation and the understandable caution we all have with regard to new vaccines.

Here is an example. Regulators are monitoring whether the Pfizer coronavirus vaccine will cause a harmful reaction in people who have had a history of a certain type of severe allergic reaction. The misinformation narratives use this to spread concern that allergic reactions to the vaccines are widespread, which is not true.

(New York Times journalists have answers to frequently asked questions about coronavirus vaccines.)

There is a story of black Americans being ill-treated or abused in the healthcare sector. Does this legacy show in? Misinformation about the pandemic or vaccines?

Previous misinformation campaigns have attempted to capitalize on existing fears or divisions among people. In 2016, for example, Russian disinformation was directed against black Americans and immigrant groups because they believed it was an effective tactic to try to increase existing social divisions. I saw some early signs that people might be trying to take advantage of the vaccine’s reluctance in black Americans.

How much to blame Internet companies for distributing misleading information about vaccines?

This year, internet companies seemed poised to be more aggressive against the urgent risks of misleading information. Facebook and YouTube have announced they will be removing coronavirus vaccine content exposed by public health experts, and Twitter says it is working on its own policy.

But writing guidelines is one thing. Whether they are effective and how companies enforce their policies is another question.

Once misleading information has become widespread – as we saw in the recent US election with false claims of election fraud – it is a tool for internet companies to take retrospective action. Organizations are often unevenly enforcing their own rules and some people take advantage of loopholes.

The fundamental problem is that Internet platforms depend on maximizing people's attention. And false information is effective in getting people's attention.

The misinformation researcher Renée DiResta wrote that health officials have not done enough to make reliable health information convincing and understandable. Are health professionals or government officials trying to fix this?

I am grateful to the health professionals who take the time to communicate clearly with people about the coronavirus and vaccines. Here, too, news organizations play a role in making health messages accessible and understandable.

(Here is Shira's interview with a doctor who makes popular TikTok videos to educate people about the coronavirus and vaccines.)

In the days before the Internet, people took important information mainly from friends and family, others interacted with personal and traditional news outlets. In a hypothetical world without the internet, would we be better informed about coronavirus vaccines?

I dont know. The plethora of information that is now available – both good and bad – tends to make us more cautious with consumers.

It is also more important for researchers, science and health professionals, journalists, and others to think of ways to communicate information effectively so that people don't try to understand what is happening on their own.

If you do not have this newsletter in your inbox yet, please register here.

I'm going to give Periscope, an app you've probably never used, a moment in the (newsletter) sun.

Periscope was one of the first breakout apps that gave anyone the ability to simply broadcast whatever they wanted in real time. Twitter, which bought the app in 2015, announced Tuesday that Periscope would unplug it by March.

Periscope has been at death's door for a while now, partly because most of us don't want to post live video of what's happening. But its influence lives on as live video is everywhere. If you've hung out on Zoom this pandemic year, played guitar lessons in real time on Instagram, or interacted with sexual performers live on sites like OnlyFans, then Periscope deserves a little credit.

My big question is not why some ideas like Periscope fail, but why relatively similar ideas have very different results.

Why did Periscope wither, but Twitch built a thriving community of people entertaining themselves playing video games or sitting around and talking? Why didn't live video really catch on for Facebook – although the company tried very hard – but for Facebook's own Instagram?

(As an aside, live video remains a feature that I feel conflicted with due to the real and difficult-to-control danger of people posting live video of terrible things.)

There will be post-mortems about what went wrong for Periscope, and Twitter certainly deserves at least some of the blame. Twitter is known for taking a new concept and ruining it by not investing in it, failing to make changes to functionality, or addressing other management problems.

The exact diagnosis of failures or successes is not easy, however. There is a magical alchemy of good idea, good execution, and good luck as to why some products live on and others don't. And sometimes, as with Periscope, failure isn't the end of the story.

We haven't heard the latest massive cyber attack: My colleague David E. Sanger explained in The Daily what was behind the computer attack that hit several US government agencies and why it was continuing.

Technology is not the answer, example infinity: The markup selected the worst algorithms of 2020. Losers include data-driven systems that influence who receives critical medical treatment, a law enforcement agency's abuse of facial recognition technology to falsely arrest a man in Detroit, and educators who used grade assignment software.

But sometimes people can use technology forever: This is a remarkable story by Ben Gardiner, who used the early internet to help people share information about AIDS, find support, and organize to change government policy. "His legacy lives on in anyone who now goes on the Internet in good faith to provide information and support to those who suffer," OneZero wrote.

"I'm tossing my hair, checking my nails …" Boston Medical Center staff danced to celebrate the distribution of the first batch of coronavirus vaccines to their colleagues. (I was definitely dancing to Lizzo when I typed this.)

We want to hear from you. Tell us what you think of this newsletter and what else you would like us to explore. You can reach us at [email protected]

If you do not have this newsletter in your inbox yet, please register here.