Wednesday, May 25, 2022
Bringing the Latest in News Straight to Your Screen


Why YouTube’s Ban On Anti-Vaccine Misinformation Is Step In Right Direction

By News Creatives Authors , in Leadership , at September 29, 2021

YouTube’s announcement today that it will ban anti-vaccine information is an important step towards eliminating one factor that has prolonged the Covid pandemic: misinformation.

One of the best practices for managing and communicating about a crisis is to only share the latest and most accurate information about it, and discourage others from posting wrong, outdated or misleading “facts”, opinions and advice about the situation and the steps that are taken to address it.

Good And Bad News

The good news about social media is that it is an effective way to immediately inform hundreds of millions of people about a crisis.

The bad news is that the same platforms that are used to inform can also be used by others with their own agendas to misinform and thereby lengthen or worsen the crisis.

From a crisis management perspective, YouTube’s announcement is welcome news for several reasons. It:

  • Will eliminate an important source of vaccine misinformation.
  • Could help encourage people to get fully vaccinated, and thereby help end the coronavirus crisis sooner rather than later.
  • May spur other companies and organizations to take steps against vaccine misinformation.

Statement From YouTube

In a blog post, YouTube said, “Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.”

YouTube said it will remove “content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines…”

It noted that, “This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them. Our policies not only cover specific routine immunizations like for measles or Hepatitis B, but also apply to general statements about vaccines.”

The Larger Fight Against Misinformation

As I wrote last October, “Misinformation continues to grow at an alarming rate, along with its potential to infect corporations and organizations. No matter how it spreads — via fake news sites, traditional news organizations, or social media — companies that ignore and hope this crisis goes away may be putting their image, credibility, and reputation at risk.”

Not The Real Thing

Coca-Cola has had first-hand experience with misinformation. In 2018 there was a false report that its Dasani brand of water was contaminated with parasites and that the Food and Drug Administration had shut down the company’s manufacturing facility and issued a major recall of the product.

In a statement, the Coca-Cola Company said that “The source of this false and inflammatory information about our brand is a hoax news website. There is no recall of Dasani being conducted in the U.S., so please confidently continue to enjoy Dasani bottled water.”

Marshall Van Alstyne, a professor of information systems at Boston University’s Questrom School of Business, specializes in curbing the spread of misinformation. “The bigger brands are more common targets because their name recognition make them better clickbait,” he told me in October.

No Help For Business From U.S. Government

Business leaders should not hold their breath waiting for the federal government to help them in their battles against misinformation and disinformation — two continuing crises that can put the image and reputation of their companies at risk. Companies will have to continue to fend for themselves the best they can.

That was the takeaway message for executives from last October’s hearing by the U.S. Senate Committee on Commerce, Science, and Transportation where the CEOs of Facebook, Google, and Twitter were questioned — and sometimes grilled — about their policies, practices, and procedures for monitoring and moderating content on their platforms. 

Lawmakers on both sides of the aisle did voice their concerns, accusations, and counter-accusations about misinformation and disinformation. But the senators focused exclusively on national political campaigns and the 2020 election. There was no mention about the impact of disinformation and misinformation on businesses or how current law could be changed to help protect them.

Future Federal Elections

The Hill reported on Wednesday that the Cybersecurity and Infrastructure Security Agency (CISA) “said it will use its ‘rumor control’ website to counter disinformation and misinformation during future elections despite the site’s role in former President Trump ousting several of the agency’s top officials in 2020.

“CISA Director Jen Easterly made the announcement…noting her concern around misleading election claims and saying the site would be one of the efforts to combat disinformation and misinformation that the agency is pursuing ahead of next year’s midterm elections. 

“So rumor control, when I looked at this as a private citizen, I saw what CISA was doing, which is really making sure that the American people have the facts that they need,” Easterly said during a keynote at the Aspen Institute’s Cyber Summit. “I worry a lot about misinformation and disinformation as a citizen, but also as a mom.”

Comments


Leave a Reply


Your email address will not be published.