YouTube blocks content with misinformation about COVID vaccines

In the midst of the pandemic caused by the Covid-19 virus, the presence of radical groups on YouTube and other social networks has become...

Compartir
Suscríbete
Suscribe our Newsletter
Recibe por email las noticias más destacadas

In the midst of the pandemic caused by the Covid-19 virus, the presence of radical groups on YouTube and other social networks has become increasingly notable.

Given this, each of these platforms has had to adapt through the implementation of strategies and the creation of new tools.

In recent days, YouTube announced that it had made the decision to delete content leading to misinformation; specifically, that related to Covid and with the development of vaccines to eradicate this virus.

Do you want to know what measures YouTube took to stop the spread of fake news? Here we tell you!

  • The fight against disinformation continues on digital platforms; this time, Youtube has announced that it will block content related to Covid
  • All erroneous or unreliable content related to Covid vaccines will be removed from YouTube
  • The intention is to stop the spread of false news that may pose a risk to the safety of the public

More and more radical groups are choosing social media as a means of sharing unreliable information on sensitive topics.

In this case, one of the most sensitive issues that has been recorded in recent months is the pandemic caused by the Covid-19 virus.

Photo: Shutterstock

Especially when, from its inception, some groups believed that it was a hoax on the part of the government.

Today, a few months after the pandemic began, several laboratories around the world are working on developing a vaccine against Covid; however, this has further raised the alarms of these groups, since they consider that vaccines represent a danger to society.

At least on Facebook, the decision has been made to remove any advertisement related to the development of vaccines, especially if it contains false information or not confirmed by an authorized medium.

Now, it’s Youtube’s turn to stop the spread of false information related to a potential vaccine against Covid.

In a statement issued last Wednesday, YouTube assured that it will take on the task of eliminating videos that contain false or unconfirmed information about vaccines against Covid-19.

This, in practice, represents an extension to the current rules of the platform, in which it is established that the creation of content related to conspiracy theories about the pandemic will not be allowed.

In addition, YouTube was emphatic in saying that it would block any type of content that discussed about Covid vaccines and that went against the official information issued by local health authorities or the World Health Organization.

But, what are actually the content that YouTube considers false or unreliable?

Within the danger posed by the spread of false information, YouTube has confirmed that it will remove from the platform any publication in which it is claimed that vaccines cause death in people or infertility in women; Likewise, it will remove from the platform the videos in which it is ensured that the people who receive the vaccine will be victims of the implantation of a microchip against their will.

Like Facebook, YouTube will allow people to continue to voice their concerns regarding the potential Covid vaccine – as long as they don’t violate the above rules.

Photo: Shutterstock

Youtube and the fight against false information related to Covid

Disinformation is an issue that YouTube has shown to take very seriously, especially since it has been shown that unreliable content can put people’s lives at risk.

For example, it has been found that those videos in which the Covid is discussed, in terms of contagion, medical care, home remedies, or its veracity, can result in negligence that puts the public who consults said comments at risk. .

One of the issues that has most concerned those responsible for the platform are the Conspiracy theories about Covid.

In particular, those related to vaccines that are currently being tested to combat and eradicate this virus.

So far, YouTube claims to have removed at least 200,000 videos related to false, unverified or unreliable information regarding Covid.

Andy Pattison, head of digital solutions at the World Health Organization, assured that this organization is in direct contact with YouTube to discuss trends in content and to identify those videos that can be classified as ‘problematic’.

The content related to Covid and vaccines that will be removed from YouTube will be all that contains references to:

  • The Covid vaccine as a risk or cause of death
  • The Covid vaccine as a direct cause of infertility
  • The vaccine against Covid as a pretext for the implantation of microchips

A spokesperson for the YouTube platform stated that “Given the imminent arrival of a vaccine against Covid-19, we are striving to guarantee the necessary policies to remove information related to the vaccine against Covid from these platforms.” content that includes information about the Covid that contradicts the experts and the authorities of the World Health Organization will be removed from YouTube. “

Thus, YouTube joins the efforts of platforms such as Facebook, Instagram and Twitter, which have already implemented measures against the spread of false information; however, each of them has made sure not to close the conversation, so discussion groups that do not pose a security threat will continue to be allowed within these applications.

Photo: Shutterstock

The post YouTube blocks content with misinformation about COVID vaccines appeared first on Hispanic World.

.