YouTube may contain misinformation, researchers say

YouTube may contain misinformation, researchers say

No, China did not work with the Democrats to steal the midterm elections as some people on YouTube claim. So does Saudi Arabia.

And there’s no evidence that Pennsylvania was “overwhelmed by fraud” in 2020, or that electronic voting machines will manipulate the results next week, as one conservative activist claims in the video.

Ahead of the midterm elections, disinformation watchdogs say they are concerned that what has been described as an aggressive effort by YouTube to combat misinformation on the Google-owned platform has created blind spots. In particular, they are worried about YouTube’s TikTok-like service, which offers very short videos, as well as the platform’s Spanish-language videos.

But the situation is hard to get a clear picture of, more than a dozen researchers said in interviews with The New York Times, because they have limited access to the data and because studying the videos is time-consuming.

“It’s easier to do research with other forms of content,” such as text found on Facebook or Twitter, says Giorre Craig, director of digital integrity at the Institute for Strategic Dialogue, or ISD, a nonprofit that counters extremism and disinformation. “It puts YouTube in a position where they go down easier.”

While Facebook and Twitter are scrutinized for misinformation, YouTube has often flown under the radar despite the widespread influence of video. It reaches over two billion people and ranks as the second most popular search engine on the Internet.

YouTube banned videos alleging widespread fraud in the 2020 presidential election, but it has not instituted a comparable policy for midterm elections, a move that drew criticism from some observers.

“You don’t build a sprinkler system when a building is on fire,” said Angelo Carusone, president of Media Matters for America, a nonprofit that monitors conservative misinformation.

A YouTube spokeswoman said the company disagreed with some criticism of its work in combating misinformation. “We have invested heavily in our policies and systems to ensure that we successfully combat election-related disinformation with a multi-layered approach,” spokeswoman Ivy Cho said in a statement.

YouTube said it removed a number of videos that The New York Times flagged for violating its policies on spam and election integrity, and that it determined other content did not violate its policies. The company also said it removed 122,000 videos containing misinformation between April and June.

“Our community guidelines prohibit misleading voters about how to vote, encouraging interference with the democratic process, and falsely claiming that the 2020 US election was rigged or stolen,” Ms. Choi said. “This policy applies worldwide, regardless of language.”

YouTube has stepped up its stance against political misinformation in the wake of the 2020 presidential election. Some YouTube creators took to the platform and live streamed 2021. The attack on the Capitol on January 6. Within 24 hours, the company began punishing people who spread lies that the 2020 election was stolen, and revoked President Donald J. Trump’s upload privileges.

YouTube has committed $15 million to hire more than 100 additional content moderators to help with midterm and presidential elections in Brazil, and the company has more than 10,000 moderators worldwide, according to a person familiar with the matter, who was not authorized to do so. discuss personnel decisions.

The company has improved its recommendation algorithm so that the platform does not recommend political videos from unverified sources to other viewers, according to another person familiar with the matter. YouTube also set up an election war room involving dozens of officials and was preparing to quickly remove videos and live streams that violate its policies on Election Day, the person said.

Still, the researchers argued that YouTube could have been more proactive in cracking down on fake stories that might continue to resonate after the election.

The most popular election conspiracy theory on YouTube is the baseless claim that some Americans cheated by stuffing ballot boxes with multiple ballots. The idea came from a discredited conspiracy-filled documentary called 2000 Mules, which claimed that Mr Trump was re-elected because of illegal ballots cast in ballot boxes.

ISD examined YouTube Shorts and found at least a dozen examples of short videos that repeated claims of “2,000 Mules” ballot trafficking without warning labels that would counter misinformation or provide authoritative election information, according to a report shared with The New York Times. of links. . Video views varied widely, from a few dozen to tens of thousands. Two of the videos contained references to the film itself.

ISD found the videos through keyword searches. Its list was not intended to be exhaustive, but “these shorts were identified with relative ease, indicating that they remain readily available,” three ISD researchers wrote in the report. Some of the videos feature men addressing the camera, in a car or at home, promoting their strong belief in the film. Other videos promote the documentary without personal commentary.

The nonprofit group also looked into YouTube Shorts’ competitors TikTok and Instagram’s Reels and found that they both spread similar misinformation.

Ms. Craig, of ISD, said nonprofit groups like hers are working hard in the run-up to Election Day to catch and counter disinformation left on the tech giants’ social media platforms, even though those companies have billions of dollars and thousands of content moderators.

“Our teams are trying to pick up the slack for well-resourced organizations that can do this kind of work,” he said.

Although YouTube Shorts videos are no longer than a minute, they are more difficult to view than longer videos, according to two people familiar with the matter.

The company relies on artificial intelligence to scan what people have uploaded to its platform. Some of the AI ​​systems work in minutes and others in hours, looking for signs that something is wrong with the content, one of the people said. Short videos generate fewer signals than long ones, the person said, so YouTube has started working on a solution that can work more effectively in its short format.

YouTube also struggles to curb Spanish-language misinformation, according to research and analysis Media issues and: Equis:a non-profit organization focused on the Latino community.

Almost half of Latinos have it turned to YouTube weekly for news, more than they have on any other social media platform, says Equis researcher Jacobo Licona. And those viewers have access to an abundance of misinformation and one-sided political propaganda on the platform, he said, with Latin American influencers based in countries like Mexico, Colombia and Venezuela infiltrating US politics.

Many of them have picked up familiar stories, such as false claims of dead people voting in the US, and translated them into Spanish.

In October, YouTube asked a group that monitors Spanish-language misinformation on the site for access to its monitoring data, according to two people familiar with the request. The company was looking for outside help to oversee its platform, and the group was concerned that YouTube had not made the necessary investments in Spanish content, they said.

YouTube said it reached out to subject matter experts to gain additional insight ahead of the midterm elections. It also said it has invested heavily in combating harmful misinformation in various languages, including Spanish.

YouTube has a somewhat laborious process for moderating Spanish-language videos. The company has Spanish-speaking human moderators who help train AI systems that also review content. One AI method involved decoding videos and reviewing text, the employee said. Another way was to use Google Translate to convert the transcript from Spanish to English. These methods haven’t always proven accurate because of idioms and slang, the person said.

YouTube said its systems also evaluated visual cues, metadata and on-screen text in Spanish-language videos, and that its AI was able to learn new trends such as emerging phrases and slang.

English-language researchers found claims of voter fraud from celebrities with large followings, including Charlie Kirk, Dinesh D’Souza (who created “2000 Mules”) and Tim Poole, a YouTube personality with a 1.3 million followers known to cast doubt on the results. On the issue of the 2020 elections and the use of ballot boxes.

“One of the things that is most troubling to me is that people who watch ballot boxes are praised and encouraged on YouTube,” Kayla Gogarty, deputy director of research at Media Matters, said in an interview. “It’s a very clear example of something going from online to offline that can cause harm in the real world.”



#YouTube #misinformation #researchers

Leave a Comment

Your email address will not be published. Required fields are marked *