<p>YouTube, like other social media platforms, spent years expanding its efforts to tackle misinformation after the 2016 election. It hired policy experts and content moderators and invested in more technology to limit the reach of false narratives. Not anymore.</p>.<p>Last month, the company, owned by Google, quietly reduced its small team of policy experts in charge of handling misinformation, according to three people with knowledge of the decision. The cuts, part of the reduction of 12,000 employees by Google’s parent company, Alphabet, left only one person in charge of misinformation policy worldwide, one of the people said.</p>.<p>The cuts reflect a trend across the industry that threatens to undo many of the safeguards that social media platforms put in place in recent years to ban or tamp down on disinformation — like false claims about the Covid-19 pandemic, the Russian war in Ukraine or the integrity of elections around the world. Twitter, under its new owner, Elon Musk, has slashed its staff, while Meta, which owns Facebook, Instagram and WhatsApp, has shifted its focus and resources to the immersive world of the metaverse.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/elon-s-twitter-ripe-for-a-misinformation-avalanche-1181620.html" target="_blank">Elon’s Twitter ripe for a misinformation avalanche</a></strong></p>.<p>Faced with economic headwinds and political and legal pressure, the social media giants have shown signs that fighting false information online is no longer as high a priority, raising fears among experts who track the issue that it will further erode trust online.</p>.<p>“I wouldn’t say the war is over, but I think we’ve lost key battles,” said Angelo Carusone, the president of the liberal media watchdog Media Matters for America. After years of efforts, he described a mounting sense of fatigue in the struggle. “I do think we, as a society, have lost the appetite to keep battling. And that means we will lose the war.”</p>.<p>The companies maintain they remain diligent, but the efforts to combat false and misleading information online — which arguably peaked during the Covid pandemic and the 2020 presidential election — have waned at a time when the problem of misinformation remains as pernicious as ever with a proliferation of alternative sites competing for users.</p>.<p>Meta restored the accounts of former President Donald Trump on Facebook and Instagram on Thursday, barely two years after suspending him for inciting violence ahead of the storming of the Capitol. It did so even though his posts on his own platform, Truth Social, are rife with extremist content, like the conspiracy theories spread by the QAnon movement that Facebook has previously declared unacceptable. (Trump has not yet posted there.)</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/as-covid-19-continues-to-spread-so-does-misinformation-about-it-1177114.html" target="_blank">As Covid-19 continues to spread, so does misinformation about it</a></strong></p>.<p>Musk has also invited Trump back to Twitter, one of the many steps he has taken to dismantle many of the platform’s previous policies, boasting that he wanted to undo censorious decisions made by its previous owners. The team that oversaw trust and safety issues — including misinformation — was among those eliminated under Musk’s leadership.</p>.<p>YouTube’s staff reductions in January were not as drastic but were significant for the small teams assigned to set and refine the platform’s policies. YouTube fired two of its five misinformation experts, including the team’s manager, leaving behind one person for political misinformation and two for medical misinformation, one of the people with knowledge of the decision said.</p>.<p>It also shed two of its five policy experts, called leads, who work on hate speech and harassment issues, the person said. These experts have played critical roles in determining where YouTube’s line between acceptable and unacceptable content should be and advised executives on difficult content decisions.</p>.<p>YouTube also made small reductions among teams that enforce its policies and its rapid response team, which is involved in addressing problematic content on the platform. Outside of YouTube, Google’s trust and safety department shed a team of program managers who assisted policy experts, according to a message about the decision reviewed by The New York Times.</p>.<p>Policy experts working on other issues like extremism, child safety and policies for new products were unaffected by the layoffs.</p>.<p>YouTube said that overall, the layoffs were consistent with the 6 per cent job cuts across Alphabet.</p>.<p>“Responsibility remains our top priority,” Elena Hernandez, a YouTube spokesperson, said in a statement. “We’ll continue to support the teams, machine learning and policies that protect the YouTube community, and pursue this work with the same focus and rigor moving forward.”</p>.<p>Still, critics are concerned that social media companies have put the bottom line above the public good.</p>.<p>“Basically, trust and safety was the first thing to go,” said Joel Finkelstein co-founder of the Network Contagion Research Institute, which tracks hate and extremism on social media. “The veneer of civility is expensive.”</p>.<p>None of the companies say they are abandoning the fight against misinformation. Nick Clegg, the president of global affairs at Meta, wrote after Trump’s reinstatement on Facebook that it would continue to intervene “where there is a clear risk of real-world harm.”</p>.<p>“The public should be able to hear what their politicians are saying — the good, the bad and the ugly — so that they can make informed choices at the ballot box,” he wrote in a statement. “But that does not mean that there are no limits to what people can say on our platform.”</p>.<p>Meta declined to comment further for this article.</p>.<p>Hernandez from YouTube said the company tries to show users information from trusted sources. YouTube said it had a dedicated team working on the midterms and removed more than 10,000 videos about the elections.</p>.<p>Researchers at Media Matters found several examples of Covid-19 misinformation being spread on YouTube Shorts, the platform’s service for minute-long videos, in the past week. They also found an array of videos that espoused hateful, misogynistic and transphobic views. Some were from well-known creators, such as Candace Owens and Nick Fuentes. Other videos and accounts featured Andrew Tate and Sneako, who were both barred from YouTube last year after violating its content policies.</p>.<p>“YouTube and the other social media platforms are inconsistent in their enforcement of their policies,” said Kayla Gogarty, the deputy research director for Media Matters.</p>.<p>YouTube said it was always working to strike a balance between allowing free expression and protecting online and real-world communities from harm. Nicole Bell, a spokesperson for the company, said that YouTube had removed six videos flagged by Media Matters for violating its policies, and it had terminated a channel for uploading content from a barred creator. But most of the more than two dozen videos flagged by Media Matters did not break the platform’s rules, she said.</p>.<p>Last year, the International Fact-Checking Network, representing more than 80 organisations, warned in a letter addressed to YouTube that the platform was “one of the major conduits of online disinformation and misinformation worldwide,” and that it was not addressing the problem.</p>.<p>The consequences of easing up on the fight against misinformation have become clear on Twitter. A new report by two advocacy groups, the Network Contagion Research Institute and the Combat Antisemitism Movement, found a surge in antisemitic content as Musk took over.</p>.<p>It described an organized campaign by extremists who had previously been barred from the platform. One, Tim Goniet, who used the name Baked Alaska online, was recently convicted and sentenced to 60 days in prison for his part in the Jan. 6, 2021, riot at the Capitol. Tweeting this month, he pressed what he called a conspiracy theory: “twitter unbanned all of us cuz their engagement was tanking w/o us.”</p>.<p>“It is true that the trust and safety efforts we have had to date have been really broken, but at least there were efforts,” said Finkelstein, an author of the report. “And there was some baby in the bath water.”</p>.<p>Despite Musk’s avowal to foster unfettered speech on the platform, he has also moved to suspend accounts, like Kanye West’s, after a series of antisemitic remarks.</p>.<p>Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability, said the experience at Twitter showed that moderating offensive content remained important for the viability of platforms, regardless of economic considerations.</p>.<p>“Content moderation is good for business, and it is good for democracy,” she said. “Companies are failing to do that because they seem to think they don’t have a big enough role to play, so they’re turning their back on it.”</p>
<p>YouTube, like other social media platforms, spent years expanding its efforts to tackle misinformation after the 2016 election. It hired policy experts and content moderators and invested in more technology to limit the reach of false narratives. Not anymore.</p>.<p>Last month, the company, owned by Google, quietly reduced its small team of policy experts in charge of handling misinformation, according to three people with knowledge of the decision. The cuts, part of the reduction of 12,000 employees by Google’s parent company, Alphabet, left only one person in charge of misinformation policy worldwide, one of the people said.</p>.<p>The cuts reflect a trend across the industry that threatens to undo many of the safeguards that social media platforms put in place in recent years to ban or tamp down on disinformation — like false claims about the Covid-19 pandemic, the Russian war in Ukraine or the integrity of elections around the world. Twitter, under its new owner, Elon Musk, has slashed its staff, while Meta, which owns Facebook, Instagram and WhatsApp, has shifted its focus and resources to the immersive world of the metaverse.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/elon-s-twitter-ripe-for-a-misinformation-avalanche-1181620.html" target="_blank">Elon’s Twitter ripe for a misinformation avalanche</a></strong></p>.<p>Faced with economic headwinds and political and legal pressure, the social media giants have shown signs that fighting false information online is no longer as high a priority, raising fears among experts who track the issue that it will further erode trust online.</p>.<p>“I wouldn’t say the war is over, but I think we’ve lost key battles,” said Angelo Carusone, the president of the liberal media watchdog Media Matters for America. After years of efforts, he described a mounting sense of fatigue in the struggle. “I do think we, as a society, have lost the appetite to keep battling. And that means we will lose the war.”</p>.<p>The companies maintain they remain diligent, but the efforts to combat false and misleading information online — which arguably peaked during the Covid pandemic and the 2020 presidential election — have waned at a time when the problem of misinformation remains as pernicious as ever with a proliferation of alternative sites competing for users.</p>.<p>Meta restored the accounts of former President Donald Trump on Facebook and Instagram on Thursday, barely two years after suspending him for inciting violence ahead of the storming of the Capitol. It did so even though his posts on his own platform, Truth Social, are rife with extremist content, like the conspiracy theories spread by the QAnon movement that Facebook has previously declared unacceptable. (Trump has not yet posted there.)</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/as-covid-19-continues-to-spread-so-does-misinformation-about-it-1177114.html" target="_blank">As Covid-19 continues to spread, so does misinformation about it</a></strong></p>.<p>Musk has also invited Trump back to Twitter, one of the many steps he has taken to dismantle many of the platform’s previous policies, boasting that he wanted to undo censorious decisions made by its previous owners. The team that oversaw trust and safety issues — including misinformation — was among those eliminated under Musk’s leadership.</p>.<p>YouTube’s staff reductions in January were not as drastic but were significant for the small teams assigned to set and refine the platform’s policies. YouTube fired two of its five misinformation experts, including the team’s manager, leaving behind one person for political misinformation and two for medical misinformation, one of the people with knowledge of the decision said.</p>.<p>It also shed two of its five policy experts, called leads, who work on hate speech and harassment issues, the person said. These experts have played critical roles in determining where YouTube’s line between acceptable and unacceptable content should be and advised executives on difficult content decisions.</p>.<p>YouTube also made small reductions among teams that enforce its policies and its rapid response team, which is involved in addressing problematic content on the platform. Outside of YouTube, Google’s trust and safety department shed a team of program managers who assisted policy experts, according to a message about the decision reviewed by The New York Times.</p>.<p>Policy experts working on other issues like extremism, child safety and policies for new products were unaffected by the layoffs.</p>.<p>YouTube said that overall, the layoffs were consistent with the 6 per cent job cuts across Alphabet.</p>.<p>“Responsibility remains our top priority,” Elena Hernandez, a YouTube spokesperson, said in a statement. “We’ll continue to support the teams, machine learning and policies that protect the YouTube community, and pursue this work with the same focus and rigor moving forward.”</p>.<p>Still, critics are concerned that social media companies have put the bottom line above the public good.</p>.<p>“Basically, trust and safety was the first thing to go,” said Joel Finkelstein co-founder of the Network Contagion Research Institute, which tracks hate and extremism on social media. “The veneer of civility is expensive.”</p>.<p>None of the companies say they are abandoning the fight against misinformation. Nick Clegg, the president of global affairs at Meta, wrote after Trump’s reinstatement on Facebook that it would continue to intervene “where there is a clear risk of real-world harm.”</p>.<p>“The public should be able to hear what their politicians are saying — the good, the bad and the ugly — so that they can make informed choices at the ballot box,” he wrote in a statement. “But that does not mean that there are no limits to what people can say on our platform.”</p>.<p>Meta declined to comment further for this article.</p>.<p>Hernandez from YouTube said the company tries to show users information from trusted sources. YouTube said it had a dedicated team working on the midterms and removed more than 10,000 videos about the elections.</p>.<p>Researchers at Media Matters found several examples of Covid-19 misinformation being spread on YouTube Shorts, the platform’s service for minute-long videos, in the past week. They also found an array of videos that espoused hateful, misogynistic and transphobic views. Some were from well-known creators, such as Candace Owens and Nick Fuentes. Other videos and accounts featured Andrew Tate and Sneako, who were both barred from YouTube last year after violating its content policies.</p>.<p>“YouTube and the other social media platforms are inconsistent in their enforcement of their policies,” said Kayla Gogarty, the deputy research director for Media Matters.</p>.<p>YouTube said it was always working to strike a balance between allowing free expression and protecting online and real-world communities from harm. Nicole Bell, a spokesperson for the company, said that YouTube had removed six videos flagged by Media Matters for violating its policies, and it had terminated a channel for uploading content from a barred creator. But most of the more than two dozen videos flagged by Media Matters did not break the platform’s rules, she said.</p>.<p>Last year, the International Fact-Checking Network, representing more than 80 organisations, warned in a letter addressed to YouTube that the platform was “one of the major conduits of online disinformation and misinformation worldwide,” and that it was not addressing the problem.</p>.<p>The consequences of easing up on the fight against misinformation have become clear on Twitter. A new report by two advocacy groups, the Network Contagion Research Institute and the Combat Antisemitism Movement, found a surge in antisemitic content as Musk took over.</p>.<p>It described an organized campaign by extremists who had previously been barred from the platform. One, Tim Goniet, who used the name Baked Alaska online, was recently convicted and sentenced to 60 days in prison for his part in the Jan. 6, 2021, riot at the Capitol. Tweeting this month, he pressed what he called a conspiracy theory: “twitter unbanned all of us cuz their engagement was tanking w/o us.”</p>.<p>“It is true that the trust and safety efforts we have had to date have been really broken, but at least there were efforts,” said Finkelstein, an author of the report. “And there was some baby in the bath water.”</p>.<p>Despite Musk’s avowal to foster unfettered speech on the platform, he has also moved to suspend accounts, like Kanye West’s, after a series of antisemitic remarks.</p>.<p>Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability, said the experience at Twitter showed that moderating offensive content remained important for the viability of platforms, regardless of economic considerations.</p>.<p>“Content moderation is good for business, and it is good for democracy,” she said. “Companies are failing to do that because they seem to think they don’t have a big enough role to play, so they’re turning their back on it.”</p>