By
Paula ParisiMarch 7, 2022
In an effort to thwart misinformation, Amazon-owned live-streaming video service Twitch is cracking down on bad actors. “We do not believe that individuals who use online services to spread false, harmful information, have a place in our community,” the company stated. Twitch worked with researchers and experts to identify three characteristics that all bad actors share: an online presence dedicated to (1) persistently sharing (2) widely disproven and broadly shared (3) harmful misinformation topics, such as conspiracies that promote violence. Twitch specified that it will not take action against “one-off” statements containing misinformation. Continue reading Twitch Aims to Remove Channels That Spread Misinformation
By
Debra KaufmanMarch 26, 2021
Prior to a House hearing on social media’s role in extremism and disinformation, Facebook chief executive Mark Zuckerberg submitted written testimony on Section 230, suggesting that “platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.” Section 230 of the 1996 Communications Decency Act holds that platforms are not liable for content posted by their users. In a bipartisan effort, lawmakers are pushing for change. “Our nation is drowning in disinformation driven by social media,” suggested Rep. Mike Doyle (D-Pennsylvania). “We will legislate to stop this.” Continue reading Congress Grills Big Tech Executives on Accountability Issues
By
Debra KaufmanOctober 9, 2020
Small social video app Triller saw an opportunity to grow in July when news of a potential TikTok ban was brewing. Triller’s first “get” was the Sway Boys, a group of TikTok influencers who had been mulling over the idea of starting their own app but were convinced to join Triller by its majority owner, entertainment executive Ryan Kavanaugh. He offered them a juicy deal: In exchange for joining, he told them, he’d give them equity and roles within Triller. And they could still post on TikTok, just less frequently. Continue reading Triller Rolls Out the Red Carpet to Attract TikTok Influencers
By
Debra KaufmanSeptember 30, 2020
In light of the fact that 26 percent of Americans say they get news on YouTube, the Pew Research Center conducted a survey in January of 12,638 U.S. adults who consumed news on YouTube, asking about their experiences. The Pew study analyzed the news channels consumers watched and the content of videos on these channels, relying on a subset of videos published in December 2019. The study found a news environment on YouTube in which established news organizations and indie news channels “thrive side by side.”
Continue reading YouTube Users Turn to Established and Indie News Channels
By
Debra KaufmanSeptember 22, 2020
Facebook vowed to stop QAnon, a conspiracy theory group claiming that a satanic cult, led by Democratic politicians and entertainers, engages in trafficking of children and cannibalism. Instead, QAnon’s Facebook group has grown by hundreds of new followers, as have the Facebook pages of a violent militia movement. More disturbing is that a study showed Facebook’s own recommendation engine drove users towards these groups. YouTube is another social platform that reportedly recommends the content of fringe groups. Continue reading Social Media Platforms Struggle to Subdue Conspiracy Groups
By
Debra KaufmanSeptember 3, 2020
Facebook and Twitter reported that the Internet Research Agency in Russia, which reportedly interfered in the 2016 U.S. presidential election, is again using fake accounts and created Peace Data, a fake left-wing website. With the likely goal of influencing the 2020 election, it is believed to be spreading disinformation about Democratic presidential candidate Joseph Biden. U.S. intelligence agencies have warned for months about Russian meddling. Both social platforms have already taken steps to address such disinformation; most recently, Facebook announced plans to block political ads one week before the November election and Twitter is adding more context to Trending Topics. Continue reading Russia Pushes More Disinformation via Facebook and Twitter
By
Debra KaufmanAugust 21, 2020
According to global civic movement Avaaz, over the past year Facebook enabled 3.8 billion views of misinformation related to health, almost four times the views of sites such as the World Health Organization (WHO) and Centers for Disease Control (CDC). This has occurred despite Facebook’s partnership with these organizations to expose users to reliable information. In another effort to squelch misinformation, Facebook removed 790 QAnon groups and restricted another 1,950 groups, 440 pages and 10,000+ Instagram accounts. Continue reading Facebook Struggles to Contain Health Misinformation, QAnon
By
Debra KaufmanJuly 23, 2020
Twitter removed about 150,000 accounts disseminating QAnon right-wing conspiracies for violating the social platform’s policies and distributing harassment and misinformation that could potentially lead to harm. The company added that it will no longer recommend QAnon-related accounts and content, including that contained in email. Twitter also stated it will make efforts to limit these theories from appearing in trending topics and search, as well as users posting links affiliated with the theories. Continue reading Twitter Bans Accounts Promoting QAnon Conspiracy Theories