Dozens of States Sue Meta for Social Media Addiction in Kids

Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids

Illinois Law Protecting Child Vloggers Will Take Effect in 2024

Illinois has become the first state in the nation to pass legislation protecting children who are social media influencers. Beginning in July 2024, children under 16 who appear in monetized video content online will have a legal right to compensation for their work, even if that means litigating against their parents. “The rise of social media has given children new opportunities to earn a profit,” Illinois Senator David Koehler said about the bill he sponsored. “Many parents have taken this opportunity to pocket the money, while making their children continue to work in these digital environments. Continue reading Illinois Law Protecting Child Vloggers Will Take Effect in 2024

Government Advances Online Safety Legislation for Children

The Senate has cleared two children’s online safety bills despite pushback from civil liberties groups that say the digital surveillance used to monitor behavior will result in an Internet less safe for kids. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are intended to address a mental health crisis experts blame in large part on social media, but critics say the bills could cause more harm than good by forcing social media firms to collect more user data as part of enforcement. The bills — which cleared the Senate Commerce Committee by unanimous vote — are also said to reduce access to encrypted services. Continue reading Government Advances Online Safety Legislation for Children

Snapchat+ Introduces ‘My AI Snaps’ for Chatbot Snap Backs

Snapchat is rolling out a new feature for its premium Snapchat+ platform that enables users who send Snaps to My AI let the artificial intelligence know what they’re up to “receive a unique generative Snap back that keeps the conversation going” via My AI Snaps. The feature was previewed at the Snap Partner Summit in April as part of a larger push on AI updates, including the ability to invite the My AI chatbot to participate in group chats with friends and the ability to get AI Lens suggestions and place recommendations. In addition, the My AI chatbot — made free to all users this year — was updated to reply to users’ Snaps with a text-based response. Continue reading Snapchat+ Introduces ‘My AI Snaps’ for Chatbot Snap Backs

Utah’s Social Media Law Requires Age Verification for Minors

Utah has become the first state to pass laws requiring social media platforms to obtain age verification before users can register. The law is designed to force social networks to enforce parental consent provisions. As of March 2024, companies including Facebook, Instagram, Snap, TikTok and Twitter will be required to secure proof of age for Utah users via a valid ID instead of just letting people type in their birth date at sign-up. While Utah is out front on the issue, nine other states have proposed legislation that includes age checks, most recently Arkansas. Continue reading Utah’s Social Media Law Requires Age Verification for Minors

Meta’s Penalty Reforms Designed to Be More Effective, Fair

Meta Platforms is reforming its penalty system for Facebook policy violations. Based on recommendations from its Oversight Board, the company will focus more on educating users and less on punitive measures like suspending accounts or limiting posts. “While we are still removing violating content just as before,” explains Meta VP of content policy Monika Bickert, “under our new system we will focus more on helping people understand why we have removed their content, which is shown to help prevent re-offending, rather than so quickly restricting their ability to post.” The goal is fairer and more effective content moderation on Facebook. Continue reading Meta’s Penalty Reforms Designed to Be More Effective, Fair

Biden Challenges Big Tech, Calls for Children’s Online Safety

President Biden’s second State of the Union speech Tuesday night included calls for stronger consumer privacy protections and tougher antitrust laws in direct challenge to what many perceive as the unchecked power of Big Tech. “Pass bipartisan legislation to strengthen antitrust enforcement and prevent big online platforms from giving their own products an unfair advantage,” Biden stated, urging Congress to “stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” Continue reading Biden Challenges Big Tech, Calls for Children’s Online Safety

UK Online Safety Bill to Exert Pressure on Social Media Execs

British legislators seem ready to make good on a threat to add criminal liability and jail time for high-level social media executives who fail to protect children from online harm as part of the Online Safety Bill. While the bill also aims to protect adults from fraud and malfeasance, its strictest provisions are geared toward child protection. The current proposal could win approval by the House of Commons within the week, and would then move to the upper chamber, the House of Lords, later in the quarter for further revision. Enactment is anticipated by year’s end.
Continue reading UK Online Safety Bill to Exert Pressure on Social Media Execs

Advocacy Groups Seek to Enact Online Rules to Protect Kids

A coalition of more than 20 advocacy groups with an interest in child safety is petitioning the Federal Trade Commission to prohibit social media platforms including TikTok as well as online games and other services from bombarding kids with ads and using other tactics that may hook children online. Regulators are being lobbied to prevent online services from offering minors “low-friction rewards” — unpredictably granting positive reinforcement for scrolling, tapping or logging on to prolonged use. The groups say the technique is the same used by slot machine makers to keep gamblers engaged. Continue reading Advocacy Groups Seek to Enact Online Rules to Protect Kids

Online Safety Act Paused as Ofcom Reports on Net Neutrality

UK watchdog Ofcom has proposed a loosening of the nation’s net neutrality rules so as to not unduly restrict innovation and development. While it is up to government and Parliament to change the law, recommendations from Ofcom — which was created to monitor compliance with net neutrality laws — are influential. “Since the current rules were put in place in 2016, there have been significant developments in the online world, including a surge in demand for capacity,” as well as the rollout of 5G, and the emergence of large players like Netflix and Amazon Prime. Continue reading Online Safety Act Paused as Ofcom Reports on Net Neutrality

California’s Online Child Safety Bill Could Set New Standards

A first of its kind U.S. proposal to protect children online cleared the California Legislature Tuesday and was sent to the desk of Governor Gavin Newsom. The California Age-Appropriate Design Code Act will require social media platforms to implement guardrails for users under 18. The new rules will curb risks — such as allowing strangers to message children — and require changes to recommendation algorithms and ad targeting where minors are concerned. The bill was drafted following Facebook whistleblower Frances Haugen’s 2021 congressional testimony about the negative effects of social media on children’s mental health. Continue reading California’s Online Child Safety Bill Could Set New Standards

TikTok Draws Criticism for Undisclosed Sponsored Content

TikTok is facing blowback for lax advertising disclosures. While the platform offers various ways to identify paid promotion, its marketing policies appear to operate on an honor system, and while some creators label their posts as advertising or partnerships, many do not. Where a financial relationship exists with regard to products mentioned, the truth in advertising rules enforced by the Federal Trade Commission and state attorneys general require media partners to disclose that funds will change hands. As part of a renewed national interest in digital consumer protections, particularly related to child safety, the area is getting increased scrutiny. Continue reading TikTok Draws Criticism for Undisclosed Sponsored Content

EU Checks Power of Big Tech with Digital Services Regulation

The European Parliament has adopted two digital acts, one focused on leveling the competitive playing field, the other on protecting consumer rights online. The Digital Markets Act and the Digital Services Act are both expected to take effect this fall, after the European Commission signs off. “We are finally building a single digital market, the most important one in the ‘free world,’” EU commissioner for the internal market Thierry Breton said Tuesday. “The same predictable rules will apply, everywhere in the EU, for our 450 million citizens, bringing everyone a safer and fairer digital space.” Continue reading EU Checks Power of Big Tech with Digital Services Regulation

Meta Adding Parent Controls for Instagram and Virtual Reality

Meta Platforms is beginning to implement parental controls on Instagram and Quest. Last week, Instagram added a Family Center that will eventually expand to allow parents and guardians to “help teens manage experiences across Meta technologies from one central place.” Meta says parental controls will be added to Quest VR in May, and hinted others, like Facebook, are queued-up to join. The Family Center will allow parents to monitor how much time their teens spend on Instagram, setting limits if they choose. Additionally, accounts teens follow and accounts following them will be trackable. Continue reading Meta Adding Parent Controls for Instagram and Virtual Reality

Senate Tells Instagram CEO the ‘Time for Self-Policing is Over’

Instagram CEO Adam Mosseri spent more than two hours in the Senate hot seat last week, answering questions about the platform’s safety policies and impact on teens’ mental health. A bipartisan phalanx grilled the executive on topics ranging from algorithms to eating disorders. Mosseri, who was appearing in Congress for the first time, defended his social platform, a division of Meta Platforms, which also owns Facebook. He resisted pressure to throw in the towel on launching an Instagram for kids, telling lawmakers only that no child would have access to such a platform “without their explicit parental consent.” Continue reading Senate Tells Instagram CEO the ‘Time for Self-Policing is Over’