By
Paula ParisiSeptember 6, 2024
YouTube is adding a Family Center hub along with a feature that allows parents to link their accounts to those of their teen children for insight on child use patterns. Linked parents will receive alerts with aggregated information about things like the number of new uploads, subscriptions and comments, or when a teen starts a live stream. What they won’t get are details about the content itself. YouTube calls it “a collaborative approach to teen supervision on YouTube.” The move comes as federal and state legislators get more aggressive about regulating online safety for minors. Continue reading YouTube Adds Family Center, Parent Insights on Teen Viewing
By
Paula ParisiSeptember 3, 2024
In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers
By
Paula ParisiJune 19, 2024
United States Surgeon General Dr. Vivek Murthy has renewed his push for Congress to enact social media warning label advising of potential mental health damage to adolescents. Murthy also called on tech companies to be more transparent with internal data on the impact of their products on American youth, requesting independent safety audits and restrictions on features that may be addictive, including autoplay, push notifications and infinite scroll, which he suggests “prey on developing brains and contribute to excessive use.” His federal campaign joins a groundswell of local laws restricting minors’ access to social media. Continue reading U.S. Surgeon General Calls for Social Media Warning Labels
By
ETCentric StaffMarch 27, 2024
Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law
By
Paula ParisiJanuary 29, 2024
New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’
By
Paula ParisiOctober 26, 2023
Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids
By
Paula ParisiJuly 31, 2023
The Senate has cleared two children’s online safety bills despite pushback from civil liberties groups that say the digital surveillance used to monitor behavior will result in an Internet less safe for kids. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are intended to address a mental health crisis experts blame in large part on social media, but critics say the bills could cause more harm than good by forcing social media firms to collect more user data as part of enforcement. The bills — which cleared the Senate Commerce Committee by unanimous vote — are also said to reduce access to encrypted services. Continue reading Government Advances Online Safety Legislation for Children
By
Paula ParisiJune 14, 2023
A bill passed by the Louisiana State Legislature that bans minors from creating social media accounts without parental consent is the latest in a string of legal measures that take aim at the online world to combat a perceived mental health crisis among America’s youth. Utah also recently passed a law requiring consent of a parent or guardian when anyone under 18 wants to create a social account. And California now mandates some sites default to the highest privacy for minor accounts. The Louisiana legislation stands out as extremely restrictive, encompassing multiplayer games and video-sharing apps. Continue reading Louisiana Approves Parental Consent Bill for Online Accounts
By
Paula ParisiMay 1, 2023
A bipartisan bill introduced in the Senate last week seeks to establish a federal age limit for using social media that would prohibit children 12 and under from creating their own accounts as a way to prevent them from independently logging on to social platforms. The Protecting Kids on Social Media Act takes issue with the engagement algorithms Big Tech uses to keep kids glued to their sites and would limit the type of coding that could be deployed to target young users between the ages of 13 and 17. If not logged into an account, users under 13 could still access other online content. Continue reading New Federal Bill Would Restrict Social Media Use for Minors
By
Paula ParisiSeptember 20, 2022
Governor Gavin Newsom signed the California Age-Appropriate Design Code Act into law last week, making his state the first in the nation to adopt online child safety measures. The bipartisan legislation requires online platforms to default to privacy and safety settings that protect children’s mental and physical health. The new law, cosponsored by Assemblymembers Buffy Wicks (D-15th District) and Jordan Cunningham (R-35th District), prohibits companies that provide online services and products in California from using a child’s personal information and forbids collecting, selling, or retaining a child’s geolocation, among other things. Continue reading California Governor Signs Online Child Protection Bill into Law
By
Paula ParisiSeptember 1, 2022
A first of its kind U.S. proposal to protect children online cleared the California Legislature Tuesday and was sent to the desk of Governor Gavin Newsom. The California Age-Appropriate Design Code Act will require social media platforms to implement guardrails for users under 18. The new rules will curb risks — such as allowing strangers to message children — and require changes to recommendation algorithms and ad targeting where minors are concerned. The bill was drafted following Facebook whistleblower Frances Haugen’s 2021 congressional testimony about the negative effects of social media on children’s mental health. Continue reading California’s Online Child Safety Bill Could Set New Standards
By
Paula ParisiJune 17, 2022
Online privacy protections for consumers are in focus on Capitol Hill, with the Kids Online Safety Act (KOSA) getting particular attention. A coalition of more than 100 organizations, including Fairplay and the American Psychological Association are calling on senators to advance KOSA this month. Co-sponsored by senators Richard Blumenthal (D-Connecticut) and Marsha Blackburn (R-Tennessee), the legislation would require social media platforms to conduct annual audits to identify risks to minors as well as more concrete steps like opting out of algorithmic recommendations and disabling “addictive” features. Continue reading Online Child Safety Gains Steam at State and Federal Levels
By
Paula ParisiFebruary 28, 2022
Taiwanese electronics company HTC has introduced a new Vive Guardian feature for its popular VR headset, the HTC Vive. The safeguard is designed to limit access to apps while children are cavorting in the metaverse, and experts say it’s a much needed step in an environment that thus far lacks kid profiles and parental safety settings. HTC, Meta Platforms and others suggest VR be used only by those over the age of 13, but at this point, it’s only a recommendation, and calls are already amplifying to put child safety measures in place. Continue reading HTC Adds Vive Guardian to Protect Kids in Volatile Metaverse
By
Debra KaufmanJanuary 4, 2022
According to CTA vice president of research Steve Koenig’s “Tech Trends to Watch” presentation at CES in Las Vegas, developments in 2022 will emerge from the transportation, space tech, sustainable technology and digital health sectors. Innovations will include electric vehicles, micro-mobility solutions and space tourism as well as alternative power sources, smart cities and homes and, in digital health, an increased use of wearables as well as an emphasis on solutions for mental health. Last year also saw historic highs of consumer demand in a wide variety of sectors. Continue reading CES: CTA’s Research VP Steve Koenig on ‘Trends to Watch’
By
Paula ParisiDecember 20, 2021
TikTok is tweaking its For You feed so users won’t be recommended too much of the same content. The move follows a global spate of regulatory hearings that took social media companies to task for promoting content that intentionally polarizes adults and potentially harms children. In addition to “diversifying recommendations” in order to “interrupt repetitive patterns” around topics that may provide negative reinforcement, like “loneliness or weight loss,” the popular ByteDance platform said it is also introducing a new feature that will allow users to avoid specific topics or creators. Continue reading TikTok Adjusts Feed to Curb Repetition, Offers Users Control