Roblox Tightens Child Safety Guidelines Amidst Media Outcry

Capitulating to outside pressure, after a barrage of media reports citing unsafe conditions for minors, Roblox is implementing new safeguards. Parents can now access parental controls from their own devices in addition to their child’s device and monitor their child’s screen time. New content labels and improvements to how users under age 13 can communicate on Roblox are additional protections that are now baked into the platform. “We’ve spent nearly two decades building strong safety systems, but we are always evolving our systems as new technology becomes available,” explained the Roblox. Continue reading Roblox Tightens Child Safety Guidelines Amidst Media Outcry

Roblox’s New Child Safety Measures Target Hangout Spaces

Online gaming platform Roblox has banned kids from “social hangout” spaces — areas that feature communication through voice chat or text and offer “free-form 2D user creation” experiences where users can do things like share drawings. Roblox has also added safeguards to prevent those under the age of 13 from playing, searching or discovering unrated games. Roblox has imposed the restrictions following allegations that it has failed to protect its younger users. This is the latest such update by a string of social platforms that have imposed guardrails designed to protect young users as lawmakers turn up the heat on child online safety. Continue reading Roblox’s New Child Safety Measures Target Hangout Spaces

Instagram Sets Its New ‘Teen Accounts’ to Private by Default

Nine months after lawmakers grilled social networks for exposing children to harm, Meta Platforms has announced that Instagram’s teen accounts will be set to “private” by default. Instagram Teen Accounts have built-in protections, limiting who can contact the underage account holders as well as the content they see. “We’ll automatically place teens into Teen Accounts, and teens under 16 will need a parent’s permission to change any of these settings to be less strict,” Meta revealed in a blog post. To avoid leaving teens feeling their wings were clipped, Meta says there will also be new features designed expressly for them. Continue reading Instagram Sets Its New ‘Teen Accounts’ to Private by Default

YouTube Adds Family Center, Parent Insights on Teen Viewing

YouTube is adding a Family Center hub along with a feature that allows parents to link their accounts to those of their teen children for insight on child use patterns. Linked parents will receive alerts with aggregated information about things like the number of new uploads, subscriptions and comments, or when a teen starts a live stream. What they won’t get are details about the content itself. YouTube calls it “a collaborative approach to teen supervision on YouTube.” The move comes as federal and state legislators get more aggressive about regulating online safety for minors. Continue reading YouTube Adds Family Center, Parent Insights on Teen Viewing

Judge Blocks Sections of a Texas Law Meant to Protect Minors

A federal judge has partially blocked a new Texas law by disallowing requirements that social platforms identify minors and filter content for their safety. The Securing Children Online Through Parental Empowerment (SCOPE) Act, signed last year, threatens free speech due to its “monitoring and filtering” requirements the court ruled as the basis for a temporary injunction. Under the law, registered users under 18 will be subject to limited data collection, target advertising bans and parental consent for financial transactions. SCOPE would affect a range of online services, with large social platforms a focus. Continue reading Judge Blocks Sections of a Texas Law Meant to Protect Minors

Snapchat Puts Focus on Teen Safety Resources for Teachers

In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers

U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA

The U.S. Department of Justice has filed suit against TikTok and its parent company, ByteDance, charging they’ve violated the Children’s Online Privacy Protection Act (COPPA) by allowing children to create TikTok accounts without parental consent, and collecting their data. The suit also alleges TikTok retained the personal data of minors who joined prior to COPPA going into effect in 2000, even after parents demanded it be deleted, a right under COPPA. This latest move in the ongoing legal battle with ByteDance follows the Chinese company’s own lawsuit against the U.S. government. Continue reading U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA

Popular Messaging App Banned from Servicing Young Users

Federal regulators have taken the unprecedented step of banning the NGL messaging platform from providing service to users under 18. The action is part of a legal settlement between NGL Labs, the Federal Trade Commission and the Los Angeles District Attorney’s Office. NGL, whose niche is “anonymous” communication and features the tagline “Ask me anything,” has also agreed to pay $5 million in fines. An FTC investigation found that in addition to fraudulent business claims about divulging the identities of message senders for a fee, NGL also falsely claimed it used artificial intelligence to filter out cyberbullying and harmful messages. Continue reading Popular Messaging App Banned from Servicing Young Users

U.S. Surgeon General Calls for Social Media Warning Labels

United States Surgeon General Dr. Vivek Murthy has renewed his push for Congress to enact social media warning label advising of potential mental health damage to adolescents. Murthy also called on tech companies to be more transparent with internal data on the impact of their products on American youth, requesting independent safety audits and restrictions on features that may be addictive, including autoplay, push notifications and infinite scroll, which he suggests “prey on developing brains and contribute to excessive use.” His federal campaign joins a groundswell of local laws restricting minors’ access to social media. Continue reading U.S. Surgeon General Calls for Social Media Warning Labels

Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Pushes Forward a Social Media Ban for Kids Under 16

Florida’s legislature has passed a bill banning children younger than 16 from having social media accounts despite some pushback from Governor Ron DeSantis, who said he will be wrestling with whether to sign the measure into law. Due to a procedural requirement, DeSantis will have to sign or veto the proposed legislation before lawmakers conclude the current session in a matter of weeks. He has expressed dissatisfaction with the lack of a provision to let parents override the restriction, which would curtail access to the most popular sites, potentially impacting TikTok, Instagram, Facebook, Snapchat and YouTube. Continue reading Florida Pushes Forward a Social Media Ban for Kids Under 16

OpenAI Partners with Common Sense Media on AI Guidelines

As parents and educators grapple with figuring out how AI will fit into education, OpenAI is preemptively acting to help answer that question, teaming with learning and child safety group Common Sense Media on informational material and recommended guidelines. The two will also work together to curate “family-friendly GPTs” for the GPT Store that are “based on Common Sense ratings and standards,” the organization said. The partnership aims “to help realize the full potential of AI for teens and families and minimize the risks,” according to Common Sense. Continue reading OpenAI Partners with Common Sense Media on AI Guidelines

New York City Classifies Social Media a ‘Public Health Threat’

New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’

FTC Seeks to Bolster COPPA So Firms Can’t Surveil Children

The Federal Trade Commission has proposed new rules to strengthen the Children’s Online Privacy Protection Act (COPPA), further limiting the collection of children’s data, particularly those who seek to monetize the information through targeted advertising. FTC Chair Lina Khan says the proposed changes aim to prevent tech firms “from outsourcing their responsibilities to parents” when it comes to ensuring privacy for children’s data. The FTC says it has issued fines totaling hundreds of millions of dollars to Google’s YouTube, and to a lesser extent, ByteDance’s TikTok, for mishandling the data of children 13-years-old and younger. Continue reading FTC Seeks to Bolster COPPA So Firms Can’t Surveil Children

Second Meta Whistleblower Testifies to Potential Child Harm

A second Meta Platforms whistleblower has come forward, testifying this week before a Senate subcommittee that the company’s social networks were potentially harming teens, and his warnings to that effect were ignored by top leadership. Arturo Bejar, from 2009 to 2015 a Facebook engineering director and an Instagram consultant from 2019 to 2021, told the Senate Judiciary Subcommittee on Privacy, Technology and Law that Meta officials failed to take steps to protect underage users on the platforms. Bejar follows former Facebook whistleblower Frances Haugen, who provided explosive Senate testimony in 2021. Continue reading Second Meta Whistleblower Testifies to Potential Child Harm