By
Paula ParisiNovember 11, 2024
Online gaming platform Roblox has banned kids from “social hangout” spaces — areas that feature communication through voice chat or text and offer “free-form 2D user creation” experiences where users can do things like share drawings. Roblox has also added safeguards to prevent those under the age of 13 from playing, searching or discovering unrated games. Roblox has imposed the restrictions following allegations that it has failed to protect its younger users. This is the latest such update by a string of social platforms that have imposed guardrails designed to protect young users as lawmakers turn up the heat on child online safety. Continue reading Roblox’s New Child Safety Measures Target Hangout Spaces
By
Rob ScottNovember 7, 2024
The government of Canada has ordered social video app TikTok to shut down its business operations in the country, following a national security review under the Investment Canada Act. and potential risks posed by TikTok and parent ByteDance. “The government is not blocking Canadians’ access to the TikTok application or their ability to create content,” explains François-Philippe Champagne, minister of innovation, science and industry. “The decision to use a social media application or platform is a personal choice.” Canada previously banned the TikTok app from official government devices, while the U.S. passed a law that could also ban the app. Continue reading Canada Orders TikTok to Shut Down Its Business Operations
By
Paula ParisiSeptember 20, 2024
Nine months after lawmakers grilled social networks for exposing children to harm, Meta Platforms has announced that Instagram’s teen accounts will be set to “private” by default. Instagram Teen Accounts have built-in protections, limiting who can contact the underage account holders as well as the content they see. “We’ll automatically place teens into Teen Accounts, and teens under 16 will need a parent’s permission to change any of these settings to be less strict,” Meta revealed in a blog post. To avoid leaving teens feeling their wings were clipped, Meta says there will also be new features designed expressly for them. Continue reading Instagram Sets Its New ‘Teen Accounts’ to Private by Default
By
Paula ParisiSeptember 3, 2024
In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers
By
Paula ParisiAugust 6, 2024
The U.S. Department of Justice has filed suit against TikTok and its parent company, ByteDance, charging they’ve violated the Children’s Online Privacy Protection Act (COPPA) by allowing children to create TikTok accounts without parental consent, and collecting their data. The suit also alleges TikTok retained the personal data of minors who joined prior to COPPA going into effect in 2000, even after parents demanded it be deleted, a right under COPPA. This latest move in the ongoing legal battle with ByteDance follows the Chinese company’s own lawsuit against the U.S. government. Continue reading U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA
By
Rob ScottAugust 1, 2024
Two landmark bills designed to bolster online safety for children — the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) — were overwhelmingly approved by the U.S. Senate on Tuesday in bipartisan 91-3 votes. If approved by the House, the legislation would introduce new rules regarding what tech companies can offer to minors and how those firms use and share children’s data. The three senators who voted against the bills cited concerns that the regulations could stifle free speech, open the door to government censorship, and fail to adequately address the greatest threats to children online. Continue reading Senate Passes Two Bills to Strengthen Children’s Online Safety
By
Paula ParisiJuly 23, 2024
Streaming rose to 40.3 percent in June, setting a record as it nudged past the previous single-category high point of 40.1, set by cable in June 2021. The percentage marks the highest share of TV ever reported in the three years since Nielsen debuted its monthly measurement report The Gauge. Google’s YouTube and Fox’s Tubi both claimed personal bests, respectively hitting 9.9 and 2.0 percent of TV viewing. Four streaming platforms achieved double-digit usage growth: Disney+ (+14.8 percent), Tubi (+14.7 percent), Netflix (+11.8 percent) and Max (+11.0 percent) — each with 20 percent or more of that growth attributable to younger viewers. Continue reading Nielsen: Streaming Reps 40 Percent Share of June TV Viewing
By
ETCentric StaffFebruary 27, 2024
Florida’s legislature has passed a bill banning children younger than 16 from having social media accounts despite some pushback from Governor Ron DeSantis, who said he will be wrestling with whether to sign the measure into law. Due to a procedural requirement, DeSantis will have to sign or veto the proposed legislation before lawmakers conclude the current session in a matter of weeks. He has expressed dissatisfaction with the lack of a provision to let parents override the restriction, which would curtail access to the most popular sites, potentially impacting TikTok, Instagram, Facebook, Snapchat and YouTube. Continue reading Florida Pushes Forward a Social Media Ban for Kids Under 16
By
Paula ParisiJanuary 29, 2024
New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’
By
Paula ParisiNovember 29, 2023
The California Privacy Protection Agency (CPPA) is preparing new regulations to protect consumers from how businesses may potentially use AI. The state regulator, whose rulings have an outsized influence on Big Tech given the many large firms that are headquartered there, has issued draft rules for how consumer data can be used in what it is calling “automated decisionmaking technology,” or ADMT. The proposed regulations give consumers the right to opt out of ADMT and entitles the public to on-demand information as how AI is interacting with their data and how businesses plan to use it. Continue reading California Privacy Protection Agency Issues Draft Rules for AI
By
Paula ParisiNovember 21, 2023
Samsung TV Plus reports it has seen enthusiastic consumer use over the past year, with a 60 percent rise in global viewership. Accordingly, the TV maker is upgrading its free streaming service — available on Galaxy devices, Samsung Smart TVs, Smart Monitors and Family Hub appliances and on the Web — with an emphasis on discoverability for kids and music programming. Launched in 2015, the free ad-supported TV (FAST) and ad-based video on-demand (AVOD) service offers content spanning news, sports, entertainment, music, and more, in 24 countries where it is accessed on 535 million TV and mobile devices. Continue reading Samsung TV Plus Hits Refresh on a 60 Percent Viewer Surge
By
Paula ParisiNovember 9, 2023
A second Meta Platforms whistleblower has come forward, testifying this week before a Senate subcommittee that the company’s social networks were potentially harming teens, and his warnings to that effect were ignored by top leadership. Arturo Bejar, from 2009 to 2015 a Facebook engineering director and an Instagram consultant from 2019 to 2021, told the Senate Judiciary Subcommittee on Privacy, Technology and Law that Meta officials failed to take steps to protect underage users on the platforms. Bejar follows former Facebook whistleblower Frances Haugen, who provided explosive Senate testimony in 2021. Continue reading Second Meta Whistleblower Testifies to Potential Child Harm
By
Paula ParisiOctober 26, 2023
Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids
By
Rob ScottSeptember 15, 2023
Ireland’s Data Protection Commission (DPC) announced a TikTok fine of about $368 million today based on how the popular social platform processes data of younger users. DPC announced in 2021 that it was investigating TikTok’s compliance with the European Union’s General Data Protection Regulation (GDPR) privacy and security laws. The investigation identified specific problems with TikTok’s default account settings, the Family Pairing settings, and its age verification process (although the age verification model did not violate GDPR, the probe found that TikTok did not sufficiently protect the privacy of children under 13 who were able to create an account). Continue reading Ireland Fines TikTok $368 Million for Mishandling of User Data
By
Paula ParisiAugust 21, 2023
Illinois has become the first state in the nation to pass legislation protecting children who are social media influencers. Beginning in July 2024, children under 16 who appear in monetized video content online will have a legal right to compensation for their work, even if that means litigating against their parents. “The rise of social media has given children new opportunities to earn a profit,” Illinois Senator David Koehler said about the bill he sponsored. “Many parents have taken this opportunity to pocket the money, while making their children continue to work in these digital environments. Continue reading Illinois Law Protecting Child Vloggers Will Take Effect in 2024