New Jersey and Pennsylvania attorneys general to take part in multi-state probe of TikTok’s impacts
New Jersey is co-leading a multi-state probe into the impact of TikTok on its youngest users, Acting Attorney General Matthew Platkin said on Thursday. Pennsylvania Attorney General Josh Shapiro later announced that the state would join the investigation.
Several attorneys general throughout the United States launched the investigation to determine whether or not TikTok’s promotional material and user-based algorithm violates consumer protection laws, particularly by inducing young people to use the app for long periods of time.
“Many parents and child advocates are rightfully concerned about the impact of social media usage on young people’s safety and wellbeing,” said Platkin. “If social media platforms like TikTok and Instagram are violating our laws and exposing young users to psychological and physical harms, we will hold them accountable.”
Other states taking part in the probe include California, Florida, Kentucky, Massachusetts, Nebraska, Tennessee, and Vermont, among a range of others that have pledged support.
The focus of the investigation is primarily on TikTok’s increased levels of social media engagement in minors and young adults, and whether the mental health impacts faced by these young people are intentional on behalf of the app’s creators.
TikTok spokesperson Mahsau Culliane told NJ.com that TikTok’s leadership will cooperate with the investigation, adding that the app’s creators care about improving user experience and making it safe for its entire user base.
“Our children are growing up in the age of social media — and many feel like they need to measure up to the filtered versions of reality that they see on their screens,” said California Attorney General Rob Bonta. “We know this takes a devastating toll on children’s mental health and well-being. But we don’t know what social media companies knew about these harms and when.”
TikTok has, in many ways, revolutionized social media, in part shown through other platforms working to emulate the short-form video app’s mysterious, user-generated algorithm. There is little published about how the algorithm works, though it has shown to be effective in keeping users engaged for long periods of time.
In June 2020, TikTok published a blog post detailing how it suggests video content on users’ “For You” pages, which includes user interaction, like accounts that the user follows, videos that they’ve liked or scrolled past quickly, videos that the user makes themselves, and comments that they add on other users’ videos.
Other facets of the platform’s recommendation system includes video information, including hashtags, sounds, and captions, as well as user information. There is also an option within TikTok’s system to allow users to manually show that they are not interested in a particular video, and similar content will be shown to them less as a result.
In the post, TikTok details the inherent issue with the recommendation system. It can create a “filter bubble” which works similarly to an echo chamber, providing content that is skewed directly to what a user already knows and believes in.
TikTok’s developers also review some content independently to ensure that videos depicting graphic or explicit content is removed quickly after being uploaded.
In reality, though, young users are often exposed to that content anyway. A Wall Street Journal investigation in 2021 used a bot disguised as a 13-year-old user in order to see whether or not the app would show “adult-themed” content despite the user’s age being listed on the account.
The more the bot lingered on videos showing more sexual content meant for adults, the more those videos would appear on the “For You” page. TikTok told The Wall Street Journal that they do not currently differentiate between minor accounts and adult accounts, but are working to implement additional filtering on the app.
In December 2021, TikTok updated its recommendation guidelines, effectively saying that some adult-themed video content would not be eligible for recommendation as a whole.
In February, the app’s creators updated its Community Guidelines further, attempting to calm an influx of complaints that the app was contributing to eating disorder and self-harm triggers in users through its recommendation system.
A similar multi-state investigation was launched in November 2021 into Instagram’s consumer protection for young people, through both Facebook and Instagram.
In May 2021, 44 state attorneys general penned a letter to Meta CEO Mark Zuckerberg, urging the company to abandon its plans to create an Instagram platform for children under 13.