The war over TikTok has many battle lines. A Biden era law that would have banned TikTok as of January 1, 2025 if it didn’t find a U.S. owner for its operations here was aimed at national security concerns. Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act to protect the data of U.S. users of the social media platform from the Chinese government and deter its ability to manipulate TikTok’s content. A deal to spin those operations into an entity with majority control through U.S. investors is ostensibly in place.
But that deal with ByteDance, Ltd., the China-based parent company of TikTok, is far from the end of its U.S. legal entanglements. More than a dozen Attorneys General across the country have sued ByteDance under their various consumer protection laws to challenge the manner in which they allege TikTok targets and seeks to addict teen users. North Carolina’s suit, State of N.C. v. TikTok, Inc., 2025 NCBC 47, recently survived spirited motion to dismiss practice in the Business Court.
TikTok’s pervasive influence in the United States fueled the interests of Congress and the AGs. A recent Pew Research Center study showed that 43% of adults under 30 said they regularly got news on TikTok – up from only 9% five years ago. North Carolina’s complaint alleges that nearly a million North Carolina teens, and hundreds of thousands of younger children, used TikTok as of 2023. As Judge Conrad put it, “It would be hard to overstate TikTok’s popularity, especially with minors.” Id. ¶ 5.
North Carolina’s suit advances an unfair and deceptive trade practices claim with two principal thrusts: that ByteDance “unfairly designed TikTok to be addictive to minors despite knowledge that compulsive use harms them” and that it misrepresented alleged safety features and their benefits for young users. Id. ¶ 14. The State’s design claims center on the reduced capacity of minors to resist an algorithm that feeds videos based on past interactions, features like “infinite scroll” that direct a constant stream of videos to users, and various filters and features that allow users to augment their videos, share with others, and amass reactions from other users. Id. ¶¶ 7-8.
First Amendment: The Court rejected ByteDance’s claim that the operation of an algorithm reflected the kind of “editorial discretion” which is regularly protected as free speech. ByteDance argued that it had constitutional protection for its selection, organization, and display of videos to users. But the Court sided with the State’s view that “features that induce compulsive use” are not expressive activity protected under the First Amendment. Judge Conrad reasoned that it was “hard to discern any expressive activity” where an algorithm acted as a content-neutral delivery engine as opposed to one that steered videos based “on the semantic nature of the content itself.” Id. ¶¶ 49-50.

The Supreme Court didn’t reach the algorithm issue in its recent Moody v. NetChoice, LLC, 603 U.S. 707, 716 (2024), decision which conceded that “some [social-media] platforms, in at least some functions, are indeed engaged in expression” protected by the First Amendment. Yet, the Business Court observed that because the algorithm “does not convey a message” by TikTok but “simply bows to user preferences and propensities,” the complaint fairly pled a lack of expressive activity. The complaint, accepted as true, “supports an inference that a reasonable person would understand TikTok’s video feed to reflect a given user’s content choices as opposed to ByteDance’s own creative expression or editorial judgment.” Id. ¶¶ 47, 50. Perhaps discovery will show whether there are “reasonable” people among TikTok’s legions of news-consuming users who’d believe that TikTok exercises at least the same level of editorial judgment about the selection, placement, and delivery of that news as do news aggregation sites that link to the content of others in ways designed to encourage clicks?
Unfair and Deceptive Trade Practices: The Court had little trouble in finding the State had adequately pled a Chapter 75 claim. It noted that numerous other states had alleged similar consumer protection claims, and that “one court after another has concluded that designing an app to induce addictive, compulsive use by minors comfortably qualifies as an unfair practice.” Id. ¶ 59.
Judge Conrad rejected ByteDance’s “free will” argument: that users could decide whether to use TikTok, and how often. The Court found that an ill fit for the complaint’s allegations that ByteDance targeted “the unique vulnerabilities that accompany youthful immaturity” and did so “allegedly knowing that addiction causes social and psychological harms to minors.” Id. ¶ 60.
Worth Noting
- The Business Court followed recent federal decisions that have limited the immunity that platforms like TikTok enjoy under 47 U.S.C. § 230(c)(1) to avoid being treated as the “publisher” of the content its users post. The Court construed Section 230’s shield to not ward off claims that TikTok was designed to be addictive to minors. That, the Court held, permissibly treats ByteDance “as a product designer, not a publisher.” Id. ¶ 40.
Brad Risinger is a partner in the Raleigh office of Fox Rothschild LLP.
