Despite the fact that
Definition
FACTS: Epic Games, Inc. (Epic) (defendant) developed the successful video game Fortnite. From its launch in 2017, Fortnite was popular among children and young teens. Epic leaned into that popularity by, among other things, using cartoon-like graphics and colorful animation, avoiding blood or gore, and emphasizing building features that allowed players to construct forts and other creations within the game. Additionally, Epic partnered with musicians and celebrities popular among children and teens to promote the game and created a licensing plan focused on Fortnite-branded goods targeting youth, such as toys, clothes, and Halloween costumes. Fortnite was an interactive game. Players, including children and teens, were linked with strangers around the world to play together. Players’ account names were publicly broadcast, and by default, real-time voice and text chat was turned on to encourage communication between players. The result was that some children and teens were bullied, threatened, and harassed within the game, sometimes sexually. Further, other children and teens were exposed to traumatizing topics like suicide and self-harm. Epic targeted Fortnite at children and knew that most players were indeed children, Epic consistently deprioritized and delayed the implementation of privacy and parental controls. When Fortnite launched, it had no parental controls and minimal privacy settings. Parents did not receive any notice describing Epic’s collection, use, and disclosure of children’s personal information. Nor did Epic obtain verifiable parental consent for such data collection. Between 2017 and 2019, Epic made some improvements to privacy and parental controls. Not long after Fortnite launched, Epic added a toggle switch that allowed players to disable voice chat. However, Epic did not inform players of the addition and buried the toggle in the middle of a detailed settings page. Further, voice chat was on by default. In 2019, Epic began requiring players in the United States under age 12 to provide a parent’s email address. Epic would send emails to those addresses explaining how Epic collected personal information and asking parents to provide verifiable parental consent. However, the changes did not apply to the millions of child players with existing accounts. Further, the default settings for privacy remained the same, with players’ display names publicly broadcast and real-time direct communication between players turned on. In 2022, the United States government (plaintiff), acting on behalf of the Federal Trade Commission, filed suit against Epic, alleging that Epic violated the Children’s Online Privacy Protection Act by collecting, using, and disclosing children’s personal information without providing notice to parents, obtaining parental consent, providing parents with a means to review collected information, or honoring parent requests that collected information be deleted. The government also alleged that Epic violated § 5 of the Federal Trade Commission Act, which prohibits unfair or deceptive practices in or affecting commerce. The government alleged that publicly broadcasting children’s player names and putting children, by default, in real-time voice and text contact with strangers created a substantial risk of injury that the players could not reasonably avoid themselves and therefore constituted an unfair practice. [Editor’s Note: The casebook excerpt is from the government’s complaint against Epic, not from a judicial decision.]