Chapter 15: Your Kids Are Being Watched
In Lawrence, Kansas, nine high school students walked into a federal courthouse in August 2025 and filed a lawsuit against their own school district. Their crime, according to the artificial intelligence software installed on every school issued device in the district, was creating art. Gaggle, a monitoring program running silently on school laptops and tablets across 1,500 American school districts, had flagged student artwork as "child pornography." The software also seized journalism files from the student newspaper staff, preventing four editions of the paper from going to print. These were not delinquents. These were artists, writers, and student journalists whose school district had handed their digital lives over to an algorithm that branded them as criminals.
Halfway across the country in Minneapolis, a transgender teenager named Logsdon Wallace sat down to write a school assignment about a painful chapter in his life. He wrote about past suicidal thoughts. He was trying to be vulnerable with his teacher. Gaggle flagged the assignment. Two days later, school officials contacted his parent. "I was trying to be vulnerable with this teacher," he said. "Now the school is contacting my counselor and is freaking out." The system designed to protect him punished him for being honest.
These are not isolated stories. Twenty seven million American students are monitored by a single company called GoGuardian, which tracks browsing histories, documents, search queries, and screen activity in real time, including outside school hours, including inside their own homes. In Baltimore, school police received GoGuardian alerts on nights, weekends, and holidays. Teachers viewed student webcam footage from bedrooms and living rooms without parental consent. A separate company called Gaggle scans billions of student communications annually across six million students, using a combination of artificial intelligence and hourly wage human content moderators to review school provided emails, documents, and chats around the clock.
And here is the part that should stop every parent reading this book in their tracks. A 2023 RAND Corporation study found only "scant evidence" that any of these surveillance tools effectively prevent youth suicide. That is the primary justification schools give for using them. The evidence says the justification does not hold up. Your children are being surveilled at a scale and intensity that would have been unthinkable a decade ago, and the system producing that surveillance has almost no accountability, almost no transparency, and almost no proof that the surveillance works.
This chapter is about what happens when an entire country decides to watch its children instead of protecting them.
The Data Extraction Starts at Birth
By the time an American child turns 13, an estimated 72,000 data points have been collected about that child. The extraction begins with connected baby monitors and smart toys. It accelerates through entertainment apps that harvest location data, browsing behavior, and voice recordings. And then the school system takes over, feeding academic records, behavioral assessments, disciplinary histories, disability information, and counselor notes into a sprawling network of technology vendors whose names most parents have never heard.
The average American school district now deploys 2,591 educational technology tools every year. In the 2016 to 2017 school year, that number was roughly 300. The pandemic drove an eightfold increase, and schools never scaled back. A staggering 96 percent of school recommended apps share student personal information with third parties. The data flowing out of those apps extends far beyond report cards. Browsing histories, search queries, keystroke patterns, documents created, emails sent, calendar entries, and in some cases biometrics and precise geolocation all flow through vendor systems that parents never agreed to and rarely know about.
Parents post an average of 1,300 photos and videos of their children before those children turn 13. Barclays, the financial services firm, projects that this kind of oversharing by parents will account for two thirds of identity theft targeting young people by 2030. Nine hundred fifteen thousand American children fall victim to identity fraud every year. Children are 51 times more likely than adults to have their identities stolen, partly because the theft goes undetected for an average of 12 years. A teenager applies for a first credit card or a student loan and discovers that a stranger has already destroyed the credit file.
One in every 50 American children gets victimized each year. The rate surged 40 percent between 2021 and 2024.
The Law Finally Moves, a Decade Late
On January 16, 2025, the Federal Trade Commission voted unanimously, five to zero, to finalize the first major amendments to the Children's Online Privacy Protection Act Rule since 2013. The updated rule took effect on June 23, 2025, with a compliance deadline of April 22, 2026.
The biggest change expands the definition of "personal information" to include biometric identifiers like fingerprints, retina patterns, voiceprints, gait patterns, and facial templates, along with government issued identifiers like Social Security numbers. The rule now requires separate verifiable parental consent before any company discloses children's personal information to third parties for targeted advertising. That means companies need your specific permission before they sell your child's data to advertisers. Companies must identify the specific third party recipients by name in their notices to parents. Three new consent verification methods were approved, including text based mechanisms, knowledge based authentication, and facial recognition matching against government identification.
Behind the scenes, the rule now requires every operator collecting children's data to maintain a written data retention policy that spells out why the data was collected, why the company needs to keep the data, and when the company will delete the data. Personal information cannot be retained indefinitely. Every operator must also maintain a written information security program with a designated coordinator, annual risk assessments, and regular testing. If you are a parent reading this, you now have the legal right to demand that any company collecting your child's data show you these written policies. Exercise that right.
The FTC also dropped several proposed provisions. Restrictions on push notifications to children were abandoned. A proposal to let educational technology providers obtain parental consent through schools was shelved. FTC Chair Andrew Ferguson identified the rule's failure to create a clear exception for age verification data collection as a significant gap. The FTC partially addressed that gap in February 2026 with a policy statement saying the agency would not pursue operators who collect data solely for age verification purposes, as long as those operators promptly delete the data and maintain reasonable security.
Congress Talks, Nothing Passes
No major federal children's privacy statute has been enacted since COPPA became law in 1998. The Kids Online Safety Act passed the Senate 91 to 3 in July 2024. Ninety one to three. That is as close to unanimous as the United States Senate gets on anything. The bill stalled in the House. Reintroduced in May 2025, the bill was folded into the broader Kids Internet and Digital Safety Act, which the House Energy and Commerce Committee passed 28 to 24 on March 6, 2026. The bill requires platforms to put safeguards in place by default for known minors, provide parents with tools to manage usage, and allow minors to disable addictive features and opt out of algorithmic recommendations.
COPPA 2.0, the Children and Teens' Online Privacy Protection Act, would raise the protected age from under 13 to under 17, ban targeted advertising to children and teens, create an "eraser button" for data deletion, and establish a Youth Marketing and Privacy Division at the FTC. The Congressional Budget Office estimated 164 million dollars in additional enforcement revenues over the next decade. The bill advanced from the Senate Commerce Committee in June 2025. Forty state attorneys general urged Congress to preserve state authority over children's privacy. Neither KOSA nor COPPA 2.0 has reached the President's desk.
The ACLU and the Electronic Frontier Foundation have raised concerns that KOSA could be used to suppress LGBTQ content under politically motivated enforcement. GLAAD reversed its support of the bill in 2025 after changes in FTC leadership. The result is that the patchwork of state laws continues to grow while Congress debates.
The FTC Goes After the Biggest Names
The Federal Trade Commission under both the Biden and Trump administrations has made children's privacy its top enforcement priority, and the results have been striking.
The Epic Games and Fortnite case remains the landmark. In December 2022, Epic Games agreed to pay 520 million dollars. The 275 million dollar COPPA penalty was the largest in FTC history. An additional 245 million dollars went to consumer refunds. The case addressed Fortnite's collection of children's data without parental consent and the game's default settings, which activated voice and text chat matching children with adult strangers. Internal documents showed employees urged the company to change those default settings as early as 2017. The company refused. Children were bullied, threatened, and sexually harassed through the platform. By June 2025, the FTC had distributed more than 200 million dollars in refunds to over 969,000 players.
The pace picked up in 2025. Cognosphere, the maker of Genshin Impact, paid 20 million dollars in January 2025 for collecting children's data without consent and running deceptive loot box schemes. The FTC deemed the game "directed to children" based on its anime style graphics and childlike characters. Disney paid 10 million dollars in September 2025 after failing to properly label child directed YouTube videos featuring Frozen, Toy Story, and Pixar properties as "Made for Kids," which allowed targeted advertising without parental consent. YouTube had flagged over 300 Disney videos as early as mid 2020. Disney kept its original designation policy in place for years after those warnings.
Illuminate Education became the first educational technology company to face federal enforcement in December 2025 after a breach exposed 10.1 million student records, including disability information, disciplinary records, and health data. The company stored student data in plain text despite promising schools that data was encrypted. Third party auditors flagged security vulnerabilities as early as January 2020. The company ignored every warning. When breaches occurred, some school districts did not receive notification for nearly two years. The FTC's consent order permanently banned the company from misrepresenting data security practices and required deletion of unnecessary student data within 90 days.
The Department of Justice, on FTC referral, sued TikTok in August 2024 in what remains the most aggressive pending children's privacy case. The complaint alleges TikTok knowingly allowed millions of children under 13 on the platform despite a 2019 consent order that carried a 5.7 million dollar penalty. TikTok built back doors allowing children to bypass age gates using Google and Instagram credentials. Account reviewers spent an average of five to seven seconds per review when deciding whether a user was a child. TikTok's motion to dismiss was largely denied in November 2025. The case is proceeding through discovery. The penalties sought go up to 51,744 dollars per violation per day.
Smaller cases tell an equally important story. A Chinese robot toy company called Apitor Technology paid 500,000 dollars after its companion app sent children's geolocation data to servers in China. NGL Labs paid 5 million dollars and its founders were personally banned from offering anonymous messaging apps to children after the company sent fake messages to drive engagement. The Sendit app, which knew 116,000 of its users were under 13, faces a pending lawsuit after sending provocative fake messages to children, including messages asking "have you done drugs?" When you see your child playing with a connected toy or downloading an anonymous messaging app, these are the kinds of companies behind those products.
The Surveillance Machine Inside Your Child's School
Let's go back to those school monitoring tools, because this is where the story gets personal for every family with a child in public school.
GoGuardian monitors 27 million students across roughly half of all American K 12 public schools. The company tracks browsing histories, documents, search histories, and screen activity in real time. The monitoring does not stop when the school bell rings. In Baltimore City Public Schools, school police received GoGuardian alerts on nights, weekends, and holidays. Teachers viewed student webcam footage from inside students' homes without consent. Gaggle scans billions of student communications annually across approximately six million students in 1,500 districts, using AI and hourly wage human content moderators to review school provided emails, documents, and chats 24 hours a day, seven days a week. Bark, Securly, and Lightspeed Systems round out the monitoring ecosystem, collectively watching tens of millions of students around the clock.
The Electronic Frontier Foundation's October 2023 investigation, titled "Red Flag Machine," found that GoGuardian's false positives heavily outweigh accurate detections. The system flagged Bible verses containing the word "naked." The system flagged Texas legislature pages about cannabis bills. The system flagged college application websites. The system flagged LGBTQ information. Research from the Center for Democracy and Technology found that 29 to 30 percent of LGBTQ students reported being outed as a result of school activity monitoring. Think about that. Nearly a third of LGBTQ students in monitored schools had their sexual orientation or gender identity exposed because of software their school installed.
In Vancouver, Washington, nearly 2,200 students, representing 10 percent of enrollment, were flagged by Gaggle in a single school year. Forty four percent of teachers reported students contacted by police as a result of monitoring. In Austin, Texas, GoGuardian alerts for "sexual content" go directly to the school district's police department. A 2025 study from the Center for Democracy and Technology found students flagged by monitoring tools were contacted by immigration enforcement.
Turn off school issued devices when your child finishes homework. Make sure your child never logs into personal accounts on a school device. Ask your school's administration for a written copy of the school's monitoring policy, and ask specifically which companies have access to your child's data. If the school cannot answer those questions, that silence tells you everything you need to know.
The Law That Protects Nothing
The Family Educational Rights and Privacy Act, signed into law in 1974, is supposed to protect student records. In practice, FERPA creates a false sense of security.
Through regulatory amendments in 2008 and 2011, the Department of Education, without a Congressional vote, expanded the "school official" exception to allow schools to designate virtually any educational technology vendor as a "school official" with a "legitimate educational interest." That designation allows schools to share your child's data with those vendors without your consent. The Electronic Privacy Information Center described these amendments as the cause of educational data flowing nearly unrestricted from schools to third parties.
FERPA's enforcement mechanism has never worked because the only penalty available, withdrawal of all federal funding from a school, is so severe that the government has never imposed the penalty in 50 years. Not once. There is no private right of action under FERPA. The Supreme Court ruled in 2002 that students and parents cannot sue schools for FERPA violations. Only 12 percent of school websites include any navigation to data privacy information. The Student Privacy Policy Office relies on voluntary compliance. And the executive order directing the closure of the Department of Education puts the primary FERPA enforcement body at risk of elimination entirely.
When educational technology companies fail or get acquired, student data follows the money. When ConnectEDU went bankrupt in 2014 with 20 million student records, the FTC had to step in to prevent the sale of that student data to a venture capital fund. When AllHere Education collapsed in 2024 after building an AI chatbot for the Los Angeles Unified School District, a whistleblower revealed student data had been processed on offshore servers in Japan, Sweden, and France. The CEO was arrested on fraud charges. Anthology, the parent company of Blackboard, filed for Chapter 11 bankruptcy in September 2025 with more than one billion dollars in debt, putting student data from hundreds of institutions into bankruptcy proceedings. If your child's school uses one of these platforms, ask the school what happens to your child's data if the company goes under. You deserve an answer, and right now, the law does not require the school to give you one.
Fifty States, Fifty Different Sets of Rules
The federal vacuum has produced a sprawling patchwork of state laws and an equally sprawling body of litigation challenging those laws. NetChoice, the technology industry trade group whose members include Google, Meta, Amazon, TikTok, and Snap, has filed lawsuits in at least 15 states and won most of them.
California's Age Appropriate Design Code Act has been mostly blocked by courts since its passage in 2022. In March 2026, the Ninth Circuit affirmed injunctions against five of six challenged provisions, finding terms like "best interests of children" and "materially detrimental" unconstitutionally vague. Laws in Arkansas, Ohio, and Louisiana have been permanently struck down on First Amendment grounds. Georgia and Utah's laws are preliminarily blocked.
Florida's HB 3 is the sole survivor. After a district court blocked the law, the Eleventh Circuit stayed the injunction in November 2025 in a two to one decision, finding the law likely content neutral and satisfying intermediate scrutiny. That makes Florida's law the only social media access law to survive appellate review as of March 2026.
A newer wave of laws aims to survive constitutional challenge by focusing on platform design rather than content restrictions. Maryland's Kids Code has been in effect since October 2024 and survived a motion to dismiss in November 2025. Nebraska, Vermont, and South Carolina enacted their own design codes in 2025 and 2026. New York's Child Data Protection Act took effect in June 2025, prohibiting operators from collecting or selling personal data of users under 18 unless strictly necessary. If you live in one of these states, learn the specific protections your state provides and use them. File complaints with your state attorney general when companies violate those protections.
The Age Verification Trap
The Supreme Court's six to three decision in Free Speech Coalition v. Paxton in June 2025 reshaped the legal terrain by upholding Texas's age verification requirement for adult content. The ruling applies narrowly to sexually explicit material obscene to minors. Courts continue to strike down age verification requirements for general social media platforms.
The deeper problem is what privacy advocates call the "age verification trap." To protect children's privacy, every system proposed so far requires collecting sensitive data about everyone, including adults. Facial age estimation technology from companies like Yoti processes over 850 million age checks worldwide, with a mean error of about 1.2 years for people around age 18. That technology requires a selfie. Government ID upload creates centralized databases that become targets for hackers. One vendor breach already exposed 70,000 government ID records. When Discord announced mandatory age verification in February 2026 requiring selfies or government identification, the backlash was severe enough to delay the rollout worldwide.
The FTC's February 2026 enforcement policy attempted to create a safe harbor for operators collecting data solely for age verification, as long as those operators promptly delete the data and maintain reasonable security. The technology industry remains deeply divided. Meta pushes for app store level verification, which would shift the burden to Apple and Google. Apple and Google prefer device level approaches using age range signals that never expose a child's birthdate to app developers. Roughly half of American states now mandate age verification for adult content, and at least 17 states have enacted social media access laws for minors. Most of those laws face legal challenges.
The Rest of the World Figured This Out
The United Kingdom's Age Appropriate Design Code, in force since September 2021, establishes 15 enforceable design standards for any online service likely to be accessed by children under 18. An independent assessment documented 44 platform changes to community standards enforcement, 43 changes to content and advertising safeguards, and 31 changes to privacy settings across major platforms as a direct result of the law. TikTok changed default privacy to private for users ages 13 to 15. Instagram made all accounts created by users under 18 private by default. YouTube disabled autoplay for minors. One major technology company told regulators that the UK Code had a greater effect than GDPR enforcement actions.
The European Union enforces children's protections with financial penalties that make American fines look like pocket change. Ireland's Data Protection Commission fined TikTok 345 million euros in 2023 for children's privacy violations, fined Instagram 405 million euros in 2022 for default public settings on children's accounts, and fined TikTok again for 530 million euros in 2025 for illegal data transfers to China. The EU's Digital Services Act requires private by default settings for minors and prohibits profiling based advertising targeted at children. Australia went the furthest in December 2025, enacting the world's first outright ban on social media for children under 16, with penalties up to 49.5 million Australian dollars.
The United States lacks every major protection that exists in those countries. No federal design based regulation for children. No federal privacy by default requirement for minors. No anti profiling rules for children. No data minimization requirements. No mandatory impact assessments for children's data. COPPA fines are modest compared to the EU's penalty of four percent of global annual turnover. When you hear people say "we do not know how to protect children online," remember that the United Kingdom, the European Union, and Australia have already shown exactly how.
62 Million Student Records and the Ransom That Solved Nothing
The PowerSchool breach of December 2024 stands as the defining data disaster of this era. Between December 19 and December 28, hackers accessed PowerSchool's customer support portal using a compromised employee credential. The system lacked mandatory multi factor authentication. The hackers pulled 62 million student records and 9.5 million educator records, including names, dates of birth, Social Security numbers, medical alert information, disciplinary records, and individualized education plans. PowerSchool serves 75 percent of the American education market.
PowerSchool paid the ransom. The hackers provided a video showing the data being deleted. On May 7, 2025, extortion emails containing samples of the stolen data arrived at schools in Canada and North Carolina. The data had not been deleted. Stolen transcripts appeared on dark web marketplaces priced between 50 and 300 dollars per record. Matthew Lane, a 19 year old Massachusetts college student who demanded 2.85 million dollars from PowerSchool, was sentenced on October 14, 2025 to four years in federal prison and ordered to pay 14 million dollars in restitution. Prosecutors acknowledged that money would likely never be collected. More than 100 school districts sued PowerSchool.
The breach exposed a systemic truth that every parent needs to understand. Schools retain student records for decades. There is no federal law requiring schools to delete old data. That decades long retention policy meant the breach captured historical records going back years, affecting students who had long since graduated. Education is now the most attacked sector globally, with 4,388 cyberattacks per week targeting schools in the second quarter of 2025, a 31 percent year over year increase. If your child attends a school that uses PowerSchool or any similar platform, freeze your child's credit report at all three bureaus today. Do not wait. The breach has already happened.
Teaching Your Children What No Law Will Teach Them
Since the legal protections are not keeping pace with the threats, families need to fill the gap themselves. Common Sense Media's Digital Citizenship Curriculum, accessed by 1.3 million educators in more than 88,000 schools, provides the most widely adopted framework for teaching children about privacy, digital footprints, and information literacy from kindergarten through 12th grade. Developed in collaboration with Harvard's Project Zero, the curriculum teaches decision making frameworks rather than rigid rules, because the specific platforms and threats change faster than any list of do's and don'ts.
Age appropriate approaches matter. Children ages five to seven learn not to share personal information and to ask a parent before downloading anything. By ages 11 to 13, students start to understand how companies make money from their data, learn to evaluate terms of service, and practice managing privacy settings on platforms. High schoolers study algorithmic profiling, data broker ecosystems, and their legal rights. Leah Plunkett of Harvard Law School puts the point sharply. Privacy literacy needs to go beyond what children post. Children also need to understand what gets posted about them and what systems collect about them invisibly.
Here is what you and your family should be doing right now. Keep school issued devices turned off when your child is not using them for schoolwork. This prevents surveillance tools from activating during personal time. Never let your child log into personal email, social media, or other personal accounts on a school device. Freeze your child's credit report at Equifax, Experian, and TransUnion. Conduct regular privacy audits by searching your child's name online and checking what comes up. Model good privacy behavior yourself.
Forty one percent of parents say they would share less about their children online if they could start over. Start over now. Every photo, every check in, every proud post about a child's accomplishments adds another data point to a profile that will follow that child into adulthood.
The System Needs to Change, and You Need to Act
American children grow up inside a data extraction machine that operates at every stage of their lives. Entertainment apps collect behavioral data through addictive design. Schools funnel academic, behavioral, and emotional information through thousands of vendor relationships governed by a federal law that has never been enforced. Data brokers aggregate all of the data and offer lists of nearly six million high school students along with data on children as young as two. Monitoring software runs around the clock, flags LGBTQ students for who they are, sends police to children's homes for after hours internet searches, and produces no verifiable evidence of preventing the harms schools cite to justify the surveillance.
The 2025 COPPA amendments and the FTC's aggressive enforcement represent genuine progress. The core architecture of childhood surveillance remains intact. COPPA protects only children under 13. FERPA's enforcement mechanism has never been used. Neither KOSA nor COPPA 2.0 has become law. The states pushing hardest face systematic legal challenges from the industry trade groups whose members profit from the current system. The United Kingdom, the European Union, and Australia have already demonstrated that enforceable, design based regulation works. Platforms changed their behavior when those countries confronted them with real penalties.
America knows how to protect its children's data. The models exist. The question is whether we will demand that our elected leaders act on what the rest of the world has already proven. Contact your members of Congress and tell them to pass COPPA 2.0 and the Kids Online Safety Act. File complaints with your state attorney general when companies violate your state's children's privacy laws. Show up at your next school board meeting and ask the superintendent, on the record, exactly which companies have access to your child's data and what happens to that data if those companies go bankrupt. Demand written answers. The only way this changes is if parents refuse to accept silence as an answer.
Your children deserve better than this. And deep down, you already know that.