Chapter 06: Once Someone Captures Your Face and Voice, No Password Reset on Earth Can Fix It

Here is a question I need you to sit with for a moment. If someone stole your credit card number tonight, how long would it take you to fix it? A phone call, maybe two. The bank cancels the old number and mails you a new card within a week. Now answer this one. If someone stole your face, your fingerprints, or your voice, what would you do? Call the bank and ask for a new face? Request a replacement set of fingerprints?

You cannot do that. You will never be able to do that. And that single fact is the reason biometric privacy is the most dangerous and least understood threat to your personal freedom in America today.

Right now, your face is stored in databases you have never heard of. A company called Clearview AI has scraped more than 50 billion photographs from Facebook, Instagram, YouTube, LinkedIn, and thousands of other websites, building a searchable facial recognition database that at least 3,100 law enforcement agencies have used to identify people. You were not asked. You were not told. Your face was taken and filed away like inventory in a warehouse.

At the same time, cameras at more than 250 airports, all 30 NFL stadiums, thousands of retail stores, and police departments from coast to coast are scanning faces and matching them against databases every single day. If the system gets it right, nobody notices. If the system gets it wrong, an innocent person goes to jail. And the system gets it wrong far more often when the face it is scanning belongs to a person of color, a woman, a child, or an elderly person.

I am going to walk you through exactly how this technology works, where it is watching you, why the people who designed it built bias into its bones, and what you can do starting today to protect yourself and your family. I am also going to tell you about a grandmother in Tennessee who was arrested at gunpoint while babysitting four children, charged with a crime in a state she had never visited, and locked in a jail cell for 108 days because a facial recognition algorithm said she was someone she was not. When you hear her story, you will understand why this chapter matters more than you think.

Your Face, Reduced to Math

The technology behind facial recognition is deceptively simple. A camera captures an image of your face. Software maps the unique geometry of that face across roughly 80 points, measuring things like the distance between your eyes, the contour of your jawline, the width of your nose, and the depth of your eye sockets. A neural network then converts those measurements into a compact numerical code called a faceprint. Think of it as a mathematical fingerprint for your face. That code is unique to you. It belongs to you. And once it exists in a database, it is there for good.

From there, the system does one of two things. It can verify you against your own ID photo, which is called one to one matching. This is what TSA does at airport checkpoints when it compares your face to the photo on your license or passport. It can also search an entire database to figure out who you are, which is called one to many matching. This is what police do when they feed a surveillance camera image into a system and ask it to find a match among millions of stored faces. The FBI alone can search against more than 411 million photographs, including passport photos, visa photos, and military images.

Two types of systems exist in the market right now. Traditional systems compare flat, two dimensional photographs and struggle when lighting is bad or the face is turned at an angle. Three dimensional systems, like the Face ID technology in your iPhone, use infrared light to map the depth and contours of your face in the dark. Newer skin texture analysis can even distinguish identical twins by reading microscopic pore patterns invisible to the naked eye. The U.S. facial recognition market hit 1.75 billion dollars in 2025 and is projected to reach 3.89 billion dollars by 2030. This is not emerging technology. It is here, it is growing, and it is everywhere.

The Places That Are Already Scanning Your Face

Let me take you through a day in the life of an American who has no idea how many times their face is captured, analyzed, and stored.

You fly to visit your family. At the airport, TSA has deployed facial recognition scanners at more than 250 airports, covering roughly 58 percent of all commercial airports in the country. In summer 2025, TSA launched its Touchless ID program for PreCheck members, where no physical ID is required. A camera matches your face against a gallery of passport photos. In January 2026, TSA announced it would expand this program to 65 airports by spring 2026, with a goal of reaching more than 400 airports by the end of the year. All five major airlines, Alaska, American, Delta, Southwest, and United, are participating. Customs and Border Protection runs a separate facial comparison system at 238 airports for international travelers.

You stop at the grocery store. A January 2026 CNN investigation found that Walmart, Kroger, and Home Depot acknowledge facial recognition capabilities in their privacy policies. Wegmans triggered controversy that same month when signs appeared at its New York City stores disclosing the technology. Madison Square Garden made national headlines for using facial recognition to identify and eject lawyers from firms that had sued its owner. The landmark retail case involved Rite Aid, which deployed facial recognition in hundreds of stores between 2012 and 2020 to flag suspected shoplifters using a watchlist built from low quality security camera images and employee cellphone photos. The FTC banned Rite Aid from using facial recognition for five years after finding thousands of false matches and a pattern of deploying the technology in predominantly Black, Asian, and Latino communities.

You take your kids to a game. The NFL completed league wide deployment of facial recognition across all 30 stadiums during the 2024 to 2025 season. Major League Baseball runs a facial recognition entry system called Go Ahead Entry at ten ballparks, including Dodger Stadium and Citizens Bank Park, with entry lanes running 141 percent faster than traditional gates. A 2025 industry survey found that 47 percent of venue operators named biometrics as a top priority for the coming year.

Your kids go to school. New York's Lockport City School District invested four million dollars in a facial recognition security system in 2019, sparking a backlash that ultimately led New York to become the first state to ban facial recognition in schools. More than 60 colleges and universities have committed to refusing the technology.

And if you have ever posted a photo to social media, Clearview AI has almost certainly scraped it into a database of 50 billion images that police across the country search every day. That is where we are. Your face is being captured at the airport, at the grocery store, at the stadium, at your child's school, and from your own social media accounts.

Once It Is Stolen, It Is Stolen Forever

I want to make sure you understand why biometric data is fundamentally different from every other kind of personal information. When your credit card number gets stolen, the bank issues a new one. When your password gets hacked, you create a new password. When your Social Security number is compromised, you can freeze your credit and monitor for fraud. These are serious problems, and I do not want to minimize them. They cause real harm to real people. They also share one saving grace. You can take steps to contain the damage because the compromised information can be replaced or restricted.

Your face cannot be replaced. Your fingerprints cannot be replaced. Your iris patterns cannot be replaced. Your voice cannot be replaced. You have exactly one face, ten fingerprints, two irises, and one voice. Once a biometric template is stolen from a database, there is no reset button. The victim is exposed for life, on every system that uses biometric authentication.

The breach record tells you everything you need to know about how seriously companies and governments take this responsibility. In 2015, Chinese hackers breached the Office of Personnel Management and stole 5.6 million fingerprint records alongside 22.1 million personnel files. Biometrics experts warned that intelligence agents could now be identified by fingerprint even when operating under assumed names. The government spent over 130 million dollars on identity protection services. No service on earth can protect a fingerprint that has already been copied. 

In 2019, a security company called BioStar 2 exposed 27.8 million records, including raw fingerprint images from banks, defense contractors, and the UK Metropolitan Police. The company had stored the fingerprints in their original form, meaning attackers could fabricate physical replicas. Clearview AI suffered a data breach in February 2020 that exposed its entire client list. In October 2023, 815 million Indian citizens' biometric records from the Aadhaar database were offered for sale on the dark web for approximately 80,000 dollars.

Between 2018 and 2023, nearly 6 billion biometric records were compromised around the world. The average biometric breach now costs 5.22 million dollars, making it one of the most expensive categories of data to lose. Researchers have even demonstrated that encrypted biometric templates, which were once considered safe, can be reverse engineered. A research team showed the complete attack chain in a 2025 paper: take a stolen fingerprint template, feed it through a generative AI model, reconstruct the fingerprint image, print it on a silicone mold, and use that mold to successfully pass commercial fingerprint scanners. This is not science fiction. This is happening now.

The Four Billion Dollar Lawsuit That Changed Everything

In 2008, Illinois passed a law called the Biometric Information Privacy Act, known as BIPA. What made BIPA different from every other privacy law in America was a single provision. It gave ordinary people the right to sue. If a company collected your fingerprint, your faceprint, your voiceprint, or your iris scan without your informed written consent, you could take them to court. The law set damages at 1,000 dollars per negligent violation and 5,000 dollars per intentional or reckless violation, plus attorneys' fees.

That provision turned BIPA into the most consequential privacy law in the country. More than 2,000 lawsuits have been filed since 2017. Facebook settled for 650 million dollars in 2021 for scanning and tagging Illinois users' faces without consent. BNSF Railway faced a 228 million dollar jury verdict in 2022, the first BIPA jury trial in history, for requiring truck drivers to scan fingerprints at railyards without permission. Google paid 100 million dollars for extracting facial templates through Google Photos without consent. TikTok paid 92 million dollars for harvesting biometric data from users. Clearview AI settled for a 51.75 million dollar equivalent in March 2025, and because the company did not have enough cash, the settlement gave class members a 23 percent equity stake in the company instead.

Texas joined the fight with even bigger numbers. The Texas attorney general secured a 1.4 billion dollar settlement from Meta in July 2024, the largest single state privacy settlement in American history, for running facial recognition on virtually every face uploaded to Facebook for over a decade. In May 2025, Texas followed that up with a 1.375 billion dollar settlement from Google for collecting voiceprints and facial geometry through Google Photos, Google Assistant, and Nest Hub Max without consent. Texas alone extracted 2.775 billion dollars from two companies.

Twenty states have now enacted comprehensive privacy laws that classify biometric data as sensitive and require heightened consent. No federal biometric privacy law exists. Congress has tried. The Facial Recognition and Biometric Technology Moratorium Act and the American Privacy Rights Act have gone nowhere. As of March 2026, the states are fighting this battle alone.

Voluntary in Name Only

The TSA says its facial recognition program is entirely voluntary. I want to tell you what voluntary looks like in practice.

The Algorithmic Justice League, led by researcher Joy Buolamwini, published a report in July 2025 called "Comply to Fly?" that examined 420 traveler experiences across 91 airports. The findings were devastating. Ninety nine percent of travelers were not verbally told they could opt out. Half did not see any signage about the opt out option. Three quarters were completely unaware that opting out was even possible. Among the small number of travelers who did opt out, 67 percent reported negative treatment. TSA officers made hostile comments, used aggressive body language, subjected travelers to increased scrutiny, and caused delays. At Seattle Tacoma airport in December 2024, a TSA officer told a traveler who declined the face scan, "Really? That's ridiculous, you must be stupid."

A landmark 125 page staff report from the Privacy and Civil Liberties Oversight Board, published in May 2025 after a six year investigation, confirmed these failures. The board found that TSA has never published a single comprehensive Privacy Impact Assessment for its facial recognition program. The DHS directive governing facial recognition use disappeared from the DHS website after the change in administration in January 2025, and DHS could not confirm whether it remains official policy.

A bipartisan bill called the Traveler Privacy Protection Act, introduced in May 2025 by Senators Kennedy and Merkley, would have made human ID checks the default, required affirmative consent before each facial scan, and prohibited negative treatment of passengers who opt out. The airline industry killed it. Airlines for America, the U.S. Travel Association, and major airport groups sent a joint letter arguing the bill would increase wait times. Senator Cruz pulled the bill from the committee agenda at the last moment. It remains stalled.

If you want to opt out today, tell the TSA officer "I would like to opt out of the face scan" before the photo is taken. You do not need to explain why. Children under 18 are not photographed. If an officer gives you trouble, note their name and file a complaint with the TSA Contact Center. As Joy Buolamwini said, giving up your face data should not be the price of getting on an airplane.

When the Algorithm Gets It Wrong, Innocent People Lose Everything

The stories I am about to share with you are the reason I am writing this book. These are real people whose lives were destroyed because a computer said they were someone they were not.

Robert Williams is a black man who lives in Farmington Hills, Michigan. In January 2020, police arrested him in front of his wife and two young daughters for allegedly stealing watches from a store. The arrest was based on a match between grainy surveillance footage and his expired driver's license photo. He spent more than 30 hours in a filthy, overcrowded detention cell. During interrogation, police showed him the surveillance photo. He held it up next to his own face and said, "I hope you don't think all black people look alike." The case was dismissed. Williams was later diagnosed with PTSD and suffered a series of strokes. His June 2024 settlement with Detroit included 300,000 dollars and landmark policy reforms requiring police to obtain independent evidence before any facial recognition based arrest.

Porcha Woodruff, a 32 year old black woman, was arrested in February 2023 for carjacking and robbery. She was eight months pregnant. Six officers arrived at her Detroit home while she was getting her children ready for school. She pleaded with police to check whether the actual suspect in the video was pregnant. They declined. She spent 11 hours in custody and began having contractions from stress and dehydration. The actual perpetrator was not pregnant. Prosecutors dismissed the case within a month.

Nijeer Parks, a 33 year old black man from Paterson, New Jersey, was accused of shoplifting and trying to hit an officer with a car in Woodbridge, a town he had never visited. Police had run a blurry image from a fake Tennessee driver's license through facial recognition. Parks spent 10 days in jail and faced charges for nearly 10 months. Police ignored DNA and fingerprint evidence pointing to a different person.

Angela Lipps may be the most devastating case of all. She is a 50 year old grandmother from Tennessee. On July 14, 2024, U.S. Marshals arrested her at gunpoint while she was babysitting four children, charging her with bank fraud in Fargo, North Dakota, a state she had never visited. Bank records confirmed she was 1,200 miles away during every single transaction. She spent 108 days in jail before being transferred to North Dakota, where she first spoke with Fargo police on December 19, 2025, five full months after her arrest. Charges were dismissed on Christmas Eve. She was left stranded in Fargo without money or a coat. She lost her home, her car, and her dog. As of March 2026, Fargo police had not apologized.

A January 2025 Washington Post investigation documented at least eight wrongful arrest cases and found that across them, police failed to check alibis in six, ignored evidence pointing to someone else in two, and neglected to collect key evidence in five. In every single case, police arrested someone without independently confirming that person's connection to the crime. These are not glitches in a system that mostly works. These are the predictable consequences of a system that was built on biased data and deployed without adequate safeguards.

The Technology Fails the People Who Need Protection Most

The accuracy problems in facial recognition are not random. They fall along lines of race, gender, and age with disturbing consistency.

The most authoritative data comes from NIST, which evaluated 189 algorithms from 99 developers using 18 million images. The December 2019 report found that false positive rates, the rate at which the system incorrectly says two different people are the same person, were 10 to 100 times higher for Black and Asian faces than for white faces. American Indian faces had the highest false positive rates of any group tested. Women were misidentified more often than men across nearly all algorithms. Children and the elderly showed elevated error rates. In one to many searches, the type police rely on, most algorithms selected incorrect matches among Black women at significantly higher rates than any other demographic group.

The foundational research came from Joy Buolamwini, then a graduate student at MIT, who discovered the problem firsthand when facial recognition software could not detect her dark skinned face. She literally had to put on a white mask to be seen by the system. Her 2018 Gender Shades study found error rates of up to 34.7 percent for darker skinned women, compared to 0.8 percent for lighter skinned men. When the ACLU tested Amazon's Rekognition against photos of every member of Congress in 2018, it falsely matched 28 lawmakers as criminals, disproportionately people of color.

The bias exists for identifiable reasons. Training datasets are overwhelmingly composed of lighter skinned male faces. Camera technology has been calibrated for lighter skin tones since the 1950s, when Kodak used cards featuring exclusively white models to set color balance standards. Darker skin reflects less light, giving algorithms fewer distinguishing details to work with. Mugshot databases used by police are themselves racially skewed by decades of disproportionate policing, meaning the systems are trained on and deployed against the same communities.

IBM exited facial recognition entirely in June 2020. Amazon imposed a moratorium on police use of its Rekognition product. Microsoft banned police sales pending a national law grounded in human rights. No such law has arrived, and the broader market continues to grow.

Your Voice Is the Next Biometric Under Attack

Your voice is a biometric identifier, analyzed and stored just like a fingerprint or a faceprint. Major banks, including JPMorgan Chase, Wells Fargo, TD Bank, Charles Schwab, and Bank of America, use voiceprint authentication to verify callers. The system analyzes more than 100 characteristics of your speech, including pitch, cadence, accent, and the shape of your vocal tract. Your voiceprint is stored as a mathematical template. It cannot be changed if compromised.

AI can now clone any voice with startling accuracy. Tools from Microsoft, OpenAI, and ElevenLabs can produce a functional clone from as little as 5-10 seconds of recorded audio. A 2024 report identified more than 350 voice cloning tools on the market. McAfee Labs found that a convincing clone could be created for five dollars and ten minutes of setup time.

These cloned voices are defeating bank security systems. University of Waterloo researchers demonstrated in 2023 that they could bypass voice authentication with a 99 percent success rate within six attempts. A journalist used a free voice cloning tool to replicate his own voice and successfully breached a major bank's voice ID system. A survey found that 91 percent of U.S. banks are now reconsidering their voice authentication programs because of AI cloning.

The fraud is already causing real financial damage. In 2021, criminals used cloned voices to steal 35 million dollars from a bank in the UAE. In January 2024, finance workers at a British engineering firm transferred 25.6 million dollars after a deepfake video call in which every participant, including the apparent CFO, was AI generated. Sharon Brightwell of Dover, Florida, wired 15,000 dollars in July 2025 after receiving a panicked call from what sounded exactly like her daughter. An estimated one in four Americans has already been targeted by a voice cloning scam.

The FCC ruled in February 2024 that AI generated voices qualify as artificial under the Telephone Consumer Protection Act, making unauthorized AI robocalls illegal. Tennessee became the first state to expressly protect against AI voice cloning through its ELVIS Act. Texas followed with its own disclosure and consent requirements.

The single best protection against voice cloning scams is old fashioned and completely free. Pick a family code word, a secret phrase known only to your closest relatives, that must be spoken in any emergency call asking for money. If someone calls claiming to be your child, your spouse, or your parent in distress, ask for the code word. Hang up and call them back at a number you know. Limit the audio and video you post publicly on social media, because that content is the raw material cloning tools need to replicate your voice.

Where Things Stand Right Now

The biometric privacy situation shifted dramatically over the past year. At the federal level, the incoming administration revoked the previous AI Executive Order on its first day in office, removing the federal government's primary oversight framework. All three Democratic members of the Privacy and Civil Liberties Oversight Board were fired in January 2025, leaving the board without a quorum. The DHS directive governing facial recognition use by ICE and Customs and Border Protection was quietly removed from the government's website. A December 2025 executive order directed the Attorney General to create a task force to challenge state AI and privacy laws deemed burdensome.

Federal biometric surveillance expanded at the same time. ICE deployed an app called Mobile Fortify that enables field agents to perform facial recognition, fingerprint, and iris scans during street encounters, drawing from more than 200 million images in government databases. Internal documents confirm that ICE does not give individuals any opportunity to decline biometric collection. The 9.2 million dollar Clearview AI contract for ICE was the company's largest government deal. Pending federal legislation has allocated 673 million dollars for biometric entry exit systems and 2.77 billion dollars for AI powered surveillance infrastructure.

The states remain the strongest source of resistance. Texas's combined 2.775 billion dollars in settlements from Meta and Google proved that a single motivated attorney general can force accountability on the largest companies in the world. Colorado's biometric amendment took effect in July 2025. At least 23 states now regulate biometric data in some form. Internationally, the EU AI Act's biometric prohibitions took effect on February 2, 2025, banning real time facial recognition in public spaces and the untargeted scraping of facial images for databases, with penalties up to 35 million euros or 7 percent of global revenue.

What You Need to Do Right Now

I am not going to sugarcoat this. Most Americans have already lost control of their biometric data. Your face has been photographed thousands of times. Your voice exists in voicemails, social media videos, and customer service recordings. Clearview AI's database almost certainly includes your photos. The question is not whether your biometric data is out there, because it is. The question is what you do next.

At the airport, tell the TSA officer you want to opt out of the face scan before the photo is taken. You do not need to provide a reason. If an officer gives you a hard time, write down their name and file a complaint with the TSA Contact Center. When possible, choose biometric systems that keep your data on your device, like Apple's Face ID, which stores your faceprint in a secure chip on your phone and never sends it to a server. Pair any biometric login with a password or security key through multi factor authentication so that even if one layer is compromised, the other still stands.

Before you hand over your biometric data to any company, ask four questions. Where is it stored? Who has access to it? How long do you keep it? Can I opt out? If they cannot give you clear answers, walk away.

Cut your social media footprint. Publicly posted photos and videos are the raw material that feeds facial recognition databases and voice cloning tools. Set your profiles to private. Be careful about unexpected phone calls where no one speaks on the other end, because those calls may be harvesting a sample of your voice. Watch for callers who try to get you to say "yes" on the phone. Pick a family code word for emergencies and make sure everyone in your household knows it.

If you live in Illinois, Texas, or one of the growing number of states with biometric privacy laws, know that you have legal options if a company collects your biometric data without consent. Find a lawyer who handles privacy cases and learn what your state law allows.

The faceprint captured at the grocery store today will still identify you in 50 years. The voiceprint recorded during a customer service call this week can be cloned by AI this afternoon. The fingerprint stolen in a data breach last year cannot be changed, cancelled, or reissued. These are not passwords. These are pieces of your body. They are permanent, they are irreplaceable, and they deserve the same fierce protection you would give to anything else you love and cannot replace.

Talk to your family about this. Talk to your friends. Share what you have learned. The companies and government agencies collecting your biometric data are counting on your silence and your ignorance. The moment you start paying attention, asking questions, and demanding answers, the equation starts to change.

Your face and voice are not a passwords you can change. Start treating them that way.