Chapter 19: Algorithms Are Using Your Data to Overcharge You

Two shoppers walk into the same grocery store on the same Tuesday afternoon. They pick up the same brand of cereal, the same gallon of milk, the same pack of chicken thighs. They check out at the same register. One pays $114. The other pays $124. Neither one knows the other got a different price. Neither one was told.

This is not a hypothetical. In December 2025, a team of 437 volunteer shoppers spread across four American cities placed identical orders through Instacart, all at the same stores, all at the same time. Roughly 75 percent of the products they ordered were priced differently depending on who was buying. A 20 item basket at a Seattle Safeway ranged from $114.34 to $123.93, an 8.4 percent spread that, over the course of a year, would cost one family about $1,200 more than another family shopping in the exact same place.

The reason for the price gap had nothing to do with coupons, memberships, or sales. Instacart was running millions of secret pricing experiments on real customers using an AI tool called Eversight, a platform the company acquired in 2022. The algorithm looked at your personal data, estimated how much you were willing to spend, and charged you accordingly. You never saw the other price. You never knew a different price existed.

And Instacart is not the only one doing this.

Welcome to the world of surveillance pricing, where the store watches you before you walk in, calculates what you are worth, and adjusts its prices in real time. Welcome to the world of hidden dossiers, where companies you have never heard of maintain secret files about your banking history, prescription medications, insurance claims, rental record, and retail return habits, and those files determine whether you get an apartment, a job, or a fair insurance rate. One in five Americans has errors in these files. Most people have no idea the files exist.

This chapter is going to show you exactly how these systems work, who is profiting from them, and what you need to do about it right now.

The Federal Government Confirmed Your Suspicions, Then Stopped Investigating

On July 23, 2024, the Federal Trade Commission voted 5 to 0, including both Republican commissioners, to issue orders to eight companies demanding answers about how they use your personal data to set individualized prices. The targets were not the stores themselves. They were the middlemen who sell the surveillance pricing technology to the stores: Mastercard, Revionics, Bloomreach, JPMorgan Chase, Task Software, PROS, Accenture, and McKinsey and Co. These companies advertise the ability to track your behavior and calculate the highest price you will tolerate.

Six months later, on January 17, 2025, the FTC released a staff report confirming the scope of the problem. These eight intermediaries serve at least 250 retail clients, including grocery chains, apparel stores, beauty companies, home goods retailers, convenience stores, and hardware chains. Their tools collect precise geolocation, demographics, browsing patterns, shopping history, mouse movements on webpages, products abandoned in shopping carts, and search activity. One cosmetics company was targeting promotions to consumers based on their specific skin type and skin tone. Another company identified new parents through recent purchases and showed them higher priced baby thermometers.

The report found something even more troubling. Several of these tools merge data from multiple sources into unique profiles for individual shoppers. They assess your personal price sensitivity. They predict whether you are an impulse buyer. Some of them determine whether you qualify for food assistance. These are not broad marketing tools. They are individualized extraction machines designed to find the maximum amount you will pay and charge you that amount.

Then the investigation went quiet. Andrew Ferguson took over as FTC Chair on January 20, 2025, and shut down the public comment period on surveillance pricing within days, despite an original deadline of April 17. Commissioner Alvaro Bedoya responded publicly: Chairman Ferguson shut the American people out. No full final report has been published. The study appears dead.

Congressional pressure has continued. Senator Mark Warner led a bipartisan push in December 2025 urging the FTC to publish findings and act. Senator Ruben Gallego introduced the One Fair Price Act to ban the use of personal data in price setting. As of March 2026, no federal enforcement action targeting surveillance pricing has happened.

Twenty Years of Charging You More Because They Could

Surveillance pricing did not arrive overnight. Corporations have been testing how much they are able to get away with for more than two decades.

In 2000, Amazon ran one of the first documented experiments, varying prices on 68 DVD titles over five days. Customers in online discussion forums compared notes and discovered the differences. One customer found that deleting his browser cookies dropped his price immediately. Amazon refunded an average of $3.10 to 6,896 customers. CEO Jeff Bezos called the experiment a mistake and promised Amazon would never test prices based on customer demographics.

In 2012, Orbitz was steering Mac users toward pricier hotels. The company's own chief scientist confirmed the data: Mac users spent $20 to $30 more per night and were 40 percent more likely to book four and five star hotels. The underlying logic was income based. Mac owners earned an average of $98,560 compared to $74,452 for PC owners. The algorithm was reading your device and using your income bracket against you.

That same year, the Wall Street Journal tested over 42,000 ZIP codes on the Staples website and found a pattern that should disturb every American. Areas located more than 20 miles from a competitor saw higher prices 67 percent of the time. Areas near rivals saw higher prices only 12 percent of the time. The most painful detail was this: the ZIP codes getting the discounts averaged $59,900 in household income. The ZIP codes paying full price averaged just $48,700. Lower income communities paid more.

The Princeton Review charged higher prices for SAT prep courses in ZIP codes with high percentages of Asian American residents and stopped only after the practice was exposed by journalists.

The Instacart investigation in December 2025 brought this history into the present. Consumer Reports, the Groundwork Collaborative, and More Perfect Union documented a secret AI pricing experiment affecting shoppers at Albertsons, Costco, Kroger, Safeway, Sprouts, and Target. Seventy five percent of grocery products were priced differently for different customers. Some products had five different price points at the same time. Instacart halted the practice on December 22, 2025, after the investigation became public, saying the company missed the mark for some customers. A survey of 2,240 adults found 72 percent of Instacart users did not want the company charging different prices for any reason.

Here is what you need to understand about the difference between regular dynamic pricing and surveillance pricing. When an airline raises ticket prices on a popular route because demand is high, that is supply and demand at work. The price responds to the market. Surveillance pricing is different. The price responds to you. The algorithm looks at your browsing history, your location, your purchase patterns, your device, and estimates what you personally will pay. Then the algorithm charges you that amount. The market does not set the price. Your data does.

A 2026 survey of 2,000 Americans found 62 percent are concerned about personalized pricing. Sixty six percent said they would stop shopping at a retailer that charged them more based on personal data. Only 7 percent actively support the practice. The public does not want this. The public just does not know how widespread this already is.

The Algorithm That Raised Your Rent

Surveillance pricing is not limited to what you buy at a store. An algorithm has been setting your rent too.

On August 23, 2024, the Department of Justice filed an antitrust lawsuit against RealPage, Inc., a Texas company owned by private equity firm Thoma Bravo, alleging its AI Revenue Management software violated federal antitrust law. The theory was straightforward. Competing landlords fed their proprietary rental data, including rates, occupancy levels, and lease terms, into a shared algorithm. That algorithm generated pricing recommendations that aligned rents across competitors. Landlords accepted these recommendations 80 to 90 percent of the time. The software included auto accept features and built in discouragement for landlords who wanted to set lower prices.

On January 7, 2025, the DOJ expanded the lawsuit to include six of the nation's largest landlords, joined by state attorneys general from California, North Carolina, Colorado, Connecticut, Minnesota, Oregon, Tennessee, and Washington. The named companies were Greystar Real Estate Partners, the nation's largest landlord with roughly 950,000 units, LivCor, a Blackstone subsidiary, Camden Property Trust, Cushman and Wakefield, Willow Bridge Property Company, and Cortland Management. Together these companies operate more than 1.3 million rental units across 43 states and Washington, D.C. The DOJ alleged renters in some markets paid 5 to 7 percent more than they would have in a competitive market. One landlord told RealPage that rents rose more than 25 percent within 11 months of adopting the software.

The DOJ reached a proposed settlement with RealPage on November 24, 2025, and consumer advocates called the terms deeply inadequate. No financial penalties. No admission of wrongdoing. RealPage agreed to stop using nonpublic competitively sensitive data in daily rent recommendations, eliminate auto accept features, and submit to a court appointed monitor for three years. The settlement runs seven years.

The private litigation has produced larger results. A federal court in Tennessee granted preliminary approval of 26 settlements with 27 defendants totaling $141.8 million. Greystar contributed $50 million, plus a separate $7 million multistate settlement with nine state attorneys general. State attorneys general in New Jersey, Washington D.C., California, Maryland, Kentucky, and Arizona filed their own lawsuits. California Governor Newsom signed AB 325, the Preventing Algorithmic Price Fixing Act, on October 6, 2025, explicitly prohibiting algorithmic collusion under state antitrust law. San Francisco, Philadelphia, Minneapolis, Seattle, Jersey City, and other cities have banned algorithmic rent pricing.

One Wrong Name on a Screen and You Are Living in Your Car

The tenant screening industry generates approximately $1 billion annually and touches nearly every renter in America. Companies like SafeRent Solutions, TransUnion Rental Screening Solutions, and RealPage's LeasingDesk pull data from credit bureaus, criminal databases, eviction courts, and public records, then run that data through algorithms that produce a score or a simple accept or reject recommendation. A human being rarely looks at the results.

The error rates in this system are staggering. A 2024 criminology study compared official state criminal records against private sector background checks for 101 individuals. Sixty percent had at least one false positive error on regulated background checks. Seventy four percent of criminal charges listed on unregulated reports did not match official state records. One participant who had only two drug convictions from 30 years earlier found more than 50 erroneous charges attributed to him, including aggravated assault, robbery, gun possession, and child abuse. The problem is that these algorithms match records by names and aliases rather than by fingerprints or verified identifiers.

The FTC found in a landmark 2013 study that one in five consumers had at least one error on their credit reports. Five percent had errors severe enough to affect loan terms. That translates to roughly 42 million Americans carrying inaccurate information in their files.

The real stories behind these numbers will make your blood boil.

Marckus Williams is a Black man in Indianapolis who rebuilt his life after incarceration and founded a grocery store in a food desert. When he applied to rent a home from Tricon Residential in November 2022, his screening report showed three prior convictions. Two had been expunged. The third was not a conviction at all. Tricon applied a blanket ban with no individualized review. Williams ended up living in his car for about a month over Christmas and New Year's. He said it kind of broke me a little bit. His class action lawsuit alleges that Tricon's policy disqualifies Black applicants at a rate 5.32 times greater than white applicants.

Carmen Arroyo's son Mikhail was severely injured in a 2015 accident, leaving him unable to speak, walk, or care for himself. When Carmen applied to move Mikhail from a nursing home into her apartment, an automated screening tool flagged a disqualifying criminal record: a dismissed shoplifting charge from 2014. The screening company refused to give the family a copy of the underlying information and gave the landlord only a bare accept or decline recommendation. Mikhail had to remain in the nursing home for approximately one additional year. The Department of Justice filed a supporting brief in the case.

Marco Antonio Fernandez, a U.S. Navy servicemember with top secret security clearance, returned from a yearlong deployment in South Korea in 2018 and applied for an apartment near Fort Meade, Maryland. The landlord rejected him. The screening algorithm had confused him with Mario Fernandez Santana, a Mexico resident on a federal drug trafficking watch list. A completely different person with a different date of birth. His lawyers noted that this inaccurate reporting will follow him for the rest of his career.

An Oregon woman found her screening report loaded with burglary, narcotics charges, and bail jumping. None of the charges were hers. The report combined criminal records from five different women who were different races and had different birthdates, including one who was an active inmate.

The CFPB identified the core problem: screening companies appear inclined to include negative information even when that information might be inaccurate. Name only matching, where the system searches databases by first and last name alone, produces frequent false hits. The error risk falls disproportionately on Hispanic, Black, and Asian Americans because of less surname diversity. Over 12 million Latinos share just 26 surnames.

Federal enforcement has produced some accountability. In October 2023, the FTC and CFPB required TransUnion to pay $15 million for inaccurate tenant screening reports. In 2018, the FTC fined RealPage $3 million for attributing false criminal records to tenants with similar names. In 2020, the FTC fined AppFolio $4.25 million for including records over seven years old. The Louis v. SafeRent Solutions case produced a $2.275 million settlement after a court found that the company's algorithm assigned disproportionately lower scores to Black and Hispanic applicants using housing vouchers.

The Secret Files Most Americans Do Not Know Exist

When most people think about their credit file, they think about Equifax, Experian, and TransUnion. Those are only the beginning. Dozens of specialty consumer reporting agencies compile detailed files about you that determine major outcomes in your life, and most Americans have never heard of them.

ChexSystems tracks your banking history and maintains a score that determines whether you are allowed to open a checking account. LexisNexis operates the CLUE database, which records every insurance claim you have ever filed and is consulted by virtually every insurer in America.

MIB Group collects medical condition data from applications for life, health, and disability insurance. Milliman IntelliScript purchases your prescription drug history from pharmacy benefit managers: every medication, every dosage, every refill, every dispensing pharmacy, and every prescribing doctor. The Retail Equation tracks your retail return patterns and flags you if the algorithm decides you return too many items. The Work Number, owned by Equifax, has current payroll data covering more than 136 million jobs and is routinely consulted by landlords, creditors, and government agencies.

The CFPB publishes an annual list of these companies. The most recent version, released January 30, 2025, expanded to include sports betting companies for the first time. The list also documented that employment screening reports now include social media data and that auto insurers collect driving behavior data through GPS and mobile phone telematics.

The errors in these files carry devastating consequences. A woman named Crawford had an excellent credit score of 788 out of 850 with no criminal history or evictions. A tenant screening company gave her a score of just 685 out of 1,000, roughly a D grade. Her apartment complex demanded an extra month's rent as a security deposit. Judy Ann Sego was listed as deceased by LexisNexis. She was alive. She disputed the notation. LexisNexis verified it and continued reporting her as dead, destroying her access to credit. The LexisNexis Accurint class action covered 200 million class members.

The Rights You Have and the Watchdog They Are Trying to Kill

The Fair Credit Reporting Act, enacted in 1970 and updated in 2003, gives you rights that most Americans do not know about. You are entitled to one free report per year from each nationwide credit bureau and each specialty consumer reporting agency. The Big Three bureaus have offered free weekly reports through AnnualCreditReport.com since the pandemic, and that benefit continues as of early 2026. Under the 2019 Equifax breach settlement, you are also entitled to up to six free Equifax reports per year through December 2026.

If any company denies you credit, housing, insurance, or employment based on a consumer report, that company must tell you and provide the name, address, and phone number of the agency that supplied the information. You have the right to dispute inaccurate information, and the reporting agency must investigate within 30 days. You have the right to sue for violations, with statutory damages of $100 to $1,000 per willful violation, plus actual damages, punitive damages, and attorney's fees. Negative information must generally be removed after seven years, with ten years for bankruptcies.

For specialty agencies, there is no central portal. You must identify each agency from the CFPB's published list at consumerfinance.gov, contact them individually, provide identification, and request your file. Key contacts include ChexSystems at 800 428 9623, LexisNexis at 866 897 8126, and MIB at 866 692 6901. When disputing errors, consumer attorneys universally recommend sending disputes by certified mail with return receipt rather than relying on online portals that limit documentation. You should also file a complaint with the CFPB at consumerfinance.gov/complaint. Credit reporting has consistently been the number one complaint category at the Bureau.

Recent litigation has produced real results. A Wells Fargo class action has a pending settlement of $56.85 million. TransUnion settled a dispute handling case for $23 million covering approximately 485,000 consumers. CoreLogic paid $5.695 million for incorrectly listing consumers as deceased. The J.B. Hunt employment background check class action settled for $5 million in August 2025.

The Supreme Court's 2021 decision in TransUnion LLC v. Ramirez narrowed the path for FCRA class actions. In a 5 to 4 ruling, the Court held that only plaintiffs who suffered a concrete injury have standing to sue in federal court. The case involved TransUnion's erroneous flagging of 8,185 consumers as potential matches to a terrorist watchlist. The jury awarded $60 million. The Court limited recovery to the 1,853 consumers whose reports were actually shared with third parties, leaving 6,332 consumers whose inaccurate files sat in a database without a federal remedy. Justice Thomas noted in dissent that state courts are not bound by the same standing requirements, opening a path for state level FCRA cases.

The agency tasked with enforcing these rights is fighting for survival. After President Trump fired CFPB Director Rohit Chopra in January 2025, Russell Vought took over as acting head. Within weeks, DOGE operatives gained access to CFPB systems. On February 8, 2025, Elon Musk posted CFPB RIP on social media, notably while his platform was launching a digital banking service the CFPB would regulate. On February 10, Vought issued a stop work order and closed CFPB headquarters. The agency fired approximately 200 probationary employees. Vought canceled roughly $100 million in contracts and requested zero dollars from the Federal Reserve.

A federal judge intervened on March 28, 2025, reinstating employees and requiring continued operations. The administration responded by sending layoff notices to approximately 1,400 of the CFPB's 1,700 workers. On July 4, 2025, the One Big Beautiful Bill Act slashed the CFPB's budget by nearly 50 percent. A second federal judge ordered funding to continue on March 13, 2026. The Supreme Court had already ruled 7 to 2 in May 2024 that the CFPB's funding mechanism is constitutional.

The damage is already showing. Since January 2025, Experian's consumer dispute relief rate collapsed from approximately 20 percent to under 1 percent. TransUnion's dropped roughly 50 percent. Over 2.7 million credit reporting complaints remain unresolved. The CFPB dropped at least four pending enforcement cases and withdrew dozens of advisory opinions, including the critical 2021 guidance on name only matching that had been protecting renters from false criminal record hits.

These Algorithms Hit Some Americans Harder Than Others

Algorithmic pricing and screening do not affect all Americans equally. When algorithms set prices using ZIP codes, credit scores, browsing patterns, and shopping history, they replicate and amplify existing economic disparities. Those disparities track closely with race.

The Staples case made the math visible. The ZIP codes getting the best prices averaged $59,900 in household income. The ZIP codes paying the most averaged just $48,700. Lower income neighborhoods, disproportionately Black and Hispanic, paid more. The Consumer Federation of America found that property insurers charge homeowners with lower credit scores $1,996 more per year. Black homeowners carry an average credit score of 612 compared to 725 for white homeowners. Scholars call this digital redlining: using data driven systems to enforce the same geographic patterns of exclusion that redlining created in the 1930s.

In tenant screening, the disparities are measurable. Black applicants are disqualified under blanket criminal history bans at 5.32 times the rate of white applicants. Black women are overrepresented in eviction filings by nearly 200 percent. The SafeRent settlement established that screening algorithms failing to account for the financial value of housing vouchers discriminate against Black and Hispanic tenants under the Fair Housing Act.

The European Union requires companies to explain algorithmic decisions that significantly affect individuals, mandates human review, and classifies AI used in insurance pricing as high risk with penalties reaching 35 million euros or 7 percent of global revenue. The United States has no federal AI law, no right to algorithmic explanation, and no mandatory transparency requirements for pricing algorithms.

Colorado passed an AI Act requiring impact assessments and consumer notification for consequential AI decisions, with an effective date pushed to June 30, 2026. Illinois became the first state to create a disparate impact standard for AI hiring tools effective January 1, 2026. Thirty seven lawsuits were filed in the first month. New York became the first state to enact a surveillance pricing disclosure law, requiring businesses to post: This price was set by an algorithm using your personal data.

In the first seven months of 2025, 51 bills across 24 states were introduced targeting algorithmic pricing. That is up from just 10 in all of 2024. States are moving. The question is whether the federal government will try to stop them. A December 2025 executive order directed the DOJ to create a task force to potentially sue states with AI laws the administration considers too aggressive. Whether an executive order has the power to override duly enacted state legislation without congressional authorization is a constitutional question headed for the courts.

What You Do Starting Today

You do not have to wait for Congress. You do not have to wait for the FTC. You do not have to wait for anyone. Here is your action plan.

First, pull your free credit reports from all three bureaus through AnnualCreditReport.com. Do this today, not next week. Review every line. Flag every error. Then go to consumerfinance.gov and download the CFPB's published list of specialty consumer reporting agencies. Contact ChexSystems, LexisNexis, MIB, The Work Number, and The Retail Equation. Request your file from each one. You are entitled to a free copy each year.

Second, when you find an error, and statistically one in five of you will, dispute the error by certified mail with return receipt requested. Do not rely on the online dispute portals. They limit what documentation you are allowed to submit. Send a letter with copies of supporting documents. Keep a record of everything. If the agency does not respond within 30 days, or responds inadequately, you have the right to sue.

Third, file a complaint with the CFPB at consumerfinance.gov/complaint and with your state attorney general. Even with the CFPB under assault, complaints create a record. State attorneys general in California, New York, Illinois, Colorado, and many other states are actively enforcing consumer protection laws. Your complaint adds to the evidence they need.

Fourth, pay attention to what your state legislature is doing. If you live in New York, surveillance pricing disclosure is already the law. If you live in California, the DELETE Act portal at https://privacy.ca.gov/drop/ lets you request deletion of your personal data from every registered data broker in one click. If your state has not passed similar laws, call your representatives and tell them you want the same protections. The 51 bills introduced across 24 states in 2025 happened because voters demanded them.

Fifth, make the algorithm's job harder. Use a VPN or a privacy focused browser when shopping online. Clear your cookies before comparing prices. Check prices in a private or incognito window alongside your regular browser. If the prices differ, you are seeing surveillance pricing in action.

The algorithms sorting your life right now were built to extract maximum value from your data. They were not built to be fair. They were not built to be accurate. They were built to make money for the companies that deploy them. Your personal information is the raw material, and these systems convert that information into prices designed to take as much from you as the data says you will tolerate.

You deserve to know what is in your files. You deserve to pay the same price as the person standing next to you. You deserve a fair shot at the apartment, the job, and the insurance rate you have earned. These are not partisan issues. These are American issues. And every single one of you has the power to start fighting back today.