Chapter 12: HIPAA Won’t Save Your Health Data

In March 2023, the Federal Trade Commission announced a $7.8 million settlement against BetterHelp, the online therapy platform millions of Americans turned to during the loneliest stretch of the pandemic. People had filled out intake questionnaires describing their depression, their suicidal thoughts, their medication histories, and their struggles with addiction. They answered deeply personal questions because they believed they were talking to a medical provider. They believed their answers were protected.

They were wrong.

BetterHelp had taken every email address belonging to every current and former client, more than seven million people, and uploaded those addresses to Facebook. Facebook matched over four million of them to social media profiles and served them targeted advertisements. Intake questionnaire responses about mental health conditions went to Snapchat, Pinterest, and Criteo. The platform displayed seals on its website suggesting HIPAA compliance. Those seals were meaningless. BetterHelp never qualified as a HIPAA covered entity. The people who poured their hearts out to a service promising confidential therapy received, on average, about ten dollars each from the settlement.

If you are thinking, wait, my health data is supposed to be protected by federal law, you are not alone. Surveys show roughly eight out of ten Americans believe the information they share with health apps falls under HIPAA. And if your doctor records your blood pressure during an office visit, they are right. The reading is protected. If you check your blood pressure at home using a consumer cuff synced to an app on your phone, the identical reading has zero federal protection. Same data. Same numbers. Entirely different legal reality.

This chapter is going to show you exactly where the line is, who is profiting from the confusion, and what you need to do right now to protect yourself and the people you love.

The Law Written for a World That No Longer Exists

Congress passed the Health Insurance Portability and Accountability Act in 1996. Bill Clinton was in the White House. Google did not exist. The iPhone would not arrive for another eleven years. HIPAA was designed to protect your medical records as they moved between your doctor, your insurance company, and the billing services that processed claims. The law applies to three categories of organizations: healthcare providers who transmit health information electronically, health plans like insurance companies and HMOs, and healthcare clearinghouses that process billing data. A fourth group, called business associates, includes vendors who handle protected health information on behalf of those organizations.

If a company does not fit into one of those categories, HIPAA does not apply. Period.

Think about what falls outside those categories in 2026. The meditation app you open before bed. The fertility tracker logging your menstrual cycle. The wearable on your wrist recording your heart rate, your sleep patterns, your blood oxygen levels, and your GPS coordinates during your morning run. The prescription discount service you used to save money on medication. The online therapy platform you turned to during a crisis. The genetic testing kit you sent off to learn about your ancestry. None of these are HIPAA covered entities. None of them are required to follow HIPAA rules. Your wellness app knows things about your body that your doctor does not know, and no federal privacy law governs what happens to that information.

Here is the staggering part. More than 350,000 health related apps are available across app stores right now. Around forty percent of American adults use some form of healthcare app. Roughly a third of all American women use a period tracking app. Over 200 million Americans wear some type of health monitoring device. The market for these apps and devices reached nearly nineteen billion dollars in 2024 and is projected to grow past sixty seven billion dollars within the next decade. Almost all of this data sits outside HIPAA. Almost none of it has federal protection.

The Misconception That Puts You at Risk

A 2023 survey of more than two thousand American adults found 81 percent incorrectly believed health data they share with digital health apps is covered by HIPAA. Sixty eight percent said they were familiar with HIPAA. Most of them did not understand what the law actually covers. Nearly six out of ten people who used health apps had never once considered how the information they entered would be used.

When a separate survey told respondents that federal privacy laws do not cover health data downloaded to apps, concern about health data privacy nearly doubled, jumping from 35 percent to 62 percent. People are not apathetic about their health privacy. They are misinformed. They believe a shield exists when there is no shield at all.

One data management consultant put the situation plainly when she told Consumer Reports HIPAA does not actually protect medical data in all circumstances. People assume sensitive data is protected because the information feels like something the law should cover. A former American Bar Association e-health privacy chair added the only thing covering you when you use a Fitbit or a Garmin is the terms of service, and frankly, no one reads those.

Your Therapist Sold Your Secrets

The BetterHelp case was not an isolated incident. The case was the tip of an entire industry operating without meaningful privacy guardrails.

In February 2023, the FTC brought its first ever enforcement action under the Health Breach Notification Rule against GoodRx, a prescription discount platform used by more than 55 million Americans. GoodRx had shared users' prescription medications, health conditions, and advertising identifiers with Facebook, Google, and other companies through tracking pixels embedded in its website and app. In one documented instance, GoodRx compiled lists of users who had purchased medications for heart disease and blood pressure, uploaded their email addresses and phone numbers to Facebook, and let Facebook match them to social media accounts for targeted advertising. The penalty was $1.5 million. That works out to less than three cents per user whose medication history was shared.

Cerebral, an online mental health provider, drew a $7.1 million penalty in April 2024 after sharing the sensitive health data of 3.2 million consumers with LinkedIn, Snapchat, and TikTok through tracking pixels. The data included names, addresses, medical histories, prescription information, insurance details, birthdates, and even religious beliefs and sexual orientation. The former CEO was personally named in the complaint. The FTC permanently banned Cerebral from using any personal information for advertising with third parties.

Monument, a New York based online alcohol addiction treatment service, received a $2.5 million penalty, suspended because the company could not afford to pay, after sharing the names, alcohol consumption data, and medical histories of more than 100,000 people with Meta and Google.

The ovulation tracker Premom shared menstrual cycle dates, pregnancy symptoms, hormone results, and precise geolocation with Google and two companies based in China. The data sharing continued until a journalist contacted the company for comment.

These are not fringe services run out of someone's garage. These are platforms that millions of Americans trusted with their most sensitive information.

When Your Hospital Sends Your Data to Facebook

In June 2022, a team of investigative journalists revealed that 33 of the top 100 hospitals in America had installed Meta Pixel, a tracking tool, on their websites. Every time someone clicked to schedule a doctor's appointment, Facebook received a data packet. Seven hospital systems had the tracker inside their password protected patient portals, meaning Facebook was receiving data about real patients in real time. The data flowing to Facebook included IP addresses, doctor names and medical specialties, health conditions searched including terms like pregnancy termination and Alzheimer's, medication names, appointment details, and in some cases patient names, email addresses, phone numbers, and zip codes.

A University of Pennsylvania study examined 3,747 hospital websites and found that 98.6 percent had at least one type of tracking code sending data to outside companies. Google received data from 98.5 percent of those websites. Facebook received data from 55.6 percent. Hospital home pages had a median of sixteen separate third party data transfers happening simultaneously. A follow up study of 100 hospitals in 2024 found 96 of them still transferring user information to third parties. As of that year, a third of healthcare websites still used Meta Pixel tracking code.

The financial consequences have been staggering. Healthcare organizations across the country have paid more than $100 million in fines and settlements specifically tied to pixel tracking violations. Advocate Aurora Health, a 26 hospital system in Wisconsin and Illinois, settled for $12.225 million after its patient portal sent data on three million patients to Facebook. Novant Health paid $6.6 million after 1.36 million patients were affected. Mass General Brigham settled for $18.4 million. The list goes on.

The federal government tried to step in. In December 2022, the HHS Office for Civil Rights issued guidance stating that tracking pixels on hospital websites could constitute a HIPAA violation when IP addresses were combined with visits to pages about specific health conditions. The American Hospital Association sued. In June 2024, a federal judge in Texas struck down the guidance, calling it an overreach of executive power. HHS chose not to appeal. The practical result is that hospitals face fewer federal restrictions on using tracking technologies on their public facing webpages.

A consolidated class action against Meta over hospital pixel tracking remains active in a California federal court. In April 2025, a judge ordered Mark Zuckerberg to sit for a deposition, finding he was the final decision maker on all consequential privacy decisions at the company. Meta appealed. The court also considered sanctions against Meta for deleting data relevant to the case.

Your Body on a Billboard: Wearables and the Data They Collect

Look at your wrist. If you are wearing a smartwatch or fitness tracker, the device knows your heart rate right now. The device knows your heart rate variability, your blood oxygen saturation, your sleep stages from last night, your stress level, your skin temperature, your respiratory rate, and every place you went today down to the GPS coordinate.

Modern wearables sample your heart rate data between ten and one hundred times per second. A Samsung Galaxy Watch study showed that photoplethysmography signals were sampled every 100 milliseconds and transmitted to a server every thirty minutes. One 2025 analysis found that the Fitbit companion app collects up to 21 different categories of data, nearly double the industry average.

Google acquired Fitbit for $2.1 billion in 2021. At the time, Fitbit had roughly 28 million active users. A Google executive promised the deal was about devices, not data. The European Commission imposed conditions: Google pledged not to use Fitbit health, fitness, or location data for Google ads in Europe, committed to maintaining a separate data silo, and agreed to preserve third party access for ten years. The United States imposed no comparable conditions. Google later paid nearly $400 million in a settlement after investigations showed the company continued tracking Fitbit user location data after users had turned off location tracking.

A 2025 analysis from University College Dublin ranked the privacy policies of seventeen wearable manufacturers. Apple scored among the lowest risk, with the shortest policy at 4,408 words, an emphasis on processing data on the device itself, and a stated commitment not to use health data for advertising. On the other end, Xiaomi received the highest risk designation with sixteen high risk ratings across twenty four criteria. Whoop, a brand popular among fitness enthusiasts, had the longest privacy policy at 12,125 words and clustered with the highest risk group.

The FBI Knows Your Heart Rate, Too

Wearable data is not just a privacy concern. Wearable data is a forensic tool.

In Connecticut, a man named Richard Dabate told police that a masked intruder shot and killed his wife Connie in their home in December 2015. Connie's Fitbit told a different story. The device showed she was moving around the house for approximately one hour after the time Richard claimed she had been killed. Digital evidence from a laptop and Facebook Messenger timestamps confirmed the timeline. Richard Dabate was convicted of murder in May 2022 and sentenced to 65 years in prison. The Connecticut Supreme Court upheld the conviction in 2025, specifically ruling that Fitbit data was scientifically reliable and properly admitted as evidence.

In San Jose, California, a woman named Karen Navarra was found dead in 2018. Her Fitbit Alta HR showed a significant spike in heart rate at 3:20 PM on September 8, followed by a rapid decline. By 3:28 PM, the device registered no heartbeat at all. Ring camera footage placed her stepfather's car in the driveway during that eight minute window. He was arrested for murder.

Police obtain wearable data through warrants or subpoenas served directly on device manufacturers. Your device does not need to be physically in law enforcement's hands. They go straight to the company.

The FDA Clears the Device, Not Your Privacy

The Apple Watch received FDA clearance for ECG and atrial fibrillation detection in 2018. Fitbit received ECG clearance in 2020. Samsung followed the same year. In 2024, the Apple Watch received clearance for sleep apnea detection. These devices are performing medical grade functions. They are generating the kind of data that a hospital would guard under HIPAA.

FDA clearance does not trigger HIPAA protection. The same atrial fibrillation reading that your cardiologist would treat as protected health information has no federal privacy protection when your Apple Watch records it and sends it to a server. The device crosses into medical territory. The privacy law does not follow.

When Your Mental Health Becomes a Product

Mozilla Foundation ran the most thorough audit of mental health app privacy practices. In 2022, their Privacy Not Included team reviewed 32 mental health and prayer apps. Twenty eight of 32 received a warning label. Twenty five failed minimum security standards. Only two apps earned a clean bill of health: PTSD Coach, developed by the Department of Veterans Affairs, and Wysa, an AI chatbot. Mozilla's lead researcher described the category as exceptionally creepy, saying these apps track, share, and capitalize on users' most intimate personal thoughts and feelings.

A year later, the picture had gotten worse. Nineteen of those 32 apps still carried the warning label, and 40 percent of them had degraded in their privacy practices since the previous year. Talkspace, one of the largest online therapy providers, buried a clause in its privacy policy allowing the company to use inferences about gender identity, sexual orientation, and depression for marketing purposes.

One of the most disturbing cases involved Crisis Text Line, a nonprofit providing free 24/7 text based crisis support for people in moments of acute mental health distress. The organization had spun off a for profit company called Loris.ai, retaining a 53 percent stake, and shared crisis conversation data with the spinoff. Loris.ai used that data to build customer service chatbot software. Users reaching out during their darkest moments had to agree to more than 4,000 words of terms of service before they could get help. A former board chair acknowledged knowing full well that no one would read those terms. The data sharing arrangement ended in January 2022 after journalists exposed the practice.

$275 for Five Thousand People's Mental Health Records

A Duke University researcher contacted 37 data brokers in 2023, asking to buy health data in bulk. Twenty six responded. Eleven were ready and willing to sell mental health records. One broker advertised the names and addresses of individuals with depression, bipolar disorder, anxiety, panic disorder, cancer, PTSD, and personality disorders, sorted by race and ethnicity. The price tag for 5,000 people's mental health profiles was $275. That is about five cents per person.

The pipeline works like this. Health apps and websites embed tracking tools, software development kits and tracking pixels, automatically transmitting user data to advertising platforms and data brokers. When you open a health app, data flows. When you visit a health website, data flows. When you complete a questionnaire about your symptoms, data flows. An advertising identifier links all of this activity to your profile. A 2018 study found more than 60 percent of Android apps tested shared data with Facebook the moment a user opened the app, regardless of whether the user even had a Facebook account. An investigation of 50 telehealth websites found 49 of 50 shared user data with advertising platforms. Thirteen of those sites sent answers to health intake questionnaires to companies including Meta, Google, TikTok, and Snapchat.

Blue Shield of California disclosed in 2025 a Google Analytics misconfiguration had shared patient data for 4.7 million individuals with Google Ads. This was not a hack. A configuration error at one of the largest insurers in the state exposed millions.

Your Employer Wants Your Steps, Too

Employer wellness programs represent another gap. When your company runs a standalone wellness program outside its group health plan, the data collected does not fall under HIPAA. These programs gather biometric screening results, blood pressure, cholesterol, BMI, health risk assessments, smoking status, and wearable activity data. The American Medical Association has warned that there are no regulations stopping companies from using data like calorie intake, blood pressure, and weight to penalize patients.

John Hancock, the insurance company, announced in 2018 that it would only sell life insurance policies that track fitness and health data. No more traditional policies without monitoring. The Vitality program, now celebrating its tenth anniversary, collects steps, exercise intensity, annual health screening results, flu shots, and even healthy food purchases. Compatible devices include Apple Watch, Fitbit, Oura Ring, Garmin, and Whoop. Policyholders earn premium savings of up to 25 percent. UnitedHealthcare's UHC Rewards program, available to three million members, offers up to $1,000 per year for daily steps, sleep tracking, biometric screenings, and health surveys.

John Hancock's own FAQ states the company will not use Vitality data to change a policyholder's risk classification. Consumer advocates are watching carefully. The infrastructure for insurance surveillance is being built one step at a time. Right now, these programs offer rewards. The data keeps accumulating. Patterns of declining activity, irregular heart rhythms, poor sleep, and elevated stress sit on company servers alongside your name and policy number.

Your Cycle, Your Messages, Your Criminal Case

After the Supreme Court's Dobbs decision in June 2022 overturning Roe v. Wade, privacy experts raised immediate alarms about period tracking apps. Missed periods, pregnancy test results, and fertility data logged in these apps could be subpoenaed by law enforcement in states that criminalized abortion.

The warnings proved prescient. In Nebraska, a 17 year old named Celeste Burgess and her mother Jessica were investigated by local police. Officers served Facebook with a warrant, and Meta complied, turning over private messages in which Jessica coached Celeste on taking abortion pills. Both mother and daughter were convicted. This case demonstrated in stark terms that digital communications about reproductive health, completely outside HIPAA, can become criminal evidence.

As of mid 2024, no court has subpoenaed period tracker app data specifically. Experts point out that prosecution timelines move slowly. A 2023 academic analysis of 35 period tracking apps found that 16 of them had privacy policies explicitly stating they could disclose personal data to law enforcement in response to subpoenas.

Some apps responded to the post Dobbs reality. Flo introduced an Anonymous Mode in late 2022, built with an encryption protocol stripping names, email addresses, and technical identifiers from health data. The feature won an industry privacy award. The mode remains opt in, meaning you have to know about the feature and turn it on yourself. Clue, a Berlin based app governed by the European Union's privacy law, publicly stated the company would refuse to share data with anyone, even in response to a legal subpoena. Post Dobbs, Clue saw a 2,200 percent increase in downloads over a single weekend. Apple added end to end encryption for health data synced through its cycle tracking feature.

States Are Moving. Washington Is Not.

The most aggressive response to the HIPAA gap has come from individual states. Washington State passed the My Health My Data Act in April 2023, and the state attorney general's office called it the first privacy focused law in the country designed to protect health data that falls outside HIPAA.

The law defines consumer health data broadly. It covers past, present, and future physical or mental health status, including medication purchases, biometric data, reproductive and sexual health information, and, critically, health information that algorithms derive or extrapolate from data that is not itself health data. The law requires opt in consent before collecting health data, demands separate consent before sharing it, requires written authorization before selling it, and gives consumers a broad right to delete. It includes a private right of action, meaning individuals can sue directly. It bans geofencing around healthcare facilities entirely, with no exceptions, to prevent anyone from tracking who visits a doctor's office or clinic.

Nevada passed a similar law in 2023 with attorney general enforcement only and no private right of action. Connecticut added health data provisions to its existing privacy act the same year. Maryland went further in 2024, passing the first state law to ban the sale of sensitive health data entirely, regardless of whether the consumer consents. Virginia passed a law in 2025 specifically restricting the disclosure and sale of reproductive and sexual health information, with statutory damages of at least $500 per violation. California's attorney general reached a $1.55 million settlement with a health media company in July 2025, treating the sharing of health article titles with advertisers as an impermissible disclosure of sensitive health data.

More than twenty states now have some form of privacy law treating health data as a sensitive category requiring extra protections. An eight state enforcement consortium formed in 2025 to coordinate privacy actions across borders.

Federal legislation keeps stalling. The American Data Privacy and Protection Act passed a House committee 53 to 2 in 2022 and never received a full vote. The American Privacy Rights Act did not advance beyond subcommittee in 2024. In November 2025, Senator Bill Cassidy, chair of the Senate HELP Committee, introduced the Health Information Privacy Reform Act. The bill would create a new category called Applicable Health Information covering data from apps and wearables, require consent before selling health data, mandate a right to deletion, and require companies to tell consumers when HIPAA does not protect them. As of early 2026, the bill remains in committee.

The result is a country where your health data rights depend on your zip code. A resident of Washington State has legal protections that a resident of Alabama does not. A Californian has tools that someone in Ohio lacks entirely.

192.7 Million Americans Exposed in a Single Breach

The largest health data breach in American history struck on February 21, 2024, when hackers broke into Change Healthcare, a clearinghouse that processes fifteen billion healthcare transactions every year. The attackers used stolen credentials on a server that did not have multifactor authentication enabled. The breach affected 192.7 million people, nearly two thirds of the entire United States population. Healthcare operations across the country ground to a halt for weeks. The total cost exceeded $2.4 billion.

The breach involved a HIPAA covered entity. The event received federal scrutiny. Now imagine the breach affecting a health app used by ten million Americans. No HIPAA obligation to notify patients. No HHS investigation. No federal consequence unless the FTC decides to act, and the current FTC has signaled a more narrow enforcement approach under new leadership installed in 2025. The agency dismissed two Democratic commissioners in March 2025. Privacy enforcement observers expect the commission to pull back from the aggressive health data enforcement actions producing the BetterHelp, GoodRx, and Cerebral settlements.

Meanwhile, breaches keep accelerating. In 2024, 742 large health data breaches exposed a record 288.9 million individual records. IBM's 2025 report found that healthcare breaches averaged $7.42 million each, making healthcare the most expensive sector for breaches for the fifteenth consecutive year. In 2025, Yale New Haven Health System disclosed a breach affecting 5.5 million people and settled for $18 million in just seven months. Blue Shield of California's Google Analytics error affected 4.7 million. Aflac reported 13.9 million records compromised. The DaVita kidney care company was hit with ransomware. Frederick Health lost a million records.

What You Need to Do Right Now

The gap between what you believe about your health data and what the law actually says is one of the most dangerous privacy illusions in America. Here is what you need to do to protect yourself.

Audit every health app on your phone. Open your settings and look at every app related to health, fitness, menstrual tracking, medication, sleep, or mental wellness. For each one, go into the app's settings and look for privacy controls, data sharing options, and account deletion features. Turn off everything that is not essential to the app's core function. If the app does not give you clear controls, delete it and find an alternative that does.

Check your wearable's privacy settings. If you wear a Fitbit, Apple Watch, Garmin, Oura Ring, Whoop, or any other device, open the companion app and review what data you are sharing and with whom. Turn off features like location tracking during workouts if you do not need them. If your device offers on device processing instead of cloud syncing, choose local storage whenever possible.

Read the first three paragraphs of every health app's privacy policy. You do not need to read all 12,000 words. The first few paragraphs typically tell you whether the company shares data with third parties, whether it sells your information, and what rights you have to delete your data. If those first paragraphs do not answer those questions clearly, that tells you something important. You can also copy and paste the privacy policy into your favorite AI chatbot and ask it to tell you all about whether or not the company shares data with third parties, does it sell your information and what your rights are to block this from happening and delete the data.

Use anonymous or privacy modes when available. Flo's Anonymous Mode, Apple's end to end encryption for health data, and similar features exist because these companies know the risk. Turn them on. Do not assume the default settings protect you.

Know your state's law. If you live in Washington, Nevada, Connecticut, Maryland, Virginia, or California, you have specific health data rights. Use them. Request deletion of your data from companies that no longer serve you. File complaints with your state attorney general if a company ignores your request.

Separate your health identity from your advertising identity. Use a dedicated email address for health apps that you do not use anywhere else. This makes it harder for data brokers to connect your health profile to your broader digital identity.

Talk to your family, especially your kids. Young people share health data freely through fitness challenges, mood tracking apps, and AI chatbots. They deserve to know that the app promising to help them manage anxiety might be selling their emotional state to an advertising platform.

The Illusion Ends When You See the Truth

The law that most Americans believe protects their health data was written three decades ago for a world of paper charts and fax machines. It was never designed to cover the smartwatch on your wrist, the therapy app on your phone, or the genetic data you spit into a tube and mailed to a company that later went bankrupt.

Eight out of ten Americans believe their health app data is protected. It is not. Data brokers sell mental health profiles for five cents a person. Hospitals sent your appointment details to Facebook through invisible tracking pixels. Insurance companies now require fitness data for their policies. AI therapy bots collect your deepest fears with minimal oversight. And 15 million Americans watched their DNA go through a bankruptcy auction.

This is not a story about technology. This is a story about you. Your blood pressure, your menstrual cycle, your therapy sessions, your heart rate at three in the morning, your prescription for antidepressants, your genetic predisposition to breast cancer. Every piece of this information has value to someone. And right now, the law treats most of it as fair game.

You deserve better than a privacy illusion. You deserve actual protection. Until Congress acts, you are the last line of defense for your own health data. Start today.