Five universal categories anchored to U.S. federal law. Eight conditional overlays that snap on when the policy itself triggers them. One rule about disclaimers — borrowed from the largest COPPA penalty in FTC history.
Every privacy policy is graded against five universal categories. Each one is anchored to a specific federal statute that defines what good practice looks like. Truthfulness carries extra weight because it's the universal backbone — the FTC enforces every other privacy promise through it.
Are commitments specific and verifiable, or hedged behind "may," "such as," and "including but not limited to"? Vague hedging is a red flag because it makes deception claims harder to prove against the company.
Weight · 25%Does the policy enumerate categories of data collected, specific purposes for each category, retention periods, and categories of recipients? Bundled or open-ended descriptions score low.
Weight · 18.75%Does the policy give actionable rights — access, correction, deletion, portability, objection — with realistic timelines (≤30 days) and a working contact mechanism? "As required by law" doesn't count.
Weight · 18.75%Are specific safeguards described — encryption standards, access controls, audit? Is there a breach notification process matching the federal 60-day benchmark, with a named contact? "Industry-standard" by itself is generic.
Weight · 18.75%Does the policy describe a verifiable parental consent (VPC) mechanism? Parental access and deletion rights? A ban on behavioral advertising to minors? Disclaimers don't earn full credit. See the Epic Rule below.
Weight · 18.75%Truthfulness × 25% + each remaining category × 18.75% = composite score on a 0–10 scale. A: 8.5+ · B: 7.0+ · C: 5.5+ · D: 4.0+ · F: below 4.0.
U.S. federal privacy law is sectoral — each statute covers a specific industry, data type, or harm. Eight overlays snap on when the policy itself triggers them. A banking app gets the GLBA overlay. A telehealth app gets HIPAA. A messaging app gets ECPA/SCA. We never invent triggers — if the policy doesn't mention SMS marketing, the TCPA overlay doesn't apply.
Each triggered overlay returns a status — compliant, partial, non-compliant, or unclear — plus one to three specific findings. Multiple overlays can apply to a single policy. A connected health-and-fitness app with SMS reminders and email marketing gets HIPAA, TCPA, and CAN-SPAM all at once.
In 2022 the FTC fined Epic Games $275 million over Fortnite — the largest COPPA penalty in U.S. history. Epic's privacy policy disclaimed responsibility for users under 13. Children played anyway. The FTC ruled that the disclaimer didn't matter; the actual user base did.
A blanket "we don't direct services to children" disclaimer is not enough when the actual user base includes minors. The policy has to do the work — verifiable parental consent, parental access and deletion, no behavioral ads to kids — or the score reflects what's missing.
Privacy Beat applies this to every analysis. If a service is plausibly used by minors and the policy disclaims child-directed status without a working VPC mechanism, the Children & Vulnerable Users score is capped — regardless of how confident the disclaimer sounds.
The corollary is also true: a policy that explicitly addresses COPPA and describes a working VPC mechanism never scores below 5 in this category, even if other parts of the policy are weak. The Epic Rule cuts both ways.
Any scoring system has limits. These are ours. We surface them up front so the score means what it says — and nothing more.
Almost every privacy-policy analyzer on the market grades against a blend of GDPR (Europe), CCPA (California), and various other state and international frameworks. Privacy Beat anchors specifically to U.S. federal law. There's a reason for that.
U.S. federal law is the floor, not the ceiling. European and state laws are often stricter, but they are also bounded — GDPR doesn't reach a U.S. company without EU users, and CCPA doesn't reach a Floridian. Every U.S. consumer-facing company is reachable by the FTC under Section 5 of the FTC Act. Every health-adjacent product is reachable by HIPAA. Every fintech is reachable by GLBA. The federal framework is the universal common denominator for any service operating in the United States.
Federal law has the deepest enforcement record. The FTC has brought hundreds of privacy and data-security actions under §5. HHS-OCR has issued more than $161 million in HIPAA penalties. The largest COPPA penalty was $275 million; the largest privacy penalty in FTC history was $5 billion. When we say a policy fails a particular check, there is concrete enforcement history showing what failure costs.
And federal law gives us a coherent narrative. The U.S. takes a sectoral approach — different statutes for different industries and harms. Reading them together as a system, instead of as a pile of separate rules, lets us tell users a clear story about what their privacy is and isn't worth in the eyes of American law.
For the deeper reference list, see our companion document tracking all fifteen-plus federal statutes and the case law behind each one. The rubric draws on five primary anchors and four conditional overlays, with three more statutes mentioned for completeness.
The United States Constitution does not contain the word privacy. The Supreme Court has spent more than a century reading it into the document — through the First, Third, Fourth, and Fifth Amendments — case by case. Federal statutes are the patches Congress laid over the gap that doctrine left behind. We score against the patches.
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. U.S. Const. amend. IV · Ratified December 15, 1791
The Fourth Amendment is the strongest privacy anchor in the Constitution. Originally a check on physical intrusion — soldiers entering homes, officers seizing papers — it has been expanded by case law to cover wherever a person has a reasonable expectation of privacy. The relevant cases:
For most of American life, those amendments would be enough. In the digital economy, they aren't — because of a single doctrine the Supreme Court built between 1976 and 1979. The third-party doctrine says that when you voluntarily share information with a third party — your bank, your phone company, your email provider, your cloud — you forfeit Fourth Amendment protection over it. The cases:
Carpenter narrowed the doctrine in 2018 for cell-site location data, but didn't overturn it. The doctrine still controls almost everything else — your search history, your purchase records, your location data sold by data brokers, the messages your apps relay. In a world where nearly every action is mediated by a third-party service, the third-party doctrine reads the Fourth Amendment out of most of modern life.
Bailment is a centuries-old common-law concept. When you hand your car to a valet, the valet has the keys; you still own the car. When the post office carries your sealed letter, the letter is still yours. Ownership stays with the original owner — the bailor — and the bailee just has temporary possession for a limited purpose.
Applied to digital data: when you upload photos to a cloud service, the photos are still yours. When you send an email through a provider, the email is still yours. When a service stores your files, the files are still yours. Government seizure of that data should trigger Fourth Amendment scrutiny because the data is your "papers and effects" in the constitutional sense — regardless of who is physically holding it.
Justice Gorsuch made exactly this argument in his Carpenter dissent — which read more like a concurrence on different grounds. Property rights and bailment, he argued, were the proper basis for protecting Carpenter's data. Legal scholars at the American Enterprise Institute and litigators at the Institute for Justice have built on that framework. New Hampshire HB 1436 attempted to codify it. The federal Surveillance Accountability Act (H.R. 8470) attacks the data-broker loophole the doctrine created.
Privacy Beat exists because of the gap between what the Fourth Amendment promises and what federal sectoral statutes actually deliver. The rubric scores a privacy policy on whether it acknowledges, in operational terms, the responsibilities a company has when it holds someone else's data. A policy that treats user data as the company's own resource — to share, sell, or surrender on demand — denies bailment in everything but name. A policy that treats user data as belonging to the user, with specific limits on use, retention, and disclosure, lives closer to the constitutional principle the Founders described in 1791.
The grade is shorthand for that distance.