the·privacy·beat
▣ The Privacy Beat · Methodology v1

How we
score privacy.

Five universal categories anchored to U.S. federal law. Eight conditional overlays that snap on when the policy itself triggers them. One rule about disclaimers — borrowed from the largest COPPA penalty in FTC history.

▢ Contents
  1. What we score
  2. The conditional overlays
  3. The Epic Rule
  4. What we don't claim
  5. Why federal law
  6. The constitutional gap
§ 01

What we score

Every privacy policy is graded against five universal categories. Each one is anchored to a specific federal statute that defines what good practice looks like. Truthfulness carries extra weight because it's the universal backbone — the FTC enforces every other privacy promise through it.

Category 01

Truthfulness & Specificity

Anchor · 15 U.S.C. § 45 (FTC Act §5)

Are commitments specific and verifiable, or hedged behind "may," "such as," and "including but not limited to"? Vague hedging is a red flag because it makes deception claims harder to prove against the company.

Weight · 25%
Category 02

Data Minimization & Purpose Limitation

Anchor · 15 U.S.C. § 6801 (GLBA model)

Does the policy enumerate categories of data collected, specific purposes for each category, retention periods, and categories of recipients? Bundled or open-ended descriptions score low.

Weight · 18.75%
Category 03

Individual Rights

Anchor · 15 U.S.C. § 1681 + HIPAA

Does the policy give actionable rights — access, correction, deletion, portability, objection — with realistic timelines (≤30 days) and a working contact mechanism? "As required by law" doesn't count.

Weight · 18.75%
Category 04

Security & Breach Handling

Anchor · HITECH 60-day standard

Are specific safeguards described — encryption standards, access controls, audit? Is there a breach notification process matching the federal 60-day benchmark, with a named contact? "Industry-standard" by itself is generic.

Weight · 18.75%
Category 05

Children & Vulnerable Users

Anchor · 15 U.S.C. § 6501 (COPPA)

Does the policy describe a verifiable parental consent (VPC) mechanism? Parental access and deletion rights? A ban on behavioral advertising to minors? Disclaimers don't earn full credit. See the Epic Rule below.

Weight · 18.75%
The Composite

How the grade is computed

Letter Grade · A through F

Truthfulness × 25% + each remaining category × 18.75% = composite score on a 0–10 scale. A: 8.5+ · B: 7.0+ · C: 5.5+ · D: 4.0+ · F: below 4.0.

§ 02

The conditional overlays

U.S. federal privacy law is sectoral — each statute covers a specific industry, data type, or harm. Eight overlays snap on when the policy itself triggers them. A banking app gets the GLBA overlay. A telehealth app gets HIPAA. A messaging app gets ECPA/SCA. We never invent triggers — if the policy doesn't mention SMS marketing, the TCPA overlay doesn't apply.

HIPAA / HITECH
42 U.S.C. § 1320d et seq.
Triggered by health, medical, wellness, fitness, telemedicine, pharmacy, mental health, biometric health data.
GLBA
15 U.S.C. § 6801 et seq.
Triggered by banking, lending, payments, mortgage, insurance, financial advice, credit, investment.
FCRA
15 U.S.C. § 1681 et seq.
Triggered by background checks, tenant screening, employment screening, credit reports, people search, identity verification.
COPPA Detail
15 U.S.C. §§ 6501-6506
Triggered by any minor user, schools, family content, teen-focused services. See the Epic Rule.
ECPA / SCA
18 U.S.C. §§ 2510, 2701
Triggered by messaging, email, content scanning, collaboration tools, document storage.
TCPA
47 U.S.C. § 227
Triggered by SMS marketing, automated calls, robocalls, push notifications via SMS.
CAN-SPAM
15 U.S.C. §§ 7701-7713
Triggered by email marketing, newsletters, promotional emails.
VPPA
18 U.S.C. § 2710
Triggered by video streaming, video hosting, video courses, video newsletters.

Each triggered overlay returns a status — compliant, partial, non-compliant, or unclear — plus one to three specific findings. Multiple overlays can apply to a single policy. A connected health-and-fitness app with SMS reminders and email marketing gets HIPAA, TCPA, and CAN-SPAM all at once.

§ 03

The Epic Rule

In 2022 the FTC fined Epic Games $275 million over Fortnite — the largest COPPA penalty in U.S. history. Epic's privacy policy disclaimed responsibility for users under 13. Children played anyway. The FTC ruled that the disclaimer didn't matter; the actual user base did.

A blanket "we don't direct services to children" disclaimer is not enough when the actual user base includes minors. The policy has to do the work — verifiable parental consent, parental access and deletion, no behavioral ads to kids — or the score reflects what's missing.

FTC v. Epic Games · December 2022 · $275M COPPA + $245M dark patterns

Privacy Beat applies this to every analysis. If a service is plausibly used by minors and the policy disclaims child-directed status without a working VPC mechanism, the Children & Vulnerable Users score is capped — regardless of how confident the disclaimer sounds.

The corollary is also true: a policy that explicitly addresses COPPA and describes a working VPC mechanism never scores below 5 in this category, even if other parts of the policy are weak. The Epic Rule cuts both ways.

§ 04

What we don't claim

Any scoring system has limits. These are ours. We surface them up front so the score means what it says — and nothing more.

  1. We score documents, not companies. A great company with a thin policy still scores low. A bad actor with a beautifully drafted policy can score high. The grade reflects what the policy promises and how clearly it promises it — not what the company actually does behind the scenes. Enforcement actions tell a different story; we link to them where they exist.
  2. We don't tell you to stop using a service. A low score is information, not a recommendation. Many people will read a D-grade policy and decide the service is still worth it. That's a personal trade-off. Our job is to make the trade-off visible.
  3. We're not your lawyer. Every score and finding is a factual statement about the policy text, not a legal conclusion. We don't determine what violates federal law. We don't determine fair use, fiduciary duty, or contractual standing. If you need legal advice about a specific situation, talk to a lawyer.
  4. The methodology evolves. Federal privacy law is moving — the SECURE Data Act is pending, FISA Section 702 is in flux, and case law on the third-party doctrine is shifting. The rubric will change as the law does. We version every methodology update and keep prior versions readable.
  5. We can be wrong. AI analysis at scale produces edge cases. If a policy says something the analyzer missed, or scores something we got wrong, tell us — every report has a feedback link. Our own privacy policy is scored by the same rubric, with the same honest weaknesses surfaced. See the Transparency Report.
§ 05

Why federal law

Almost every privacy-policy analyzer on the market grades against a blend of GDPR (Europe), CCPA (California), and various other state and international frameworks. Privacy Beat anchors specifically to U.S. federal law. There's a reason for that.

U.S. federal law is the floor, not the ceiling. European and state laws are often stricter, but they are also bounded — GDPR doesn't reach a U.S. company without EU users, and CCPA doesn't reach a Floridian. Every U.S. consumer-facing company is reachable by the FTC under Section 5 of the FTC Act. Every health-adjacent product is reachable by HIPAA. Every fintech is reachable by GLBA. The federal framework is the universal common denominator for any service operating in the United States.

Federal law has the deepest enforcement record. The FTC has brought hundreds of privacy and data-security actions under §5. HHS-OCR has issued more than $161 million in HIPAA penalties. The largest COPPA penalty was $275 million; the largest privacy penalty in FTC history was $5 billion. When we say a policy fails a particular check, there is concrete enforcement history showing what failure costs.

And federal law gives us a coherent narrative. The U.S. takes a sectoral approach — different statutes for different industries and harms. Reading them together as a system, instead of as a pile of separate rules, lets us tell users a clear story about what their privacy is and isn't worth in the eyes of American law.

For the deeper reference list, see our companion document tracking all fifteen-plus federal statutes and the case law behind each one. The rubric draws on five primary anchors and four conditional overlays, with three more statutes mentioned for completeness.

§ 06 · Closing

The constitutional gap

The United States Constitution does not contain the word privacy. The Supreme Court has spent more than a century reading it into the document — through the First, Third, Fourth, and Fifth Amendments — case by case. Federal statutes are the patches Congress laid over the gap that doctrine left behind. We score against the patches.

The Fourth Amendment, in full

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. U.S. Const. amend. IV · Ratified December 15, 1791

How the Court has read it

The Fourth Amendment is the strongest privacy anchor in the Constitution. Originally a check on physical intrusion — soldiers entering homes, officers seizing papers — it has been expanded by case law to cover wherever a person has a reasonable expectation of privacy. The relevant cases:

The doctrine that hollowed it out

For most of American life, those amendments would be enough. In the digital economy, they aren't — because of a single doctrine the Supreme Court built between 1976 and 1979. The third-party doctrine says that when you voluntarily share information with a third party — your bank, your phone company, your email provider, your cloud — you forfeit Fourth Amendment protection over it. The cases:

Carpenter narrowed the doctrine in 2018 for cell-site location data, but didn't overturn it. The doctrine still controls almost everything else — your search history, your purchase records, your location data sold by data brokers, the messages your apps relay. In a world where nearly every action is mediated by a third-party service, the third-party doctrine reads the Fourth Amendment out of most of modern life.

The alternative: bailment

Your data is your data — even when someone else is holding it.

Bailment is a centuries-old common-law concept. When you hand your car to a valet, the valet has the keys; you still own the car. When the post office carries your sealed letter, the letter is still yours. Ownership stays with the original owner — the bailor — and the bailee just has temporary possession for a limited purpose.

Applied to digital data: when you upload photos to a cloud service, the photos are still yours. When you send an email through a provider, the email is still yours. When a service stores your files, the files are still yours. Government seizure of that data should trigger Fourth Amendment scrutiny because the data is your "papers and effects" in the constitutional sense — regardless of who is physically holding it.

Justice Gorsuch made exactly this argument in his Carpenter dissent — which read more like a concurrence on different grounds. Property rights and bailment, he argued, were the proper basis for protecting Carpenter's data. Legal scholars at the American Enterprise Institute and litigators at the Institute for Justice have built on that framework. New Hampshire HB 1436 attempted to codify it. The federal Surveillance Accountability Act (H.R. 8470) attacks the data-broker loophole the doctrine created.

Why this frames the rubric

Privacy Beat exists because of the gap between what the Fourth Amendment promises and what federal sectoral statutes actually deliver. The rubric scores a privacy policy on whether it acknowledges, in operational terms, the responsibilities a company has when it holds someone else's data. A policy that treats user data as the company's own resource — to share, sell, or surrender on demand — denies bailment in everything but name. A policy that treats user data as belonging to the user, with specific limits on use, retention, and disclosure, lives closer to the constitutional principle the Founders described in 1791.

The grade is shorthand for that distance.

Not legal advice The Privacy Beat is not a law firm. The methodology described here is a research-and-product framework for analyzing publicly available privacy policies. Every score is a factual statement about policy text, not a legal conclusion. Case citations have been verified against publicly available Supreme Court opinions, but legal interpretation is the job of attorneys. If you have a specific legal question — about your company's compliance, your rights as a consumer, or the application of any statute mentioned here — consult qualified counsel. The methodology evolves with the law and with feedback from users and experts. Send corrections, disagreements, and suggestions through the contact link on the analyzer.