The Federal Trade Commission has dramatically escalated privacy enforcement in 2024-2025, bringing landmark cases against health apps, data brokers, connected vehicles, and major tech companies. Understanding FTC enforcement is critical for CIPP/US exam success—it represents 20-24% of the exam and demonstrates how Section 5 of the FTC Act operates in practice. This comprehensive guide analyzes every major enforcement action, emerging enforcement theories, and the patterns you need to know for the exam.
FTC enforcement appears throughout the CIPP/US exam:
- Domain I: FTC authority and enforcement mechanisms
- Domain II: Sector-specific enforcement (20-24% of exam - largest domain)
- Domain IV: Workplace monitoring and AI in employment
- Domain V: Coordination with state AGs on state law enforcement
Exam tip: Know specific cases, penalties, and legal theories—not just general principles.
FTC Section 5 Authority: Foundation of Privacy Enforcement
Before diving into specific cases, understanding the FTC's legal authority is essential for both practice and the exam.
Section 5 of the FTC Act: The Privacy Workhorse
Legal Framework:
- Statute: 15 U.S.C. § 45(a) prohibits "unfair or deceptive acts or practices in or affecting commerce"
- Jurisdiction: Applies to most businesses (exceptions: banks, insurance companies, nonprofits, common carriers)
- No Need for Actual Harm: FTC can act on likelihood of harm, not just proven injury
- Broad Interpretation: FTC has expansively interpreted Section 5 to cover data privacy and security
Deceptive Practices Standard:
- Representation, omission, or practice that is likely to mislead
- Consumer interpretation is reasonable under the circumstances
- Misleading representation is material (likely to affect consumer decisions)
Unfair Practices Standard (3-prong test):
- Causes or is likely to cause substantial injury to consumers
- Injury is not reasonably avoidable by consumers
- Injury is not outweighed by countervailing benefits to consumers or competition
FTC Enforcement Tools
- Consent Orders: Settlements without admission of liability (most common outcome)
- Civil Penalties: Up to $50,120 per violation (adjusted annually for inflation)
- Injunctive Relief: Orders to stop illegal practices
- Consumer Redress: Refunds to harmed consumers
- Data Deletion Orders: "Algorithmic disgorgement" - delete data and algorithms derived from it
- Monitoring: Compliance monitoring periods (typically 10-20 years)
- Privacy Program Requirements: Mandated comprehensive privacy programs
2024-2025 Enforcement Trends: The Big Picture
Five Priority Enforcement Areas (2024-2025):
- Health Privacy: Digital health apps, telehealth, health data brokers
- Location Data: Precise geolocation tracking, data brokers, connected vehicles
- Children's Privacy (COPPA): Social media, AI toys, teen-targeted apps
- Data Security: Unreasonable security practices, breach notification failures
- AI & Automated Decision-Making: Deceptive AI claims, discriminatory algorithms
Major Health Privacy Cases: Digital Health Under Scrutiny
- GoodRx: $1.5 million penalty (first HBNR enforcement)
- BetterHelp: $7.8 million (first consumer refunds for health data breach)
- Combined: ~$10 million in penalties
- Message: Health data sharing for advertising violates Section 5
Case Study 1: GoodRx Holdings ($1.5 Million - February 2023)
GoodRx Holdings, Inc.
Industry: Prescription drug discount provider, telehealth
Penalty: $1.5 million civil penalty
Date: February 1, 2023
Legal Basis: Section 5 (deceptive practices) + Health Breach Notification Rule
Allegations:
- Shared consumers' prescription medication and health condition data with Facebook, Google, and Criteo
- Used third-party tracking pixels and SDKs that transmitted sensitive health information
- Privacy policy promised to limit sharing of personal health information
- Failed to notify consumers of data breaches under Health Breach Notification Rule
- Shared data included: prescription medications, health conditions, phone numbers
Settlement Terms:
- $1.5 million civil penalty
- Permanent ban on sharing health data for advertising purposes
- Must obtain affirmative express consent before sharing health data
- Direct third parties to delete previously shared health data
- Implement comprehensive privacy program
- Cannot misrepresent data practices
Significance:
- First FTC enforcement of Health Breach Notification Rule since 2009
- Established that sharing data with advertisers = "breach" under HBNR
- Set precedent for non-HIPAA entities handling health information
- Tracking pixels deemed data sharing mechanism subject to consent requirements
Case Study 2: BetterHelp, Inc. ($7.8 Million - March 2023)
BetterHelp, Inc.
Industry: Online mental health counseling
Penalty: $7.8 million (consumer refunds)
Date: March 2, 2023 (finalized July 14, 2023)
Legal Basis: Section 5 (deceptive practices)
Allegations:
- Shared sensitive mental health data with Facebook, Snapchat, Pinterest, and Criteo
- Used hashed email addresses tied to mental health questionnaire responses
- Promised consumers data would only be used for counseling services
- Shared data used for targeted advertising and lookalike audiences
- Deceptively displayed HIPAA compliance seals without actual compliance review
- Data sharing occurred 2013-2020 (7-year period)
Settlement Terms:
- $7.8 million payment for consumer refunds (first-ever health data consumer redress)
- Ban on sharing health data for advertising purposes
- Ban on sharing personal information for retargeting
- Obtain affirmative express consent before sharing data with third parties
- Implement comprehensive privacy program
- Direct third parties to delete previously shared data (90-day deadline)
- Limit data retention periods
- 10-year compliance monitoring
Significance:
- First FTC action returning funds to consumers for health data violations
- Established that mental health status is sensitive data requiring heightened protection
- Hashed data still personally identifiable under FTC framework
- False HIPAA compliance claims separately actionable
- Set template for digital health company obligations
Health Privacy Lessons for CIPP/US Exam
- Health Breach Notification Rule: Applies to vendors of personal health records NOT covered by HIPAA
- "Breach" Definition: Includes unauthorized disclosure (not just cyber intrusions)
- Tracking Pixels: Third-party pixels can constitute data sharing requiring consent
- Hashing Not Protection: Hashed data still considered personally identifiable if reversible
- Privacy Policy = Contract: Promises in privacy policies are enforceable by FTC
- Advertising Use: Sharing health data for advertising almost always violates Section 5
Location Data Crackdown: Data Broker Enforcement Wave
The FTC launched an unprecedented series of enforcement actions against data brokers collecting and selling precise geolocation data in 2024, establishing new standards for location data practices.
Case Study 3: X-Mode Social (Outlogic) - January 2024
X-Mode Social, Inc. (now Outlogic)
Industry: Location data broker
Date: January 9, 2024
Legal Basis: Section 5 (unfair and deceptive practices)
Key Innovation: First-ever BAN on use and disclosure of sensitive location data
Allegations:
- Collected location data through SDKs embedded in mobile apps
- Sold data revealing visits to sensitive locations (medical facilities, places of worship, protests)
- Failed to adequately disclose data collection to app developers and consumers
- Sold data to government contractors without consumer knowledge
- Made deceptive claims about de-identification and anonymization
Settlement Terms:
- Unprecedented: Complete BAN on collecting, using, or selling sensitive location data
- Must delete all previously collected location data
- Cannot use location data to create audience segments or user profiles
- If resumes location data collection, must obtain affirmative express consent
- Cannot misrepresent data practices
Case Study 4: InMarket Media - May 2024
InMarket Media, LLC
Industry: Data aggregator, location-based advertising
Date: May 1, 2024
Legal Basis: Section 5 (unfair practices)
Allegations:
- Failed to obtain informed consent for location data collection
- Privacy disclosures buried in lengthy privacy policies
- No clear notice that data would be used for targeted advertising
- Shared precise location data with third parties without disclosure
Settlement Terms:
- Comprehensive location data ban (same as X-Mode)
- Delete all previously collected location data
- Implement sensitive location data program if resumes collection
- Must provide clear, prominent notice
- Obtain affirmative express consent
Additional Location Data Cases (2024)
Gravy Analytics (December 2024):
- Used location data to categorize consumers by sensitive characteristics
- Complete ban on sensitive location data
- Algorithmic disgorgement required
Mobilewalla (January 2025):
- Aggregated location data from multiple sources
- Created consumer segments based on visits to sensitive locations
- Same comprehensive ban structure
Case Study 5: General Motors & OnStar - January 2025 (LANDMARK)
General Motors LLC & OnStar LLC
Industry: Connected vehicles, telematics
Date: January 16, 2025
Legal Basis: Section 5 (unfair and deceptive practices)
Significance: FIRST FTC action on connected vehicle data
Allegations:
- Collected precise geolocation data every 3 seconds from millions of vehicles
- Collected driving behavior data (hard braking, speeding, late-night driving)
- Sold data to consumer reporting agencies without consumer knowledge
- Misleading enrollment process for OnStar Smart Driver feature
- Data used by insurance companies to set rates—consumers saw premiums spike
- Some consumers unaware they'd enrolled in Smart Driver
- Enrollment screens provided only "Accept" button (no clear decline option)
Settlement Terms:
- 5-year BAN on disclosing geolocation and driver behavior data to consumer reporting agencies
- Must obtain affirmative express consent for data collection
- Provide easy opt-out mechanism
- Allow consumers to access their collected data
- Enable data deletion requests
- Limit location tracking capabilities
- Cannot misrepresent data practices
- Comprehensive privacy program required
Consumer Impact:
- Insurance premiums increased 80%+ for some consumers
- One consumer reported "603 events" shared with insurer
- Many consumers discovered sharing only after rate increases
- High-performance vehicle owners penalized for using vehicle capabilities
Significance:
- Establishes framework for connected vehicle regulation
- Demonstrates FTC will regulate emerging technologies
- Highlights consent requirements for IoT devices
- Shows real consumer harm from data sharing
Location Data Enforcement Lessons
- Sensitive Locations: Medical facilities, places of worship, reproductive health clinics, addiction treatment, protests
- Complete Bans Possible: FTC can ban data practices entirely, not just require consent
- Precise Geolocation: Data accurate within ~1,750 feet or less
- Consumer Reporting Agencies: Sharing with CRAs requires heightened disclosures
- Affirmative Express Consent: Required for sensitive location data—no pre-checked boxes
- Connected Devices: IoT devices held to same standards as apps
COPPA Enforcement: Protecting Children Online
Children's privacy enforcement remained a top FTC priority in 2024-2025, with major actions against social media platforms, toy makers, and teen-focused apps.
2025 COPPA Rule Updates (Effective April 2025)
Major Changes:
- Separate opt-in consent required for targeted advertising
- Separate opt-in consent required for third-party disclosure
- Enhanced data minimization requirements
- Retention limited to "reasonably necessary" period
- Strengthened parental controls
Case Study 6: TikTok/ByteDance - August 2024
ByteDance Ltd. & TikTok Inc.
Date: August 2, 2024
Legal Basis: COPPA violations
Allegations:
- Failed to obtain parental consent for children under 13
- Allowed children to create accounts without age verification
- Collected persistent identifiers from children
- Failed to honor parental deletion requests
- Violations dating back to Musical.ly acquisition (2017)
Significance: Reinforces COPPA applies to social media platforms, not just kids' sites
Case Study 7: Disney - September 2025
The Walt Disney Company
Penalty: $10 million
Date: September 2, 2025
Legal Basis: COPPA violations
Allegations:
- Improperly designated YouTube channel videos as "Not Made for Kids"
- Enabled targeted advertising on content directed to children
- Collected persistent identifiers from children without parental consent
- Failed to properly categorize content on Disney-branded channels
Settlement: $10 million penalty + improved content designation procedures
Case Study 8: Apitor Technology (Robot Toys) - September 2025
Apitor Technology Co.
Penalty: $500,000 (suspended based on inability to pay)
Date: September 3, 2025
Allegations:
- Programmable robot toys designed for children 6-14
- Required location permissions to use Android app
- Collected precise geolocation without parental consent
- Integrated third-party SDK that collected and shared geolocation data
- Gave third party broad latitude to use data for advertising
Significance: Extends COPPA to IoT toys; third-party SDK integration doesn't absolve liability
Case Study 9: Iconic Hearts/Sendit App - September 2025
Iconic Hearts Holdings, Inc. (Sendit App)
Date: September 29, 2025
Legal Basis: COPPA + ROSCA + Section 5
Allegations:
- Anonymous messaging app targeted to teens
- Sent fake, provocative messages to children to induce subscriptions
- Collected personal data from children without parental consent
- Required Diamond Membership subscription to reveal message senders
- Provided useless or fabricated information to subscribers
- Dark pattern: fake messages designed to appear sexual/provocative
Significance: Shows FTC will target deceptive practices even in "free" apps; fake engagement is unfair practice
Additional 2024-2025 COPPA Cases
- Microsoft ($20M - June 2023): Xbox violations, account creation without parental consent
- Epic Games ($275M - December 2022): Fortnite dark patterns, default privacy settings
- Amazon/Alexa ($25M - June 2023): Voice recordings retention, failed to honor deletion requests
- Edmodo ($6M - August 2023): Ed-tech platform, inadequate parental consent procedures
Data Security Enforcement: Novel Theories Emerging
Case Study 10: Blackbaud - February 2024
Blackbaud, Inc.
Date: February 1, 2024
Legal Basis: Section 5 (unfair practices - novel theory)
First-Ever Enforcement Actions:
- FIRST standalone unfairness claim for unreasonable data retention
- FIRST unfairness and deception claims for inadequate breach notification
Allegations:
- Retained backup data unnecessarily long (data minimization failure)
- Unreasonable data retention practices
- Breach notifications understated scope and severity
- Failed to promptly notify affected customers
- Downplayed significance of compromised data
Significance:
- Establishes data retention as standalone violation
- Shows FTC will scrutinize breach notification quality
- Minimizing breach severity can be deceptive practice
Case Study 11: Avast - February 2024
Avast Limited
Penalty: $16.5 million
Date: February 22, 2024
Allegations:
- Avast antivirus software promised to protect privacy and prevent tracking
- Simultaneously collected and sold browsing data through subsidiary Jumpshot
- Browsing data sold for advertising purposes
- Deceptive claims about privacy protection
- Privacy-focused branding contradicted actual practices
Settlement:
- $16.5 million penalty
- Permanent ban on selling Avast-branded browsing data for advertising
- Must obtain explicit, unambiguous consent before selling browsing data from other products
- Delete all browsing data transferred to Jumpshot
- Notify affected consumers
- Algorithmic disgorgement: Delete products/algorithms derived from improperly collected data
Emerging Enforcement Areas
Artificial Intelligence & Automated Decision-Making
While full AI enforcement is still developing, the FTC has brought actions involving:
- Rite Aid (October 2023): Facial recognition system with discriminatory outcomes; required to suspend use for 5 years
- Weight Watchers (2023): AI-powered app illegally collected children's data; algorithmic disgorgement ordered
- Ring/Amazon (2023): Failed to restrict employee access to video recordings; required algorithm deletion
- Avast (2024): Delete algorithms derived from improperly collected browsing data
Algorithmic Disgorgement: Emerging remedy requiring deletion of not just data, but algorithms, models, and products derived from illegally obtained data.
Social Media & Dark Patterns
September 2024 FTC Report: "A Look Behind the Screens"
- Examined data practices of major social media platforms
- Found "vast surveillance" of users
- Inadequate protections for children and teens
- Lax privacy controls
- Extensive AI use with minimal transparency
Biometric Data
FTC Policy Statement on Biometrics (2023): Biometric information is sensitive data deserving heightened protection under Section 5.
2025 Enforcement Priorities Under New Leadership
With leadership changes in 2025 (Commissioner Ferguson becoming Chair), enforcement priorities are evolving while maintaining focus on core areas:
Expected Continuity:
- Children's privacy and online safety
- Location data privacy
- Data security
- Health privacy
Potential Shifts:
- Possible refinement of unfairness claims around sensitive data classification
- Continued focus on deceptive practices over novel unfairness theories
- Attention to content moderation and speech issues
- Examination of how platforms moderate content
Key Enforcement Patterns for CIPP/US Exam
Common Violations
| Violation Type | Examples | Typical Remedy |
|---|---|---|
| Deceptive Privacy Policies | GoodRx, BetterHelp, Avast | Monetary penalty + ban on practice + privacy program |
| Inadequate Consent | GM/OnStar, location data brokers | Require affirmative express consent + compliance monitoring |
| Sensitive Data Sharing | Health data, location data cases | Complete ban on practice + data deletion |
| COPPA Violations | TikTok, Disney, Apitor, Sendit | Penalties + parental consent mechanisms + data deletion |
| Data Security Failures | Blackbaud, Drizly, CafePress | Security program requirements + monitoring |
| Unreasonable Data Retention | Blackbaud (first standalone case) | Data minimization requirements + deletion schedules |
Standard Consent Order Provisions
Nearly Every FTC Consent Order Includes:
- Monetary Relief: Civil penalties and/or consumer redress
- Injunctive Prohibitions: Specific banned practices
- Affirmative Obligations: Required actions (consent mechanisms, notices)
- Privacy Program Requirements: Comprehensive privacy program with specific elements
- Data Deletion: Delete improperly collected data
- Third-Party Notification: Direct third parties to delete shared data
- Compliance Monitoring: Typically 10-20 years
- Reporting Requirements: Annual compliance reports to FTC
- Record Keeping: Maintain documentation of compliance
- Employee Training: Privacy and security training programs
Penalty Calculation Factors
- Nature of Violation: Deception vs. unfairness
- Sensitivity of Data: Health, children's, location data = higher penalties
- Number of Consumers Affected: More consumers = higher penalties
- Duration of Violations: Years of violations = higher penalties
- Company Size & Revenue: Ability to pay considerations
- Prior Violations: Repeat offenders face enhanced penalties
- Consumer Harm: Actual vs. potential harm
- Company Cooperation: Self-reporting may reduce penalties
Exam Study Strategy: FTC Enforcement
High-Yield Case Studies to Memorize
Health Privacy (Memorize These):
- GoodRx ($1.5M) - First HBNR enforcement, tracking pixels
- BetterHelp ($7.8M) - First consumer refunds for health data, mental health sensitivity
Location Data (Know the Pattern):
- X-Mode/Outlogic - First complete ban on sensitive location data
- GM/OnStar - First connected vehicle case, insurance implications
COPPA (Know Major Cases):
- Epic Games ($275M) - Largest COPPA penalty ever
- Disney ($10M) - YouTube designation issues
- Microsoft ($20M) - Xbox account creation
Data Security (Novel Theories):
- Blackbaud - First data retention unfairness claim, breach notification quality
Exam Question Patterns
Type 1: Case Identification
"Which FTC enforcement action was the first to enforce the Health Breach Notification Rule?"
Answer: GoodRx Holdings (2023)
Type 2: Legal Theory Application
"A company promises in its privacy policy not to share health data but shares it with advertisers. What is the likely FTC claim?"
Answer: Deceptive practice under Section 5 (privacy policy = enforceable promise)
Type 3: Remedy Identification
"What remedy did the FTC impose in location data broker cases like X-Mode?"
Answer: Complete ban on collecting, using, or selling sensitive location data + data deletion
Type 4: Threshold/Standard Questions
"What is required to establish an unfair practice under Section 5?"
Answer: (1) Substantial injury, (2) not reasonably avoidable, (3) not outweighed by benefits
Common Exam Traps
- Confusing HBNR with HIPAA: HBNR applies to non-HIPAA entities; know the difference
- Thinking FTC needs actual harm: FTC can act on likelihood of harm
- Forgetting about state AG coordination: Many cases involve both FTC and state AGs
- Not knowing recent cases: Exam includes 2024-2025 cases; stay current
- Ignoring algorithmic disgorgement: New remedy - delete algorithms derived from bad data
Practical Compliance Lessons
What the Cases Teach
1. Privacy Policies Are Contracts
- Every promise in privacy policy is enforceable
- Overpromising = deceptive practice
- Privacy policy must match actual practices
2. Tracking Pixels Are Data Sharing
- Third-party pixels constitute data disclosure
- Must disclose pixel use in privacy policy
- Health data + pixels = high FTC scrutiny
3. Sensitive Data Requires Special Care
- Health data, precise location, children's data = sensitive
- Affirmative express consent required
- Consider not collecting if not essential
4. Third-Party Liability Doesn't End Responsibility
- Sharing data with third parties doesn't transfer liability
- Must ensure third parties comply with your promises
- SDK integration = your responsibility
5. Data Minimization Is Enforceable
- Collecting or retaining unnecessary data = unfair practice
- Implement data retention schedules
- Delete data when no longer needed
Conclusion
FTC privacy enforcement has reached unprecedented levels in 2024-2025, with landmark cases establishing new legal frameworks for health privacy, location data, connected vehicles, and children's online safety. Understanding these cases is essential for CIPP/US exam success—they represent how Section 5 operates in practice and demonstrate evolving enforcement priorities.
Key Trends to Remember:
- Health privacy is a top enforcement priority (GoodRx, BetterHelp)
- Location data faces unprecedented scrutiny (complete bans now possible)
- COPPA enforcement includes novel theories (fake engagement, dark patterns)
- Connected devices subject to same standards as apps (GM/OnStar)
- Data retention can be standalone violation (Blackbaud)
- Algorithmic disgorgement emerging as standard remedy
- Consumer redress increasingly common (BetterHelp first-ever for health data)
For the CIPP/US exam, focus on memorizing major case names, penalties, legal theories, and novel remedies. Understand how Section 5 unfairness and deception standards apply in different contexts. Know the difference between HBNR and HIPAA. Study the pattern of consent order provisions.
Most importantly, recognize that FTC enforcement creates the "common law" of US privacy—these cases define what practices are acceptable and what crosses the line. Master them, and you'll excel on 20-24% of your exam.
Test Your FTC Enforcement Knowledge
Practice questions on Section 5, major cases, and enforcement theories with detailed explanations