Gradient glow shape

Shah v Capital One CCPA PRA Risk

By Vault JS | October 10, 2025

At a recent privacy event for Chief Privacy Officers in Los Angeles California, it wasn’t the relatively non existent CCPA (California Consumer Privacy Act) enforcement that stirred concern. Instead, discussions heated up around private right of action letters, especially under CIPA (California Invasion of Privacy Act) and VPPA (Video Privacy Protection Act). Adding to the unease was an emerging risk: plaintiffs’ attorneys are beginning to target the private right of action rights under CCPA, initially meant for data breaches, towards cases of excessive data collection by digital tracking technologies (DTTs).   

No case better represents this emerging risk than Shah v Capital One Financial CorporationIn March 2025, a federal court in the Northern District of California sent a jolt through the privacy world when the claim was permitted to move forward without a traditional “data breach,” based on allegations that DTTs (think Meta Pixel, Google Analytics, and similar tags) disclosed users’ personal information to third parties. The court denied Capital One’s motion to dismiss on the CCPA private right of action, signaling a potentially broader path for consumers—and a bigger risk surface for businesses.  

According to CCPA, 

“Any consumer whose nonencrypted or nonredacted personal information. . . is subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’ violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information. . . “ 

Many courts read this as breach-only (see McCoy v Alphabet N.D Cal. 2021). Shah suggests a different lens: unauthorized disclosure via embedded tracking technologies can plausibly fall within “failure to implement reasonable security procedures” even when there’s no outside hacker. While this wasn’t the first time courts hinted at breach rights to include unauthorized disclosure (see Stasi v. Inmediata S.D. Cal. 2020), Shah introduces a case with data leakage but without a clear security incidentAt the pleading stage, that was enough to keep the CCPA claim alive.  

What the court actually did (and didn’t) do

  • Did: Let the CCPA PRA claim proceed past the motion-to-dismiss stage based on alleged disclosures through pixels/cookies. It also let some related California Invasion of Privacy Act (CIPA) theories proceed, framing certain tracking as potential “pen register/trap-and-trace”-like activity. 
  • Didn’t: Decide the merits. This is an early procedural ruling—not a final liability finding. Evidence, class certification, and potential appeals all lie ahead.  

Risk to businesses:

Here’s how Shah could reshape the privacy landscape: 

  1. Beyond classic breaches. If a site’s trackers route user information (e.g., URLs visited, form inputs, account or financial data) to ad/analytics vendors without adequate notice, consent, or safeguards, that may support a CCPA PRA theory. 
  2. More leverage in litigation. Because courts sometimes treat pixel-based sharing as “unauthorized disclosure,” plaintiffs can potentially seek statutory damages ($100–$750 per consumer per incident) or actual damages, alongside injunctive relief. (Damages availability still depends on proof and how courts continue to read the statute.) 
  3. CIPA hooks. Separate from the CCPA, some tracking setups might be framed as intercepting communications “in transit,” adding another cause of action with its own remedies and penalties. 

Important caveat: Shah is one district court decision at the pleading stage. Other courts may disagree or cabin its reach, and the case could evolve on appeal or summary judgment. There is a history of cases similarly wrestling with the scope of CCPA’s private right of action:

M.G. v. TherapyMatch, Inc. (N.D. Cal. 2024): Court allowed a CCPA claim over pixel/analytics disclosures, holding plaintiffs didn’t need a traditional breach to allege “unauthorized disclosure.”
In re Google RTB Consumer Privacy Litigation (N.D. Cal. 2022): Court sustained claims that Google’s real-time bidding system unlawfully disclosed user data to ad-tech partners.
McCoy v. Alphabet, Inc. (N.D. Cal. 2021): Court dismissed a CCPA claim with prejudice, holding the PRA applies only to breach-type events. Shows the narrow reading still has traction.
Stasi v. Inmediata Health Grp. (S.D. Cal. 2020): Court allowed CCPA claims for non-medical data in a healthcare breach, while excluding medical data governed by CMIA. Illustrates careful parsing of statutory boundaries.
Torres v. Prudential Financial (C.D. Cal. 2025): Court rejected a CIPA claim where plaintiffs couldn’t show third-party “reading” of communications in transit. Highlights that CIPA theories may face evidentiary hurdles even when CCPA claims survive.

Together, these cases illustrate the tug-of-war: some courts are expanding the PRA beyond breaches, while others reaffirm its narrow scope.  One should view the Shah decision as momentum towards broadening the risk of private right of action with a lot of work ahead before further clarification will be reached. 

That having been said, this case set a precedent that materially increased risk for “ordinary” pixels and SDKs. Even if an organization has never suffered a breach, Shah highlights five pressure points:

  1. Pixels can equal “disclosure.” If the tags send identifiers, page paths, or sensitive inputs (e.g., search queries, account details, health/finance hints) to third parties, plaintiffs may argue a failure to use “reasonable security” to prevent exfiltration—without a malicious actor.  
  2. Policies vs. practice. Mismatches between privacy notices and actual data flows become litigation magnets. Plaintiffs in Shah pointed to policy language they say didn’t authorize the sharing at issue.
  3. Vendor sprawl. The more digital tracking technologies used, the more potential “disclosures”—and the harder it is to prove minimization, necessity, and contractual controls.
  4. CIPA exposure alongside CCPA. Plaintiffs may stack wiretap-style claims on top of CCPA counts when trackers capture session content, keystrokes, or headers in transit.
  5. Class action gravity. Shah is already being cited by plaintiff and defense firms alike as a roadmap case for pixel lawsuits, particularly in California. Expect more filings, bigger classes, and earlier settlement pressure.  

Practical playbook: how to reduce your pixel risk (now)

There are blogs that suggest things like “Remove nonessential pixels; disable parameters sending URL paths or form values. Block by default; allow by need.” That’s great unless you actually have a marketing team trying to serve the needs of your users and prospects.

In that case…

  1. Tighten contracts & DPAs. Carefully document legal basis for your digital tracking technologies categorizations. Require vendors to be service providers/contractors (not third parties) where possible, ban secondary use, and document instructions, retention, and deletion. Validate subprocessor lists and other piggybacking digital tracking technologies.
  2. Consent + preference integrity. Implement region-aware consent that truly suppresses tracking until choice, and honor Global Privacy Control (GPC).
  3. Update notices to realities. Make sure your privacy policy and “Do Not Sell/Share” disclosures match the data you actually send to vendors (identifiers, online activity, precise geolocation, etc.).
  4. Gate sensitive contexts. On authorized pages, payments, financial or health flows, minimize digital tracking technologies or route analytics through a vetted, self-hosted proxy with field-level redaction of PII/PHI/financial inputs.
  5. Prove “reasonable security.” Maintain written policies for tag governance, input redaction, log review, and break-glass procedures; run periodic red-team/QA tests to confirm no PII bleeds into analytics payloads. Document it—because in Shah, procedures matter.
  6. Litigation readiness. Preserve configs, tag versions, network traces, vendor contracts, and consent logs. If sued, you’ll need to show exactly what left the browser, when, to whom, and under what controls. Utilize a scanner that samples the site as a user from California which analyze and catalogs the user experience with at least some sampling daily or weekly at a minimum.
  7. Verify. Test your site for compliance and maintain audit runs to aid in frivolous suits.
    • Ensure testing technology is capable of identifying all types of trackers such as in and out of an iframe, all types of cookies (store, transmit and observe) etc.
    • Create inventory of DTTs seen in testing across all digital properties (web, mobile, ctv as applicable).
    • Test GPC of a user coming from CA across properties.
    • Test opt out as well as do not sell do not share.
    • Ensure testing covers entire footprint at least quarterly to ensure all templates are captured in regular testing.
    • Test secure/authenticated areas as well as public.
    • Test primary user flows in addition to single page sessions.
    • Test sensitive flows such as checkout processes.

What to watch next

  • Appeals and sister cases. Expect other courts to either follow or distinguish Shah. A split could invite California appellate attention or legislative clarification. 
  • Agency posture. The California Privacy Protection Agency (CPPA) has prioritized dark patterns, sensitive data, and adtech signals; Shah’s framing aligns with that emphasis even though this case is private litigation, not enforcement. Track rulemakings and enforcement summaries. (Inference based on agency priorities reported in practitioner analyses.) 
  • Contracting patterns. Look for vendors rolling out “service-provider mode,” proxying, and field-level hashing/redaction by default as a selling point to reduce CCPA/CIPA exposure.  

Bottom line

For businesses, the message is clear: treat pixel governance as security, not merely “marketing ops.” Lock down trackers in sensitive contexts, align notices with reality, upgrade vendor contracts, and be ready to prove your “reasonable security” story.

Sources