Front-Loading the Determination: A Response to EFF on ALPR Transparency
EFF proposes case-by-case balancing instead of exemptions. Front-loaded rulemaking and watchlist-based determinations would actually work.
Yesterday, the Electronic Frontier Foundation (EFF) published a post on ALPR transparency opposing a wave of state bills that would categorically exempt ALPR data from public records laws. Per EFF, “EFF is alarmed by recent laws in several states that have blocked public access to data collected by ALPRs”. The post catalogs seven states — Connecticut, Arizona, Washington, Illinois, Georgia, Maryland, Oklahoma — moving in the same direction, and cites public-records work documenting racist ALPR use, surveillance of protestors, and tracking of an abortion-seeking patient as the reason that access matters.
This is a meaningful position from an organization I’ve previously disagreed with on ALPR transparency. The underlying disagreement was never moral — the general public should not have their data collected, catalogued, and published — but practical: the structures EFF and ACLU were endorsing produce, in the real world, the opposite of transparency.
Although both EFF and ACLU have since expressed a need and desire for stronger transparency than their previous words suggested, practical issues remain. In its new post, EFF opposes the worst version of these bills (categorical exemption) while still endorsing a framework that keeps records hidden in practice (case-by-case balancing).
#The Problem
Seven states (and counting) introducing hostile legislation is not a coincidence. It’s a direct result of Flock’s lobbyists[1] responding to public-records work that has produced policy outcomes like contract cancellations, declined renewals, and even criminal charges. EFF correctly identifies the records as “not just informational—they are leverage.”
The Washington example is instructive and worth dwelling on. A state court ruled last year that ALPR data are public records. The legislature responded by exempting them. This is the predictable endpoint of a balancing-test regime: when transparency wins on the merits in a forum that requires reasoned analysis, the response is to move the question to a forum that doesn’t.
Flock’s home state of Georgia goes further. Not content with exempting ALPR data, the state made it a misdemeanor to request or use plate data for non-law-enforcement purposes. That is the trajectory of denial-by-fee taken to its logical conclusion: when charging $5.4M for search logs (Dunwoody’s number) becomes inadequate to deter requesters, criminalization is next.
#Case-by-Case Balancing and Deidentification
EFF prescribes an unworkable framework that already exists in many states: a privacy exemption requiring case-by-case balancing of transparency benefits against privacy costs.
The framework is idealistic. It is also the primary mechanism by which logs get withheld. Most states lack specific exemptions for the kind of audit data published on haveibeenflocked.com. Even states with “ALPR data” exceptions have to contend with the fact that search terms entered by a user are not “ALPR data.” Agencies often route around this by treating each search as a separate record, then assessing per-record review fees. Five- and six-figure fee estimates are common. The “balancing” and redaction in practice means an agency charges enough to make the request impossible. No balancing actually occurs.
The practical reality under these frameworks is that if you can collect enough records, you can ensure nobody ever gets to see them. That’s the opposite of the desired outcome.
The per-record framing is also a trap of agencies’ own making. If each search log is a separate record for fee purposes, it is a separate record for every other purpose too. Open records law generally requires a specific lawful basis for withholding each record, communicated to the requester, with each denial independently appealable. An agency that wants to charge per-record review fees on 10,000 “records” in a single Excel spreadsheet should be prepared to issue 10,000 individualized determinations and defend each one. Agencies want the fees, but not the obligations.
EFF’s fourth recommendation — disclosing aggregated or deidentified data while withholding personally identifiable information, and treating that process as redaction rather than record-creation — is closer to a real, and workable, solution.
Aggregation is a dodge. Counts of scans by month, hit ratios in percentages, and total-records-shared figures only tell you that surveillance is happening at scale. We know that already. Whether specific searches are lawful, whether officers are stalking exes, or whether “investigation” is being used as a pretext for anything remains locked away in a filing cabinet in the basement of the police station. Aggregate data can’t surface the Milwaukee Ayala case or the Joplin firing. Pattern-of-misuse questions require record-level data.
Deidentification is the workable part. haveibeenflocked.com already takes this approach to an extent: the site publishes audit logs but maps plates to “identifiers” to obscure their identity. This allows the patterns to remain visible — Officer X searched plate Y 124 times in two months — without exposing what plate Y is. Flock previously did the same thing with usernames in its transparency portals before stripping the IDs entirely. It’s a solution where risks like reidentification must be considered, but that’s not an insurmountable problem.
I’ve written about this approach before; haveibeenflocked.com’s “identifiers” are a partially working example of the disclosure-with-redaction structure EFF is asking legislatures to enact.
#Front-loading Beats Balancing
The deeper problem with case-by-case balancing is that it does the work in the wrong place at the wrong time. Each request triggers an individualized analysis by an agency that has no incentive to perform it well, no consequences for performing it badly, and a claimed financial mechanism (per-record review or redaction fees) for converting the analysis itself into a denial.
There is a logical alternative: front-load the determination through rulemaking. My state, Iowa, has the structure largely on the books in its Fair Information Practices Act, even if implementation and enforcement are absent in practice.
Under FIPA, state agencies must promulgate rules describing what personally identifiable information they collect, why they collect it, the legal basis for collection, and which of their records are public, confidential, or mixed. The determination is made before any request arrives. The burden sits with the agency, ahead of time, rather than being shifted to the requester at the moment of request.[2]
Front-loading also forces honest accounting. We do this elsewhere as a matter of course. If a record could contain confidential information, we treat it as if it does. Your doctor can’t store lab results in the same folder where she receives the office Christmas party invites. Yet agencies constantly argue, through public records responses, that they commingle confidential and non-confidential records and store them with third parties — Flock, email providers — that are not bound to keep those records confidential and that, in Flock’s case, will actively disseminate them to paying customers.
Strict reading of public records law makes that assumption-based structure hard. Open records statutes generally, and correctly, turn on what a record does contain, not what it could. Front-loaded rulemaking forces agencies to make the determination at the outset — at minimum, by instructing employees not to enter PII into Flock; at maximum, with a manual confidentiality justification stored with each entry.
Agencies will argue this is too burdensome; cops can’t be trusted to make these legal determinations. They will be right. The burden is the point. If an agency doesn’t know whether a search reason contains confidential information, it doesn’t know whether the search was lawful. Privacy, confidentiality, and oversight are all the same problem.
#Watchlists as a Solution
Front-loading exposes one residual problem: are license plates themselves confidential PII? A categorical answer is available through watchlists.
Properly constructed, watchlist inclusion should require an active police investigation; current practice does not, which is itself part of the problem. Assuming the rule is in place, plates on a watchlist can be exempted from disclosure as part of an investigation. Plates not on a watchlist were captured and stored without an existing investigative basis. This is the “just in case” form of mass surveillance creating the biggest privacy problems.
Those historic location profiles should not be stored at all or, if they are not sensitive enough to prevent their storage, they should be subject to disclosure. The question of whether the public should see information collected without an investigatory nexus collapses into the question of whether they should be collected at all.
The watchlist approach also makes aggregation meaningful. The unit shifts from “plates scanned” — which only confirms surveillance is happening at scale — to “watchlist entries vs. open investigations,” which tells you more about how the system is being used. An agency with thousands of watchlist entries and a few dozen open investigations is using the watchlist for something other than active investigations. An agency whose watchlist entries persist for years is operating differently than one whose entries turn over in days. Either pattern is more useful for oversight than a count of plates scanned or a number of “hits.”
The contradiction is unavoidable. Either license plates are not PII and carry no privacy interest on public roads — Flock’s position when defending the cameras — or they are sensitive PII exempt from disclosure — Flock’s position when defending the audit logs from disclosure.
That contradiction is theirs, not EFF’s, and not the law’s. EFF takes the coherent position that plate data should generally be withheld from third parties, while audit logs and aggregate scan data should be public.
Flock aggressively funds the narrative that total exemption is the only solution. It would probably be right, if it didn’t assign itself the exclusive right to collect, receive, and store the very data it argues is too sensitive for the public. “The public” includes Flock.
Flock is also right that current open records laws are flawed, but the solution isn’t to hollow them out. EFF’s proposed solution is an opaque balancing test with unpredictable outcomes. My proposal is to add the requirement that governments be transparent and consistent.
Both proposals are much better than Flock’s. Neither would be required if current laws on confidentiality and open records were enforced.
According to an IPVM investigation, Flock’s lobbying increased from $90,000 in 2024 to $1.02 million in 2025. ↩︎
Not that the agency won’t try to shift the burden anyway; that is the subject of my current litigation against the Iowa Department of Corrections. ↩︎