Flock's FreeForm Free-For-All
An analysis of 3,217 FreeForm search logs from 124 agencies reveals that Flock's "content moderation" blocks constitutionally sound searches while approving nationwide dragnets targeting military affiliation, political expression, and people wearing jeans.
Flock’s “FreeForm” search lets users search for more than license plates: it can filter for makes, models, dents, stickers, roof racks, and so on. Through its ethics page, Flock tells a story about the feature being safe and respectful of legal, constitutional, and ethical boundaries. The logs say otherwise.
After writing yesterday’s feature announcement about the new cost estimate feature, as an afterthought I did a quick query to see how many agencies used “FreeForm” and how often it’s used. The result: 6,736 “FreeForm” searches in 2025 across 121 agencies. At a $50,000 annual subscription MSRP, that works out to roughly $900 per search.
Naturally, I wanted to know what, if anything, makes these searches so valuable.
Flock’s FreeForm
Flock writes that “Flock’s ALPR system cannot be used to search for human characteristics, like race or gender” on its ethics page. In another recent blog post, recently discussed here, Flock takes it a step further:
Flock products do not identify race. They do not target neighborhoods based on demographics. They do not rely on subjective descriptions. They do not expand broad discretionary stops.
Instead, they narrow law enforcement action to vehicles that have been objectively linked to reported crimes.
The FreeForm product page even promises that “[m]oderation tools help prevent biased or inappropriate searches and support responsible, community-trusted policing.”
That narrative is echoed throughout Flock’s website, and aggressively carried out by its 200 sales staff.
In Q2 of 2025, Flock launched a new feature that “is all about one thing: speed. Speed to leads.”
In a move that will transform the largest network of LPR cameras in the nation, Flock announced that every existing Flock LPR camera can soon become video-enabled at no cost to the customer.
FreeForm, Flock’s AI-powered search tool, now works not only on owned LPR cameras but also on shared ones. It also supports video searches—meaning you can now search for characteristics on people* (e.g., “man in blue hoodie with backpack”) just like you would search for vehicles. You can even set alerts on these searches: think “green ATV on a trailer” or “person in orange vest,” so you’re notified in real time when there’s a match.
Plus, FreeForm is now compatible with third-party video feeds (e.g., Genetec, Milestone), so agencies can leverage its power without needing to switch platforms.
It notes that “people characteristics cannot be searched on LPR feeds, only video feeds”.
The FreeForm report (was “Moderation Report”) has been online for a while, but with few search entries and no documentation I never paid much attention it.
Now, almost a year after Flock’s Q2 2025 product announcement, we have a collection of searches from network logs provided by Flock LPR-system users — searches that show lookups for “objectClass:person” and “objectClass:people.”
The Constitution
The 2020 memo to Congress “Racial Profiling: Constitutional and Statutory Considerations for Congress,” written after the death of George Floyd, gives an overview of the boundaries of permissible searches.
The Equal Protection Clause “bars most law-enforcement decisions based on race,” and this prohibition holds “even if members of a given race are responsible for more crimes in a particular neighborhood.”
Courts have also held that “an officer cannot meet the Fourth Amendment standard by relying on a person’s racial appearance, alone, as grounds for reasonable suspicion.” But an officer may include race when “searching for a person matching a suspect’s description and part of that description is the suspect’s race.”
The Searches: Dragnets and Military Personnel
After analyzing 3,217 searches from 124 agencies — 3,184 of which Flock’s moderation allowed, 19 it blocked, and 14 it warned about — it’s clear that the “FreeForm” system that’s implemented is not the one that Flock describes, or the one the Constitution requires. Instead, it is a digital free-for-all where cops go on fishing expeditions based on protected characteristics. Flock even blocks the most obviously constitutional searches.
Houston PD searched 53,017 devices across 3,734 networks for “white car with black front bumper” (reason: murder investigation). That is a description so generic and a dragnet so wide that it would match tens of thousands of vehicles nationally.
Houston PD also searched that same 53,000-device scope for “Marine Corps” and “volkswagen jetta U.S. marine corps” — the first of which is a bare military affiliation search with no vehicle descriptor at all.
“Marine Corps” as a standalone search term, run across the entire Flock network, is functionally a request to identify every vehicle in America displaying USMC insignia — which would include many active service members and their families.
Since December 2025, Flock redacts its network logs before providing them to its customers whose data is being searched. Those customers can’t see who ran the search. Flock, and many of its customers on the nationwide network, maintain no policies requiring background checks or prohibiting account sharing. That’s a “local decision,” says Flock.
We can’t say, or even begin to speculate, who searched the country for “Marine Corps” and for what purpose. All we know is that someone did, and that Flock’s AI-moderator approved it.
Louisville Metro PD regularly searched 39,000–42,000 devices across 2,600–2,800 networks. One search: “overloaded waste hauler” — a code enforcement query — hit 39,751 devices across 2,672 networks. Louisville is using Flock’s AI-powered search to run municipal waste-hauling compliance checks through a nationwide surveillance apparatus.
O’Fallon, Missouri PD — a city of about 90,000 people — searched 41,054 devices across 2,707 networks for the person descriptor “jeans.” No case number. Reason: “inv.” That search hit cameras in thousands of jurisdictions across the country, looking for Americans in blue jeans.
Corona, California PD consistently searched 11,400+ devices across 370+ networks for person searches including “a person,” “police badge,” and “fire” — the first of which is literally searching for the existence of a human being.
All of these are overbroad fishing expeditions using a mass surveillance system. There is no valid investigative purpose in looking up “a person” or “jeans.” Retrieving the location history of every US Marine in the nation does not prevent crime, it hurts national security.
The Moderation System: No on “white male” — Yes on “tweaker”
The most constitutionally defensible person search in the entire dataset was the California Highway Patrol’s prompt:
Looking for a white male about 6ft 1in tall, longer brown hair almost to his shoulders, slender build, will have been wearing blue jeans, boots with white paint stains on the toes and possibly carrying a black helmet
This was a search across only 91 devices and 3 networks. It is a textbook individualized suspect description: race as one of many physical identifiers, exactly as Fourth Amendment jurisprudence permits. It was run in a narrow area where this suspect was likely to be found.
Flock rejected the search. The most probable explanation, based on other searches, is that it saw “white male.”
Meanwhile, Florence, South Carolina PD searched for “all” (objectClass:people, reason: Robbery) — a search that matches literally every person on camera. Also allowed from Florence: “people,” “hoodie,” “jacket,” “jeans,” “Red.” These were searched across only 1 device and 1 network, suggesting Florence was early in deployment or testing, but the moderation system approved them regardless.
O’Fallon MO PD’s “jeans” search hit 41,054 devices. If Florence’s identical search was allowed on 1 device, there’s no scale-based restriction either.
Hemet, California PD searched for “tweaker on bike” across 1,581 devices and 30 networks. No reason given. No case number. “Tweaker” is a slang pejorative for methamphetamine users. This is the definition of a “subjective and invasive search” — targeting people by perceived social status and assumed drug use.
Unlike the search for a highly specific white male, the moderation system allowed this search for any tweaker.
The First Amendment
An objection that’s often raised is Flock’s (admitted) ability to search for bumper stickers and other characteristics. Flock regularly claims that it is only the existence of a bumper sticker that can be queried, not its content. That is not what the logs indicate.
Spokane County WA SO searched for “american flag,” “coexist sticker,” and “trump flag” on vehicles. All three triggered a warn status. The reason fields — “freeform suspicious search test” — indicate Spokane was deliberately testing the moderation boundaries.
What happens when Flock’s AI-moderator issues a warning is not entirely clear. From earlier analysis of frontend code, it is a dialog that can be clicked through. It’s possible that someone gets a notification or an email. We don’t know.
Flock’s system knew these searches were problematic, and it flagged them, but it did not block them, as its product pages promised.
Corona CA PD searched for “american flag” on people and got blocked. The same agency searched for “american flag” on vehicles and got warned.
O’Fallon MO PD searched for “vehicle with flag” across 40,235 devices and 2,642 networks. Allowed. No warning. The generic “flag” search is arguably broader and more concerning than the specific “american flag” or “trump flag” searches that triggered warnings.
CHP searched for “Hells Angels” as a vehicle descriptor nine times (8 allowed, 1 warned from San Jose PD). The allowed searches used reasons like “Investigative Follow-up” and “Traffic Collision.” Searching for vehicles displaying Hells Angels insignia — rather than a specific vehicle involved in a specific incident — targets organizational membership.
If CHP wanted a specific motorcycle involved in a traffic collision, the search would describe the motorcycle, not the association. Seven of the nine Hells Angels searches hit only 190 devices and 1 network, suggesting a narrow local scope — but the moderation principle is the same regardless of scale.
Audit Logs and Objectivity
Of course, the majority of these searches do not have case numbers. We know by now that the claim that “every search made within the Flock platform is logged and auditable, creating a tamper-proof trail of accountability” is completely false. The sensitivity of the data being searched here — like “Marine Corps” — highlights how important it is to be able to audit a search’s full context.
Only 85 of 3,217 searches — 2.6% — had a plate field that could have contained a value. None of the problematic searches discussed above were among them.
The “objective criteria” Flock allows include a descriptor like “tweaker” but not a detailed description of a white male. It allows searching for every white car, or every military member in the nation, and only lightly wags its finger at you when searching for protected political speech.
Flock’s AI-based moderation appears inconsistent and insufficient. It certainly won’t lead to “responsible, community-trusted policing.”
This is an insecure, unaccountable, and unrestricted dragnet that can be — and is — used to mass surveill Americans based on their political, professional, and religious affiliations, their protected personal characteristics, and their expression of speech. It is exactly what the Constitution prohibits.
For each of those searches, lawful or not, Flock collects $900.