About two weeks ago, millions of Google Chrome users were signed up for an experiment they never agreed to be a part of. Google had just launched a test run for Federated Learning of Cohorts—or FLoC–a new kind of ad-targeting tech meant to be less invasive than the average cookie. In a blog post announcing the trial, the company noted that it would only impact a “small percentage” of random users across ten different countries, including the US, Mexico, and Canada, with plans to expand globally as the trials run on.
These users probably won’t notice anything different when they click around on Chrome, but behind the scenes, that browser is quietly keeping a close eye on every site they visit and ad they click on. These users will have their browsing habits profiled and packaged up, and shared with countless advertisers for profit. Sometime this month, Chrome will give users an option to opt-out of this experiment, according to Google’s blog post—but as of right now, their only option is to block all third-party cookies in the browser.
That is if they even know that these tests are happening in the first place. While I’ve written my fair share about FLoC up until this point, the loudest voices I’ve seen pipe up on the topic are either marketing nerds, policy nerds, or policy nerds that work in marketing. This might be due to the fact that—aside from a few blog posts here or there—the only breadcrumbs Google’s given to people looking to learn more about FLoC are inscrutable pages of code, an inscrutable GitHub repo, and inscrutable mailing lists. Even if Google bothered asking for consent before enrolling a random sample of its Chrome user base into this trial, there’s a good chance they wouldn’t know what they were consenting to.
(For the record, you can check whether you’ve been opted into this initial test using this handy tool from the Electronic Frontier Foundation.)
The trackers that FLoC is meant to replace are known as “third-party cookies.” We have a pretty in-depth guide to the way this sort of tech works, but in a nutshell: these are snippets of code from adtech companies that websites can bake into the code underpinning their pages. Those bits of code monitor your on-site behavior—and sometimes other personal details—before the adtech org behind that cookie beams that data back to its own servers.
The catch is that Google still has all that juicy user-level data because it controls Chrome. They’re also still free to keep doing what they’ve always been doing with that data: sharing it with federal agencies, accidentally leaking it, and, y’know, just being Google.
“Isn’t that kind of… anti-competitive?”
It depends on who you ask. Competition authorities in the UK certainly think so, as do trade groups here in the US. It’s also been wrapped up into a Congressional probe, at least one class action, and a massive multi-state antitrust case spearheaded by Texas Attorney General Ken Paxton. Their qualms with FLoC are pretty easy to understand. Google already controls about 30% of the digital ad market in the US, just slightly more than Facebook—the other half of the so-called Duopoly—that controls 25% (for context, Microsoft controls about 4%).
While that dominance has netted Google billions upon billions of dollars per year, it’s recently netted multiple mounting antitrust investigations against the company, too. And those investigations have pretty universally painted a picture of Google as a blatant autocrat of the ad-based economy, and one that largely got away with abhorrent behavior because smaller rivals were too afraid—or unable—to speak up. This is why many of them are speaking up about FLoC now.
“But at least it’s good for privacy, right?”
Again, it depends who you ask! Google thinks so, but the EFF sure doesn’t. In March, the EFF put out a detailed piece breaking down some of the biggest gaps in FLoC’s privacy promises. If a particular website prompts you to give up some sort of first-party data—by having you sign up with your email or phone number, for example—your FLoC identifier isn’t really anonymous anymore.
Aside from that hiccup, the EFF points out that your FLoC cohort follows you everywhere you go across the web. This isn’t a big deal if my cohort is just “people who like to reupholster furniture,” but it gets really dicey if that cohort happens to inadvertently mold itself around a person’s mental health disorder or their sexuality based on the sites that person browses. While Google’s pledged to keep FloC’s from creating cohorts based on these sorts of “sensitive categories,” the EFF again pointed out that Google’s approach was riddled with holes.