There’s a rising controversy right now — I’ve received a couple of dozen queries about this in the last few days — regarding Facebook’s permitting advertisers to block particular ads from specific “ethnic affinity” groups, e.g. African American.
Facebook insists that these aren’t actually racial categories per se since they don’t directly ask users about their race. Rather, Facebook insists that they “merely” assign a kind of racial “score” to users based on user activities.
That’s Facebook double-talk of course. Look at stuff that Facebook figures mainly interests whites, and Facebook sorts you into the white club. Look at materials that Facebook assumes mainly attract blacks, and Facebook relegates you to the black shack. Same idea for Hispanics, and so on.
These assumptions are naturally going to be wrong part of the time, but Facebook cares not, since they don’t make a point of explicitly telling you which racial categories — and that’s what these actually are, racial categories — that they’ve slotted you into.
But they do tell advertisers, at least to the extent that they permit advertisers to exclude different racial groups (or, excuse me, I mean “ethnic affinity” groups) from seeing particular ads or even knowing that those ads exist.
Facebook insists that their rules prohibit using these “racial control” facilities in illegal ways — such as to foster housing or job discrimination against particular racial groups.
But this issue hit the fan now when it was demonstrated how simple it is to get clearly racially discriminatory and illegal ads approved via Facebook’s advertising portal.
Facebook (which, despite having put these racial categories in their “demographics” section, seems to assert that they’re not really demographic!) tries to explain away these problems with the usual excuse — blame the users (or in this case, blame the advertisers). This despite the fact that it’s Facebook’s creation of these racial filters that practically begs racist advertisers to use them to exclude what those advertisers deem to be “undesirable” persons.
This kind of “hey, it’s not our fault!” excuse would never fly with newspaper ads or other traditional advertising, but has become common with Internet darlings, including firms like Uber and Airbnb, who are increasingly facing government actions pushing back on their cavalier attitudes in a range of contexts.
This is not to say that there’s anything wrong with targeted advertising as a whole. In fact, it helps avoid wasting users’ time with ads for products or services that they probably don’t care about.
But once you step into the fire of racial classification on the Net, you’re letting yourself in for a world of pain.
Just as a thought experiment, imagine if Google permitted YouTube uploaders to specify which racial groups would be permitted to find and view particular videos? Google would be rightly crucified in short order.
Obviously, Google would never do this. Yet what Facebook is actually doing is far worse than this imaginary example, and they’ve been doing it under the radar of most users. People writing to me are expressing outrage that Facebook didn’t clearly inform them that they were being secretly stuffed into racial boxes and being spoon-fed particular ads based on those racial classifications.
Ultimately, this sort of misbehavior by Facebook threatens to provide ammunition to politicians and their cronies who have long wished to impose draconian controls on users’ ability to post a wide range of completely legitimate materials on social media, video, and other sorts of sites. There’s nothing that these politicos would love more than to leverage racial discrimination into broad-based Internet censorship.
Facebook needs to clean up their act. Or the government is likely to clean it up for them, and in their overreaction do immense harm to everyone else in the process.
I have consulted to Google, but I am not currently doing so — my opinions expressed here are mine alone.
– – –
The correct term is “Internet” NOT “internet” — please don’t fall into the trap of using the latter. It’s just plain wrong!