By Sam Dean
Los Angeles Times
WWR Article Summary (tl;dr) Experts say that the practice of allowing advertisers to target hateful speech runs counter to the company’s stated principles and can help fuel radicalization online.
Los Angeles Times
Facebook makes money by charging advertisers to reach just the right audience for their message, even when that audience is made up of people interested in the perpetrators of the Holocaust or explicitly neo-Nazi music.
Despite promises of greater oversight following past advertising scandals, a Times review shows that Facebook has continued to allow advertisers to target hundreds of thousands of users the social media firm believes are curious about topics such as “Joseph Goebbels,” “Josef Mengele,” “Heinrich Himmler,” the neo-nazi punk band Skrewdriver and Benito Mussolini’s long-defunct National Fascist Party.
Experts say that this practice runs counter to the company’s stated principles and can help fuel radicalization online.
“What you’re describing, where a clear hateful idea or narrative can be amplified to reach more people, is exactly what they said they don’t want to do and what they need to be held accountable for,” said Oren Segal, director of the Anti-Defamation League’s center on extremism.
After being contacted by The Times, Facebook said that it would remove many of the audience groupings from its ad platform.
“Most of these targeting options are against our policies and should have been caught and removed sooner,” said Facebook spokesman Joe Osborne. “While we have an ongoing review of our targeting options, we clearly need to do more, so we’re taking a broader look at our policies and detection methods.”
Approved by Facebook
Facebook’s broad reach and sophisticated advertising tools brought in a record $55 billion in ad revenue in 2018.
Profit margins stayed above 40 percent, thanks to a high degree of automation, with algorithms sorting users into marketable subsets based on their behavior, then choosing which ads to show them.
But the lack of human oversight has also brought the company controversy.
In 2017, Pro Publica found that the company sold ads based on any user-generated phrase, including “Jew hater” and “Hitler did nothing wrong.” Following the murder of 11 congregants at a synagogue in Pittsburgh in 2018, the Intercept found that Facebook gave advertisers the ability to target users interested in the anti-Semitic “white genocide conspiracy theory,” which the suspected killer cited as inspiration before the attacks.
This month, the Guardian highlighted the ways that YouTube and Facebook boost anti-vaccine conspiracy theories, leading Rep. Adam Schiff, D-Calif., to question whether the company was promoting misinformation.
Facebook has promised since 2017 that humans review every ad targeting category. It announced last fall the removal of 5,000 audience categories that risked enabling abuse or discrimination.
The Times decided to test the effectiveness of the company’s efforts by seeing if Facebook would allow the sale of ads directed to certain segments of users.
Facebook allowed The Times to target ads to users the company has determined are interested in Goebbels, the Third Reich’s chief propagandist, Himmler, the architect of the Holocaust and leader of the SS, and Mengele, the infamous concentration camp doctor who performed human experiments on prisoners. Each category included hundreds of thousands of users.
The company also approved an ad targeted to fans of Skrewdriver, a notorious white supremacist punk band, and automatically suggested a series of topics related to European far-right movements to bolster the ad’s reach.
Collectively, the ads were seen by 4,153 users in 24 hours with The Times paying only $25 to fuel the push.
Facebook admits its human moderators should have removed the Nazi-affiliated demographic categories. But it says the “ads” themselves, which consisted of the word “test” or The Times’ logo and linked back to the newspaper’s homepage, would not have raised red flags for the separate team that looks over ad content.
Upon review, the company said the ad categories were seldom used. The few ads purchased linked to historical content, Facebook said, but the company would not provide more detail on their origin.
‘Why is it my job to police their platform?’
The Times was tipped off by a Los Angeles musician who asked to remain anonymous for fear of retaliation from hate groups.
Earlier this year, he tried to promote a concert featuring his hardcore punk group and a black metal band on Facebook. When he typed “black metal” into Facebook’s ad portal, he said he was disturbed to discover that the company suggested he also pay to target users interested in “National Socialist black metal”, a potential audience numbering in the hundreds of thousands.
The punk and metal music scenes, and black metal in particular, have a long grappled with white supremacist undercurrents.
Black metal grew out of the early Norwegian metal scene, which saw prominent members convicted of burning down churches, murdering fellow musicians and plotting bombings. Some bands and their fans have since combined anti-Semitism, neo-paganism, and the promotion of violence into the distinct subgenre of National Socialist black metal, which the Southern Poverty Law Center described as a dangerous white supremacist recruiting tool nearly 20 years ago.
But punk and metal fans have long pushed back against hate. In 1981, the Dead Kennedys released “Nazi Punks F— Off”; last month 15 metal bands played at an anti-fascist festival in Brooklyn.
The musician saw himself as a part of that same tradition.
“I grew up in a punk scene in Miami where there were Nazis, they would kind of invade the concerts as a place where they knew they could get away with violence,” he said.
So he saw it as his duty, he said, to contact Facebook and express his disgust.
Facebook subsequently removed the grouping from the platform, but the musician remains incredulous that “National Socialist black metal” was a category in the first place, let alone one the company specifically prompted him to pursue.
“Why is it my job to police their platform?” he said.
A rabbit hole of hate
After reviewing screenshots verifying the musician’s story, The Times investigated whether Facebook would allow advertisers to target explicitly neo-Nazi bands or other terms associated with hate groups.
We started with Skrewdriver, a British band with a song called “White Power” and an album named after a Hitler Youth motto.
buy vidalista generic buy vidalista online no prescription
Since the band only had 2,120 users identified as fans, Facebook informed us that we would need to add more target demographics to publish the ad.
The prompt led us down a rabbit hole of terms it thought were related to white supremacist ideology.
First, it recommended “Thor Steinar,” a clothing brand that has been outlawed in the German parliament for its association with neo-Nazism.
Then, it recommended “NPD Group,” the name of both a prominent American market research firm and a far-right German political party associated with neo-Nazism. Among the next recommended terms were “fluchtlinges,” the German word for “refugees,” and “Nationalism.”
Facebook said the categories “Fluchtlinges,” “Nationalism,” and “NPD Group” are in line with its policies and will not be removed despite appearing as auto-suggestions following neo-Nazi terms. (Facebook said it had found that the users interested in NPD Group were actually interested in the American market research firm.)
In the wake of past controversies, Facebook has blocked ads aimed at those interested in the most obvious terms affiliated with hate groups.
“Nazi,” “Hitler,” “white supremacy” and “Holocaust” all yield nothing in the ad platform. But advertisers could target more than a million users with interest in Goebbels or the National Fascist Party, which dissolved in 1943.
Himmler had nearly 95,000 constituents. Mengele had 117,150 interested users, a number that increased over the duration of our reporting, to 127,010.
Facebook said these categories were automatically generated based on user activity, liking or commenting on ads, or joining certain groups. But it would not provide specific details about how it determined a user’s interest in topics linked to Nazis.
‘Expanding the orbit’
The ads ended up on the Facebook pages of a wide swath of media outlets, including the Daily Wire, CNN, HuffPost, Mother Jones, Breitbart, the BBC and ABC News.
They also went to viral sites with names like Pupper Doggo, I Love Movies and Right Health Today _ a seemingly defunct media company whose only Facebook post was a link to a now-deleted article titled “What Is The Benefits Of Eating Apple Everyday.”
Segal, the ADL director, said Facebook might wind up fueling the recruitment of new extremists by serving up such ads on the types of pages an ordinary news reader might visit.
“Being able to reach so many people with extremist content, existing literally in the same space as legitimate news or non-hateful content, is the biggest danger,” he said. “What you’re doing is expanding the orbit.”
Some critics contend that the potential for exploitation is built into the fundamental workings of ad platforms like Facebook’s, regardless of whether the target demographics are explicitly extremist.
“Finely targeted digital advertising allows anonymous advertisers with who knows what political agenda to test messages that try to tap into some vulnerability and channel a grievance in some particular direction,” said Anthony Nadler, a professor at Ursinus College in Pennsylvania who researches how social networks and ad platforms can assist radicalization and spread disinformation. “I imagine that the more sophisticated white supremacists out there are trying to figure out how to expand their base.”