- Facebook's automatic ad targeting systems can still engage in forms of "discriminatory ad delivery" — including around race and gender — even when marketers on the platform cast a broad net with whom they want to reach, per a new study reported by Adweek. Facebook said in a statement to Adweek that it's considering enacting more changes in response to the findings.
- The study, available online, was conducted by Northeastern University, the University of Southern California and digital rights nonprofit Upturn. The team spent more than $8,500 to purchase and run actual Facebook campaigns to examine how the systems work. Researchers deliberately controlled for different aspects of the campaigns, such as targeting parameters and ad runtimes. Budgetary factors, ad copy and creative elements like imagery all contributed to placement in front of different demographics on the site, the study found.
- Facebook's systems appear to work in fairly subtle ways. The researchers embedded stereotypical image data that might be targeted at male or female consumers — in this case, a football and makeup brush, respectively — and hid them from plain sight in its ads, per Adweek. Facebook still delivered the ads to different audiences, which the study argues is indicative that the ads were being automatically scanned and skewed in a particular fashion. In another example, the study saw ad delivery discriminate in regards to race. A housing ad was treated differently depending on whether the house was labeled as for sale or for rent and whether it contained depictions of a black or a white family.
The new batch of research suggests that, despite a series of changes and sharper scrutiny into its advertising practices, Facebook is still engaging in forms of ad-targeting discrimination that could further degrade consumer trust in the already embattled platform. Late last month, the Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act for allowing housing discrimination via its advertising, according to CNN. The government body first filed a complaint against Facebook in August, but criticism of its practices around issues like how housing and employment ads are served to users have lingered for years.
Back in 2016, Facebook shut down a targeting feature that let advertisers exclude users whose behavior on the site supposedly linked them to particular racial and ethnic groups. The adjustment came, in part, as a response to findings from ProPublica that argued the feature violated both the Civil Rights Act of 1964 and the Fair Housing Act of 1968.
Then, in August 2018, Facebook killed 5,000 ad-targeting options in what it said was a bid to combat discriminatory and abusive practices. Finally, last month, the company axed age, gender and ZIP code-related targeting in regards to advertising categories like housing and employment. That push was part of a settlement on five discrimination lawsuits filed by the National Fair Housing Alliance, the Communications Workers of America and other advocacy groups. Facebook additionally paid out about $5 million to settle several of the suits, CNN reported.
Now, the latest findings show that discrimination might still be a major problem when it comes to Facebook's ad targeting, including for continued pain points like housing. The study could read as particularly damning because the campaigns the researchers ran to uncover their results intentionally went broad in the audiences they wanted to reach, and yet were still funneled into serving to very different users by Facebook's technology.
Facebook introducing repeated fixes to system-wide problems, only to have those problems persist, has become a familiar story in regards to both ad-targeting discrimination and data privacy. The company, which faces mounting regulatory scrutiny in the U.S. and abroad, earlier this year announced a significant pivot of its business model to focus on private, encrypted and ephemeral messaging across its entire suite of apps, which will all be linked on the back end.