“It’s very clear that the algorithm is doing exactly what Facebook intended it to do: That is, it relies on gender and age to decide who should receive ads,” Peter Romer-Friedman, one of the lawyers representing group, said in an interview. “The problem of algorithmic discrimination is far worse than anyone ever thought it was, or studied, or found through studies.”
Romer-Friedman was previously involved in a discrimination case against Facebook that led to a 2019 settlement in which the social media company agreed to make sweeping changes to its ad platform.
The legal filing draws on publicly available data in the company’s ad library, which reveals a pattern of “algorithmic steering” that causes some job ads to be shown to an audience of 90 percent women or men. For instance, an employer seeking to hire truck drivers in North Carolina set the eligible audience for a job ad to all genders. But of the people Facebook showed the ad to, 94 percent were men and just 11 percent were 55 and older, according to the lawsuit.
The complaint is likely to add to the public and legal scrutiny facing Meta over how the company’s automated ad system, which is known for offering marketers the ability to tailor ads to thin slices of the population — have discriminated against minorities and other vulnerable groups in the areas of employment, housing and finance.
In 2019, Facebook agreed to stop allowing advertisers to use gender, age and Zip codes to market housing, credit and job openings to its users. That change came after a Washington state attorney general probe and a ProPublica report found that Facebook was letting advertisers conceal housing ads from African Americans and other minorities. Earlier this year, Meta agreed to build a new automated advertising system that the company says will help ensure that housing related ads are delivered to a more equitable mix of the population.