Facebook settles with DOJ over discriminatory housing ads

Placeholder while article actions load

Meta, the owner of Facebook, has agreed to overhaul the social network’s targeted advertising system under a sweeping settlement with the US Department of Justice, after the company was accused of allowing homeowners to market its housing ads in a discriminatory manner.

The settlement, which stems from a 2019 Fair Housing Act lawsuit filed by the Trump administration, is the second settlement the company has agreed to change its ad systems to avoid discrimination. But Tuesday’s settlement goes beyond the first, requiring Facebook to revamp its powerful internal ad targeting tool, known as Lookalike Audiences. Government officials said that by allowing advertisers to target housing-related ads by race, sex, religion or other sensitive characteristics, the product allowed for housing discrimination.

Under the deal, Facebook will build a new automated advertising system that the company says will help ensure that housing-related ads are delivered to a more equitable mix of the population. The agreement said that the social media giant would have to submit the system to a third party for review. Facebook, which last year renamed its parent company Meta, also agreed to pay a fee of $115,054, the maximum penalty available under the law.

“This settlement is historic, marking the first time Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” Assistant Attorney General Kristen said. Clarke of the Department of Civil Justice. Rights Division.

Advertisers will still be able to target their ads to users in specific locations, but not just based on their zip codes and those with a limited set of interests, according to Facebook spokesman Joe Osborne.

Facebook is now legally obligated to stop advertisers from excluding people because of their race

Facebook vice president of civil rights Roy Austin said in a statement that the company will use machine learning technology to try to more equitably distribute who sees housing-related ads, regardless of how those marketers have targeted their ads. , taking into account the age, sex and likely race of users.

“Discrimination in housing, employment and credit is a deeply rooted issue with a long history in the US, and we are committed to expanding opportunities for marginalized communities in these spaces and beyond,” Austin said in a statement. “This type of work is unprecedented in the advertising industry and represents a significant technological advance in how machine learning is used to deliver personalized ads.”

Federal law prohibits housing discrimination based on race, religion, national origin, gender, disability or family status.

The settlement follows a series of legal complaints from the Department of Justice, a state attorney general and civil rights groups against Facebook that are arguing that the company’s algorithm-based marketing tools – which specialize in giving advertisers an of targeting ads to thin slices of the population — discriminated against minorities and other vulnerable groups in the areas of housing, credit and employment.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age and zip codes – which often act as proxies for race – to market housing, credit and job openings to its users. This change followed an investigation by the Washington state attorney general and a report by ProPublica that found that Facebook was allowing advertisers to use its micro-targeting ads to hide housing ads from African-American and other minority users. Later, Facebook said it would no longer let advertisers use the “ethnic affinities” category for housing, credit and employment ads.

HUD is reviewing Twitter and Google’s ad practices as part of the housing discrimination investigation

But since the company agreed to those agreements, researchers have found that Facebook’s systems can continue to discriminate even more, even when advertisers have been banned from ticking specific boxes by sex, race or age. In some cases, its software detects that people of a certain race or gender are frequently clicking on a specific ad, and then the software begins to reinforce those biases by showing ads to “similar audiences,” said Peter Romer-Friedman, a director of the law firm Gupta Wessler PLLC.

The result could be that only men see a particular housing ad, even when the advertiser didn’t specifically try to show the ad to men only, said Romer-Friedman, who has filed several civil rights lawsuits against the company, including the 2018 settlement in that the company has agreed to limit ad targeting categories.

Romer-Friedman said the settlement was a “huge achievement” because it was the first time a platform was willing to make major changes to its algorithms in response to a civil rights lawsuit.

For years, Facebook has struggled with complaints from civil rights advocates and people of color, who argue that Facebook’s enforcement sometimes unfairly removes content where people complain of discrimination. In 2020, the company underwent an independent civil rights audit, which found that the company’s policies were a “tremendous setback” for civil rights.

Leave a Reply

%d bloggers like this: