Meta agrees to change ad technology in agreement with the US

SAN FRANCISCO – Meta agreed on Tuesday to change its ad technology and pay a $115,054 fine in a settlement with the Department of Justice over allegations that the company’s ad systems discriminated against Facebook users by restricting who could see housing ads on the platform. based on your race, sex and zip code.

Under the deal, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-assisted method that aims to regularly check that those who are targeted and eligible to receive housing ads are, in fact, seeing those Ads. The new method, known as the “variance reduction system,” relies on machine learning to ensure that advertisers are delivering housing-related ads to specific classes of protected people.

“Meta will – for the first time – change its ad delivery system to address algorithmic discrimination,” Damian Williams, an attorney for the Southern District of New York, said in a statement. “But if Meta does not demonstrate that it has sufficiently changed its delivery system to protect against algorithmic bias, this office will proceed with litigation.”

Facebook, which has become a corporate colossus by collecting data from its users and allowing advertisers to target ads based on the characteristics of an audience, has faced complaints for years that some of these practices are biased and discriminatory. The company’s ad systems allowed marketers to choose who saw their ads using thousands of different characteristics, which also allowed these advertisers to exclude people who fall into various protected categories such as race, gender, and age.

The Justice Department filed the lawsuit and settlement against Meta on Tuesday. In its action, the agency said it concluded that “Facebook could achieve its interests in maximizing its revenue and delivering relevant ads to users through less discriminatory means.”

While the settlement specifically refers to housing ads, Meta said it also plans to apply its new system to verify the targeting of employment and credit-related ads. The company has already faced criticism for allowing bias against women in job ads and excluding certain groups of people from seeing credit card ads.

The issue of biased ad targeting has been particularly debated in housing ads. In 2016, the discriminatory potential of Facebook ads was revealed in an investigation by ProPublica, which showed that the company’s technology made it simple for marketers to exclude specific ethnic groups for advertising purposes.

In 2018, Ben Carson, who was secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having ad systems that “illegally discriminated” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems do not deliver ads to “a diverse audience,” even if an advertiser wants the ad to be seen widely.

“Facebook is discriminating against people based on who they are and where they live,” Carson said at the time. “Using a computer to limit a person’s housing choices can be as discriminatory as slamming a door in someone’s face.”

The Justice Department’s lawsuit and settlement are based in part on HUD’s 2019 investigation and discrimination accusation against Facebook.

In its own related tests, the US Attorney for the Southern District of New York found that Meta’s ad systems directed housing ads away from certain categories of people, even when advertisers did not intend to do so. The ads were “disproportionately targeted at white users and away from black users, and vice versa,” according to the Justice Department’s complaint.

Many housing ads in neighborhoods where the majority of people were white were also targeted primarily at white users, while housing ads in areas that were mostly black were shown primarily to black users, the complaint added. As a result, according to the complaint, Facebook’s algorithms “really and predictably reinforce or perpetuate patterns of racially segregated housing.”

In recent years, civil rights groups have also fought the vast and complicated advertising systems that underpin some of the Internet’s biggest platforms. The groups argued that these systems have inherent biases and that tech companies like Meta, Google and others should do more to combat these biases.

The area of ​​study, known as “algorithmic justice,” has been a major topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm about these biases for years.

Over the next few years, Facebook restricted the types of categories marketers could choose from when purchasing housing ads, reducing the number to hundreds and eliminating targeting options based on race, age, and zip code.

Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it was “essential” that “fair housing laws are aggressively enforced.”

“Housing ads have become tools for illegal behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea they were being targeted or denied housing ads based on their race and other characteristics.”

Meta’s new ad technology, which is still in development, will occasionally check who is receiving ads for housing, employment, and credit, and ensure those audiences match the people marketers want to reach. If ads running start to lean heavily towards white men in their 20s, for example, the new system will theoretically recognize this and change ads to run more equitably across wider and more varied audiences.

“Occasionally, we’ll take a snapshot of marketers’ audiences, see who they’re targeting, and remove as much variation from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and vice general counsel. , said in an interview. He called it “a significant technological advance in how machine learning is used to deliver personalized ads.”

Meta said it would work with HUD over the next few months to incorporate the technology into Meta’s ad targeting systems and agreed to a third-party audit of the new system’s effectiveness.

The company also said it would no longer use a feature called “special ad audiences,” a tool designed to help advertisers expand the groups of people their ads would reach. The Justice Department said the tool also engaged in discriminatory practices. Meta said the tool was an early effort to combat bias and that its new methods would be more effective.

The $115,054 fine that Meta agreed to pay in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.

“The public should know that the latest Facebook abuse was worth the same amount of money as Meta earns in about 20 seconds,” said Jason Kint, chief executive of Digital Content Next, an association for premium publishers.

As part of the settlement, Meta did not admit to any wrongdoing.

Leave a Reply

%d bloggers like this: