Facebook will eliminate tool that allows advertisers to discriminate — ProPublica

In a settlement announced by the Justice Department on Tuesday, Meta Platforms — formerly known as Facebook — agreed to eliminate features in its advertising business that allow owners, employers and credit bureaus to discriminate against groups of people protected by federal laws from civil rights.

The settlement comes nearly six years after ProPublica first revealed that Facebook allowed real estate marketers to exclude African-Americans and others from seeing some of its ads. Federal law prohibits housing, employment and credit discrimination based on race, religion, gender, family status and disability.

For yearsProPublica and other researchers showed that problems persisted in delivering ads related to housing, employment and credit, even as Facebook committed to patching the loopholes we identified.

This week’s settlement was the result of a lawsuit filed three years ago by the Trump administration alleging that Meta’s ad targeting system violated the Fair Housing Act. The DOJ also argued that Facebook used a machine learning algorithm to narrow and create ad audiences, which had the effect of directing delivery to or against legally protected groups. This was the first time the federal government challenged algorithmic bias under the Fair Housing Act.

As part of the settlement, Meta has agreed to deploy new advertising methods that will be scrutinized by a third-party reviewer and overseen by the court.

The company said in a statement that it will implement a “new use of machine learning technology that will work to ensure that the age, gender, and estimated race or ethnicity of the general audience of a housing listing matches the estimated age, gender, and racial mix.” or ethnicity. of the population eligible to see this ad.”

The statement, by Roy L. Austin Jr., vice president of civil rights and vice general counsel at Meta, noted that while the settlement only requires Facebook to use its new tool for housing-related ads, it will also apply to postings about employment and credit. (Facebook declined a request for additional comment.)

Civil rights attorney Peter Romer-Friedman, who has filed multiple lawsuits against the company, said previous negotiations had tried and failed to hold Facebook accountable for algorithmic bias. “Ultimately, what this shows is that it was never a matter of feasibility to eliminate algorithmic bias,” he told ProPublica. “It’s a matter of will.”

After we reported on the potential for advertising discrimination in 2016, Facebook quickly promised to create a system to catch and review ads that illegally discriminate. A year later, ProPublica found that it was still possible to exclude groups such as African Americans, mothers of high school students, people interested in wheelchair ramps and Muslims from seeing ads. It was also possible to target ads to people with an interest in anti-Semitism, including options like “How to burn Jews” and “Hitler did nothing wrong”.

We later found out that companies were posting job advertisements that women and older workers could not see. In March 2019, Facebook resolved a lawsuit brought by civil rights groups by creating a “special ad portal” specifically for job, housing and credit ads. The company said the portal would restrict advertisers’ targeting options and also limit its algorithm to consider gender and race when deciding who should see ads.

But when ProPublica worked with researchers at Northeastern University and Upturn to test Facebook’s new system, we found more examples of biased ad delivery. While Facebook’s modified algorithm would prevent advertisers from overt discrimination, delivery could still rely on “special ads” or “similar” proxy characteristics that correlate with race or gender.

The research also found that Facebook skewed the audience depending on the content of the ad itself. How many women could see a job listing for a janitor vacancy, for example, depended not only on what the advertiser told Facebook, but also on how Facebook interpreted the ad image and text.

ProPublica also continued to find job advertisements that favored men or excluded older prospective candidates, potentially violating civil rights law. Some advertisers we interviewed were surprised to learn that they couldn’t reach a diverse audience even if they tried.

In a press release, the DOJ said Tuesday’s settlement requires Meta to stop using the “Special Ad Audience” tool by the end of the year. It also requires Meta to change its algorithm “to address the racial, ethnic, and gender disparities between the advertisers’ target audience and the Facebook user pool to whom Facebook’s personalization algorithms actually deliver the ads.” The company must share details with the DOJ and an independent reviewer before implementing the changes.

As part of the settlement, Meta also agreed to pay a fee of $115,054, the maximum allowed by law.

“Because of this innovative process, Meta will – for the first time ever – change its ad delivery system to address algorithmic discrimination,” U.S. Attorney Damian Williams for the Southern District of New York said in a statement. “But if Meta is unable to demonstrate that it has sufficiently changed its delivery system to protect against algorithmic bias, this office will proceed with litigation.”

Leave a Reply

%d bloggers like this: