From DOJ Settlement, Facebook to eliminate tool that enables discriminatory advertising

This story was originally published by Ariana Tobin and Ava Kofman of ProPublica.

In a settlement announced by the Justice Department on Tuesday, Meta Platforms – formerly known as Facebook – agreed to eliminate features from its advertising business that allow owners, employers and credit bureaus to discriminate against groups persons protected by federal civil rights laws.

The deal comes nearly six years after ProPublica first revealed that Facebook was letting real estate marketers block African Americans and others from seeing some of its ads. Federal law prohibits discrimination in housing, employment, and credit based on race, religion, sex, marital status, and disability.

ProPublica and other researchers have shown that problems persist in the delivery of ads related to housing, jobs and credit, even though Facebook has committed to fixing the flaws we have identified.

This week’s settlement was the result of a lawsuit filed three years ago by the Trump administration alleging that Meta’s ad targeting system violated the Fair Housing Act. The DOJ also argued that Facebook used a machine learning algorithm to narrow and create ad audiences, which had the effect of biasing delivery toward or against legally protected groups. It was the first time the federal government challenged algorithmic biases under the Fair Housing Act.

As part of the settlement, Meta agreed to deploy new advertising methods that will be reviewed by a third-party reviewer and supervised by the court.

The company said in a statement that it will implement a “new use of machine learning technology that will work to ensure that the age, gender, and estimated race or ethnicity of the overall viewership of a real estate ad matches the age, gender, and race or ethnicity of the population eligible to view that ad.

The statement from Roy L. Austin Jr., vice president of civil rights and assistant general counsel at Meta, noted that while the settlement only requires Facebook to use its new tool for housing-related ads, it will also apply. publications on employment and credit. (Facebook declined a request for additional comment.)

Civil rights attorney Peter Romer-Friedman, who has filed multiple lawsuits against the company, said previous negotiations had unsuccessfully tried to hold Facebook accountable for algorithmic bias. “Ultimately, what this shows is that it was never about the feasibility of eliminating algorithmic bias,” he told ProPublica. “It is a question of will.”

After flagging the potential for ad discrimination in 2016, Facebook quickly promised to implement a system to detect and review unlawfully discriminatory ads. A year later, ProPublica found that it was still possible to exclude groups such as African Americans, high school mothers, people interested in wheelchair ramps, and Muslims from seeing ads. It was also possible to target ads to people interested in anti-Semitism, including options such as “How to burn Jews” and “Hitler didn’t do anything wrong.”

We later discovered that companies were posting job ads that women and older workers couldn’t see. In March 2019, Facebook settled a lawsuit brought by civil rights groups by creating a “special ads portal” specifically for job, housing and credit ads. The company said the portal would limit advertisers’ targeting options and also prevent its algorithm from taking gender and race into account when deciding who should see ads.

But when ProPublica worked with researchers from Northeastern University and Upturn to test Facebook’s new system, we found other examples of biased ad delivery. Although Facebook’s modified algorithm prevented advertisers from explicitly discriminating, delivery could still rely on “special ad” or “look-alike” proxy characteristics correlated to race or gender.

The research also found that Facebook skews audience based on the content of the ad itself. The number of women likely to see a job posting for an open janitorial position, for example, depended not only on what the advertiser told Facebook, but also how Facebook interpreted the image and text. advertising.

ProPublica also continued to find job postings that favored men or excluded older potential applicants, potentially violating civil rights law. Some advertisers we interviewed were surprised to learn that they weren’t able to reach a diverse audience even if they tried.

Facebook ads may still discriminate against women and older workers, despite civil rights agreement

In a press release, the DOJ said Tuesday’s settlement required Meta to stop using the “Special Ad Audience” tool by the end of the year. It also forces Meta to change its algorithm “to address race, ethnicity and gender disparities between the audiences targeted by advertisers and the group of Facebook users to whom Facebook’s personalization algorithms actually serve ads.” . The company must share details with the DOJ and an independent reviewer before implementing the changes.

As part of the settlement, Meta also agreed to pay a fee of $115,054, the maximum allowed by law.

“Because of this groundbreaking lawsuit, Meta will – for the first time – change its ad serving system to address algorithmic discrimination,” U.S. Attorney Damian Williams of the Southern District of New York said in a statement. “But if Meta fails to demonstrate that it has modified its delivery system sufficiently to guard against algorithmic bias, this office will pursue litigation.”

LOADING
. . . comments & After!

About Ricardo Schulte

Check Also

Q&A directory: affable.ai – [Talking Influence]

Q&A directory: affable.ai – [Talking Influence]

The Talking Influence Directory was created to provide our audience with information on influencer marketing …