Meta Agrees to Change Ad Algorithm in DOJ Settlement for Violating FHA


Meta agreed to change its advertising algorithm to be less discriminatory, according to a press release from the government.

The company told the Department of Justice (DOJ) it would change its algorithm as a part of a settlement agreement resolving allegations that Meta violated the Fair Housing Act (FHA). According to the DOJ, Meta’s housing advertising system illegally discriminated against Facebook users based on their “race, color, religion, sex, disability, familial status, and national origin.” 

The settlement resolved the DOJ’s first lawsuit challenging algorithmic discrimination under the Fair Housing Act. The suit alleged that Meta used algorithms that relied, in part, on characteristics protected under the FHA to determine which Facebook users received ads for housing.


Under the new settlement, Meta agreed to “​​develop a new system over the next six months to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads,” but the new algorithm must be approved by the DOJ before it is implemented. If the DOJ determines the algorithm to be insufficient in resolving its complaint, the settlement agreement will be terminated.

The suit hinged on three key aspects of Meta’s ad targeting and delivery system, alleging that the company “enabled and encouraged” advertisers to decide whether users were eligible to receive housing ads by relying on protected identity attributes. The suit said Facebook then developed a tool called “Lookalike Audience” that used machine learning to find Facebook users who shared similarities with those deemed eligible for housing ads, also using FHA-protected characteristics. Finally, the suit alleged that Meta used those characteristics to determine which subset of an advertiser’s targeted audience would receive the ads. 


Meta was also fined $115,000.

The agreement states that Facebook must stop using the “Lookalike Audience” tool (now called “Special Ad Audience”) by January 2023, and sets the same timeline for the company to develop a new algorithm to determine housing ad selection. 

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” said Assistant Attorney General Kristen Clarke. “The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”


Meta is not the only tech company to come under fire for violating the Fair Housing Act. In April, real-estate giant Redfin agreed to pay a $4 million fine and implement a new internal monitoring system as a result of a lawsuit from the National Fair Housing Alliance. Zillow, another real-estate tech behemoth, has been accused of hosting listings that violate the FHA several times.

Read more of the Daily Dot’s tech and politics coverage

*First Published: Jun 22, 2022, 3:32 pm CDT

Jacob Seitz

Jacob Seitz is a freelance journalist originally from Columbus, Ohio, interested in the intersection of culture and politics.


Source link



Please enter your comment!
Please enter your name here