Is Facebook Responsible for Its Users’ Discriminatory Ads?

Facebook is free for its 2.2 billion active users, and is sustained on ad revenue. Its worldwide ad revenue was over $33 billion in 2018, with almost $15 billion generated in the U.S. Advertisers love the extensive ad targeting options available on Facebook, which allow advertises to zero in on target demographics. But the same options are getting Facebook into legal trouble.

Image Source:

In 2016 and 2017 ProPublica published stories about Facebook letting advertisers exclude users by race, including in ads for housing and employment.

Facebook promotes its ad targeting platform with “success stories” for finding “the perfect homeowners,” “reaching home buyers,” “attracting renters” and “personalizing property ads.”

However, Facebook enabled advertisers to restrict which Facebook users receive housing-related ads based on race, color, religion, sex, familial status, national origin and disability.

The Fair Housing Act of 1968 makes it illegal “to make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.”

The Civil Rights Act of 1964 also prohibits the “printing or publication of notices or advertisements indicating prohibited preference, limitation, specification or discrimination” in employment recruitment.

Image Source:

In August of 2018 the Secretary of Housing and Urban Development filed a complaint of housing discrimination against Facebook. HUD’s Complaint states that Facebook mines extensive user data and classifies its users based on protected characteristics, and permits advertisers to express unlawful preferences in advertising housing.

For example, Facebook enabled advertisers to discriminate based on familial status by not showing ads to users whom Facebook categorizes as interested in “child care” or “parenting,” or by showing ads only to users with children above a specified age.

Facebook allowed advertisers to not show ads to users whom Facebook categorizes as interested in “Latin America,” “Southeast Asia,” “China,” “Honduras,” “Somalia,” the “Hispanic National Bar Association” or “Mundo Hispanico.”

Facebook advertisers could draw a red line around majority-minority zip codes and not show ads to users who live in those zip codes.

In August, Facebook announced that it was doing away with “5,000 targeting options” that related to ethnicity, religion and other attributes.

Just this week the American Civil Liberties Union filed a complaint against Facebook with the EEOC (Equal Employment Opportunity Commission), alleging that it gives employers a powerful tool to discriminate against women seeking work. The plaintiffs alleged that they were denied certain job opportunities because they never saw the ads. Many of the postings, like those for mechanics or truck drivers, were in male-dominated fields.

In response, Facebook said it plans to require all advertisers to comply with Facebook’s anti-discrimination policies and the law.

Is Facebook a “publisher” in the traditional sense like a newspaper, or a neutral “platform?” The distinction is important because Section 230 of the Communications Decency Act immunizes online platforms for their users’ defamatory, fraudulent, or otherwise unlawful content. However, Facebook’s lawyers claim in a court case that it is a “publisher.” Facebook used the argument in defending its decision to not publish certain content on its site, which it said is within its free speech rights.

What do you think? Should advertisers be allowed to target Facebook users using demographic filters like ethnicity, family status, gender, and so on?

Leave a Reply

Your email address will not be published. Required fields are marked *