Thursday, 27 January 2022

ACLU: Holding Facebook Accountable for Digital Redlining

Holding Facebook Accountable for Digital Redlining

In today’s digital world, people rely on online advertising platforms for critical information such as job opportunities or available housing. But unfortunately, thanks to practices known as “digital redlining” the use of technology to perpetuate discrimination — any ads for housing or jobs that you are likely to see (or not see) in your newsfeed can largely depend on who you are, including your gender, race, or age.

Digital redlining has become the new frontier of discrimination, as social media platforms like Facebook and online advertisers use personal data to target ads based on race, gender, and other protected traits. Research has shown time and again that online ad-targeting practices often reflect and replicate existing disparities in society, effectively locking out historically marginalized groups from housing, job, and credit opportunities. For example, they give housing providers the ability to target potential renters or home-owners by zip code — a clear proxy for race in our still-segregated country. This type of online discrimination is just as harmful as discrimination offline, yet some courts have held that platforms like Facebook and online advertisers can’t be held accountable for withholding ads for jobs, housing, and credit from certain users based on their race, age, or gender — even though the same practices would be considered unlawful if they were offline.

This is why the ACLU is working to hold social platforms and online advertisers accountable for online discrimination. We recently joined a number of partners — including the Lawyers’ Committee for Civil Rights Under Law, the National Fair Housing Alliance, the Washington Lawyers’ Committee for Civil Rights and Urban Affairs, and the Free Press — in filing separate amicus briefs in Vargas v. Facebook and Opiotennione v. Bozzuto Management Company — two lawsuits filed by individuals who were excluded from viewing housing ads on Facebook based on protected characteristics. In our amicus briefs, we urged the courts to recognize that digital redlining violates civil rights protections, and that companies like Facebook should not be shielded from liability merely because their discriminatory targeting took place online.

The ads that people see online are not chosen at random. Facebook has designed and marketed targeting tools that allow advertisers to select the categories of people that they do — and do not — want to see their ads. Facebook’s own ad-delivery algorithm then chooses which users matching those criteria will actually see the ads based on predictions relying on a vast trove of user data about who they are, where they live, what they “like” or post, and what groups they join. These practices may seem harmless or even beneficial in some cases. But when used to conceal economic opportunities — like ads for jobs, housing, or credit — from historically marginalized groups, this type of precision ad-targeting amounts to discrimination by another name.

That’s why, in 2018, the ACLU challenged Facebook’s digital redlining practices by filing charges with the Equal Employment Opportunity Commission (EEOC). We brought these charges on behalf of three women workers, the Communications Workers of America, and a class of millions of women and non-binary users who were excluded from learning about job opportunities in stereotypical “male-dominated” sectors because of Facebook’s gender-based ad-targeting practices.

 

Because of these efforts, in 2019, Facebook agreed to make sweeping changes to its ad platform and to stop offering certain tools that explicitly invited discriminatory ad-targeting. While these changes were a step in the right direction, digital redlining on Facebook persists. Even though advertisers for housing, jobs, and credit are no longer offered an option to exclude people from seeing their ads based on discriminatory criteria, Facebook nevertheless offers other ad-targeting tools to advertisers that researchers have proven to be just as effective at discriminating.

Facebook’s own ad-delivery algorithm also continues to be biased. For example, a recent audit of Facebook’s ad-delivery system found that Facebook continues to withhold certain job ads from women in a way that perpetuates historical patterns of discrimination: Ads for sales associates for cars were shown to more men than women, while ads for sales associates for jewelry were shown to more women than men.

Courts must recognize that civil rights laws protect against online discrimination and hold tech companies and online advertisers accountable. First, courts must acknowledge the harm that discriminatory ad-targeting practices inflict on social media users, and open the courthouse doors to people who were excluded from seeing housing, job, or credit opportunities based on characteristics like race, age, and gender. Doing so will ensure that existing civil rights laws will protect people from the modern tools of discrimination, at a time when those tools are becoming ever more commonplace. Additionally, courts must make clear that Facebook is not shielded from liability under Section 230 of the Communications Decency Act — a law designed to protect internet publishers from liability for third-party content, like content posted on message boards or a user’s Facebook feed. This law offers no protection for Facebook’s own conduct, such as its design and sale of discriminatory ad-targeting tools and its use of a discriminatory ad-delivery algorithm.

The elimination of Facebook’s discriminatory ad-targeting practices is critical to advancing racial and gender equity in housing, employment, and credit. That can only happen if courts recognize the harms of online discrimination, and apply civil rights laws to the discriminatory tools of the digital age.

We need you with us to keep fighting
Donate today

Published January 27, 2022 at 09:29PM
via ACLU https://ift.tt/3AER8a3

No comments:

Post a Comment