Civil Rights Law And Digital Marketing


The evolution of digital marketing platforms gives advertisers increasingly sophisticated methods for finding their audience and serving relevant ads to that audience. At its best, digital marketing platforms can improve the user experience for consumers by introducing them to products and services they are likely to be interested in, and deliver stronger ROIs for advertisers than they would see in traditional marketing channels. The sophistication of these tools and algorithms does, however, introduce a number of challenges for advertisers, one of which is ensuring that they are complying with civil rights law and serving ads in a non-discriminatory manner.

Rachel Goodman, Staff Attorney at the ACLU Racial Justice Program, was kind enough to answer a few of our questions on that subject. Those looking for a short introduction to civil rights law and digital marketing should first read her excellent short article, “Algorithms and Civil Rights: Understanding the Issues.” Note: Neither that article nor the interview below are intended to convey or constitute legal advice and neither are substitutes for obtaining legal advice from an attorney.

PPC Hero: In cases where an agency is placing ads for a client where the use of an algorithm is found to result in civil rights violations, which parties risk liability? The advertiser? The client? The platform that created the algorithm?

Rachel Goodman: Although it will differ somewhat depending on facts and context, generally speaking, both the advertiser and the client can be liable, so they should be very careful in selecting these tools and implementing them.  In contrast, merely creating the algorithmic tool in and of itself wouldn’t result in the platform incurring liability.

I should note here, that this area of law is likely to be fluid as it develops and any judicial or regulatory decision will rest on specific facts. Agencies and advertisers making specific decisions or setting policies should consult with a qualified attorney rather than assuming that what I say in this interview would cover all of the relevant legal considerations.

PPC Hero: In terms of potential civil rights violations and liabilities, does an advertisers’ intent matter? In other words, does it make a difference whether or not the advertiser knows that the algorithm they’re employing is discriminatory?

RG: The federal civil rights laws prohibiting discrimination in housing, credit, and employment all prohibit intentional discrimination. But they also prohibit the use of neutral criteria that have a discriminatory effect, regardless of the intention—unless an entity can prove those criteria are justified by some business necessity. This is known as “disparate impact” discrimination.  As a result, if the outcome of an advertiser’s use of algorithmic targeting is that fewer women are hired, for example, and that kind of targeting isn’t necessary to meet the business goal, the advertiser can be liable regardless of its intent.

PPC Hero: Digital marketing platforms allow advertisers to direct ads to be shown more frequently or less frequently to different demographic groups. Is there settled law on cases where a protected class is not necessarily excluded, but also doesn’t see particular ads as frequently?

RG: None of this is settled law, and very little of it has been litigated at all. But the focus of the disparate impact standard is on the outcome—who gets the housing or the job or the loan as a result of the advertising practices. The U.S. Equal Employment Opportunity Commission (EEOC) follows the general rule that a prohibited disparate impact has occurred if a person from a protected group (e.g. a person of color) is less than 80%, or four-fifths, as likely to be hired as a person from a different group. But EEOC can also find an adverse impact for any statistically and practically significant impact, or even when a policy discourages applicants disproportionately on the grounds of race, sex, or ethnicity.

So, while slight variations in who sees ads are unlikely to trigger liability, substantial variation could. Finally, if an advertiser intends to limit the number of Black people who see an ad in hopes that fewer Black people apply, the advertiser would incur liability regardless of whether that strategy was successful.

PPC Hero: Digital marketing platforms also allow advertisers to serve different ads to different demographic groups. Is there settled law on cases where different protected classes see ads at the same rate as the population as a whole, but see different ads?

Generally, showing different ads to different demographic groups for the same housing, credit, or employment opportunities will not trigger liability.  The question is whether that “difference” is one of quality – that is, if members of those groups are suffering an adverse impact because they are seeing different ads.  But if they are being equally encouraged to apply, there should be no adverse impact.

PPC Hero: Do you predict that these issues can be settled conclusively by the judicial system? Or does the rate at which digital advertising technology evolves means that the courts are likely to always be reacting to new potential civil rights infringements?

There is much work remaining to be done in this space by courts and the regulatory agencies—EEOC, the Department of Housing and Urban Development (HUD), and the Consumer Finance Protection Bureau (CFPB). These bodies will clarify the applicable standards, but it will always be up to the advertising industry to make sure that new practices comply with those existing standards as technology evolves.

PPC Hero: Your Civil Rights Insider article focuses particularly on advertisements for housing and employment. Are there other industries where advertisers risk civil rights infringements when utilizing demographic targeting?

RG: Housing, employment, and credit are the sectors explicitly covered under federal law, for which disparate impact discrimination is prohibited.  But intentional discrimination is prohibited much more broadly by the Constitution and state and federal laws. Government actors – like police departments, or schools – cannot intentionally discriminate in any context, and neither can providers of public accommodations like restaurants, hotels, and transportation. As a result, advertising that intentionally excluded members of protected groups in these areas could incur liability.

PPC Hero: Finally, what would you recommend to digital advertisers who wish to educate themselves further and avoid committing civil rights violations in their advertising efforts? Are there resources available advertisers should be aware of?

RG: That’s why I’m glad to be doing this interview! I can point those interested in learning more to my short article laying out more of the legal background here. There’s also one great, detailed paper written by computer scientists and lawyers together laying out the way the digital advertising ecosystem interacts with the law.


Interested in learning more about the intersection of civil rights law and digital marketing? Learn more about Rachel Goodman’s work and get in touch with her here. You can also share feedback on Twitter – get in touch with PPC Hero @PPCHero and with Rachel Goodman @rachelegoodman1.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com