DC AG Introduces “Landmark Legislation” to Issue Protections Against Discriminatory Algorithms

On December 9, 2021, Attorney General of the District of Columbia (DC), Karl A. Racine,  announced the introduction of the Stop Discrimination by Algorithms Act of 2021 (the Act), landmark legislation intended to “prohibit companies and institutions from using algorithms that produce biased or discriminatory results.” The legislation will only be applicable to “covered entities”[1] operating in DC.

Specifically, the Act would “hold businesses accountable for preventing biases in their automated decision-making algorithms and require them to report and correct any bias that is detected.” The bill reportedly will “strengthen civil rights protections and protect marginalized communities from the harm caused by algorithmic bias by:

  • Prohibiting companies and organizations from using algorithms that produce biased and unfair results. The proposed legislation would make it illegal for companies and organizations to use discriminatory algorithms to make decisions about key areas of life opportunity, including education, employment, housing, and public accommodations and services like credit, health care, and insurance.

  • Requiring companies to audit their algorithms for discriminatory patterns. Companies would be required to perform an audit each year to ensure that algorithmic processing practices do not discriminate directly and to determine whether the results show a disparate impact on protected groups. Companies would also be required to document how their algorithms are built, how the algorithms make determinations, and all of the determinations made. Companies would be required to report audit results and any needed corrective steps to OAG.

  • Increasing transparency for consumers. Companies would be required to make easy-to-understand disclosures to all consumers about the companies' use of algorithms to reach decisions, what personal information they collect, and how their algorithms use information to reach decisions. If businesses or corporations make an unfavorable decision about an opportunity—like denying a housing application or charging a higher interest rate for a loan—based on an algorithmic determination, they must provide a more in-depth explanation. They must also give consumers an opportunity to submit corrections to prevent negative decisions based on inaccurate personal information.”

If passed, a company's failure to comply with the requirements of the legislation would result in a $10,000 fine.

Although this bill will only apply to DC businesses, it should be closely monitored by organizations across the United States as the recent developments in technology and a push for use of artificial intelligence in many sectors will continue to create a need for legislative actions to govern such use. Other states could very quickly follow suit with this type of legislation or some variation.

If you have any questions or concerns about this recent legislative action or how it could impact your organization, please contact Kennedy Sutherland.


[1] Under section (3) of the Act, a “covered entity” means

any individual, firm, corporation, partnership, cooperative, association, or any other organization, legal entity, or group of individuals however organized, including entities related by common ownership or corporate control, that either make algorithmic eligibility determinations or algorithmic information availability determinations, or relies on algorithmic eligibility determinations or algorithmic information availability determinations supplied by a service provider, and that meets one of the following criteria:

(A) Possesses or controls personal information on more than 25,000 District residents;

(B) Has greater than $15 million in average annualized gross receipts for the 3 years preceding the most recent fiscal year;

(C) Is a data broker, or other entity, that derives 50 percent or more of its annual revenue by collecting, assembling, selling, distributing, providing access to, or maintaining personal information, and some proportion of the personal information concerns a District resident who is not a customer or an employee of that entity; or

(D) Is a service provider.

Previous
Previous

White House Launch New Initiative to Innovate Technologies on a World-Wide Basis

Next
Next

FINRA Issues Multi-Million Dollar Fine After Consumer Recordkeeping Violations