January 2023

Overcoming Small Sample Sizes When Identifying Bias

New York City has passed a new law called Local Law 144 requiring employers and employment agencies to commission independent, impartial bias audits of automated employment decision tools (AEDTs) being used when evaluating candidates for employment or employees for promotion. The bias audits will be based on impact ratios using the Equal Employment Opportunity Commission's four-fifths rule to calculate whether a hiring procedure results in adverse or disparate impact. However, the rule can provide false positives when sample sizes are small, and the NYC legislation doesn't provide guidance on this issue. The enforcement date of Local Law 144 has been delayed to July 5, 2023, providing employers, employment agencies, and vendors more time to collect additional data and make the analysis more robust.

EEOC Announces a Draft Strategic Enforcement Plan for 2023—2027

The US Equal Employment Opportunity Commission (EEOC) has published a Strategic Enforcement Plan (SEP) for the 2023-2027 fiscal years, which prioritizes the regulation of AI and automated employment tools to prevent discrimination against protected groups. The EEOC aims to ensure that these tools do not disproportionately impact protected subgroups, and has launched initiatives to examine the impact of AI on employment decisions. The EEOC recently sued iTutorGroup for age discrimination due to their use of software to reject older applicants, highlighting the importance of regulation in preventing AI-related discrimination in employment.

Digital Markets Act: The EU Commission is Cracking Down

The Digital Markets Act (DMA) came into effect on 1 November 2022 and focuses on regulating fair competition and consumer choice in the digital economy. The legislation defines gatekeepers as providers of core platform services and imposes specific obligations, including an independent audit on companies designated as gatekeepers. The DMA will be enforced from February/March 2024 and failure to comply could result in fines of up to 10% of the company’s total worldwide annual turnover. The DMA, in combination with the Digital Services Act (DSA) and the EU AI Act, is set to make digital technologies safer for users and companies will struggle to find loopholes to avoid doing their due diligence. The DMA is anticipated to set a global precedent, and regulation like this will soon mean that AI around the world is deployed more transparently and with greater accountability.

December 2022

New York City’s DCWP Updates their Proposed Rules for Local Law 144

New York City's Local Law 144 mandates the use of independent impartial bias audits for automated employment decision tools (AEDTs) used for employment or promotions. The enforcement date has been pushed back to 2023 due to concerns about who qualifies as an independent auditor and the suitability of the impact ratio metrics. The updated rules clarify that bias audits must be conducted by a third party and include metrics for calculated impact ratios based on selection rate or average score. The audit can be based on test data when historical data is not available. Additionally, employers must provide AEDT data retention policies, making them available on their website. Holistic AI offers auditing services for businesses seeking compliance.

NYC Insurance Circular Letter: Using Consumer Data and Information Sources

The New York Department of Financial Services published a circular letter in January 2019 to insurers authorized to write life insurance in the state. The letter warns insurers not to use external data sources, algorithms, or predictive models in underwriting or rating unless it has been determined that the system does not collect or use prohibited criteria. The burden and liability lie with the insurer, and the NYDFS reserves the right to audit and examine an insurer’s underwriting criteria, programs, algorithms, and models and can take disciplinary action if necessary. The letter also highlights the obligation to comply with existing anti-discrimination and civil rights laws and regulations. Insurers should provide transparency to consumers regarding the reason or reasons for any adverse underwriting decisions made using external data sources or predictive models. Failure to comply may result in an NYDFS audit and breach of existing anti-discrimination laws.