January 2023

New York City passed Local Law 144 in November 2021 to mandate bias audits of automated employment decision tools (AEDTs) used in candidate screening and promotion. The Department of Consumer and Worker Protection (DCWP) proposed metrics to calculate impact ratios for regression systems, but they have limitations, such as being fooled by unexpected distributions and data tweaking. The article suggests using different metrics that consider fairness over the whole distribution, tests to compare different distributions, or metrics that compare the ranking of candidates rather than the score itself. Holistic AI offers an open-source library of metrics for both binary and regression systems and bias mitigation strategies.

New York City has passed a new law called Local Law 144 requiring employers and employment agencies to commission independent, impartial bias audits of automated employment decision tools (AEDTs) being used when evaluating candidates for employment or employees for promotion. The bias audits will be based on impact ratios using the Equal Employment Opportunity Commission's four-fifths rule to calculate whether a hiring procedure results in adverse or disparate impact. However, the rule can provide false positives when sample sizes are small, and the NYC legislation doesn't provide guidance on this issue. The enforcement date of Local Law 144 has been delayed to July 5, 2023, providing employers, employment agencies, and vendors more time to collect additional data and make the analysis more robust.

The US Equal Employment Opportunity Commission (EEOC) has published a Strategic Enforcement Plan (SEP) for the 2023-2027 fiscal years, which prioritizes the regulation of AI and automated employment tools to prevent discrimination against protected groups. The EEOC aims to ensure that these tools do not disproportionately impact protected subgroups, and has launched initiatives to examine the impact of AI on employment decisions. The EEOC recently sued iTutorGroup for age discrimination due to their use of software to reject older applicants, highlighting the importance of regulation in preventing AI-related discrimination in employment.

The Digital Markets Act (DMA) came into effect on 1 November 2022 and focuses on regulating fair competition and consumer choice in the digital economy. The legislation defines gatekeepers as providers of core platform services and imposes specific obligations, including an independent audit on companies designated as gatekeepers. The DMA will be enforced from February/March 2024 and failure to comply could result in fines of up to 10% of the company’s total worldwide annual turnover. The DMA, in combination with the Digital Services Act (DSA) and the EU AI Act, is set to make digital technologies safer for users and companies will struggle to find loopholes to avoid doing their due diligence. The DMA is anticipated to set a global precedent, and regulation like this will soon mean that AI around the world is deployed more transparently and with greater accountability.
December 2022

New York City's Local Law 144 mandates the use of independent impartial bias audits for automated employment decision tools (AEDTs) used for employment or promotions. The enforcement date has been pushed back to 2023 due to concerns about who qualifies as an independent auditor and the suitability of the impact ratio metrics. The updated rules clarify that bias audits must be conducted by a third party and include metrics for calculated impact ratios based on selection rate or average score. The audit can be based on test data when historical data is not available. Additionally, employers must provide AEDT data retention policies, making them available on their website. Holistic AI offers auditing services for businesses seeking compliance.