May 2023

European Parliamentary Committees Adopt AI Act Text and Set Date for Plenary Adoption

The EU AI Act is a piece of legislation proposed by the European Commission to regulate the AI systems available in the EU market. The Act takes a risk-based approach to regulation, classifying systems as posing minimal risk, limited risk, high risk, or unacceptable levels of risk. The December 2022 text defined AI as a machine-based system designed to operate with autonomy that can generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments. The adopted text specifies eight broad high-risk applications of AI, with important updates to biometric and biometrics-based systems. A new Fundamental Rights Impact Assessment obligation has been introduced for users of High-Risk AIs, and prohibited practices now include AI models used for biometric categorization, predictive policing, and the collection of facial images for database construction. The EU AI Act will have important implications for the fairness and safety of AI systems available in the EU market, with deployers and users of AI systems facing a number of obligations.

A Comparison of NYC and New Jersey’s Bias Audit Mandates

New York City and New Jersey are introducing legislation to require bias audits of automated employment decision tools. New York City Local Law 144 will require audits of these tools from July 2023, while the New Jersey Assembly Bill 4909, which would introduce similar requirements if passed, has recently been introduced. The legislation is being proposed as concerns grow around the risks posed by technology including artificial intelligence. Companies will need to take steps to manage risks and procure any necessary interventions, such as bias audits, to remain compliant.

Draft for Conducting Independent Audits under the Digital Services Act Released for Public Comment

The European Commission released a draft for conducting audits under the Digital Services Act (DSA) on May 6, 2023, which pertains to the 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs, including Facebook, LinkedIn, Twitter, Bing, and Google). The purpose of this delegated regulation is to promote transparency and public accountability for large platforms, with provisions for annual independent audits. Algorithmic systems will be audited and will include disclosures and risk assessments. The draft clarifies the relationship between Audited Providers and Auditing Organizations, and lays down provisions for selecting auditors, data sharing, and cooperation. Auditing Organisations will send Final Reports, including Risk Analyses and Audit Conclusions, and must be completed within a year from the date of application of the obligations to the Audited Provider. The draft is open for public comments until June 2, 2023. The article promotes Holistic AI's interdisciplinary approach for AI governance, risk, and compliance.

Who Needs to Comply with the NYC Bias Audit Mandate?

New York City's Local Law 144 requires bias audits of automated employment decision tools, and employers and employment agencies who use these tools within the city must have a bias audit performed by an independent auditor. Vendors of these tools may not be directly affected by the legislation but may still be subject to audits if their clients meet the above criteria. Vendors are encouraged to proactively get an audit to mitigate potential issues and provide assurance to clients and prospects that their software is in compliance. A free consultation or quiz is available to determine if an audit is necessary. This blog article is for informational purposes only and is not intended to provide legal advice.

NYC Bias Audits Protected Characteristics

The New York City Council has mandated bias audits of automated employment decision tools (AEDTs) used to evaluate employees for promotion or candidates for employment in New York City. The NYC Bias Audit Law requires employers to make a summary of the results of the bias audit publicly available on their website, increasing transparency in the hiring process. The law requires testing for disparate impact against component 1 categories required to be reported by employers, including sex and race/ethnicity categories. The delayed enforcement deadline provides an opportunity to collect necessary data or use test data for the bias audit. The article advises early preparation by employers to ensure compliance with the law.