August 2022

What is Bias and How Can it be Mitigated?

Bias refers to unjustified differences in outcomes for different subgroups, which can occur in human decision-making and algorithmic systems. Sources of bias in algorithms include human biases, unbalanced training data, differential feature use, and proxy variables. Bias mitigation strategies include obtaining additional data, adjusting hyperparameters, and removing or reweighing features. Bias audits will soon be required in New York City and can contribute to risk management of algorithmic systems. It is important to seek professional legal advice when dealing with bias in decision-making.