Home > Article > Automating the Risk of Bias

Automating the Risk of Bias

Kristin N. Johnson
87 Geo. Wash. L. Rev. 1214

Artificial intelligence (“AI”) is a transformative technology that has radically altered decision-making processes. Evaluating the case for algorithmic or automated decision-making (“ADM”) platforms requires navigating tensions between two normative concerns. On the one hand, ADM platforms may lead to more efficient, accurate, and objective decisions. On the other hand, early and disturbing evidence suggests ADM platform results may demonstrate biases, undermining claims that this special class of algorithms will democratize markets and increase inclusion.

State law assigns decision-making authority to the boards of directors of corporations. State courts and lawmakers accord significant deference to the board in the execution of its duties. Among its duties, a board must employ effective oversight policies and procedures to manage known risks. Consequently, the board of directors and senior management of firms integrating ADM platforms must monitor operations to mitigate enterprise risks including litigation, reputation, compliance, and regulatory risks that arise as a result of the integration of algorithms. After the recent financial crisis, firms adopted structural and procedural governance reforms to mitigate various enterprise risks; these approaches may prove valuable in mitigating the risk of algorithmic bias. Evidence demonstrates that heterogeneous teams may identify and mitigate risks more successfully than homogeneous teams. Heterogeneous teams are more likely to overcome cognitive biases such as confirmation, commitment, overconfidence, and relational biases. This Article argues that increasing gender inclusion in the development of AI technologies will introduce important and diverse perspectives, reduce the influence of cognitive biases in the design, training, and oversight of learning algorithms, and, thereby, mitigate bias-related risk management concerns.

Read the Full Article Here.