Home > Article > Algorithms Acting Badly: A Solution from Corporate Law

Algorithms Acting Badly: A Solution from Corporate Law

Mihailis E. Diamantis
89 Geo. Wash. L. Rev. 801

Sometimes algorithms work against us. They offer many social benefits, but when they discriminate in lending, manipulate stock markets, or violate expectations of privacy, they can injure us on a massive scale. Only one-third of technologists predict that artificial intelligence will be a net positive for society.

Law can help ensure that algorithms work for us by imposing liability when they work against us. The problem is that algorithms fit poorly into existing conceptions of liability. Liability requires injurious acts, but what does it mean for an algorithm to act? Only people act; and algorithms are not people. Some scholars have argued that the law should recognize sophisticated algorithms as people. However, the philosophical puzzles (are algorithms really people?), practical obstacles (how do you punish an algorithm?), and unexpected consequences (could algorithmic “people” sue us back?) have proven insurmountable.

This Article proposes a more grounded approach to algorithmic liability. Corporations currently design and run the algorithms that have the most significant social impacts. Longstanding principles of corporate liability already recognize that corporations are “people” capable of acting injuriously. Corporate law stipulates that corporations act through their employees because corporations have control over and benefit from employee conduct. When employees misbehave, corporations are in the best position to discipline and correct them. This Article argues that the same control and benefit rationales extend to corporate algorithms. If the law were to recognize that algorithmic conduct could qualify as corporate action, the whole framework of corporate liability would kick in. By exercising the authority it already has over corporations, the law could help ensure that corporate algorithms work largely in our favor.

Read the Full Article Here.