Symposium 2023 | Fireside Discussion: Putting Legally Disruptive Technologies into Practice

Summary

Fireside Discussion: Putting Legally Disruptive Technologies into Practice
at

The George Washington Law Review’s Vol. 92 Symposium:
Legally Disruptive Emerging Technologies

Summary authored by Noah Fisher.

The final installment of the 2023 George Washington Law Review Symposium was a Fireside Chat moderated by Aram Gavoor, Associate Dean for Academic Affairs of The George Washington University Law School. Joining Dean Gavoor were three esteemed public servants: Keith Sonderling, Commissioner, United States Equal Employment Opportunity Commission (“EEOC”); Hester Peirce, Commissioner, United States Securities and Exchange Commission (“SEC”); and Jonathan J. Smith, Deputy Assistant Attorney General, Civil Rights Division, United States Department of Justice (“DOJ”). Please note that all remarks made by the panelists are personally held views and should not be construed as representative of the opinion of their respective agencies.

The opening question posed by Dean Gavoor asked the panelists to reflect upon changes made to the rulemaking and enforcement process in response to legally disruptive technologies.

Commissioner Peirce responded by noting two general trends. First, the SEC is highly skeptical of technological advancements. This fact is exacerbated when securities originating from outside the established financial industry are brought before the Commission. A product of this skepticism is that legally disruptive technologies, like cryptocurrency, are subject to a lengthy approval process. The result: a significant number of enforcement actions against individuals who acted without approval from the Commission. A second trend Commissioner Peirce noted is the power of artificial intelligence (“A.I.”) in financial markets. Specifically, she believes that A.I. could have significant applications in helping Americans get cheap, yet effective, investment assistance. In light of this, the SEC initiated a rulemaking around the use of A.I. and other technologies in the work of investment banking.

Deputy Assistant Attorney General Smith continued the dialogue, remarking that a significant concern at the Civil Rights Division is that discrimination can be baked into the DNA of A.I. algorithms. This, he explained, prompts two important questions. How can existing civil rights laws be applied to A.I., and what remedies exist for those injured by discriminatory A.I. algorithms?

Commissioner Sonderling answered by emphasizing that A.I. is already widespread in the employment arena. For instance, many employers utilize A.I. to create job postings and job descriptions, parse through applicants’ resumes, evaluate employees’ job performances, and terminate employees’ positions. He continued by highlighting that existing employment laws are equally applicable to A.I.-made employment decisions as they are to human-made employment decisions. Considering this, Commissioner Sonderling argued that agencies need not reenvision their approaches to rulemaking and enforcement. Instead, he posited that agencies should stick to what they know and apply existing sources of law to A.I..

The next question asked by Dean Gavoor asked the panelists how their agencies balance regulating legally disruptive technologies with enforcement against legally disruptive technologies.

Commissioner Peirce opened the dialogue, observing that while the laws governing the SEC are quite old, Congress safeguarded against the inevitability of technological change by providing the Commission with exemptive authority. This allows the Commission to permit conditioned deviation from SEC regulations. Commissioner Peirce explained that the Commission’s exemptive power provides a critical mechanism through which the SEC can test new ideas and determine when the promulgation of a new rule is needed.

Deputy Assistant Attorney General Smith was next to answer, noting that the primary role of the DOJ is enforcement. A major challenge in going about enforcement actions against legally disruptive technologies, the Deputy Assistant Attorney General explained, is that the DOJ is composed primarily of attorneys who lack the technical knowledge to understand what exactly they are prosecuting against. Further, he explained that there is still great uncertainty regarding how existing law applies to new technologies. For instance, the Deputy Assistant Attorney General asked the audience to consider whether discriminatory A.I. algorithms should constitute disparate treatment or disparate impact. Finally, he noted that enforcement by itself cannot be expected to have a widespread impact in restraining discriminatory legally disruptive technologies.

Commissioner Sonderling began his remarks by stating that enforcement should always be a last resort. As such, he recommended not only that agencies engage in rulemaking, but also that they provide significant guidance of how new technology applies to existing law. Nevertheless, a significant limitation the EEOC experiences is that the agency may only commence an enforcement action upon receiving a discrimination complaint from an employee. This poses a significant problem: many employees are unaware that they are being subject to a legally disruptive technology. For instance, it is unlikely that an interviewee knew that an employer used facial expression tracking to determine whether she would be an effective salesperson. Unaware she has been the discriminated against, the employee will inevitably fail to lodge a complaint against that employer. Moreover, even if an employee knew that they were subject to A.I.-based discrimination, the EEOC complaint form lacks a section for A.I.-based discrimination.

Dean Gavoor’s penultimate question asked the panelists about the extent of their collaboration with other agencies in regulating legally disruptive technologies.

Commissioner Sonderling responded by explaining that the EEOC is a traditional independent agency headed by a bi-partisan, multi-member Commission. He indicated that the level of collaboration is largely determined by the Chair of the Commission. Commissioner Sonderling further noted that there has been an increase in the number of memoranda of understanding (“MOU”), a formal agreement of collaboration between agencies, but has not yet seen any on the issue of A.I..

Deputy Assistant Attorney General Smith highlighted that although the DOJ is a pure executive agency, there is a firewall between the DOJ and the White House. Further, he explained that the DOJ engages in frequent collaboration efforts with other agencies when initiating enforcement actions. In the context of A.I., Deputy Assistant Attorney General Smith noted, there has been less collaboration and enforcement. He hinted, however, that information sharing via MOUs could provide an effective pipeline of complaints and concerns that will allow the DOJ to better understand when enforcement may be sensible.

Commissioner Peirce remarked that the SEC takes quite seriously its position as an independent agency. With that said, she explained that the SEC works quite closely with the DOJ as both are frequently investigating the same misconduct.

Dean Gavoor’s final question asked the panelists how, if at all, their agencies have implemented emerging technologies in enforcement efforts.

Deputy Assistant Attorney General Smith opened by noting that the government is quite slow to adapt to new technology. This is especially true given that the DOJ is primarily driven by attorneys who lack technological expertise. In recent years, Deputy Assistant Attorney General Smith stated, the DOJ has done a better job hiring more technologists so that it can better adapt to the new era.

Commissioner Peirce shared that the SEC is always looking for ways to implement new technology, but with an important caveat. As data becomes more prevalent, she explained, it becomes very tempting to ask Americans for their data. As such, she expressed that the government should proceed with caution when adopting new technology.

Commissioner Sonderling expressed that the EEOC is adapting to the technical revolution by retaining more specialists, such as economists and psychologists, who bring different perspectives on how to interpret data that the agency has collected. Further, Commissioner Sonderling was quite hopeful that technology could help the EEOC be a more effective advocate for employees. For example, he noted the possibility of creating a map that would allow EEOC attorneys to zoom into a particular location and learn the composition of its workforce. Likewise, he discussed the possibility of using technology to track data on employment discrimination broadly and across industries.

The final segment of the Fireside Chat was a question-and-answer panel. The first question offered by a student asked whether there is a risk that heavy regulation of A.I. may restrict societal progress. Deputy Assistant Attorney General Smith emphasized that it is not the job of his agency to make value judgments on whether technology is good or bad. Rather, the role of the DOJ, and more specifically, the Civil Rights Division, is to reduce discriminatory behavior. Periodically, this requires bringing an enforcement action against a discriminatory technology. Commissioner Sonderling explained that his concern with legally disruptive technologies is that they can exponentially scale existing sources of discrimination. A resume screening program, for instance, can sort through thousands of resumes in a matter of seconds. If discriminatory tendencies were baked into the code of the program, the consequences could be disastrous. Commissioner Peirce expressed that she believes A.I. and other technology can exist in harmony with human-produced work.

The second question asked the panelists whether the mass collection of Americans’ data constitutes an injury that gives rise to liability against the collector of that data. Commissioner Sonderling noted that employees have no right to privacy in any activity they engage in while in the workplace. He continued by explaining that the bigger privacy question in employment law is whether this monitoring violates the National Labor Relations Act by stifling employees’ ability to form a union. Deputy Assistant Attorney General Smith said that the collection of data can at times give rise to liability. For instance, the collection of healthcare data may violate the Health Insurance Portability and Accountability Act, and the collection of crime-seeking data could have Fourth Amendment implications. Commissioner Peirce said that the SEC frequently tells employers to monitor their employees’ devices for misconduct. The question of privacy becomes quite murky, she explained, when workplace activity is done on private devices.

The final question asked the panel how they plan to commence enforcement actions against new technologies for both disparate treatment and disparate impact. Deputy Assistant Attorney General Smith remarked that the major question the DOJ has is who to hold responsible. Is the discriminatory outcome the fault of the individual who bought the software, or the individual who designed it? Commissioner Sonderling explained that the likely outcome of these suits is dependent on which expert witness is the most effective at trial.