Executive Order regarding Artificial Intelligence
On the heels of many tech industry leaders voicing concern about the increasing sophistication of artificial intelligence (AI) and the singularity of AI, President Biden recently signed an Executive Order creating some ground rules for the development, testing and use of AI.
The October 30, 2023, Executive Order creates reporting requirements, seeks to protect Americans’ privacy, and seeks to prevent AI from being used for discriminatory purposes. On the reporting side, for example, the Order requires American companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety to notify the federal government when training the model, and share the results of all red-team safety tests. Different federal agencies will be vested with the responsibility to develop standards for red-testing and applying those standards to critical infrastructure systems, such as homeland security and the energy distribution systems.
In what will likely be among the most difficult tasks to implement effectively, the Executive Order also seeks to protect Americans’ privacy online and prevent AI-generated cyber fraud against consumers. Among other measures, the Department of Commerce is vested to develop guidance for content authentication and watermarking to clearly label AI-generated content.
The Executive Order provides some, albeit limited, guidance on the use of AI in the judicial system. In the criminal justice system, President Biden calls for developing “best practices” on the use of AI in sentencing, parole and probation, pretrial release and detention, risk assessments, surveillance, crime forecasting and predictive policing, and forensic analysis. The Order is more oblique on the civil litigation side of the judicial system, stating more generally that different agencies should issue guidelines to prevent AI from exacerbating discrimination.
As AI becomes more sophisticated, it is being used more frequently in civil litigation. While fictitious AI-generated legal briefs have garnered attention and sanctions (see, https://www.gstexlaw.com/chatgpt-sanctions/), AI systems are already in use in many other areas of civil litigation. Document review and summary, and identifying and preparing exhibits for deposition and trial are just two examples. However, the use of AI in jury selection has the greatest potential to undermine confidence in the impartiality of our rules-based system of government.
Courts have long ago prohibited race being used as a factor in selecting jurors, yet the practice persists and every once in a while is exposed and condemned (see https://www.gstexlaw.com/racial-bias-nullifies-9-6-million-verdict/). AI can make race-based juror selection both easier to implement and more difficult to detect. By collecting and analyzing volumes of public (and private) data, AI juror selection programs can identify jurors more likely to have sympathy for certain types of cases or to award large verdicts. While the data may appear objectively neutral, it may de facto categorize prospective jurors by race, religion or other protected categories, undermining the public’s already jaundiced attitude about the judicial process. It is unclear how, if at all, the Executive Order will prevent AI systems from being used for these purposes, or whether the task will fall on the numerous federal and state judicial systems to find their own solution.
With the regulations created by the Executive Order taking effect 90 to 365 days from the issuance of the Order, knowing what guidelines and processes are put in place is important for everyone developing, testing or using AI systems, no matter the purpose. If you should have any questions about the AI Executive Order, please contact us at info@gstexlaw.com.
Legal Disclaimers
This blog is made available by Gerstle Snelson, LLP for educational purposes and to provide general information about the law, only. Neither this document nor the information contained in it is intended to constitute legal advice on any specific matter or of a general nature. Use of the blog does not create an attorney-client relationship with Gerstle Snelson, LLP where one does not already exist with the firm. This blog should not be used a substitute for competent legal advice from a licensed attorney.
©Gerstle Snelson, LLP 2023. All rights reserved. Any unauthorized reprint or use of this material is prohibited. No part of this blog may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage or retrieval system without the express written permission of Gerstle Snelson, LLP.