Balancing the BOTS: How to Harness the Power of AI without Falling Down the Rabbit Hole

What is ChatGPT and How Will It Affect My Business?

Business owners should be aware of the benefits and legal pitfalls that lurk under the world of Artificial Intelligence (“AI”). OpenAI’s latest creation, ChatGPT, is a language-based AI chatbot that relies on user inputs to generate conversational, human-like responses. ChatGPT boasts a broad range of skills, including proofreading, translating, researching and analyzing large bodies of data efficiently.

AI can assist companies in optimizing processes for customer support, data analysis, employee training and onboarding, and can even automate other processes such as the dissemination and execution of consent forms, disclosures and disclaimers. Automating some of these procedures may be enticing to a company looking to reduce overhead.

But which of AI’ s skills should your company take advantage of? And which should it avoid?

1.  Be Careful What You (and Your Employees) Say to ChatGPT.

Unlike a typical chatbot, ChatGPT’s knowledge is based on user input. Any data disclosed to ChatGPT is retained by the AI model to I) personalize future conversations and 2) expand its knowledge base.

In April of 2023, Samsung’s code engineers were utilizing ChatGPT to analyze bugs in the source code for a new program when they mistakenly leaked top-secret codes and internal meeting notes to ChatGPT. Samsung subsequently banned its employees from using generative AI models for work-related purposes, citing a “growing concern about security risks presented by generative AI”.

When disclosing data to ChatGPT, take heed that any user inputs could be inadvertently disclosed to a third-party.

2. Consider Adding a ChatGPT Provision to Your Employee Handbook and Business Contracts.

To combat potential exposures related to employee use (or misuse) of AI, consider implementing a ChatGPT prohibition to your company’s employee handbook, particularly if numerous employees have access to sensitive data.

If an employee inadvertently submits a document to ChatGPT which has any type of confidential or sensitive data, such as trade secrets, price lists, client contacts, or budgets, the company could open itself up to liability, particularly if it is contractually obligated to keep that information confidential.

Depending on the terms of the agreement, a company that breaches a confidentiality provision could lose its contractual rights or face penalties. The agreement could even be terminated, causing a loss of revenue and reputation. Finally, the company in violation could potentially face a negligence lawsuit based on the disclosure.

Adding an AI provision to an employee handbook can help a business defend itself should an employee misuse AI while on the clock. The handbook should warn employees not to disclose private, confidential or proprietary information to any generative AI platform, such as ChatGPT.

For the same reasons, companies should consider an AI provision in contracts with vendors and subcontractors to ensure the integrity of their data and security.

3.  Do Not Rely on AI in Employment Practices.

In the employment realm, ChatGPT can screen candidates and evaluate resumes. However, these tools have been the subject of criticism from diversity advocates as well as the subject of legislation. On July 5, 2023, New York City’s Local Law 133 went into effect, prohibiting employers from using automated employment decision-making tools [such as ChatGPT] unless the tool has been subject to a bias audit within one year.

Since ChatGPT’s knowledge base is extrapolated from multiple sources, including thousands of users, it is subject to the same bias its users and sources may possess. In January of 2023, German researchers found that ChatGPT has a political affiliation, raising concerns about Al’s objectivity. In February of 2023, a Forbes editorial accused ChatGPT of political bias after it provided favorable commentary of some politicians, but not others.

In 2018, Amazon was building an AI recruitment tool, intended to streamline its hiring process. Amazon engineers abandoned the project after they determined that selection results were biased against women. But why would AI have a bias against women? Well, the data ChatGPT pulls from is only as good as the data it has been provided. Amazon’s algorithm was relying on historical data to screen and select its candidates. When looking at the data, it may have noted that in 2021, women made up only 15% of C-­suite executives at S&P 500 companies. Accordingly, when ChatGPT attempts to “analyze the data”, one of the findings it will make is that men are statistically more likely to hold C-suite positions. This is how AI could silently bring bias into the hiring process. A best hiring practice requires humans with hearts and minds to make thoughtful employment decisions.

4.  Do Not Rely on AI for Legal Advice.

Whatever your legal needs, it is critical to consult with an experienced attorney who is familiar with the laws in your jurisdiction. The law changes frequently and varies greatly between jurisdictions, such that AI should never be relied on for legal advice.

ChatGPT has been known to fabricate data to fill in gaps in its knowledge. In May of 2023, a New York City litigator asked ChatGPT to conduct legal research for a brief. ChatGPT generated false citations to fictional cases and then assured the unsuspecting attorney that the citations were real and could be “found in reputable legal databases such as LexisNexis and Westlaw.” Such responses from ChatGPT are now referred to as “hallucinations.”

ChatGPT is aware of its own limitations. When asked if it can provide legal advice, ChatGPT wisely responds with “[n]o, I cannot provide legal advice. I am an Al language model and not a licensed attorney. Legal matters require expertise and knowledge of specific laws and regulations that can vary by jurisdiction. It is always recommended to consult with a qualified attorney or legal professional for personalized legal advice pertaining to your specific situation.”

For more information, please contact Kelly Cronin at kelly.cronin@sfbbg.com or at 312-648-2300.

Related Articles

Q1-2024  Employment Spotlight

Q1-2024 Employment Spotlight

The U.S. Department of Labor’s new rule, effective March 11, 2024, aims to clarify worker classification under the Fair Labor Standards Act. It defines “independent contractor” based on economic dependence, utilizing a “six plus one” factor test. These factors slightly favor an employment relationship.