holographic background

Hot topic: Legal, Regulatory & Compliance Considerations about ChatGPT

Not a day goes by without a headline about new capabilities, use cases and products in the generative AI space. Today, we are looking at ChatGPT.


In brief

  • This article elaborates legal, regulatory and compliance considerations concerning ChatGPT.
  • While also discussing recent regulatory developments in the context of AI and data protection in particular.

ChatGPT is the current hot topic being discussed across the globe. Garnering over 100 million active users in the first two months since its launch on 30 November 2022, it is disrupting industries worldwide. The hype around ChatGPT is understandable given its impressive capabilities, however its widespread use has sparked debates concerning various legal, regulatory and compliance considerations. 

Disrupting industries worldwide
million active users in the first two months since the ChatGPT launch.

ChatGPT is a powerful artificial intelligence technology developed by the American AI company, OpenAI. It was built on a large language model (LLM) which uses deep learning techniques to generate human-like responses to natural language input. ChatGPT has been trained using vast amounts of text data which is then used to predict outputs in a text-based format.  

The release of ChatGPT has highlighted the potential benefits of adopting generative AI in a variety of fields, including the legal sector. ChatGPT can for example be useful within the legal industry for summarizing case studies, and automating mundane administrative tasks.

Despite the impressive opportunities and capabilities of ChatGPT, there are also various risks. A significant challenge of ChatGPT is its potential to produce bias or discriminatory outputs. The use of ChatGPT also raises ethical concerns regarding its potential to manipulate or deceive others. ChatGPT is prone to hallucinations and can generate false or misleading information that could harm consumers. In a legal context, questions may arise regarding the degree to which a person relies on information provided by ChatGPT or other generative AI. Additionally, while ChatGPT has certain safeguards that prevent it from outputting offensive and harmful content, these safeguards can be bypassed and exploited for purposes such as writing malicious code.

Of particular concern is the risk that ChatGPT poses to data privacy. The data collection method used to train ChatGPT may be unlawful if data was scraped from a source without the consent of the data owners. The data collected may include personal or sensitive data for which consent is required under relevant data protection laws, notably the EU General Data Protection Regulation (GDPR), and the Swiss Federal Act on Data Protection (FADP). Transparency is another important aspect of data privacy in the context of ChatGPT. Users should be informed of how their personal data is being collected, processed and used, and should have the ability to access or control their personal data as required.

The hype around ChatGPT is understandable given its impressive capabilities, however its widespread use has sparked debates concerning various legal, regulatory and compliance considerations.

From a regulatory perspective, there have been growing calls to halt further releases of ChatGPT and investigate its developer, OpenAI, due to data privacy, misinformation, and cybersecurity concerns. The Swiss Federal Data Protection and Information Commissioner released a statement providing cautionary advice for organizations intending to use ChatGPT. Additionally, the European Data Protection Board announced the creation of a task force enacted specifically for the purpose of investigating ChatGPT, reiterating the need for proper regulation of such AI systems.                                                                                                                                                                                                                                                                                                                                                              

The rapid rise of ChatGPT has also disrupted legislative debates on the EU Artificial Intelligence Act (AI Act), which aims to regulate the development and use of AI in the European Union. Concerns have been raised about how generative AI such as ChatGPT will be regulated under the upcoming AI Act. Given the significant advances made by ChatGPT and its potential impact on society, EU legislators are pushing towards placing stringent requirements for those operating such AI systems.

Summary

Organizations intending to utilize ChatGPT must ensure compliance with applicable legislation, particularly those pertaining to AI and data protection. At EY, we can support your organization on a broad range of topics and challenges regarding the use of ChatGPT, like e.g. on governance, legal, regulatory and compliance matters. 

Acknowledgement

We would like to thank Cathy O’Neill for her valuable contributions to this article.

About this article

Related articles

A new Era for Data Protection in Switzerland – Are you ready?

A new Era for Data Protection in Switzerland – Are you ready?