More
Social Media

Many Aren’t Comfortable With Legal Using Generative AI. But That Could Soon Change

In the ever-evolving landscape of artificial intelligence, one industry still grapples with uncertainty – the legal sector. While tech experts debate how and when to deploy generative AI, a recent survey reveals that many consumers are not quite ready to embrace this new technology in the legal realm.

The survey, conducted in June 2023 by e-discovery provider DISCO, involved 1,000 U.S.-based consumers over the age of 18. It shed light on lingering concerns surrounding generative AI, particularly when used in regulated industries like law.

Surprisingly, nearly 39% of respondents expressed discomfort with any highly regulated industry employing generative AI, including healthcare and finance. However, what’s even more intriguing is that only 21% felt comfortable with the legal sector utilizing this cutting-edge technology.

One of the primary reasons behind this hesitancy seems to be related to privacy and security risks. Approximately 31% of respondents expressed significant concerns about their privacy and security when using generative AI, especially in financial and legal settings.

Tom Furr, DISCO’s Chief Marketing Officer, believes that consumers seek innovation but also demand reassurance through thorough testing and implementation. He remarked, “I’m ready to adopt something really new and cool, but once you’ve road tested it for [a number of] years.”

Specifically, consumers harbor anxiety about the legal industry’s use cases for generative AI. Almost a third of respondents opposed any application of the technology in the legal world. The prevailing narrative around generative AI often depicts it as intimidating, job-threatening, and potentially dangerous to humanity, which fuels these anxieties.

Also Read:  Navigating the AI Horizon: Unleashing Strategic Ingenuity in Legal Arenas

Nonetheless, certain legal use cases for generative AI could garner more acceptance. For instance, 39% of respondents expressed approval for the technology’s use in record keeping and documentation, while 35% endorsed citing past cases, and 28% supported drafting legal documents and contracts.

The key factor that could instill confidence in consumers is the assurance of human management. Approximately 65% of surveyed individuals indicated that knowing humans are still in control would increase their trust in industries leveraging generative AI. Following closely, around 48% sought regulatory oversight and transparency as comforting factors.

Furr urged businesses to be cautious and thorough when implementing generative AI solutions. He emphasized the importance of real technology that has been rigorously tested. The timeline of the technology’s development could also be telling, as companies with extensive experience in AI research may be more trustworthy.

Looking ahead, Furr expects attitudes to change as consumers become more familiar with generative AI. Currently, 62% of respondents have not yet used any generative AI tools for work-related tasks. He believes that proving the technology’s safety and value will be essential to building consumer confidence.

In Furr’s words, “We’re going to end up with a kind of a checklist of, for you to be comfortable using this technology, it needs to be able to do this. And I think that’s what we’re going to be solving for over time versus just trying to keep up with the cool things that ChatGPT can do.”

While the legal industry grapples with public perceptions and concerns surrounding generative AI, it’s evident that transparency, regulatory oversight, and human involvement are crucial elements in winning consumer trust. As technology continues to evolve, it remains to be seen how these attitudes will shift, ultimately determining the future role of generative AI in the legal world.

Also Read:  Senators Unfold Pioneering Vision for AI Oversight
Share the post

Join our exclusive newsletter and get the latest news on AI advancements, regulations, and news impacting the legal industry.

What to read next...