Wakeling was particularly impressed with Harvey’s skill at translation. He is strong in mainstream law but struggles in certain niches where he is more prone to hallucinations. “We know the limits and people are very well informed about the risk of hallucinations,” he says. “At the firm, we have done a great job of developing an extensive training program.”
Other lawyers who spoke to WIRED were cautiously optimistic about the use of AI in their practice.
“This is certainly very exciting and definitely indicative of some of the fantastic innovation going on in the legal industry,” says Sian Ashton, Client Transformation Partner at law firm TLT. “However, it’s definitely a tool in its infancy and I’m wondering if it really does much more than provide case documents that are already available in business or subscription services.”
AI is likely to continue to be used for entry-level work, says Daniel Seredyuk, a data protection lawyer based in Paris, France. “Drafting legal documents can be a very time consuming task, which AI seems to do quite well. Contracts, policies and other legal documents tend to be regulatory in nature, so AI’s ability to collect and synthesize information can do a very heavy job.”
But as Allen & Overy found, the results of the AI platform will require careful analysis, he says. “Part of practicing law is understanding your client’s specific circumstances, so the outcome will rarely be optimal.”
Sereduk says that while the results of legal AI will require close monitoring, the inputs can be just as difficult to manage. “Data sent to the AI could become part of the data model and/or training data, and this would likely violate privacy obligations to customers and individuals regarding data protection and privacy rights,” he says.
This is especially true in Europe, where the use of this type of AI could violate the principles of the EU General Data Protection Regulation (GDPR), which governs how much data about individuals can be collected and processed by companies.
“Can you legally use a program built on this foundation? [of mass data scraping]? In my opinion, this is an open question,” says data protection expert Robert Bateman.
Law firms will likely need a strong legal basis under the GDPR to pass any personal data about clients they control to a generative AI tool like Harvey, and existing contracts covering third party processing of that data, Bateman said. using AI tools.
Wakeling says Allen & Overy does not use personal data to deploy Harvey and will not do so unless it is confident that any data will be shielded and protected from any other use. It will be up to the information security department of the company to decide when this requirement is met. “We take customer data very seriously,” Wakeling says. “We currently use it as a system of non-personal data, non-client data, to save time on research or drafting, or preparing a plan for slides and the like.”
International law is getting tougher when it comes to providing generative AI tools with personal data. In Europe, EU law on artificial intelligence aims to more strictly regulate the use of artificial intelligence. In early February, the Italian Data Protection Agency intervened to prevent Replika’s generative AI chatbot from using personal data of its users.
But Wakeling believes Allen & Overy can leverage AI while keeping customer data secure and improving how the company works. “This will have a significant impact on productivity and efficiency,” he says. Small tasks that would otherwise take a lawyer precious minutes can now be outsourced to AI. “If you add up that’s over 3,500 lawyers who have access to it now, that’s a lot,” he says. “Even if it’s not total destruction, it’s impressive.”