tldr; If you asked me today whether or not one should 1) learn to code, or 2) go to law school and what the future of the profession looks like for both coders and lawyers, then I would say the future is bright. AI has lowered the transactions costs to writing and running code. This means more code, more applications, more bugs, more deals, more lawsuits, more of everything we need readers, interpreters, and authors of code to do. The language used and whether the audience is human don't matter anymore. At least until AGI makes all of us obsolete, I believe these are some of the best skills to develop in a post-AI economy.
Long version
I'm a lawyer who writes code. I have been a student of AI for over a decade now, having caught a glimpse of the future after ImageNet (which demonstrated that deep neural networks could recognize objects in photos), and work with the incredible team at Yahoo Labs, who I was fortunate to have had the opportunity to support as a lawyer. In the last two years, I've been completely immersed in AI, since recognizing how useful it could be in writing code. The first version of this patent valuation project (https://github.com/riemannzeta/xena) took me months to complete when I built it at Yahoo circa 2013 and an afternoon when I rebuilt it from scratch using ChatGPT in 2023.
Large Language Models (LLMs) based on the transformer architecture (the key attention-based architectural breakthrough that enabled ChatGPT) are by themselves fascinating, but not particularly useful without at least two additional ingredients: a) context and b) tools.
Context
The context is critical for responses to be useful because the entire body of knowledge on which LLMs are trained is incomprehensibly large and diverse. Context is necessary to identify the sparse regions of their high-dimensional space that are relevant to the specific question being asked. The importance of making context explicit for LLMs is at least part of the reason why only mediocre talents like myself were quick to appreciate the power of LLMs: There's a barbell effect whereby the novices get bad responses because they have no understanding of what context is appropriate and the experts are frustrated because the LLMs have almost no relevant context for the frontiers of knowledge at which the experts operate. LLMs are useful moving up the steepest part of the learning curve; not at the flat bottom or top edges.
Tools
The tools are critical because even with the ability to give a useful response and a sense of the relevant context, supplying the LLMs the relevant context is an unscalable manual process until the LLMs themselves have the ability to search for, get, and filter relevant context. They're useful too if you want to actually do anything with a response, but the tools I mean are critical here are the tools necessary for the LLMs to construct their own context —think web search, database queries, API calls to bugbases (like Jira), and so on.
But when you give even the current generation of LLMs the context they need and the appropriate string of prompts, LLMs are capable of fully automating workflows that until recently were possible to complete only using trained experts. In my field of patent law, the hourly rates are in the $100s or $1000s and the turnaround is weeks or months. A team of LLMs with tools can do the same work that would have been billed at $1000s for $1s and in minutes.
The Coase Theorem and Transactions Costs
So why am I saying that it's a great time to learn to code and practice law? Because of Ronald Coase (and the professors I had at the University of Chicago who introduced me to his work). Ronald Coase won the Economics Nobel for his work on what is now called the Coase Theorem. The Coase Theorem posits that if “transactions costs” are sufficiently low, then private parties can negotiate to reach an efficient reallocation of rights or resources. “Transactions costs” means anything and everything that prevents the parties from reaching a private agreement — from the brokerage fee or the bid-ask spread in a stock trade to the costs of identifying potential counterparties for a transaction through search and the costs of due diligence in M&A and discovery in litigation.
What I am seeing first hand is that LLMs with tools are reducing by a factor of 100 the transactions costs associated with certain types of legal work, and the time required for turnaround on projects that would otherwise be done by humans by a factor of 1,000,000 or more. The transaction costs of both traditional transactional and traditional litigation work are not zero, but they’re so much lower than they have been for the entirety of human history up to this point that nobody has a clear idea yet of what new private agreements are now possible. Numerous tasks that were impossible to complete without a small army of legal experts are automating: analyzing discovery, drafting filings including pleadings and briefs, drafting contracts, due diligence — you name it, with the right tools an LLM can do it.
But if LLMs can do it, then how can I believe that there is going to be a need for more coders and lawyers? Because as Coase understood, when something you want to do anyway gets cheaper, there is going to be more of it — a lot more of it. More deals, more lawsuits, more of everything people might ask a lawyer to do. Way more than any of us are prepared for.
The only way out is through. We're in a quiet before the storm right now, like the quiet as the shore recedes before a tsunami. Should you still learn to code and how to practice law before the tsunami strikes? There has never been a better time than now. While LLMs with tools can do the work, we are still going to need humans to act on what those LLMs enable. Humans will remain the preferred interface for humans. Humans who know how to harness the power of code in all its forms will be in more demand, not less.
Learning to code and learning to practice law have more in common than many people realize. At bottom, the core skill is reasoning ability. There has never been a better or more important time for humans to develop their reasoning ability for the important purpose of helping other humans.