Recently I noticed this article on Artificial Lawyer. The title is Generative Legal AI + “The Last Human Mile”, and it’s about limits to applying AI to legal work. It says this:
The last mile problem is a well-developed theory that many systems fail because there are some key steps at the end that cannot be done properly and that ruins the whole thing. We can extend this to ‘the last human mile’, i.e. that you need a human in the loop when we get into areas of highly complex unstructured data and where you need to trust that there is an actual expert there who is alive to all the needs and risks and the very real human intricacies of the situation.
The article suggests that applying the notion of “the last human mile” to AI would help address current shortcomings in using AI for legal work. (I discuss in this recent blog post the limitations of using AI for contract drafting.)
Some are applying this approach. For example, this 2021 article on Artificial Lawyer describes how LawGeex, which offers an AI-and-contract-review service, decided to supplement its AI offering with a human component. But I suggest that applying last-mile thinking to AI is to misunderstand the nature of the problem.
When you present a human (whether it’s a customer or someone at a vendor) with AI results, you’re not inviting them to take additional steps that pick up where previous work left off. Instead, if they’re inclined to do anything other than rely on the results, the only meaningful option open to them would be to check the work. Instead of covering just the last mile, they would retrace some or all of the journey. That seems inefficient. Furthermore, checking AI results requires expertise—exactly the sort of expertise presumably lacking in, or expensive for, anyone who wants to use AI to tackle a given task. That seems ineffective.
So what do you do? You could do what LegalSifter is doing for its contract-review product, as discussed in this 2021 blog post on LegalSifter’s blog—combining its artificial intelligence with expertise, so instead of just replicating patterns the AI finds, you’re able to attribute significance to them. (I’m LegalSifter’s chief content officer.)
The best example of “the last human mile” in legal work that I’m aware of is a dreadfully old-fashioned one: inviting the user to make choices in a questionnaire-driven automated contract template. Too bad no one has yet done that properly—we’re always too busy looking for tech to save the legal profession from itself.
Note: This post also appears on Ken Adams's own blog.