I am all for making estate planning services more accessible and in theory anything that removes the psychological shadow that seems to lurk whenever someone says ‘Have you made a will?’ would be a good thing.
But do I really want an artificial chatbot drafting one of the most important documents I will ever make?
The fact is that anyone who values their loved ones and agrees with the idea that they would like their hard-earned wealth preserved, should put a will in place.
There is no doubt that technology now plays a vital role in legal services although it has to be said that the legal services industry is in general lagging behind many of the other service industries.
At Beswicks we want to make our services more accessible, cost effective and risk manage the quality of the service delivered. We’ve approached this by investing in digital tools to help us identify clients who are exposed to risks (such as inheritance tax) and automating processes to help the team manage communication with our clients.
Yes, the legal profession needs to cotton on to the fact that keeping our clients and professional partners up-to-date is a must and technology can help us to do this more effectively and efficiently.
We can draft documents using clever question and answer sessions, build letters and reports and analyse our client database to direct news and updates to the right people rather than blitz everyone with the same message. However, at the heart of a legal service that risk manages families, business owners and their personal estates, remains the human being whose intuition, expertise and ability to connect on an emotional level with the client is essential.
If we were ever to be able to instruct our virtual assistant ‘Alexa, draft my will’, there would need to be a genuine artificial intelligence programmed to identify all manner of financial and complex family relationships necessary to advise on the correct will structure before it could even start drafting a will document. What could possibly go wrong!?
What if someone fakes your instructions? How does a chatbot interpret mental capacity which determines whether the end document is valid or not and whether a series of litigation may be caused by a family member who has inadvertently not been provided for?
We want to make the process of protecting assets and family more straightforward, so helping clients interact with us using technology-based factfinding on a remote basis to make meetings more productive, means we can offer advice and deliver solutions much more quickly, saving time and cost.
We can build client databases and intelligently educate and support client knowledge of changes in the law and new thinking. Technology can help us drive our business knowledge about our clients’ circumstances so we can identify the best suited solutions.
But our human brains are still at the heart of our ability to identify interlaced and overlapping facts and circumstances and be used to engender trust and confidence in the outcome of the service.
If that service happens to be a will or a lasting power of attorney, confidence, trust and knowledge that the right outcome has been achieved is unlikely to happen if the only interaction has been with a robot.
‘Alexa, stop drafting.’