Legal AI is full of talk about 'explainability', but most of it is smoke and mirrors. If these systems are to be useful in law, they need more than plausible stories; they need legally sound reasoning and real-world rigour.
Black box algorithms and the rights of individuals: no easy solution to the “explainability” problem
The design of modern machine learning systems should take into account not only their effectiveness in solving a given problem, but also their impact on the rights of individuals. Implementing this goal may involve applying technical solutions proven in the IT industry, such as event logs or certification frameworks.