The Hidden Dangers of Using AI to Prepare for Mediation
- James Shafman

- Oct 17, 2025
- 1 min read
As artificial intelligence tools become more accessible, it's tempting for lawyers and adjusters to lean on them for efficiency; from summarizing briefs to predicting case outcomes. But when it comes to mediation, AI's convenience can come at a cost.
Many AI platforms store and learn from user inputs. Uploading mediation briefs, medical records, or settlement positions into a public or cloud-based AI tool can expose sensitive information. Once data is entered, it may not be retrievable or erasable, and that can compromise both client confidentiality and mediation privilege.
AI thrives on patterns and averages, but mediation is anything but average. Tools that "predict" likely outcomes can't capture nuances like witness credibility, sympathy factors, or the unique dynamics between counsel. Overreliance on algorithms can narrow negotiation strategy and reduce creative problem-solving.
Mediation is as much about emotion and timing as it is about facts and law. The instinct to read a room, pivot strategy, or sense when a client is ready to move; those are skills no AI can replicate. Delegating too much preparation to automation risks dulling the professional intuition that drives settlement success.
AI can be a helpful assistant, but it shouldn't be your strategist. Lawyers who use it thoughtfully, without outsourcing their judgment or breaching confidentiality, will remain the most effective advocates at mediation.
At Shafman Resolutions, I've seen first-hand that preparation grounded in human judgment leads to better outcomes.
Booking mediations in the near future? There's no substitute for real experience in the room.



Comments