Clients Turning to AI Reviewed by Momizat on . Evaluating Your Work Product and Professional Services The author states that over the past year, he has seen clients use AI tools to “check” his work, draft th Evaluating Your Work Product and Professional Services The author states that over the past year, he has seen clients use AI tools to “check” his work, draft th Rating: 0
You Are Here: Home » QuickRead Top Story » Clients Turning to AI

Clients Turning to AI

Evaluating Your Work Product and Professional Services

The author states that over the past year, he has seen clients use AI tools to “check” his work, draft their own valuation commentary, and even generate critique his reports. While he welcomes their curiosity, some encounters have been alarming.

Clients Turning to AI: Evaluating Your Work Product and Professional Services

Artificial Intelligence (AI) is no longer just a topic for webinars or conference panels. It is now part of daily client interactions. Over the past year, I have seen clients use AI tools to “check” my work, draft their own valuation commentary, and even generate critiques of my reports. While I welcome curiosity, some encounters have been alarming. Ready or not, our clients are using AI, too.

The Good: Curiosity and Engagement

On the positive side, AI has sparked a new level of curiosity among clients. Complex valuation concepts can be explained at a basic level, helping clients engage more fully in the process. It creates opportunities to explain methodology in ways that resonate.

AI can also improve the client review process. When a report clearly lays out assumptions, AI can highlight gaps or generate clarifying questions. In one collaborative divorce case, a client used ChatGPT to draft questions from my report. Many questions were redundant or based on rounding issues the report had already addressed. Instead of pushing back, I used the moment to educate, reinforcing trust rather than creating conflict.

The Bad: False Confidence and Hallucinations

One limitation is that AI-generated outputs may contain errors, incorrect formulas, fabricated sources, or simplified assumptions while appearing authoritative. Clients should note that AI processes long documents by dividing them into smaller parts and combining them. Additionally, AI tools might interpret headers in schedules differently from those in reports, which could result in the DCF discussion in a report not being integrated with the corresponding DCF section in schedules.

Many platforms, including ChatGPT and Google Gemini, store interactions by default. Unless privacy settings are adjusted, sensitive client data could be retained and used to train future models. These tools are not inherently private; it is the user’s responsibility to make them secure. This means that any information entered into these platforms, including confidential client data, business records, or sensitive case details, may be retained and potentially exposed in future model updates.

The Fear: Misplaced Trust

The real danger is not the AI critique itself, but its psychological effect on clients. What happens when a client believes AI’s version over yours? In litigation especially, where emotions run high and trust runs low, this misplaced confidence can be toxic. Unlike Excel, which reveals its formulas, AI conceals its logic. What appears precise is often just probabilities, patterns, or even pure invention.

The Professional Challenge: Guardrails and Judgment

These experiences raise important questions for practitioners: How do we respond when clients introduce AI into the valuation process? Do we embrace it, ignore it, or push back? My answer is that we must return to the fundamentals of professional judgment, standards, and ethics.

As I wrote previously in Should AI Be Disclosed?, the test is not whether AI was used, but whether our services meet the standards of knowledge, experience, education, training, and skills (KEETS) that underpin credible valuation work. AI can assist, but it cannot replace judgment. If we abandon those responsibilities, we risk letting tools dictate outcomes instead of guiding them.

Moving Forward

AI is not going away, and neither are the clients who will continue to experiment with it. The best path forward is leadership, not resistance. We must:

  • Educate clients on both the potential and pitfalls of AI.
  • Clarify misinterpretations when AI raises inaccurate critiques, showing clients how context changes the answer.
  • Reinforce trust by demonstrating how professional standards and judgment outweigh machine-generated shortcuts.

When AI-driven questions land on your desk, do not embrace them as attacks. Stay calm, provide clear answers, and consider how AI may have misinterpreted your work. AI is neither friend nor foe. It is another input requiring context, skepticism, and above all, professional judgment.


Nick Mears, MBA, CVA, MAFF, is Managing Member of Caprock Business Consulting, LLC, a Lubbock, Texas business consulting, valuation, and litigation support firm. He is also a Member of NACVA’s Ethics Oversight Board as well as NACVA’s AI Machine Learning Commission.

Mr. Mears can be contacted at (806) 853-7832 or by e-mail to nick@caprockbc.com.

The National Association of Certified Valuators and Analysts (NACVA) supports the users of business and intangible asset valuation services and financial forensic services, including damages determinations of all kinds and fraud detection and prevention, by training and certifying financial professionals in these disciplines.

Number of Entries : 2705

©2024 NACVA and the Consultants' Training Institute • Toll-Free (800) 677-2009 • 1218 East 7800 South, Suite 301, Sandy, UT 84094 USA

event themes - theme rewards

Scroll to top
G-MZGY5C5SX1
lw