← Insights
·3 min read

AI in Professional Services: How Firms Are Using It for Due Diligence and Research

G

Ger Perdisatt

Founder, Acuity AI Advisory

AI is changing due diligence and research in professional services — but the governance questions it raises are as important as the efficiency gains. Here is what is working and what requires careful management.

Professional services firms — legal, accounting, consulting, corporate finance — have been among the earlier adopters of AI tools for substantive work. The economics are clear: highly paid professional time spent on document review, entity research, and background investigation is an obvious candidate for AI assistance. The productivity gains in the right applications are real.

What has received less attention is the governance layer. When AI is assisting with due diligence outputs that inform significant commercial decisions, the obligations on the firm — and the risk if things go wrong — require more than a usage policy and a disclaimer.

What is actually working

Large document review is the clearest current success. AI tools that can ingest large volumes of contracts, corporate records, and correspondence and surface material clauses, anomalies, or issues are delivering genuine value. The application is well-suited to AI: the task is pattern recognition across structured text at a volume that would be prohibitively expensive to review manually, and the output is a flagged set of items for human review rather than a final conclusion. The professional retains the analytical judgement. The AI does the scanning.

Entity and ownership research is another productive application. Tracing corporate structures, identifying beneficial ownership, cross-referencing regulatory filings, and flagging sanctions exposure across multiple jurisdictions is work that AI tools can assist with significantly. The output quality depends heavily on the underlying data the tool accesses — a closed model without current data is less useful — but for structured research tasks, AI is accelerating work that previously required significant analyst time.

Red flag identification in financial due diligence — identifying transactions that look anomalous, related-party structures that require scrutiny, or inconsistencies between reported figures and underlying documentation — is an area where AI tools are being piloted by the larger accountancy practices. The results are promising, with the important caveat that AI-flagged items require experienced human review before they become conclusions.

What requires governance attention

The professional liability exposure is the first consideration. If an AI-assisted due diligence process misses a material issue that a thorough human review would have caught, the liability question is not simplified by the involvement of AI. In some respects it is more complex — was the AI tool used within its design parameters? Was the output reviewed by a competent professional? Was the client informed about the use of AI in the process?

Data confidentiality is a second concern. Client information uploaded to a cloud-based AI tool may be processed in ways that create confidentiality obligations or data protection issues. Many professional services firms are addressing this through private deployment of AI models on their own infrastructure, rather than using public commercial services. That approach carries its own cost and complexity, but is the correct answer for client data.

EU AI Act classification

Several AI applications in professional services are in territory the EU AI Act treats carefully. AI systems used to evaluate or score individuals in the context of creditworthiness, suitability, or risk assessment may qualify as high-risk systems under the Act. Firms using AI to assist with KYC, credit analysis, or financial risk assessment should be seeking a proper classification of those tools against the Act's framework, not assuming that "professional advice with AI assistance" falls outside regulatory scope.

The Act applies to deployers, not just developers. A professional services firm that uses a vendor's AI tool for due diligence is an operator under the Act and has obligations accordingly.

The quality assurance requirement

The fundamental governance requirement for AI-assisted professional work is that the output is reviewed by a competent professional who takes responsibility for it. AI can draft, flag, and surface. The professional approves, modifies, and owns. Any workflow that allows AI outputs to move directly to client deliverables without substantive professional review is not a responsible deployment, regardless of how impressive the AI tool's accuracy statistics look in controlled conditions.


If you want an independent view of how to deploy AI in professional services due diligence responsibly — including EU AI Act classification and governance framework design — get in touch.

professional services