← Insights
·5 min read

AI Policy for Professional Services Firms: What to Include and Why

G

Ger Perdisatt

Founder, Acuity AI Advisory

Professional services firms — law, accountancy, consulting, financial advisory — face specific AI governance obligations. A generic AI acceptable use policy will not meet them. Here is what an effective policy actually needs to cover.

Professional services firms have a particular relationship with AI governance. They advise clients on regulatory compliance, they carry professional liability for the quality of their outputs, and they handle confidential client information under strict professional obligations. When AI is introduced into those workflows, the governance requirements are correspondingly specific.

The policies I see most often in professional services firms are adapted from general corporate AI policies — and they show it. They address general principles (responsible use, data privacy, transparency) without engaging with the specific professional obligations that make an accountancy, law, or financial advisory firm different from any other knowledge business.

This piece sets out what an AI policy for a professional services firm should actually contain.

The professional liability dimension

The most important distinction for professional services firms is professional liability. When a firm provides advice — legal opinion, audit sign-off, financial recommendation — it carries professional responsibility for that advice. The introduction of AI into the process that generates that advice does not transfer or dilute that responsibility.

An AI policy for a professional services firm must be explicit about this. AI tools may assist in research, drafting, analysis, or quality control, but the professional signing off on an output remains responsible for its accuracy and appropriateness. The policy needs to define what level of human review is required before AI-assisted work is delivered to a client, and who is responsible for that review.

This is not about slowing down AI adoption. It is about ensuring that the efficiency gains from AI do not come at the cost of professional standards.

Client confidentiality and data handling

Professional services firms operate under strict confidentiality obligations. In legal services, these are reinforced by legal professional privilege. In financial services, by regulatory data protection requirements. In accountancy, by audit independence rules.

The central question for any professional services AI policy is simple: what client data may be processed by which AI systems, and under what conditions?

Most commercial AI tools process data on infrastructure that is not under the firm's control. Even where data is not retained by the provider, the processing itself occurs outside the firm's security perimeter. This raises genuine questions about confidentiality obligations — questions that a generic data privacy section in an AI policy will not answer.

An effective policy will:

  • Define which client data classifications may be processed by external AI tools
  • Require data processing agreements with AI tool providers where client data is involved
  • Prohibit the use of external AI tools for privileged or highly confidential client materials unless specific safeguards are in place
  • Address the question of client notification — should clients be informed when AI tools are used in delivering their work?

On the last point, different regulatory bodies are arriving at different answers. Firms should anticipate that disclosure obligations will increase, and should design their AI policy to accommodate this rather than resist it.

Sector-specific regulatory considerations

Different professional services sectors face different regulatory requirements for AI governance.

Legal services. The Law Society of Ireland and the Bar of Ireland have both begun to address AI use in legal practice. While formal guidance is still developing, solicitors and barristers are subject to existing professional conduct obligations — including obligations around accuracy, competence, and disclosure — that apply to AI-assisted work regardless of whether specific AI guidance has been issued.

Financial services. Firms regulated by the Central Bank of Ireland face AI governance obligations under the Central Bank's operational resilience framework, as well as potentially under the EU AI Act where AI is used in credit assessment, investment advice, or other regulated activities. AI policies for regulated financial services firms need to be aligned with the specific requirements of their regulatory framework.

Accountancy. Audit firms need to consider the implications of AI tools for audit independence and audit quality. Where AI tools are used in audit procedures, the methodology needs to be documented and defensible. Firms should also consider whether AI tool providers constitute related parties for independence purposes.

Quality control and output review

Any AI policy for a professional services firm needs a section on quality control. This is not a generic data quality requirement — it is a professional obligation.

The policy should define:

  • Which workflows may use AI tools
  • What review is required before AI-assisted outputs are used or delivered
  • Who is responsible for that review
  • How AI assistance is documented in the firm's work records

The documentation requirement is worth emphasising. If a firm is later required to demonstrate the basis for advice it provided, the presence of AI in the workflow will be relevant — and the absence of documentation of how that AI was overseen will be a problem.

Training and competence

A policy document is only effective if the people subject to it understand it. Professional services firms should include in their AI governance framework a defined training requirement — not a one-time awareness session, but ongoing training that keeps pace with the AI tools the firm is using and the regulatory environment governing them.

Training should address: what tools are approved for use, what data may be processed, what review requirements apply, and how to escalate concerns about AI-generated outputs.


Acuity AI Advisory works with professional services firms to build AI governance frameworks that reflect both the opportunities of AI adoption and the specific obligations of professional practice. The starting point is always a diagnostic — understanding what is currently in use before prescribing what governance should look like.

professional servicesai governance