Financial Services May 12, 2026 8 min read

AI Policy for Financial Services: What Regulators Are Now Asking About

Financial services firms handle some of the most sensitive data categories in any industry — client account information, non-public financial data, regulated investment advice, and personally identifiable information subject to multiple overlapping privacy frameworks. When employees use AI tools without governance, each of those categories becomes an exposure. Here's what a financial services AI policy needs to cover.

Note: This article is for informational purposes and does not constitute legal, compliance, or regulatory advice. Financial services firms should consult qualified compliance counsel regarding their specific regulatory obligations.

The financial services industry is not waiting for federal AI legislation to start paying attention to how firms govern AI use. The SEC, FINRA, state banking regulators, and insurance commissioners are all actively developing expectations — and exam teams are beginning to ask questions about AI governance during routine examinations.

This isn't hypothetical future risk. It's the current regulatory environment. And it extends well beyond the largest firms. RIAs, broker-dealers, credit unions, community banks, insurance agencies, and financial planning practices at every scale face these questions.

The regulatory landscape as of 2026

SEC

The SEC has published guidance and brought enforcement actions related to AI use in investment advisory contexts — particularly around "AI washing" (claiming AI capabilities that don't exist) and around suitability and fiduciary considerations when AI tools influence investment recommendations. Exam priorities for 2026 explicitly include AI governance and controls.

FINRA

FINRA has published guidance on the use of AI in broker-dealer operations, including expectations for supervision of AI tools used by registered representatives. The guidance emphasizes that firms are responsible for AI-generated content communicated to customers, and that supervisory obligations apply to AI-assisted communications the same as any other.

State regulators

Multiple states — including New York, California, and Colorado — have enacted or proposed AI-specific rules affecting financial services firms operating in those states. Texas enacted AI-related employer obligations effective January 2026. The patchwork is growing and requires monitoring.

Gramm-Leach-Bliley Act (GLBA)

GLBA's Safeguards Rule requires financial institutions to protect customer financial information. The FTC updated the Safeguards Rule to explicitly address vendor and service provider risk — which applies to AI vendors. Sharing non-public customer financial information with AI tools that lack appropriate data processing agreements may create Safeguards Rule exposure.

93% of IT leaders in financial services report concerns about data security risks from AI tools — yet less than 40% have formal governance policies that address them. 2025 SaaS Management Index

The specific data categories that matter most

Financial services firms handle data categories that require specific treatment in an AI policy — not just the general "confidential information" language that works for most industries:

What a financial services AI policy needs to cover beyond the standard

Explicit MNPI prohibition

Material non-public information must never enter any AI tool regardless of tier designation. This needs to be a named, standalone rule — not implied by a general "confidential information" prohibition. The consequences of MNPI exposure through an AI channel are severe enough to warrant explicit, emphatic treatment. This rule applies to all employees, not just those in investment advisory roles.

Customer communication supervision

If AI tools are used to draft, assist, or generate communications to customers — including email, letters, or social media content — those communications are subject to the same supervision and recordkeeping requirements as any other customer communication. The policy should require that AI-assisted customer communications be reviewed and approved before sending by a person with appropriate supervisory authority. This isn't a new obligation — it's an existing one that needs to be applied explicitly to the AI context.

Investment advice and suitability guardrails

AI tools should not generate investment recommendations, suitability assessments, or portfolio guidance without explicit human review and approval by a licensed professional. Even when AI is used as a research or drafting aid — not as the decision-maker — the output should be treated as a draft requiring professional review before it influences client advice in any form.

Vendor and third-party AI risk

GLBA's Safeguards Rule requires financial institutions to oversee service providers who access customer information. AI vendors that receive NPI are service providers under this framework — which means your firm needs to conduct due diligence on their security practices, document that due diligence, and have contractual protections in place. This requirement applies whether the AI tool is a standalone product or an AI feature embedded in existing software your firm already uses.

Recordkeeping for regulated activities

Many regulated financial activities have specific recordkeeping requirements. If AI tools are used in those activities — generating analysis that informs investment decisions, producing communications to customers, or creating documentation of advisory processes — the records of that AI use may need to be retained and producible in the same way as other business records. Your policy should address whether and how AI-assisted work in regulated activities will be documented and retained.

The exam question you need to be able to answer

Regulatory examiners asking about AI governance in financial services firms are focusing on a core set of questions. Being prepared to answer them — with documentation — is the practical goal of your AI governance program:

A written policy with documented employee acknowledgment answers the first two directly. The others require process and operational controls — but they all start with the policy.

"Regulatory exam priorities for 2026 explicitly include AI governance and controls for registered investment advisers and broker-dealers." — SEC Examination Priorities, 2026

Generate a financial services AI policy in 10 minutes.

Shadow AI Policy generates a tailored AI acceptable use policy with financial services-specific data handling rules — including NPI, MNPI, and customer communication guidance — plus a tool tier list, acknowledgment form, and manager FAQ.

Generate my financial services policy →