EU AI Act Obligations Are Live: What Financial Services Firms Must Do Now
- 22 hours ago
- 2 min read
February 2025 marked the first major enforcement milestone of the EU Artificial Intelligence Act, with the prohibition of unacceptable-risk AI systems coming into force. For financial institutions operating in or into the EU, the AI Act is no longer a future concern — it is a present obligation.
What Came Into Force in February 2025
The prohibitions that took effect in February 2025 target AI systems deemed to pose unacceptable risks — including systems that manipulate human behaviour through subliminal techniques, exploit vulnerabilities of specific groups, or enable real-time biometric identification in public spaces for law enforcement. Financial services firms deploying AI in customer-facing contexts, credit decisioning, or fraud detection need to ensure their systems do not fall into these categories.
High-Risk AI in Financial Services
The AI Act classifies several use cases prevalent in financial services as high-risk, including AI used for creditworthiness assessment, life and health insurance risk assessment, and employment-related decisions. High-risk systems must comply with requirements covering risk management, data governance, technical documentation, transparency, human oversight, accuracy, and robustness — with conformity assessments required before deployment.
The GPAI Model Obligations
General Purpose AI (GPAI) models — such as large language models — used by financial institutions also carry obligations from August 2025. Providers must maintain technical documentation, comply with copyright law, and publish summaries of training data. For systemic-risk GPAI models, additional obligations around adversarial testing and incident reporting to the European AI Office apply.
What Financial Firms Should Do Now
Firms should begin with an AI inventory — cataloguing all AI systems in use, their risk classification under the Act, and whether they are deployed as providers or deployers. For high-risk systems, a conformity assessment and registration in the EU AI database will be required. Governance frameworks need to be updated to include AI-specific oversight, accountability, and incident management processes.
How Surety Helps
Surety's AI-powered compliance platform can support financial institutions in managing their EU AI Act obligations alongside existing regulatory frameworks like DORA, GDPR, and PSD2. Our obligation management workflow enables firms to map AI Act requirements to internal controls, assign ownership, and maintain the audit trail that regulators will expect. Contact us to see how Surety handles the EU AI Act in practice.




Comments