Financial services companies are rushing to integrate artificial intelligence (AI) into their operations, but many are doing so without adequate governance frameworks or testing procedures.
This oversight is causing significant compliance and information security risks, as detailed in the 2024 AI Benchmarking Survey. This survey, a collaborative effort by ACA Group’s ACA Aponix and the National Society of Compliance Professionals (NSCP), was unveiled today at the NSCP National Conference.
The survey, conducted online in June and July of 2024, collected insights from over 200 compliance leaders within the financial services industry. It focused on the deployment of AI tools and technologies and the compliance measures in place to manage the associated risks.
Despite the growing use of AI, only a small fraction of firms have established adequate controls to mitigate its risks. The data shows a concerning gap: just 32% of the firms have an AI committee or governance group, a mere 12% have an AI risk management framework, and only 18% have formalised testing programs for their AI tools. Additionally, a vast majority (92%) lack policies governing the use of AI by third-party service providers, exposing them to heightened risks in cybersecurity, privacy, and operations.
Lisa Crossley, executive director at NSCP, commented on the findings: “We’re seeing widespread interest in using AI across the financial sector, yet there’s a clear disconnect when it comes to establishing the necessary safeguards.
“Our survey shows that while many firms recognize the potential of AI, they lack the frameworks to manage it responsibly. This gap not only exposes firms to regulatory scrutiny, but also underscores the importance of building robust AI governance protocols as usage continues to grow.”
The survey further revealed that 75% of respondents are either exploring or already using AI internally. The most frequent applications are in research, marketing, compliance, risk management, and operational support. Despite high adoption rates, 68% of those already using AI tools report no significant impact on their compliance programs, highlighting a disconnect between potential benefits and actual outcomes.
Moreover, cybersecurity and privacy issues topped the list of concerns about integrating AI into compliance programs, noted by 45% of respondents. This was closely followed by regulatory uncertainty and a lack of skilled AI professionals.
Carlo di Florio, president at ACA Group, emphasised the importance of third-party risk management: “The survey’s most concerning finding is the lack of policies governing third-party AI use.
“Regulators are heavily emphasizing third-party risk management, as we saw with the SEC’s Reg S-P updates, the SEC Cyber Rule, and the EU’s Digital Operational Resilience Act. ACA is actively helping clients build robust AI governance frameworks that align with regulatory demands, ensuring compliance and improving program efficiency. Without these measures, leveraging AI’s potential while remaining compliant will be challenging.”
The full results of the 2024 AI Benchmarking Survey will be discussed in a detailed webcast hosted by ACA and NSCP on November 7, 2024.
Keep up with all the latest FinTech news here
Copyright © 2024 FinTech Global
Copyright © 2018 RegTech Analyst