Is your board ready for the AI revolution? The EU AI Act will put Irish businesses to the test

Guest post by Keith Fenner, SVP and GM EMEA, Diligent

Business leaders have been anticipating its arrival and now the world’s first comprehensive legal framework for regulating AI has been passed in Europe. The EU AI Act (AIA) is expected to become EU law this year and organisations that use these AI tools must prepare for the responsibilities and obligations that will follow.

While much of the AIA covers the providers of AI systems, the companies who use the tools have responsibilities to meet too, and these users are required to operate the technology as intended, with human supervision required throughout.

The AIA will likely become the global standard for AI regulation, banning certain uses for AI and mandating risk assessments, which will have a substantial impact on businesses. It will enforce a risk-based approach and will apply to providers, importers, distributors, and businesses deploying AI systems in the EU.

These systems will be regulated based on their potential to cause harm to people. Those classified as posing a limited risk will have transparency requirements and those deemed high-risk will have additional governance to ensure regulatory compliance.

While the US AI Executive Order sets out standards and guidelines, the AIA takes a more prescriptive approach with legally binding regulations. For instance, it imposes strict data protection requirements and bans specific high-risk uses of AI like social scoring. Irish business leaders and boards operating within the EU must comply with these regulations to avoid significant financial penalties and reputational damage.

Implications on Irish directors

A 2023 survey by The Institute of Directors (IoD) Ireland reveals over 75% of directors and senior executives were not aware of the extensive scope of the proposed landmark EU legislation on AI when asked, and more than half did not have a board-approved AI and cyber security strategy in place. This indicates that many Irish directors are not currently prepared for the changes coming in.

This is particularly worrying as there are high penalties for non-compliance with fines of up to 35 million euros, or 7% of an organisation’s global turnover. This is why careful oversight is a necessity to prepare for the new rules. To keep up with the developing regulation around AI and stay ahead of risks, a consolidated view of governance, risk and compliance is necessary across the whole organisation to avoid these hefty fines.

Ireland’s chief economist has warned that AI is now classed as a ‘severe’ risk to Ireland’s economy, with a risk of large job losses in some sectors, and the WEF Global Risks Report 2024 highlights adverse outcomes of AI as a key risk. Boards must strike the balance between elevating the organisation through the implementation of new technologies and complying with new regulation to do so in a safe and successful way.

Steps to comply

In Ireland, the government has established its new task force to rapidly respond to emerging technologies and deliver advice to the government on changing policies. With the AIA now endorsed by all member states, the onus is on GRC professionals to follow suit and work on their governance strategies to become compliant.

The first step should be developing and implementing an AI governance strategy and this should be followed by conducting an assessment to map, classify and categorise the AI systems they use or that are under development based on the risk levels in the AIA.

When a business has high risk AI systems, it’s important that they perform a conformity assessment to determine whether the AIA’s requirements have been addressed before placing the AI system onto the EU market. The board should be mindful of integrating appropriate safeguards and informing any stakeholders and investors involved.

Next, GRC professionals will need to perform gap assessments between current policies and the new requirements and determine whether they can apply regulations they’re currently tracking on privacy, security, and risk to AI as well. Irish businesses must ensure that a strong governance framework is established, whether developing AI systems in-house or adopting third-party AI solutions, with top-down buy-in by the board, senior leadership, and other stakeholders, including data protection officers.

To ensure the board and leadership are consistently making responsible decisions on AI for their organisation, they should consider external education or certifications on governing AI. This can help senior decision makers navigate the ethical and technological issues that come with the application of AI, and ensure the organisation executes honest and credible practices.

AI’s potential

Although the new regulation may instil caution in GRC professionals, the benefits of AI to business objectives are undeniable. Diligent’s survey reveals 60% of European business leaders believe their current investment in technology, data or AI has improved the processes by which they can arrive at decisions, and more than 70% say their organisations will be investing in more technology to aid their business intelligence capability over the next five years. AI tools can be used to automate tasks like risk assessments and documentation, analyse copious amounts of data to identify potential vulnerabilities in systems.

Although the AIA introduces new layers of regulation for GRC professionals to navigate, it also opens doors for them to leverage the unmatched power of AI while remaining compliant. However, 60% of business leaders currently believe their boards lack understanding of how data or AI can be applied to improve decision-making. Unless organisations solve their data management issues first, no amount of AI is going to help businesses make better GRC decisions.

See more stories here.