Dealing with risks relating to artificial intelligence (AI); diversity, equity, and inclusion (DEI); and shortfalls in staff, training, and expertise are set to be among the biggest challenges for compliance officers in 2023 and the years ahead, say practitioners.
Speaking as part of a panel discussion on compliance readiness for 2023 and beyond at Compliance Week Europe in Edinburgh, Scotland, in late October, Yaara Alon-Redl, head of legal EMEA at tech firm Jellysmack, told attendees compliance functions will need to better understand the risks inherent in the AI systems they use and ensure their organizations have plans in place to mitigate them before they cause harm.
“The problem with AI is that it relies on historic data to inform decisions today,” said Alon-Redl. “This can produce results that are sexist and racist or potentially biased in some other way. Compliance officers need to be aware of the risks and how they can be mitigated.”
Incoming legislation such as the European Union’s AI Act will regulate AI technologies and place an increased onus on internet platforms and tech firms to ensure solutions are designed with safety in mind so that those with “unacceptable” levels of risk are banned from use and those with a potential high risk of causing harm are used narrowly and with safeguards.
“If compliance is going to do all the work that it is expected to do and wants to do, then it needs to have the team in place to achieve this.”
Kelly Gama, Legal and Compliance Officer, Alstom
Alon-Redl believes “the EU AI Act is likely to become the global standard in the way that the General Data Protection Regulation (GDPR) is for data” and warned companies will still need to monitor for potential bias or harm caused by using AI technologies even if tech vendors say the AI-based solutions are technically compliant.
“How far will compliance officers go in terms of checking whether the AI systems they are using are compliant? Will they simply ask if they are compliant with the EU AI data act, or will they ask questions too about how the tools are being used and why and what kind of oversight is involved? In my opinion, it has to be the latter,” she said.
Alon-Redl added DEI will be another key issue likely to be on compliance officers’ agendas in the immediate future. But she warned “companies need to be very careful that (DEI) does not just become about fulfilling quotas,” adding they should take a more proactive approach and “make sure there is (DEI) by design in any new product or service that we offer. This is particularly important in terms of AI development.”
Kelly Gama, legal and compliance officer (Europe) at Alstom, a French train manufacturer, agreed DEI is “the issue of the moment, but companies still don’t know how to use all the talent at their disposal.”
She also cast doubt about how quickly DEI policies will be embedded in practice. “We have not even narrowed the gender pay gap yet, so what kind of progress can you expect with (DEI) that will be meaningful?” she said.
Attendees also raised concerns over companies’ progress. One said: “A lot of companies say they support (DEI) but look at who gets hired. That’s the best indicator of the level of commitment companies really have.”
The panelists also raised the problem of compliance functions being expected “to do more with less.”
“If compliance is going to do all the work that it is expected to do and wants to do, then it needs to have the team in place to achieve this,” said Gama. “Getting more responsibilities and getting more powers produces other problems—how do we do the work with the resources we have? How do we develop or learn new skills? Where do we get the experience and expertise from?
“Compliance needs to make the business case to get more resources if it is going to get more involved and make more of an impact,” added Alon-Redl.
No comments yet