According to a recent Gartner poll, 54% of organisations have appointed a head of AI or an AI leader but there is significant variation amongst the professional background and skill sets of such leaders, with few having the title of “Chief AI Officer”. What is clear is that most organisations today acknowledge the need to identify an individual responsible for AI, namely AI strategy and AI governance.
AI governance is multi-disciplinary and good AI governance leverages knowledge and experience from a wealth of disciplines, including data privacy, information security, cybersecurity, human rights and many more. It is vital that this cross-section of disciplines be centralised and driven by senior leadership in order to ensure accountability within an organisation. Senior, centralised management also helps foster a good governance culture and raise awareness throughout the organisation from the top down. Having an appointed leader in the C-suite is a good way of ensuring adequate resources are appropriately allocated in a way that allows for meaningful accountability.
At a minimum, the appointed leader needs to be someone with a good understanding of the technology and how it is deployed within the business. There have, of course, been significant regulatory developments worldwide in the field of AI specifically over the last 18 months in particular, so the appointed leader would have to be on top of such developments.
To date, privacy laws have been a key source for AI regulation, particularly in the EU, UK and US. Both the EU and the UK have highly developed privacy rules, specifically the General Data Protection Regulation (GDPR) and the GDPR as incorporated into the laws of the UK. In the US, there are 13 comprehensive state privacy laws, with more likely to follow. A legal background in data privacy would therefore be highly beneficial. However, the world of tech, particularly AI, and its use within a business can involve multiple legal practice areas, such as IP, commercial contracts to name just two. Any appointed leader in this field would need to have some experience of such legal matters.
There is no right or wrong location for an AI governance leader/team but we are seeing clients lean towards their existing data privacy and cybersecurity teams, with many Chief Privacy Officers taking on additional titles or simply roles of overseeing AI governance programs. This makes a lot of sense as there are great synergies in terms of approach to regulatory compliance.
Data privacy and cybersecurity teams have the prior knowledge, skills and experience in developing governance programs, including ensuring appropriate processes are developed and the relevant policies and procedures are put in place. It is worth noting that the European Data Protection Board (“EDPB”) recently released a statement on the role of data protection regulators in the framework of the EU AI Act.
According to the EDPB, the data protection regulatory bodies already have experience and expertise when dealing with the impact of AI on fundamental rights, in particular the right to protection of personal data, and should therefore be designated as the key regulatory authorities responsible for enforcement (the “Market Surveillance Authorities”) under the Act.
Sarah Pearce is a partner at Hunton Andrews Kurth