Insights Generative AI: Information Commissioner’s Office publishes latest consultation

Contact

The Information Commissioner’s Office (“ICO”) has published the latest in its series of calls for evidence on generative AI models, focusing this time on individual rights. Previous calls for evidence have been commented upon here, here, and here.

The ICO explains that the call for evidence addresses how organisations developing generative AI should enable people to exercise the following rights: (1) to be informed about whether their personal data is being processed; (2) to access a copy of their personal data; (3) to have information about them deleted; and (4) to restrict or stop the use of their information. Equally, the ICO makes clear that these rights apply to the processing of personal data in all contexts, from the training and fine-tuning of AI models, to the inputting of information by individuals, to the outputs generated by the model.

Taking each of the relevant rights in turn, the call for evidence explains that the right of an individual to be informed about the processing of their data is a prerequisite for the exercising of other information rights. In the context of generative AI, it is not only expected that individuals are informed about the collection and use of data that is collected directly from them (such as prompts that they submit), but also that organisations have appropriate measures in place to govern circumstances in which personal data is gathered from other sources, such as through web-scraping in the process of developing the model. In the latter circumstance, the ICO recognises that it could well be impossible or disproportionate to expect each individual to be informed about the use of their specific data within a vast dataset. Having said that, organisations will still be expected to: (a) publish “specific, accessible information on the sources, types and categories of personal data used to develop the model”; (b) publish “specific, accessible explanations of the purposes for which personal data is being processed and the lawful basis for the processing”, and (c) provide “prominent, accessible mechanisms for individuals to exercise their rights, including the right of access and rights to rectification, to erasure, to restriction of processing and to object to processing”. The call for evidence invites views on other measures that developers of generative AI should take to safeguards individuals’ rights, stressing that organisations are expected to apply a “data protection by design and by default approach that ensures data protection principles are implemented effectively – including transparency, fairness and accountability”.

The ICO also expects generative AI developers to have “accessible, clear, easy-to-use, documented and evidenced methods to facilitate and respond” to requests by individuals to access a copy of personal data held about them. If developers cannot comply with such a request (for example, the data cannot be identified within the larger training data), they are required to explain to the individual why this is so. Developers will also need to consider how they can comply in practice with an individual’s rights to erasure, to restriction of processing, and to object to processing. Since they will need to apply these rights to the processing of all personal data, and at all stages (from training data to model outputs), the ICO recognises that this could be a challenge. It cites the practice of some developers using ‘input and output filters’ both to mitigate the risk that a generative AI model includes personal data in its outputs and to detect and amend specific user prompts that may include personal data. It invites views on whether such filters are sufficient to protect individuals’ rights, or whether there might be alternative means to supress or remove personal data. More generally, the ICO also calls for evidence on “verifiable and effective methods that organisations are developing or using to meet their legal obligations in this area, to support innovation and the protection of personal data in the development and use of generative AI”.

To read the call for evidence in full, click here.