Five Practical Tips For Commercial Facial Recognition Technology
Commercial retailers with large warehouse-style outlets, health providers who are trying to guard against theft of medicines, or employers who just want to keep their people safe.
The range of applications for facial recognition in the commercial sector keeps growing. Yet most of the headlines are still about the well-publicised use of the technology by police or local authorities. As a result, the ICO’s recent opinion on the commercial use of facial recognition (part of a general move towards being more encouraging and supportive of new and emerging technologies) is particularly welcome.
Here we look at five practical tips for businesses who are thinking about adopting face recognition technology, based on some of the main elements of the ICO’s guidance:
'The controller must identify a specified, explicit and legitimate purpose for using [face recognition] in a public place.
Yes, it’s a legal requirement. But it also just makes good sense. This is technology that has the potential to be intrusive and should only be used where you have a really good reason to do so. Polling, referred to in the ICO’s opinion, shows that the public want face recognition schemes to have a defined purpose, and ideally one which is seen as having a social benefit.The controller must identify a lawful basis and meet its requirements
Consent might be OK as the basis for processing face data when you are unlocking a phone, or for leisure applications. But for most purposes it is not realistic to expect to collect the informed individual consent of everyone whose face data might be captured. Unless the technology is being deployed to comply with a legal obligation, the most likely basis is therefore going to be legitimate interests. As a minimum, this is going to mean a separate legitimate interest assessment, on top of the privacy impact assessment that you should always be carrying out.
3. The controller must consider alternative measures and demonstrate that they cannot reasonably achieve their purpose by using a less intrusive measure
– this is a consideration that will be familiar to anyone who has deployed CCTV. Minimising the impact on individuals whose images might be caught by the system should always be a priority, even if that means using a different solution.
4. The [face recognition] system should be technically effective and sufficiently statistically accurate
– this is an enormously important factor, and all the more so if the system is going to be used for identification of wrong-doers or safeguarding of children, for example. Don’t be afraid to challenge your supplier to demonstrate that their system has a level of accuracy that you are comfortable with. Check what measures are in place to guard against engrained bias that can be a real issue in less sophisticated face recognition or AI systems.
5. The controller must be transparent and provide clear information about how they are processing personal data
– as with CCTV, it will almost never be appropriate to have this technology in use without being open and transparent about it. Understand that people will be concerned about the impact on them, and think about clear and straightforward explanations that can easily be provided (in pictorial signs, audio announcements or written notices) to address their concerns and reassure them about the security, accuracy and integrity of the system.
Of course there is a lot more to it than that. But if you are thinking about deploying face recognition technology and you, or your supplier, are not confident about how even these preliminary points would be addressed, you probably ought to pause and get some further guidance before proceeding.
If you would like to find out more, join us for a FREE webinar 7th October 11am.
Will Richmond-Coggan is a Director in the data protection team at Freeths LLP, with nearly two decades of experience in the field. He specialises in advising data controllers and technology developers (including those working with facial recognition) on a wide spectrum of contentious and non-contentious data protection and privacy matters.