What you need to know
- Automated recognition systems and facial recognition technology bring great benefits in the transport sector.
- However, if deployed in a non-compliant way, employment claims, privacy complaints and regulatory enquiries could follow. A dispute could arise under the transport operator contract.
- The solution lies in balancing the public benefits of these systems with the potential data privacy risks by way of a comprehensive privacy impact assessment.
- A data privacy reform is underway which may result in further tightening of the rules.
Introduction
Automated recognition systems are a powerful tool in ensuring safety, fraud prevention and the well-being of passengers in Australia’s wider transport networks. However, some of these systems collect biometric data the use of which is regulated by federal and state laws. Despite their obvious benefits, having a good reason for the deployment of such systems alone will not satisfy underlying compliance requirements.
About biometric data
Biometric data is any distinctive and measurable characteristic of the human body or pattern of behaviour, such as one’s fingerprint, facial features, voice or gait. It can be used to uniquely identify an individual in a range of different scenarios. Being intrinsically private, it is considered ‘sensitive data’ under the Privacy Act 1988 (the Privacy Act).
The Office of the Australian Information Commissioner (OAIC) is known for taking action in respect of privacy breaches concerning biometric data. The Jeremy Lee v Superior Wood Pty Ltd [2019] FWCFB 2946 case is an example of how innovation that is too rushed could result in lengthy employment proceedings.
Use cases
Some of the use cases for automated recognition in the transport sector include:
- Alerts about the conductor’s loss of focus and fatigue
- Automated fare deduction
- Incident detection and disclosure to law enforcement
- Identifying a blacklisted individual
- Property damage prevention
- Real-time passenger analytics combined with vehicle telematics
Not all use cases create a biometric template or result in biometric identification of individuals. However, most of these technologies may result in the continuous location-tracking and other surveillance of individuals. Where personal information is used, subject to certain exemptions, the Australian Privacy Principles (APPs) must be complied with.
The key to data privacy compliance
The APPs require organisations such as transport operators to actively adopt measures to protect personal information from misuse, interference, or unauthorised access, modification and disclosure. Without a clear use case and a good understanding of the underlying technology, it will be difficult to adopt appropriate measures.
This is where a privacy impact assessment (PIA) becomes relevant. The purpose of the PIA is to systematically assesses the risks and impacts of the technology on the privacy of individuals. The PIA helps justify the deployment of technology and identify mitigations in relation to any adverse effect on an individuals’ privacy.
What does the law require?
Among other things, some of the legal requirements include:
- Legal justification. A strong reason is required for the processing of biometric data, a so called “permitted general situation”, such as a serious threat to safety or taking appropriate action in relation to serious misconduct or unlawful activity. An effort to save money alone may not justify the deployment. Without such a strong reason, consent will be required. Any personal information in automated recognition systems must only be used as reasonably necessary for the transport operator’s clearly defined functions or activities.
- Consent. Privacy consent can be express or implied. The difficulty with implied consent is that it often fails to provide appropriate choice to individuals if there is no alternative but to say yes. The OAIC warns about consent assumed purely because of prior notice or a lack of objection. Individuals have the right to withhold consent.
- Transparency. Individuals must be adequately informed before collecting any sensitive or other personal information under the APPs. Intrusive technologies may increase the level of transparency needed to comply with the law.
- Fairness. Under the APPs, covert, excessive, unexpected or disproportionate monitoring might not be fair. Transport operators should consider if the individual would expect the information to be collected from them directly rather than through monitoring.
- Secondary purposes. Information collected for one purpose must not be disclosed or used for another purpose without consent, unless reasonably anticipated by the individual.
- Supply chain due diligence. The automated recognition service provider must be subject to a contract imposing appropriate data privacy obligations and to provide information and assistance to enable the transport operator to comply with the APPs, and deal with enquiries and complaints.
- Data rights. Transport operators must retain the ability to ensure access or correction of data upon request by individuals. This will not be possible without a process in place and the technology provider’s assistance.
- Australia’s AI Ethics Principles. Automated recognition systems are powered by machine learning and artificial intelligence (AI) technologies. A system used in Australia should comply with the principles, particularly if the transport operator is bound by a state contract imposing obligations in relation to these technologies. The aim of the principles is to minimise any errors, particularly false positives which could result in discriminatory outcomes. It will be essential for the transport operator to impose appropriate contractual obligations on the technology service provider.
What are the risks?
Peoples’ perception of the technology may aggravate the risk, particularly if there is a lack of transparency, suspected function-creep (ie data is used for unforeseen secondary purposes), incontestable automated decisions or doubts about proper information security. According to the OAIC’s research, 51% of Australians feel uncomfortable with the use of biometric analysis to identify (for example) an individual’s emotional state.
One-to-one facial verification will be less intrusive than one-to-many facial detection. The positioning, lighting and other conditions will also be important.
Different risks will arise in relation to the transport operator’s employees and passengers. Generally, failure to comply with the law could give rise to a number of risks:
- Complaints to the OAIC, Fair Work Commission or the Human Rights Commission.
- Claims under employment law, breach of confidence, negligence, etc. The ‘employee record’ exemption under the Privacy Act will likely not apply to biometric data collected by the system.
- Regulatory scrutiny from the OAIC, state transport department and other relevant regulators.
- Contractual fallout with the state transport department.
- Cost of dealing with complaints, claims and inquiries.
- Unfavourable publicity, reputational damage and resulting loss of passengers or workers.
- Unwanted scrutiny from business partners, customers and suppliers.
Conclusion
Automated recognition drives innovation in the ground public transport sector. It helps to improve safety and security, increases convenience in payment of fares, streamlines workplace processes, aids in compliance with safety requirements under the operator contract and the law, and saves money.
However, new technologies also come with legal risks. Data privacy compliance is not difficult, but it is not self-fulfilling. A PIA might help identify and mitigate risks—taking your technology provider’s word for it, maybe a mistake. A lack of data privacy and cyber compliance could result in unexpected exposure to damages, fines, and costs. Any cost of compliance will likely outweigh the cost of non-compliance.
The OAIC recently expressed its preference to introduce a more stringent legislated standard on data privacy, including the handling of sensitive biometric information. Regular compliance audits will likely prepare the transport operator for any future regulatory enquiries.
Want Data Privacy, Cyber & Digital updates delivered straight to your inbox? Click here to subscribe.