Natalie Bravo is an IPilogue Writer and a 2L JD Candidate at Osgoode Hall Law School.
Facial recognition technology (FRT) is an increasingly popular and controversial tool used by public authorities and commercial institutions. FRT increases surveillance methods for investigative or security work. FRT easily collects vast quantities of biometric information with minimal cost or effort. Sensitive identity-based data is particularly valuable.
These databases and data collection methods are not without risk. Reports of racial bias and Canadian privacy law violations weaken the argument for implementing FRT. On June 10, 2021, the Office of the Privacy Commissioner (“OPC”) issued a 29-page report on FRT and Royal Canadian Mounted Police (“RCMP”)-related surveillance as it pertains to the Canadian public. The special report specifically investigates the RCMP’s use of Clearview AI (with FRT), pursuant to section 39(1) of Canada’s Privacy Act.
What is Clearview?
Clearview AI (“Clearview”) is an American-based entity that has amassed a wide catalogue of facial images with associated location information. Users with Clearview accounts can access the images for matching purposes. In October 2019, the RCMP confirmed that it acquired Clearview licenses and utilized free trials. The OPC subsequently received a complaint under the Privacy Act. Clearview also has their fair share of legal troubles in the United States.
The OPC investigation and report engages the Privacy Act, specifically section 4: “No personal information shall be collected by a government institution unless it relates directly to an operating program or activity of the institution.”
Further, the Personal Information and Protection of Electronic Documents Act (“PIPEDA”) applies to “private-sector organizations across Canada that collect, use or disclose personal information in the course of a commercial activity.” Commercial activity is defined by law as “any particular transaction, act, or conduct, or any regular course of conduct that is of a commercial character, including the selling, bartering or leasing of donor, membership or other fundraising lists.”
Alberta, British Columbia, and Quebec have their own privacy laws that may apply instead of PIPEDA. Most organizations within these provinces rely on provincial privacy laws, except for “transactions involving personal information transferred across borders,” or “federal works, undertakings or businesses (FWUBs) such as banks, telecommunications and transportation companies,” where PIPEDA applies.
2020 Investigation into Clearview
On February 21, 2020, the OPC, along with privacy authorities in Alberta, British Columbia, and Quebec (“the Offices”), began investigating Clearview, their FRT database, and database disclosures pursuant to section 11(2) of PIPEDA. Clearview’s collection practice was found to contravene privacy laws in all investigating jurisdictions. This investigation provided much of the backdrop for the subsequent RCMP investigation. As outlined in the PIPEDA Report of Findings #2021-001, the Offices set out to identify whether Clearview:
- “obtained requisite consent to collect, use and disclose personal information; and
- collected, used and disclosed personal information for an appropriate purpose.”
The Commission d’accès à l’information (CAI) also sought to determine if Clearview had:
- “reported the creation of a database of biometric characteristics or measurements.”
The OPC’s February 2021 report of Clearview’s facial recognition tool identified the following functions.
- “scrapes” images of faces and associated data from publicly accessible online sources (including social media), and stores that information in its database;
- creates biometric identifiers in the form of numerical representations for each image;
- allows users to upload an image, which is then assessed against those biometric identifiers and matched to images in its database; and
- provides a list of results, containing all matching images and metadata. If a user clicks on any of these results, they are directed to the original source page of the image.”
The OPC found that Clearview’s database contains over three billion images of faces with biometric identifiers, including pictures of Canadian faces (including children) collected without their knowledge or consent. Clearview allows law authorities and commercial entities to match people to online images within their database. The OPC found that the large image database “of biometric facial recognition arrays …[for] providing a service to law enforcement personnel, and use by others via trial accounts, represents the mass identification and surveillance of individuals by a private entity in the course of commercial activity.” The OPC stated that police authorities can “essentially” subject billions of people to a non-consensual “24/7 police line-up”.
The OPC concluded that Clearview’s operations harm Canadians as they may detriment individuals whose photos are used without their explicit and informed consent. The method in which images were scraped from web pages was also found to be “unreasonable”, among other Clearview activities.
The OPC provided three recommendations for Clearview to better comply with federal and provincial privacy laws: “(i) cease offering its facial recognition tool to clients in Canada; (ii) cease the collection, use and disclosure of images and biometric facial arrays collected from individuals in Canada; and (iii) delete images and biometric facial arrays collected from individuals in Canada in its possession.”
Clearview disagreed with the OPC’s findings, noting that they had withdrawn from Canada during the investigation, and did not commit to the recommendations. Clearview also suggested that the OPC should suspend the investigation and not publish the report.
RCMP Investigation in the Special Report: Police Use of Facial Recognition Technology in Canada and the Way Forward
As Clearview was clearly found to contravene privacy laws, the RCMP’s use of Clearview technology was also found to contravene the Privacy Act’s collection policies.
Curiously, the RCMP initially told the OPC that they did not utilize Clearview; that was false and concerned the OPC. According to Clearview’s records, the RCMP made hundreds of search requests through the database on at least 19 accounts. The OPC assessed the RCMP’s “controls to ensure it complies with Section 4 of the Act when it collects personal information in novel ways and from new sources.” They found that the RCMP failed to properly ensure that their use of Clearview technology complied with the Privacy Act. Further, the RCMP did not report any system implemented to “track, identify, assess, and control” Clearview’s data. This represented a serious lack of care regarding the sensitive information collected. Ultimately, the OPC recommended “systemic measures” and “pertinent training” within a year to handle (any) novel collections of data.
The RCMP did not agree that they violated section 4 of the Privacy Act. In fact, they argued that under the Act, they do not have a duty to ensure legal compliance of private third parties like Clearview. However, they did agree to OPC’s recommendations in an effort to improve operations.
Soon after the OPC launched their RCMP investigation, the RCMP internally worked to address some of the issues. They restricted their use of Clearview and started the “National Technology Onboarding Program” to look into how novel investigative techniques comply with the Privacy Act and the Canadian Charter of Rights and Freedoms. As of July 2020, Clearview stopped offering its services to Canada, and the RCMP stopped using it altogether.
In light of this report, the OPC published the “Draft privacy guidance on facial recognition for police agencies” (“the Draft”) to provide provincial, regional, federal, and municipal police agencies with more detailed privacy compliance information. The guide offers a privacy framework with various related lawful authority guidance, and data management related to collection, retention, security, transparency, accountability, and more.
The OPC also offers up-to-date information on the accuracy of FRT and algorithms, emphasizing the unreliability and prevalence of false results. In the same way, the guide underlines “data minimization.” In other words, agencies should only collect and retain what is necessary, rather than cast a wide net. The Draft is a rather thorough document with many references to specific case law and related authority. It demonstrates the importance of privacy in Canadian society and the seriousness in which Canadian officials deal with consent, surveillance, and novel technology. FRT may evolve into a useful tool, but until it meets the recommendations of the OPC, the RCMP will need some constructive and careful effort to use FRT again.