Facial Recognition Technology and the Retail Sector: Opportunity or Liability?

Facial Recognition Technology and the Retail Sector: Opportunity or Liability?

Facial recognition technology has recently come under greater scrutiny. In February 2020, the RCMP admitted to using Clearview AI technology, prompting the Office of the Privacy Commissioner (OPC) to launch a full investigation into whether the use violates federal privacy laws.

Far less attention has been paid to how retailers are using the same technology to improve customer loyalty and increase sales.  For example, Saks Fifth Avenue has been using facial recognition technology to identify VIPs in its Toronto store.

For brick and mortar retailers, facial recognition technology holds tremendous value. Marketing analysts have described how shoppers interact and behave within the actual retail environment as a virtual “black box.” Facial recognition technology could potentially provide marketers with the insights they’ve been missing. 

From an IP commercialization perspective, patents represent only a fraction of facial recognition technology’s value. The data extracted by the algorithm is extremely lucrative. However, who can claim ownership rights over this data is not entirely clear cut. 

Who Owns the Data?

Under Canadian law, individuals do not own their personal information. Instead, personal information is protected by a number of different privacy laws at the federal and provincial levels. With few exceptions, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) would apply to most retailers.

The economic importance of data has led to increased discussion around the need to create property rights in data. However, that may not be necessary in this case. It’s uncertain if individuals could claim ownership over the personal information collected by a facial recognition technology program by virtue of their personality rights.  

What are Personality Rights?

Personality rights recognize that individuals have the right to protect their image, name, and voice from commercial exploitation. Several jurisdictions (British Columbia, Manitoba, Saskatchewan, and Newfoundland) protect the right through their provincial privacy legislation. Quebec provides statutory protection through its Civil Code. In Ontario and Alberta, the right is entirely governed by the common law. 

Personality rights are recognized as property rights that are owned by individuals. Like other forms of property, they can be licensed and even inherited upon death.

The limited cases on wrongful appropriation of personality in Canada have involved celebrities or well-known figures, though celebrity is not a requirement. This raises the question of whether individuals could license their personality rights to companies. Licensing personality rights could potentially provide a new revenue stream for individuals and even data brokerages that already buy and sell personal data. For companies, it can create a burdensome problem of ensuring that appropriate permissions have been obtained.

Using someone’s personality for commercial purposes, without their consent is considered a tortious act under the common law. If successful, a plaintiff may be entitled to an injunction and damages.  Case law suggests that a successful cause of action will require that (i) the plaintiff can be identified; and (ii) that their image was used for the purpose of commercial gain.

In Gould Estate v Stoddart Publishing Co, the court narrowed the scope of the tort to “endorsement-type situations.” This does not necessarily limit the cause of actions to celebrities endorsing products. The flexible language leaves the door open for courts to consider whether tracking and analyzing a customer’s shopping preferences to customize how they are marketed to would be considered an “endorsement-like situation.”   

While some companies may be able to anonymize the data so that individuals aren’t identified, this may prove more difficult for retailers relying on facial recognition technology to identify customers within their loyalty programs.

Moreover, using someone’s facial identity to increase sales is a primary objective behind the retail sector’s use of the technology.  For example, in New York, athletic footwear giant Reebok uses an algorithm to snap photos of shoppers in-store and rapidly build a profile that can track their emotional cues, for example, how interested they are in a particular product.  Reebok has expressed hope that  these insights could be used to customize the ads customers see while they’re in store.

Remaining Competitive Post-COVID-19   
      
 
The value of facial recognition technology to retailers may be even more important in the aftermath of COVID-19. Although the true financial ramifications of COVID-19 on the global economy are not yet fully known, competition among retailers has always been tough.
In light of this, the pressure to lower customer acquisition costs and keep existing customers is higher. Customer acquisition cost (what a business must spend on marketing to obtain and keep a new customer), is a critical metric that can shine light on a company’s performance and future success. Anything companies can do to lower their marketing costs, while increasing their precision to enhance loyalty, will likely be helpful in rebounding from the economic crisis.

Facial recognition technology offers retailers a promise of improved customer service, lowered costs, and more efficient marketing. While tempting, the technology comes with strings attached and requires ongoing maintenance and vigilance on the retailer’s part to ensure they are compliant with privacy laws and larger public policy goals. Even if personality rights are not engaged, retailers must still be aware of their legal responsibility to safeguard individuals’ personal information under privacy legislation like PIPEDA

In a WIPO paper on Trends in AI, Kay Firth-Butterfield, Head of AI and Machine Learning at the World Economic Forum (WEF) cautioned companies to be aware of potential problems that AI can introduce, noting that substantial brand value can be lost if the wrong decisions are made about the use of AI. Firth-Butterfield stressed that the fast pace of change surrounding the technology requires companies start thinking about regulatory and governance mechanisms now not later.

The OPC is currently in the process of reviewing and revising its guidance around the use of artificial intelligence by companies. As more jurisdictions are enacting privacy legislation to respond to the growing use of artificial intelligence in commercial settings, businesses may face additional barriers before they can fully implement the technology. Companies could see new compliance requirements that either reflect or closely align with the legislation that is already in force in other jurisdictions, like the General Data Protection Regulation (GDPR) in Europe.

The added expense required to safeguard the information and ensure that staff are trained to use it may not be enough to justify taking on the risk, particularly for small- to mid-size businesses who cannot shield the costs and administrative burdens as easily. 

If there is a silver lining for companies, it may be that courts have previously recognized that PIPEDA serves a distinct purpose that can be distinguished from other federal and provincial privacy legislation. Namely, PIPEDA must balance protecting individuals’ privacy with the need for commercial organizations to collect and use personal information. 

So while it is unlikely that the OPC will issue recommendations that stifle commercial activity, it would be reasonable for retailers to expect that they will need to take extra precautions to ensure highly sensitive personal information is protected.

Ultimately, it will be for retailers to decide if the benefits outweigh the costs.   

Maggie Vourakes is a JD candidate at Osgoode Hall Law School.