Explainer: Are Australian retailers using facial recognition software on their unknowing customers?

Imagine that your every move, every expression, every purchase could be tracked without your knowledge, with this information shared or sold for marketing to make you spend more money. This is already happening to us when we shop online, use a streaming service – and also, as it turns out, when we shop in-store.

It has been revealed that some of Australia’s largest retailers, including Kmart, Bunnings and The Good Guys, are using facial recognition technology (FRT) on their customers.

Consumer group CHOICE has referred these companies to the Office of the Australian Information Commissioner (OAIC) for investigation into potential breaches of the Privacy Act.

So how is our data being collected?

While Amazon and Google are tracking us through clicks, searches and online purchases, these stores are using in-store video surveillance to capture images and videos of customers.

This includes using images from camera and video surveillance both in-store and from store entrances and car parks. This biometric data can also be shared to overseas third-party service providers for use and storage.

A survey from CHOICE has found that 76% of Australians aren’t aware that their biometric and financial data are being captured and potentially mined in this way. From the Australian Community Attitudes to Privacy Survey, commissioned by the OAIC, 70% of Australian are concerned about their data privacy, and 87% want more control and choice of the collection and use of their data.

What could it be used for?

While Kmart, Bunnings and The Good Guys have all told CHOICE that their FRT is being used for “loss prevention”, this data could also easily be used for targeted advertising, which from their perspective is being used to “enhance” our shopping experiences, while simultaneously being a major invasion and manipulation of our privacy.

For example, when searching for something on Google, or buying something from Amazon, those companies can use your browsing and purchase history to build a customer profile in order to market purchases more effectively to you. It’s possible that the Australian retailers could be using our biometric data gathered in-store to gain insight into customers – for example, looking at our facial expressions to gauge reactions to advertising, sales and new products in-store.

While major retailers like Kmart factor in “shrinkage” into their profit margins, which is the percentage of goods lost to mishandling and theft, how they intend to use FRT to mitigate this loss is unknown. Why wouldn’t they just turn this footage over to the police if there was any illegal activity happening in-store?

Facial recognition technology is already being used by Australian federal governing bodies for monitoring and preventing criminal activity; including the Australian Federal Police, the Australian Security Intelligence Organisation, and the Department of Home Affairs. But facial recognition technology is wrought with biases, especially because the FRT is often trained using data sets lacking diversity, leading to wrongful arrests.

How is our privacy currently protected?

Currently in Australia there is no dedicated law on the use of FRT, but we do have some protection through privacy laws.

Under the Privacy Act 1998, our biometric information can only be collected first with our consent (with few exceptions), it must have a high level of privacy protection, and must be destroyed or de-identified when no longer needed. But how this translates into reality seems to be loosely interpreted.

The version of obtaining our consent is placement of a small sign at the store entrance in the case of Kmart, and an online privacy policy for Bunnings. Rather than having an active choice in giving our data away, like online cookies give now, we are all automatically opted in. If you don’t like the sound of this, you can easily opt out of digital marketing by changing your browser settings, but for opting out in-store the only thing you can do is try to email retailers directly.

Facial recognition, machine learning, marketing, data privacy
Kmart Marrickville (NSW, Australia) entrance side view showing the privacy policy statement. Credit: CHOICE

Anonymising the data is another way to protect it, but this is unfavourable for companies as they would lose the ability to personalise advertisements and content to individual consumers. Faces can be digitally augmented to protect the anonymity of the person – but there is a trade-off between privacy and data performance.

In tests comparing FRT algorithms trained using computer-generated faces, anonymity was protected, but performance in individual facial recognition ability was degraded, compared to data sets using real people. It’s also almost impossible to avoid using real data, as even with computergenerated faces, the generations are still based on real biometric data that can potentially be linked back to an individual. This includes the generation of deep-fakes, some that look so real, even we cannot tell them apart.

Where to from here?

There are calls for the federal government to adopt the Human Rights Commission’s 2021 moratorium (temporary pause) on FRT until laws have been more clearly defined and safeguards put in place. Hopefully, the OAIC responds to CHOICE’s submission to investigate breaches of our data privacy by these major Australian retailers.

Until then, you could follow the example of American-born artist Jennifer Lyn Morone, who registered herself as a corporation, in an attempt (and protest) to realise the market value of her private data. While Facebook which made $167 billion in profit last year from our data – all we got in return was the use of Facebook for “free”.

Please login to favourite this article.