As guests arrive at the Warilla hotel in eastern Australia, a small camera equipped with facial recognition software scans their faces as part of a plan to tackle problem gambling.
The technology, which uses artificial intelligence (AI) to identify addicts who have applied to be banned from gambling sites, will be rolled out to gambling venues in the state of New South Wales next year.
Supporters say it will help curb problem gambling in a country where addiction affects around 1 percent of the population and annual losses run into the billions of dollars.
But the technology is “invasive, dangerous and undermines our most basic and fundamental rights,” said Samantha Floreani, program leader for the nonprofit group Digital Rights Watch.
“We should be exceptionally cautious about introducing it to more areas of our lives and it should not be seen as just a quick fix for complex social problems,” he said.
The Warilla Hotel did not respond to requests for comment. Its website states that it supports “responsible” gaming.
The organizers of the AI scheme, industry bodies ClubsNSW and the Australian Hotel Association of NSW (AHA NSW), said “strict privacy protections” were in place.
‘The best chance’
Facial recognition systems use AI to compare live images of a person with an image database; in this case, a gallery of people who have voluntarily signed up for a “self-exclusion” scheme for problem gamblers.
If the camera identifies someone in the state database, a staff member is alerted so they can be denied entry to casinos or escorted out of slot machines in hotels and bars.
“We think this is the best chance we have to stop people who have self-excluded from entering venues,” said John Green, director of AHA NSW.
The data collected will be protected and encrypted and will not be accessible to third parties, including the police and even gambling venues, Green said.
However, digital rights groups said the technology was ineffective in stopping problem gambling and could be used for broader surveillance, adding that such projects underscore the need for tougher privacy and data rights laws to protect the citizens.
“People who opt into self-exclusion programs deserve meaningful support, rather than have punitive surveillance technology foisted on them,” said Floreani of Digital Rights Watch.
“And those who have not opted into these programs should be able to go to the pub without having their faces scanned and their privacy undermined.”
Digital rights advocates want Australia’s Privacy Act 1988 to be amended to better address the use of facial recognition technology and clarify when and how it can be used.
Facial recognition technology is increasingly used around the world for everything from unlocking mobile phones to checking in for flights. It has also been adopted by some police forces.
Advocates say it helps maintain public order, solve crimes, and even find missing people.
Critics say there is little evidence that it reduces crime and that it carries an inherent risk of bias and misidentification, especially for darker-skinned people and women.
Gaming industry bodies have said facial recognition cameras will only be used to enforce the self-exclusion scheme.
But a bill introduced in the NSW parliament last month, which will formally legalize the technology in clubs and pubs, includes language that would incorporate other uses, including banning people for being too drunk.
“There’s a scope extension capability, the capability of this facilitating more uses,” said Jake Goldenfein, a senior lecturer at Melbourne Law School who studies technology.
He called for more regulation on facial recognition due to the sensitivity of the data captured and the increased risks of data leaks.
“Face templates are…something we can’t change. If we lose control over our biometric information, it becomes particularly dangerous,” he said.
Advocates of the reform have pushed for measures such as reduced opening hours of gambling venues and limits on the value of bets.
The use of facial recognition technology is the industry’s way of delaying such reforms and is unlikely to have a “practical effect” on problem gambling, said Tim Costello, chief advocate of the Alliance for Gambling Reform, a pressure group.
“Clubs are trying to appear proactive… it’s a complete window dressing to stop real reform,” he said.
Green at AHA NSW said a survey of self-excluded gamers found more than eight in 10 respondents felt using facial recognition would be effective.
There is a growing backlash against facial recognition in Europe, the United States and elsewhere, with companies like Microsoft and Amazon ending or reducing sales of the technology to police.
In Australia, retail giants Bunnings and Kmart stopped using facial recognition technology to monitor customers in their stores earlier this year after the country’s privacy watchdog opened an investigation into whether they had broken the law.
Consumer rights group CHOICE, which referred the brands to the regulator, said the technology was “unreasonably intrusive” and “customers’ silence cannot be taken as consent” to its use.
The Australian Human Rights Commission last year called for facial recognition technology to be banned until it is better regulated with “stronger, clearer and more specific” human rights protections.
“There are questions that the existing law doesn’t have very good answers to,” said law professor Goldenfein.
“There are so many ways to help problem gamers that the idea that facial recognition technology is the solution is, frankly, ludicrous.”
|Hire Us For website setup, design, tech support or any type of online work.||Send Email|
|Follow Us On Google News||Google News|
|Follow Us On Facebook|
|Follow Us On Pinterest|
|Follow Us On Tumblr||Tumblr|
|Follow Us On Telegram||Telegram|
|Follow Us On Linkedin|
|Download Free Games||Download Free Games|