Facewatch support the BSIA as a full member and are working with the team to publish a guide for the industry on the use of AFR and the need for an understanding of the data laws that currently exist in the UK and Europe. In a recent round table discussion many of the opportunities and challenges for the technology were aired. Nick Fisher was one of the key speakers.

 

Watch the broadcast here:

Please accept statistics, marketing cookies to watch this video.

BRC unveils Shopworkers’ Protection Pledge to protect retail workers against crime

Over 400 incidents of violence and abuse against shopworkers occur every day

11 cross-party MPs put their name to the pledge for the launch


The BRC has launched the Shopworkers’ Protection Pledge in an effort to support the legislation necessary to protect retail workers against crime and violence.

On Wednesday, a total of 11 cross-party MPs put their name to the pledge for the launch and the BRC is calling on MPs from all parties to add their name to this cause.

The signatories pledge aims to improve legislation after the BRC Crime Survey found that over 400 incidents of violence and abuse against shopworkers occur every day.

http://brc.org.uk/news/corporate-affairs/shopworkers-protection-pledge/ 

THE PLEDGE

Over 400 retail workers face violence and abuse in the workplace every single day. The British Retail Consortium Crime Survey shows an increasing problem of abuse, threats and violence facing the millions of people who work in our shops, serving our local communities. These incidents are often the result of challenging shoplifters, enforcing age restricted sales and recently, implementing coronavirus safety measures.

These victims of abuse carry their experiences with them for a lifetime. It affects them, their colleagues, and the families they go home to. Retail workers don’t just serve the community, they are the community and have been ‘Hidden Heroes’ during the coronavirus pandemic, working tirelessly to keep the nation fed and supplied with the items we have all needed.

As elected Members of Parliament, we have a duty to protect retail workers, ensuring that those who needlessly assault shop staff face the full force of the law. No one should have to face violence or threats in their workplace.

I pledge to champion shopworkers in my constituency and:

  • Recognise the serious impact that violence and abuse has on shopworkers and the local communities they serve.
  • Stand with retail workers to support legislation to better protect them.

Send email here: chantelle.devilliers@brc.org.uk <chantelle.devilliers@brc.org.uk>; to sign the pleadge

Facing down shop crime

Big Interview with Asian Trader magazine: Nick Fisher, MD Facewatch

After starting his career at the RAC and moving on to Dixons and Phones4U, Nick Fisher has found his vocation in making shops safe for staff again. Here is the story of Facewatch as told to Andy Marino

Facewatch HQ is in London, but when I call Nick Fisher he is at home in Derbyshire, enjoying the countryside despite grey skies, and happy that it’s easy to go for a bike ride any time he wants.

It seems suitable that we are speaking just now. Demonstrations have turned into riots in London and other UK cities, and shop crime in Convenience has almost doubled during lock-down – criminals have found their other stores of choice closed (except to looting), so they descended on local shops instead.

Facewatch, after all, is a gotcha technology that shop thieves and thugs will be powerless against, or at least that’s the theory. I want to know all about it: can retailers at last have access to a powerful tool to fight back against the tide of criminality that seems to be overtaking us – a tide Nick Fisher actually describes as a tsunami?

Nick has been vocal, not least in this magazine, about the need to thwart the rise of crime, but he has decided that the best way he can intervene is to leave the streets to the police.

“The police use facial recognition and there’s been a number of legal cases (all funded and supported by civil liberties groups) where the police have been taken  to court for not securing permission or failing to notify – there was an instance in King’s Cross where the Met put up cameras, probably inappropriately, without telling anybody and no signage,” he says.

“The fundamental difference with public CCTV is that Facewatch is private, for businesses,” he says, as we talk about the toppling of statues, “so the rioters are observed by the police and we have nothing to do with that.”

With all this hullabaloo over privacy rights and GDPR, it’s a minefield that he doesn’t want to step into. He sees the police and government suffering terribly from laws and red tape and has found a genius way to stay above the fray.

Private property, says Nick, is fundamentally different from the streets, and the attitude of Facewatch is that when you are in my shop, it’s my rules – and it’s all legal.

“I say to these civil liberty groups, who often seem to want to defend criminals, that I will arrange for you to go and work in a store so you can experience it for yourself and it might change your view, He says, adding that they never take him up on the offer.

The beginning

Nick Fisher CEO Facewatch

Nick Fisher CEO Facewatch

“I come from a retailer’s background,” says Nick, who spent 30 years in the industry. “I helped out Carphone Warehouse for six months, because I had the experience of ten years at Dixons, and during that period a friend of mine introduced me to a guy called Simon Gordon who owns Gordon’s Wine Bar.”

Gordon’s is in the passage beside London’s Charing Cross Station that leads down to the Thames Embankment. Its vaulted cellars are evocative and lit with candles. Plenty of city folk and tourists drink there, and so plenty of pickpockets and bag-snatchers would also turn up. Gordon had founded Facewatch because he was sick of the police giving up on crimes that were happening in his bar.

“It started off as a digital crime-reporting platform, a bit like Pubwatch, and he’d been at it for five-plus years. He’d made a lot of progress but wasn’t making any money.”

Nick told him what he was doing was commendable but not commercial. First because he didn’t charge his subscribers enough and second because his principal model was reporting crime to the police, who weren’t interested in those types of crimes because they didn’t have the resources.

“The police are very public about that,” opines Nick. “In the five years since then it’s got even worse and they’ve said they’re not coming out for less than £100 or £200.”

He told Simon that from a retailer’s perspective, what they want is just for the crimes not to happen in their store.

“The challenge with CCTV – and I speak from experience, having run an estate with 700 shops – is that it’s observational technology designed to be mounted in the roof. They have HD recorders because you are supposed to record somebody pinching something and then intervene,” says Nick.

“But if you do, it’s a really complex situation. And if you miss the incident, then notice later on you’ve lost £60-worth of meat and sausages, there’s nothing you can do about it except spool through footage looking for when it happened.”

The step after that is the hardest: to get the law interested. “The police will tell you to burn it on a disk and you’ll never hear from them again.” You’ve wasted an hour of your life and lost £60.

“We needed to come up with something proactive not reactive,” Nick concludes.

The available facial recognition tech was good enough if the subject stared into the camera – as you do at passport control – but wouldn’t help with a hoodie slinking into a store with his head down.

Nick was working on a new kind of digital algorithm that could get past this problem.

“[Gordon] liked my idea and asked me to write a business plan, so I did and we raised a first round of funding to develop the technology and the platform, and examine the lawfulness of holding stored and shared data.”

The next round, which was bigger and successfully completed last October, was to commercialise the product and take it to market: “We raised a substantial amount of money,” says Nick, indicating that investors were impressed.

Since then, the marketing has gotten underway. Facewatch now has agreements with two key distributors, the established CCTV companies Vista and DVS, “So we don’t have to go out and build our own sales force – we now have distribution partners who have their own CCTV installers and massive customer bases, and they take the Facewatch proposition to their client.”

What comes next, the big campaign?

“That’s where we are now, the early days of going live,” says Nick. “We’ve got a number of licences already out there and the amount of interest is incredible. Obviously with retailers having just been through Covid-19, even with customers having to be two metres apart they are still seeing crime at a ridiculous level.”

Watching the Defectives

Facial recognition

I ask Nick to explain how it works – not just the technical side but the legal side as well. How are both made criminal-proof?

“It’s about identifying individuals,” he answers, “low-level criminals, not murderers and rapists but people who come into your store and just verbally or violently abuse you, and/or steal from the store.” Paradoxically, the every-day nature of most of the crimes – and the fact that it’s on private premises, is the magic that makes Facewatch work.

Nick tells me about his new software – “periocular” software – that can recognise a covered face. “We’ve just developed it, and it will just take sections of the eyes, brows, cheekbones – and it’ll work with hats, sunglasses and masks on.”

The fact that crime is also mostly local can help, too. “If you speak to anyone in retail, anyone in the convenience sector, they’ll know the top five who come into their store and they probably know three of them by name.”

Facewatch is a bit like email, in that it becomes more effective and efficient the more users it has. “So Facewatch will tell you whenever they enter the shop, but it will also help you identify the others, that aren’t regular but do operate within your geographical area.” This means that if a face is on the watch-list of a shop even eight miles away, the first time he enters your store, you will get an alert.

If the thief wraps up carefully it won’t help him if he has been photographed previously or somewhere nearby, says Nick: “The quality of algorithmic technology has gone through the roof in the last couple of years. There’s new releases of software coming out every ten weeks from the developers: it’s pretty advanced stuff now.”

Whether this is allowed all comes down to the legal small-print, which is watertight because Facewatch has gone over it with a fine-tooth comb at the highest levels.

“Part of our licencing contract is signing an information-sharing agreement with Facewatch, and that means we hold the data for the client and become their data controller. This is fantastic for the retailer because it means they don’t have to worry about anything. And then Facewatch shares the data proportionately with other retailers in the area who are Facewatch subscribers,” Nick explains.

“The object of Facewatch is to build this big, powerful database that we hold, store and share on behalf of our clients. We mitigate their risk, we have data-sharing rules, we have the relationship with the ICO [Information Commissioner’s Office] and other required bodies, to own and manage this data on our clients’ behalf.”

And in practice? “If there are three separate retailers on Bromley High Street and they all have Facewatch licences, and have each put ten thieves into the watch-list, they share between them a database of 30 in that area,” says Nick.

Retailers like this because next to crime, their biggest fear – quite correctly – is data. They don’t know whether they should see it, whether they should be sharing it, or whether they should hold it or store it. Biometric data, such as faces, is classed as “special-category data”, and you are not allowed to possess it without explicit permission of the subject.

But Facewatch can.

“An employee might give you permission so he can open the office door but a thief certainly won’t give you permission to put them on a watch list,” Nick says. “And there’s a set of rules around GDPR and the proportionality of sharing data you have to satisfy to hold on to stuff. We take care of that.”

He says Facewatch has worked with the Home Office and the Ministry of Justice, the Joint Security and Resilience Centre (JSaRC) and the ICO, to make sure they are doing everything above board.

“We’ve had our GDPR information-sharing-DPA [Data Protection Act] compliance all signed off by the leading QC on cyber law in the UK, Dean Armstrong [of the 36 Group]. He marked our homework, so to speak. We’ve ticked all the right boxes and we are super-transparent in everything we do.”

It sounds as if the criminals’ biggest nightmare is about to become real. Rights are still protected, and you can ask Facewatch whether you are on the watch-list and they have to tell you – but if a thief wants to be removed from it, he would have to come up with better evidence than what is stored on the database.

“If that thief does a personal access request we can tell the day, date, time and camera number and the individual who put that report on the watch-list,” laughs Nick. “And then we’ll use the evidence to challenge the request to be removed from the list: We have evidence of you right here stealing £60 of meat and sausages on such a date, and if you want to contest it, go to the police and we can have a discussion about it.”

At the same time, the innocent have nothing to fear – and for once, it’s true. “You as an individual can do as subject-access request to Facewatch any time you want,” says Nick. “We’ve only had two requests at Gordon’s Wine Bar, and funnily enough they tend to be from older blokes who have been there with younger ladies!”

In reality, a person would only ever be on the watch-list if they had committed a crime or an assault, because it is illegal to store just any old data. Most gets immediately deleted.

Putting it into practice

It strikes me that the genius part of the Facewatch offer is not the tech, even though it is cutting-edge. As Nick says, many companies can offer the hardware needed to take good pictures of ugly mugs.

Rather, it is the legal and data ju-jitsu that Facewatch has accomplished that is key. It seems to have managed cut through the tangle of permissions and loop-holes exploited by criminals and their professional helpers to thwart traders and owners.

“All biometric companies in the UK sell the tech, but Facewatch are essentially selling you the data management service,” says Nick. “The tech is included, of course, because the tech sends the alert but the difference is this: If I, as a tech company, were to sell you a facial recognition system then I don’t want to look after the data. I just want you to buy the tech from me. The problem with that is, you can only use that tech legally with the permission of the people you want to put on the database. That works fine with membership stuff like gyms, where you look into a camera for admittance, for example. But every member has to consent to supply a picture of their face for comparison. The data would be held by that gym, that business. But they can’t use the data for anything else, such as crime prevention, because permission hasn’t been given by the individual for that.”

That’s the reason shop crime until now has been rampant: every incident has to be dealt with as a new thing rather than spotted and avoided at the outset. It is slow and inefficient and soul-destroying. There has been no legal means of automating pre-emptive detection – until now.

There are three rules of using facial recognition, Nick explains. One is necessity – you need to demonstrate your need for it. You can’t just collect it if you haven’t any crimes. Second, you have to be able to show it’s for crime prevention purposes if you are not getting permission to collect images from the people whose pictures you are taking. Third and most important, you can only hold that data if it is in “the substantial public interest” – so a business can’t do it just for themselves as that is not a substantial public interest.

“But Facewatch is an aggregator of information across businesses and geographic areas which acts as a crime prevention service and we share it in the public interest,” Nick says. “So you need to be a data aggregator, but a camera company can’t be that. Likewise a facial recognition software company can’t be a data aggregator because each company would have a different algorithm and database.”

But Facewatch sits right in the middle. “You just have to make sure all roads lead to Facewatch because that’s how the data becomes lawful to use.”

Big Brother is kept in his cage: there won’t be any nationwide watch-list. Proportionality rules keep it local – a database in London would have a radius of about five miles, but maybe 30 miles in Norfolk. “Guys who go into your readers’ stores and nick booze, deodorant, cheese and meat – they don’t travel more than five miles,” Nick confides. “They only start to travel when you start dispersing them. We put facial recognition into every store in Portsmouth and all the thieves went to Southampton – or to Tesco!”

I want it now!

 

“You could put it up pretty much instantly,” Nick explains when I ask exactly what a retailer would need to do to have Facewatch installed.

“They could either phone us or one of our distribution partners. The distribution partner would then introduce them to one of their re-sellers or installers who would then make an appointment with the retailer to see the site. They’ve got all the tools with them – you have to install the camera on a certain angle, at a certain pitch. It tells them what camera they need and measures the distance. It’s really simple and for a CCTV installer it’s a doddle of a job.”

Facewatch is sold on a per-camera, per-licence basis of £2,400 per annum, £200 a month essentially, which includes all the hardware and licencing and all the sharing of data – everything they’ll need other than the installation and the camera itself. “A lot of retailers already have their preferred installer,” says Nick. “He has to be accredited by Facewatch, and if not, we’ll train him, and if a retailer has no installer, we’ll recommend one.”

It’s pretty straightforward, he adds. “In the same way as they would add more CCTV cameras to their store. Cameras are cheap, £150 or whatever. A standard HD camera is fine. It’s got to be of a certain lens-type and we’ve got an offer at the moment where the camera is free.”

The proposition is that with the average store leaking thousands of pounds a year in theft, Facewatch will easily pay for itself. “Everywhere we have deployed facial recognition, in the first 90 days, from a standing start, even without an initial watch-list, we have seen a greater-than 25 per cent reduction in crime with every deployment,” says Nick.

He adds that it even works with repeat offenders that you might not have put on the watch-list, because the thief is known by some other shop participating in Facewatch and they would see the sign in your window, effectively warning them off.

“We’ve even got a little [digital] tool that we give to our installers, and it works out the ROI,” Nick tells me. “So any store that loses more than £8,000 per year – which is at least half of them – will have an ROI in as little as 11 months, your money back inside a year. I’m just talking about the savings against crime, that’s not to mention all the time you would have had to commit preparing evidence to give the police, and all the hassle and stress you avoid, and managing the emotions of your employees who have been abused or assaulted.”

I ask Nick for his message to the nation’s retailers.

“I can say this,” he answers. “I am not some bystander in an ivory tower who just has a viewpoint. I’ve been a retailer for nearly 30 years. What typically happens in a recession is that money gets tight and crime goes up. What we are about to see is incredible.

“The police are saying they just can’t spare the time to deal with it. That whole, ‘Your problem, not ours’ approach is going to be significantly worse after Covid-19 than before, and it was bad enough then. Facewatch is a really simple deterrent proposition and it’s easy to install. We look after all the data on behalf of our clients – there is a tsunami of crime coming our way.”

By  18 March 2020  Retail Week 

As the government takes increasingly stringent action to combat the ongoing coronavirus outbreak, retail staff have found themselves on the front line in the face of panic buying and contagion fears.

  • Labour MP Alex Norris calls on government to pass legislation protecting shopworkers from violence, abuse and assault
  • Morrisons CEO David Potts introduces numerous measures to protect staff, including statutory sick pay
  • Several retailers have called on consumers to treat colleagues with respect in the face of growing concerns around stockpiling

The number of confirmed cases of coronavirus in the UK is rising daily. The government by its own admission has begun imposing “draconian” measures and many consumers have ignored pleas to the contrary and cleared shelves of some products in a stockpiling frenzy.

While panic buying has mostly taken place in supermarkets and grocery convenience stores, instances of consumers hoarding hand sanitiser, soap and over-the-counter medicines have also hit health and beauty retailers.

Social media, Twitter in particular, has started to fill up over the last few days with stories of frontline retail staff working longer hours and coming face to face with fraught and sometimes abusive members of the public, all the while trying to do their best to keep shelves stocked and consumers happy.

On Monday, Labour MP Alex Norris stood up in Parliament and put forward legislation to protect shopworkers from rising levels of violence, abuse and assault.

Norris said the shopworkers were “on the front line” of abuse and crime, and those worries were likely to be exacerbated amid the growing panic about coronavirus, given the “significance retail workers have in our lives, particularly during this period”.

“With the current coronavirus crisis we would argue that retail staff are essential workers”

Paddy Lillis, Usdaw

Shopworkers’ union Usdaw has backed the call and said “retail staff are essential to our communities, particularly during the coronavirus crisis”.

Usdaw general secretary Paddy Lillis says: “We have always made the case that retail staff are at the heart of our communities, but with the current coronavirus crisis we would argue that they are essential workers.

“Usdaw members across the retail supply chain and in stores are working hard to keep shelves filled and serve customers. We understand this is a stressful time and remind customers that shopworkers deserve respect and that no level of abuse is ever acceptable. It should never be a part of the job.”

The BRC says it is working with police and partners to “keep retail sites running as smoothly as possible” and that “when circumstances are difficult, retailers are well-versed in providing effective security measures”.

In a statement issued to all Morrisons’ stakeholders yesterday, chief executive David Potts agreed and called on consumers to “treat our colleagues on the front line with the greatest respect”.

David Potts

Morrisons boss David Potts asked customers to ‘treat our colleagues on the front line with the greatest respect’

Potts also called on customers to “please consider others even more so everyone can buy what they need, especially those who are most vulnerable in our society”.

A spokesman from another national grocer said there had been a handful of incidents across its estate, but there had not needed to take on extra security guards.

While some retail staff have faced abuse from consumers, others are also struggling with worries about getting the disease themselves – either from customers or from colleagues.

It is becoming a growing concern for businesses that frontline staff, as well as those working in key supply chain roles such as warehouse workers and delivery drivers, will fall sick or be forced to self-isolate as the virus continues to spread.

Earlier this week, in a call between representatives of Defra, supermarket chains and the wider food industry, the possibility of taking thousands of hospitality workers on secondment to work in food supply chains was raised, according to BuzzFeed news. While this could even increase the risk of spreading the virus, it will at least safeguard vital jobs and supply lines in the sector.

‘Amazing group of people’

Protecting staff from spreading the disease is becoming a top priority. The managing director of one high street food and beverage operator told Retail Week his staff are deep cleaning premises three times a day. Under normal circumstances, they would be deep cleaned twice in a month.

A spokeswoman from the Co-op says it has taken “immediate steps” to safeguard staff including building in “additional working hours for store colleagues to undertake more frequent handwashing throughout the day”.

Morrisons and Boots are among those to have implemented measures designed to enhance hygiene and staff safety.

A Boots spokeswoman says it has been “making sure that our stores, pharmacies, opticians and hearing care facilities are all clean and hygienic”. She also says teams in-store “have access to handwashing facilities and sanitiser”.

Boots bag

Boots has ensured staff have access to handwashing facilities and sanitiser

Morrisons yesterday announced a slew of measures designed to protect staff. In order to reduce the handling of cash by shopworkers, the grocer has asked customers to pay by card or smartphone “where possible”.

The grocer has “been issuing hand sanitiser” to all checkout workers in-store, significantly increased cleaning on surfaces that consumers and staff touch and redeployed staff “who are vulnerable to the virus”.

The retailer has also taken measures to protect staff who either fall ill from the virus or are forced to self-isolate and therefore can’t work by creating a ‘colleague hardship fund’. This will ensure all staff affected by the virus receive sick pay “whether or not they would normally be eligible”.

As the retail sector waits to see what measures will be bought in by the government next, many in the industry are rallying around frontline workers in these uncertain times.

Timpson chief executive James Timpson today described employees as “an amazing group of people who I’m going to need to lean on heavily over the coming weeks and months to keep the show on the road”. Other retail leaders will heartily agree and be doing their best on behalf of their people.

Less than a month ago Vista CCTV became the newest partner with Facewatch to distribute this game changing technology to Vista Priority Partners (VPPs). In this series of Face to Face videos Nick Fisher,CEO, Facewatch and Dean Kernot, Sales and Marketing Manager, Vista CCTV go Face to Face in conversation to explore the opportunity.

In a wide ranging discussion with probing questions from Dean and straight talking answers from Nick this huge change to the security landscape is discussed.  Fundamentally Facewatch facial recognition is fast becoming the acceptable, affordable and compliant solution for any business wanting to deter crime and anti-social behaviour whilst providing a safer and more customer orientated environment for visitors and staff.

The full Face to Face film:

Please accept statistics, marketing cookies to watch this video.

The Facewatch story

Please accept statistics, marketing cookies to watch this video.

The Problem Facewatch Solves

Please accept statistics, marketing cookies to watch this video.

The Vista VPP program

Please accept statistics, marketing cookies to watch this video.

The Accredited Partner Program

Please accept statistics, marketing cookies to watch this video.

How do watch lists work?

Please accept statistics, marketing cookies to watch this video.

Why Facewatch?

Please accept statistics, marketing cookies to watch this video.

 

 

 

In an open debate at The Temple in London the issues of facial recognition and its use by the Police was debated by Fiona Barton QC.

Facewatch was invited to speak to allow the invited barristers to learn more about this important crime deterrent. Presentations by Nick Fisher and our Data Protection officer,Dave Sumner, were made.  The video gives the edited highlights of this wide ranging presentation.

 

Event Video:

 

Presentations from:

Fiona Barton QC, 5 Essex Court

Nick Fisher, CEO, Facewatch

Dave Sumner, DPO, Facewatch

Fiona Barton QC

5 Essex Court Breakfast presentation

https://5essexcourt.co.uk/

The event:

https://5essexcourt.co.uk/resources/events

 

By Charles Hymas

Home Affairs Editor (The Daily Telegraph)

https://www.telegraph.co.uk/authors/charles-hymas/

Please accept statistics, marketing cookies to watch this video.

When you enter Paul Wilks’ supermarket in Aylesbury, a facial recognition camera by the door snaps your image and then checks it against a “watchlist” of people previously caught shoplifting or abusing staff.

If you are on it, managers receive a “ping” alert on their mobile phones from Facewatch, the private firm that holds the watchlists of suspects, and you will be asked to leave or monitored if you decide you want to walk around the store.

This is not some Big Brother vision of the future but Budgens in Buckinghamshire in Britain 2019.

It is also stark evidence of the way that Artificial Intelligence (AI) technology is spreading without regulation potentially intruding on our personal privacy.

For Mr Wilks, it has been a success. Since he introduced it at his 3,000 square foot store a year ago, he says shoplifting has fallen from ten to two incidents a week. The thousands he has saved has more than paid for the technology.

“As well as stopping people, it’s a deterrent. Shoplifters know about it,” says Mr Wilks, who has a prominent poster warning customers they will be subject to facial recognition. “As retailers, we have to find ways to counteract what is going on.”

As the retail sector loses £700 million a year to thefts, Facewatch gives store owners a “self-help” solution to the reluctance of police to investigate petty shoplifting.

It is the brainchild of businessman Simon Gordon, whose own London wine bar Gordons was plagued by pickpockets. Using AI technology provided by Intel, a US multinational, he has bold ambitions to have 5,000 high-resolution facial recognition cameras in operation across the UK by 2022.

His firm is close to a deal with a major UK supermarket chain and already has cameras being used or trialled in 100 stores, garages and other retail outlets.

The lenses are mounted by entry doors to catch a full clean facial image, which is sent to a computer that extracts biometric information to compare it to faces in a database of suspects.

Facewatch says there must be a 99 per cent match before the alert is sent to store staff and in consultation with the Information Commissioner Elizabeth Denham has introduced privacy safeguards including immediate automatic deletion of images of “innocent” faces.

“When CCTV first came out 25 or 30 years ago, people thought it was the end of the world, Big Brother and 1984,” says Stuart Greenfield, a Facewatch executive. Now there are six million cameras in London. People either think they are not working or are there to stop terrorists. No-one really worries about it. Facial recognition is the same. Facebook, Instagram and the airports are using it. It is here to stay but it has to be regulated. Everything needs to be controlled because every technology can be used negatively.”

And there’s the rub. MPs, experts and watchdogs, like the Information Commissioner Ms Elizabeth Denham and Paul Wiles, the biometrics commissioner, are concerned facial recognition technology is becoming established if not widespread with little public debate or regulatory scrutiny. They point to critical questions yet to be resolved.

When should facial technology surveillance be used, in what circumstances and under what conditions? And should consent be required before it is deployed?

Judges in a test case against its use by South Wales Police ruled taking a biometric image of a face is as intrusive as a fingerprint or DNA swab. More significantly, unlike with fingerprints or a swab, people have no choice about whether, where or when their biometric image is snapped.

South Wales Police are thought to have scanned more than 500,000 faces since first deploying facial recognition cameras during the Champions League Final at Cardiff’s Millennium Stadium in June 2017. The Met Police and Leicestershire police have scanned thousands more in their “trials.”

The test case in South Wales was brought by Ed Bridges, a former LibDem councillor, who noticed the cameras when he went out to buy a lunchtime sandwich. He said taking “biometric information without his consent” was a breach of his privacy when he was acting lawfully in a public place.

The judges, however, ruled use of the technology was “proportionate.” They said it was deployed in an “open and transparent” way, for a limited time to identify particular individuals of “justifiable interest” to the police and with publicity campaigns to alert the public in advance.

However, Mr Wiles, the biometrics commissioner, is not convinced that this test case alone should be taken as sufficient justification for a roll-out of police use of facial recognition. “I am not disagreeing with the South Wales Police judgement. What South Wales Police did was lawful,” he says.

“Some uses of Automated Face Recognition in public places when highly targeted – for example scanning the faces of people going into a football match against watchlists of those known to cause trouble in football matches in the past – that is arguably in the public interest.

“However, scanning everyone walking down the street against a watchlist of people you would like to arrest seems to be a bit more difficult because it gets near mass surveillance. I don’t think in this country we have ever really wanted to see police using mass surveillance. That’s what the Chinese are doing with facial recognition. There are some lines between legitimate use to protect people who have committed crimes against a rather different use. It is not for me to say where the line is. Nor should it be the police who say where it is. That’s the debate we are not having. I feel it is frustrating that ministers are not talking about it. And before we ask Parliament to decide, we need to have a public debate.”

Cases have already emerged where Mr Wiles’s line appears to have been crossed. Last year, the Trafford Centre in Manchester had to stop using live facial recognition cameras after the Surveillance Camera Commissioner intervened. Up to 15 million people were scanned during the operation.

At Sheffield’s Meadowhall shopping centre, some two million people are thought to have been scanned in secret police trials last year, according to campaign group Big Brother Watch.

The privately-owned Kings Cross estate in London has also had to switch off its facial recognition cameras after it became public. It later emerged the Met Police shared images of suspects with the property firm without anyone’s consent or senior officers and mayor’s office apparently knowing.

Liverpool’s World Museum scanned visitors with facial recognition cameras during its exhibition, “China’s First Emperor and the Terracotta Warriors” in 2018, while Millennium Point conference centre in Birmingham – a scene of demonstrations by trade unionists, football fans and anti-racism campaigners – has deployed it “at the request of law enforcement.”

The Daily Telegraph has revealed Waltham Forest council in London has trialled facial recognition cameras on its streets without residents’ consent and even that AI and facial expression technology is being used for the first time in job interviews to identify the best UK candidates.

As its use widens, one key issue is its reliability and accuracy. The success of the technology’s algorithms in matching a face is improving and is good when there is a high-quality image – as at UK passport control – but less effective with CCTV images that do not always give a clear view.

The US National Institute of Standards and Technology which assesses the ability of algorithms to match a new image to a face in a database estimates it has improved 20-fold between 2014 and 2018.

However, it is by no means infallible. In South Wales, police started in 2017 at a Champions League match with a “true positive” rate – where it got an accurate match with a “suspect” on its database – of just three per cent. This rose to 46 per cent when deployed at the Six Nations rugby last year.

Across all events where deployed, there were 2,900 possible matches of suspects generated by the facial recognition system, of which only 144 were confirmed “true positives” by operators; 2,755 were “false positives,” according to the analysis by Cardiff University. Eighteen people were arrested.

The researchers found performance fell as light faded and was less accurate if faces were obscured by clothing, glasses or jewellery. They concluded it should be viewed as “assisted” rather than “automated” facial recognition, as the decision on whether there was a match was a police officer’s.

Professor Peter Fussey, from Essex University, who reviewed the Met Police trials of the technology, said only eight of the 42 “matches” that they saw thrown up by the technology were accurate.

Sixteen were instantly rejected in the van as it was clear they were the “wrong ethnicity, wrong sex or ten years younger,” he said. Four were then lost in the crowd, which left 22 suspects who were then approached by a police officer in the street to show their id or be mobile fingerprinted.

Of these, 14 were inaccurate and just eight were correct.

In the febrile world of facial recognition, how you measure success is a source of debate. It could be argued you have high 90 per cent-plus accuracy given the cameras scan thousands of faces to pick out the 42. Or you can measure it according to the ratio of “false positives” to accurate matches.

On human rights grounds, Professor Fussey said his concern was consent and legitimacy. In Berlin and Nice, the trials have been conducted where volunteers have signed up to be “members of the public” to test the facial recognition technology.

By contrast, in the Met police trial in Romford, he saw one young man targeted after being seen to avoid the facial recognition cameras which were signposted as in operation. “He was stopped and searched and had a small quantity of cannabis on him and was arrested,” he said.

He may have acted suspiciously by trying to avoid the camera but he was not a suspect on any “watch list,” nor had he consented to take part in the “trial.”

“For me, one of the big issues is around the definition of ‘wanted’,” said Professor Fussey. “There is ‘wanted by the courts’ where someone has absconded and there is judicial oversight. Then there is ‘wanted by the police’ which is wanted for questioning or wanted on the basis of suspicion.”

South Wales police was careful to prepare four watchlists: red – those who posed a risk to public safety, amber – offenders with previous convictions, green – those whose presence did not pose any risk, blue – police officers’ faces to test the system.

However, human rights campaigners cite as an example police databases that hold the images of 21 million innocent people who have been arrested or in custody but never convicted.

It has been ruled such databases are illegal but so great is the task of processing and deleting them that progress on doing so has stalled.

So concerned is the Commons science and technology committee that in its recent report on face recognition it called for a moratorium on its use until rules on its deployment are agreed.

Professor Wiles sums it up:

“There are some uses that are not in the public interest. What that raises is who should make that decision about those uses. The one thing I am clear about is that the people who want to use facial recognition technology should not be the people who make that decision. It ought to be decided by a body that represents the public interest and the most obvious one is Parliament. There should be governance backed by legislation. Parliament should decide, yes, this is in the public interest provided these conditions are met. We have done that with DNA. Parliament said it’s in the public interest that the police can derive profiles of individuals from DNA but it’s not in the public interest that police could keep the samples. You can tell a lot more about a person from a sample. It’s important because if you get it wrong, the police will lose public trust and in Britain, we have a historic tradition of policing by consent.”

Ms Denham, the Information Commissioner, is to publish the results of her investigation into facial recognition, which she says should only be deployed where there is “demonstrable evidence” that it is “necessary, proportionate and effective.”

She has demanded police and other organisations using facial technology must ensure safeguards are in place including assessments of how it will affect people before each deployment and a clear public policy on why it is being used.

“There remain significant privacy and data protection issues that must be addressed and I remain deeply concerned about the rollout of this technology,” she says.

By 27 August 2019

Facial recognition technology burst into the headlines this month following an exposé in the Financial Times about its use in London’s King’s Cross.

The Information Commissioner’s Office has launched an investigation into the use of the technology, which scanned pedestrians’ faces across the 67-acre site comprising King’s Cross and St Pancras stations and nearby shopping areas, without their knowledge.

It is the latest controversy to embroil the technology. Manchester’s Trafford Centre was ordered to stop using it by the Surveillance Camera Commission, which works for the Home Office.

Information commissioner Elizabeth Denham said after details of the King’s Cross scheme emerged that she was “deeply concerned about the growing use of facial recognition technology in public spaces”.

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all”

Elizabeth Denham, information commissioner

“Scanning people’s faces as they lawfully go about their daily lives in order to identify them is a potential threat to privacy that should concern us all,” she maintained.

“That is especially the case if it is done without people’s knowledge or understanding. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

The European Commission is also understood to planning new regulation that will give EU citizens explicit rights over the use of their facial recognition data as part of an update of artificial intelligence laws.

What’s it for?

So what does that mean for retailers that are either already deploying or are considering a roll-out of facial recognition technology in their stores?

Given the level of concern and scrutiny from regulators and public alike about how such technology is used, can retailers deploy it in a way that adds value to their business and without risking alienating customers?

Innovation agency Somo’s senior vice-president of product management Tim Johnson says: “There’s a very wide range of things [facial recognition] could potentially be used for. It is a very significant technology and a really seamless process that provides a strong form of identification, so it is undeniably big news.

“But at the moment it is a big muddle in terms of what it is for, whether it is useful or too risky and in what ways. We’ll look back on where we are now as an early stage of this technology.”

One area where facial recognition technology has been piloted by retailers is in-store to crack down on shoplifting and staff harassment.

“The only information held is on those who are known to have already committed a crime in the store previously”

Stuart Greenfield, Facewatch

According to the BRC, customer theft cost UK retailers £700m last year, up 31% year on year, while 70% of retail staff surveyed described police response to retail crime as poor or very poor.

Against that backdrop, retailers such as Budgens have rolled out tech from facial recognition provider Facewatch to stores across the South and Southeast, after a trial in an Aylesbury shop resulted in a 25% crime reduction.

Facewatch marketing and communications director Stuart Greenfield explains that clear signage is displayed throughout any store where the platform’s technology is used, and any data is held in Facewatch’s own cloud platform, not by the retailers.

“The only information held is on those who are known to have already committed a crime in the store previously, anyone whose face is scanned by the system and does not correspond against our existing watchlist is deleted immediately,” says Greenfield.

He believes it is the “combination of marketing, in-store signage and the system itself” which acts as a deterrent to shoplifting and staff harassment in stores where Facewatch’s technology is used.

Shopping centre operator Westfield has teamed up with digital signage firm Quividi, which analyses passersby’s facial data based on their age, gender and mood to determine which adverts are displayed as a means of driving customer engagement and sales. Shoe specialist Aldo and jeweller Pandora also work with Quividi overseas.

Quividi chief marketing officer Denis Gaumondie argues that the platform’s technology is not facial recognition – rather it is facial analysis, because it does not store any data on passersby and would therefore not recognise a repeat customer, or link their data to purchases.

He adds that it is the responsibility of Quividi’s retail partners to inform shoppers that the technology is in use.

Hot potato

However, DWF partner Ben McLeod, who specialises in commercial and technology law, says even using facial recognition or analysis technology in-store as described above could land retailers in hot water.

“There is a general prohibition on processing special category data [which may, for instance, include racial or ethnic origin] unless a specific exception applies,” he points out. “Many of the exceptions relate to the public interest which doesn’t really apply to retailers, particularly where the primary purpose for the use of the technology is marketing or to prevent stock loss.”

“Processing is possible where the data subject [the customer] has given explicit consent, but in practice, this will be difficult to demonstrate, as merely alerting customers to the use of facial recognition technology will not suffice.”

“Given that the basis on which the police are using surveillance technology is also currently subject to legal challenge, retailers are advised to tread carefully,” he cautions.

Opting in

Facial recognition technology is prompting controversy

Facial recognition has also been tried out by the Co-op to verify the purchase of age-restricted products such as alcohol at self-service checkouts. Customers found to be over 30 were allowed to complete the purchase without the need for verification by a member of staff.

Johnson believes such use of facial recognition technology would be welcomed by many customers because it would require their specific consent to use it, as was the case with the Co-op, as would verification of the purchase of a whole shopping basket using biometric data.

“People are comfortable with using facial identification on their own device [such as Apple’s Face ID], so using it as a means of verifying purchases in-store feels like a logical next step. It would speed up the check-out experience.”

Capgemini principal consultant Bhavesh Unadkat also points to the roll-out of Amazon Go stores in the US, which verify shoppers’ purchases and link them to their Amazon account using biometric data including facial recognition technology.

He explains that shoppers who download the Amazon Go app and then go into one of the checkout-free stores understand what technology is being used, and how it is benefiting them by providing an efficient shopping experience. The trade-off is clear and there is an “opt-in” to use the technology.

“I don’t think [retailers] can ask customers to opt out of facial recognition technology being used in-store, or just alert them to it being there,” he says.

“They need to ask shoppers to opt in and sell them the benefits they would get, such as a cashless checkout, more rewards, personalised offers to your mobile as you enter the store. Don’t go down the route of assuming people will never opt in and not communicating effectively, because if you get it wrong then the trust is broken.

“Right now we are making a mess of [facial recognition technology] because people are already paranoid about sharing information online and now feel like they are being victimised in a bricks-and-mortar environment as well.”

McLeod concurs with that view.

He says: “Amazon Go is the kind of thing where people are making a choice upfront by downloading the app. That is different from walking into a shopping centre or having the technology foisted upon you in a way that isn’t transparent.

“It becomes far more pervasive in that setting, but the more fundamental issue is there isn’t a strong legal grounding for the use of the technology.”

Right side of the law

Greenfield emphasises that Facewatch is working with the ICO to ensure its technology remains compliant with current and incoming regulations.

“We are pushing like mad for legislation as quickly as possible,” he says. “We want to do everything that is good for the technology because the reality is we cannot put the genie back in the bottle; [facial recognition] is out there and it will be used by someone, so we should have legislation to ensure it is used properly.”

Johnson advises retailers to collaborate closely with engaged suppliers and legislators, and tread carefully when deploying facial recognition technology, but does not believe that current controversies should deter retailers from using it for good.

He says: “I absolutely think [retailers] should still be exploring it. The current environment should make them fully aware of the risks, but it isn’t going away and the potential rewards are large, from crime prevention to age verification and flagging relevant products to customers.

“We’ll hopefully see a period of innovation which shows people what [facial recognition] is useful for.”

Reported by:

As the world’s first case against the controversial technology concluded, two leading judges dismissed the case brought by human rights campaign group Liberty on behalf of Ed Bridges, a Cardiff resident whose face was scanned by South Wales Police during a trial of facial recognition.

Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that South Wales Police’s use of live facial recognition “met the requirements of the Human Rights Act”.In a three-day hearing in May, Mr Bridges’ lawyers had argued that South Wales Police violated his human right to privacy by capturing and processing an image taken of him in public.

The judges also ruled that existing data protection law offered sufficient safeguards for members of the public whose faces were scanned by facial recognition cameras, and that South Wales Police had considered the implications.

Liberty lawyer Megan Goulding said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

A spokesperson for the Information Commissioner’s Office said: “We will be reviewing the judgment carefully.

“We welcome the court’s finding that the police use of Live Facial Recognition (LFR) systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”