San Francisco is first US city to ban police use of facial recognition tech

Started by garbon, May 15, 2019, 03:47:34 AM

Previous topic - Next topic

garbon

https://www.theguardian.com/us-news/2019/may/14/san-francisco-facial-recognition-police-ban

QuoteSan Francisco supervisors voted to make the city the first in the United States to ban police and other government agencies from using facial recognition technology.

Supervisors voted eight to one in favor of the "Stop Secret Surveillance Ordinance", which will also strengthen existing oversight measures and will require city agencies to disclose current inventories of surveillance technology.

"This is really about saying: 'We can have security without being a security state. We can have good policing without being a police state.' And part of that is building trust with the community based on good community information, not on Big Brother technology," the supervisor Aaron Peskin, who championed the legislation, said on Tuesday.

Two supervisors were absent for Tuesday's vote. The board of supervisors is expected to vote on the new rules a second time next week, when they are expected to pass again.

Critics argued on Tuesday that police needed all the help they could get, especially in a city with high-profile events and high rates of property crime. That people expect privacy in public space is unreasonable given the proliferation of cellphones and surveillance cameras, said Meredith Serra, a member of a resident public safety group Stop Crime SF.

But those who support the ban say facial recognition technology is flawed and a serious threat to civil liberties.

Matt Cagle, a technology and civil liberties attorney at the ACLU of Northern California, argued the legislation was a positive step towards slowing the rise of technologies that may infringe on the rights of communities of color and immigrants. "Face surveillance won't make us safer, but it will make us less free," Cagle told the Guardian after the proposal passed a committee vote last week.

The ordinance applies to a wider range of technology, including automated license plate reading and gunshot-detection tools. It also expands a 2018 law requiring the San Francisco public transportation system Bart to outline how it surveils passengers.

Speaking to the Guardian last week, Peskin said the new regulations were meant to address concerns about the accuracy of technology and put a stop to creeping surveillance culture.

"We are all for good community policing but we don't want to live in a police state," Peskin said. "At the end of the day it's not just about a flawed technology, it's about the invasive surveillance of the public commons."
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

dps


DGuller

Good that someone finally understood that changing technology requires changing legal norms.  Being able to map someone's whereabouts with facial recognition may in the most stupidly literal sense not involve any new invasion of privacy, but in the real sense it does.

Barrister

Look, facial recognition tech is a tool.  It's not proof positive.  It can say "it strongly looks like person X is also person Y".  It's not strong enough to constitute proof beyond a reasonable doubt in court.  As much as I understand the science, it never can be that conclusive.  I had a long conversation with some experts on this topic a couple of months ago when I had a case about someone applying for a ID under a fake name.

As such - I can not understand why you would ban it from the law enforcement toolbox.  All it does is say "Hey this person looks a lot like this other person who committed a crime".  Without more it cannot constitute a conviction.

Criminal justice is supposed to be a search for the truth.  It is not a game, where police are supposed to fight with one hand held behind their back.  If the police cannot prove an offence beyond a reasonable doubt, so be it.  But let them use every tool in the tool box.
Posts here are my own private opinions.  I do not speak for my employer.

garbon

I think someone in this thread is a tool. ;)

"Let them use every tool in the tool box." :lmfao:

"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

dps

I don't think that BB gets the concerns about loss of privacy when tech like this is used for general surveillance.  OTOH, an outright, blanket ban seems a bit over the top.  For example, if the police have surveillance footage of an armed robbery, I can't see why they shouldn't be able to use facial recognition technology to search for possible matches with known robbers.  They're going to do that anyway by looking through mug shots using their own eyes, so why not let them use technology to speed up the process? 

EDIT:  and it just occurred to me that if Ide still posted here, he'd agree with BB.

The Brain

Quote from: Barrister on May 17, 2019, 01:00:02 AM
Criminal justice is supposed to be a search for the truth.  It is not a game, where police are supposed to fight with one hand held behind their back.  If the police cannot prove an offence beyond a reasonable doubt, so be it.  But let them use every tool in the tool box.

lolwut
Women want me. Men want to be with me.

crazy canuck

Quote from: dps on May 17, 2019, 05:29:52 AM
I don't think that BB gets the concerns about loss of privacy when tech like this is used for general surveillance.  OTOH, an outright, blanket ban seems a bit over the top.  For example, if the police have surveillance footage of an armed robbery, I can't see why they shouldn't be able to use facial recognition technology to search for possible matches with known robbers.  They're going to do that anyway by looking through mug shots using their own eyes, so why not let them use technology to speed up the process? 

EDIT:  and it just occurred to me that if Ide still posted here, he'd agree with BB.

The chances of wrongly accusing someone goes up substantially.  BB puts all his faith in the system working.

dps

Quote from: crazy canuck on May 17, 2019, 01:54:14 PM
Quote from: dps on May 17, 2019, 05:29:52 AM
I don't think that BB gets the concerns about loss of privacy when tech like this is used for general surveillance.  OTOH, an outright, blanket ban seems a bit over the top.  For example, if the police have surveillance footage of an armed robbery, I can't see why they shouldn't be able to use facial recognition technology to search for possible matches with known robbers.  They're going to do that anyway by looking through mug shots using their own eyes, so why not let them use technology to speed up the process? 

EDIT:  and it just occurred to me that if Ide still posted here, he'd agree with BB.

The chances of wrongly accusing someone goes up substantially.  BB puts all his faith in the system working.

Are you saying that the use of facial recognition software is more likely to result in someone being falsely accused due to a mistaken match than the same thing happening with some one in the police department eyeballing surveillance footage and looking through mugshots for a match, and by a substantial margin?  That's what I think you're saying here.  Do I understand you correctly?  I'm not saying that you're wrong (tough intuitively, that doesn't seem right), just trying to make sure I'm not misunderstanding your point.

crazy canuck

Quote from: dps on May 17, 2019, 02:24:03 PM
Quote from: crazy canuck on May 17, 2019, 01:54:14 PM
Quote from: dps on May 17, 2019, 05:29:52 AM
I don't think that BB gets the concerns about loss of privacy when tech like this is used for general surveillance.  OTOH, an outright, blanket ban seems a bit over the top.  For example, if the police have surveillance footage of an armed robbery, I can't see why they shouldn't be able to use facial recognition technology to search for possible matches with known robbers.  They're going to do that anyway by looking through mug shots using their own eyes, so why not let them use technology to speed up the process? 

EDIT:  and it just occurred to me that if Ide still posted here, he'd agree with BB.

The chances of wrongly accusing someone goes up substantially.  BB puts all his faith in the system working.

Are you saying that the use of facial recognition software is more likely to result in someone being falsely accused due to a mistaken match than the same thing happening with some one in the police department eyeballing surveillance footage and looking through mugshots for a match, and by a substantial margin?  That's what I think you're saying here.  Do I understand you correctly?  I'm not saying that you're wrong (tough intuitively, that doesn't seem right), just trying to make sure I'm not misunderstanding your point.

Yes that is exactly what I am saying.  the technology gives a false sense of certainty.  The fact the person has been "identified" is what sticks, not that the identification itself is uncertain.  There are a lot of case studies which demonstrate that once an investigator identifies the likely suspect they begin to build the case against them even if there is contrary evidence pointing to a more likely suspect which is found in the course of the investigation.

dps

Quote from: crazy canuck on May 17, 2019, 02:51:05 PM
There are a lot of case studies which demonstrate that once an investigator identifies the likely suspect they begin to build the case against them even if there is contrary evidence pointing to a more likely suspect which is found in the course of the investigation.

Oh, there's no doubt about that.  I'm just not sure the technology is more or less prone to error than a cop looking through a book of mug shots.  If it's not less prone to error, then it's more-or-less useless at best, I'd think.

crazy canuck

Quote from: dps on May 17, 2019, 03:55:50 PM
Quote from: crazy canuck on May 17, 2019, 02:51:05 PM
There are a lot of case studies which demonstrate that once an investigator identifies the likely suspect they begin to build the case against them even if there is contrary evidence pointing to a more likely suspect which is found in the course of the investigation.

Oh, there's no doubt about that.  I'm just not sure the technology is more or less prone to error than a cop looking through a book of mug shots.  If it's not less prone to error, then it's more-or-less useless at best, I'd think.

The difficulty is that we all intuitively understand that mistakes can be made by a cop looking through mug shots.  There are two problems with this technology.  First, it dramatically increases the number of potential faces to be matched and therefore dramatically increases the number of people who could be falsely accused.  But more problematic is it gives the impression of certainty.  It is no longer a human making a very human mistake.  No matter how many times someone is told the percentage chance of error by facial recognition the human who is supposed to consider that chance of error will almost certainly ascribe a degree of certainty which is not warranted.  We have seen that with a number of technologies. For example, lie detector tests are still admissible evidence for some purposes even though they have been demonstrated to be unreliable.  There is something about technology that causes us to give it deference we would never give to human judgment.

dps

Quote from: crazy canuck on May 17, 2019, 04:04:07 PM
Quote from: dps on May 17, 2019, 03:55:50 PM
Quote from: crazy canuck on May 17, 2019, 02:51:05 PM
There are a lot of case studies which demonstrate that once an investigator identifies the likely suspect they begin to build the case against them even if there is contrary evidence pointing to a more likely suspect which is found in the course of the investigation.

Oh, there's no doubt about that.  I'm just not sure the technology is more or less prone to error than a cop looking through a book of mug shots.  If it's not less prone to error, then it's more-or-less useless at best, I'd think.

The difficulty is that we all intuitively understand that mistakes can be made by a cop looking through mug shots.  There are two problems with this technology.  First, it dramatically increases the number of potential faces to be matched and therefore dramatically increases the number of people who could be falsely accused.  But more problematic is it gives the impression of certainty.  It is no longer a human making a very human mistake.  No matter how many times someone is told the percentage chance of error by facial recognition the human who is supposed to consider that chance of error will almost certainly ascribe a degree of certainty which is not warranted.  We have seen that with a number of technologies. For example, lie detector tests are still admissible evidence for some purposes even though they have been demonstrated to be unreliable.  There is something about technology that causes us to give it deference we would never give to human judgment.

Fair enough.  Heck, I basically proved your point--I had simply assumed that the technology would be more accurate than a cop looking at mugshots.

derspiess

"If you can play a guitar and harmonica at the same time, like Bob Dylan or Neil Young, you're a genius. But make that extra bit of effort and strap some cymbals to your knees, suddenly people want to get the hell away from you."  --Rich Hall

garbon

https://www.bbc.co.uk/news/uk-48315979

QuotePolice facial recognition surveillance court case starts

Ed Bridges, whose image was taken while he was shopping, says weak regulation means AFR breaches human rights.

The civil rights group Liberty says current use of the tool is equivalent to the unregulated taking of DNA or fingerprints without consent.

South Wales Police defends the tool but has not commented on the case.

In December 2017, Mr Bridges was having a perfectly normal day.

"I popped out of the office to do a bit of Christmas shopping and on the main pedestrian shopping street in Cardiff, there was a police van," he told BBC News.

"By the time I was close enough to see the words 'automatic facial recognition' on the van, I had already had my data captured by it.

"That struck me as quite a fundamental invasion of my privacy."

The case could provide crucial guidance on the lawful use of facial technology.

It is a far more powerful policing tool than traditional CCTV - as the cameras take a biometric map, creating a numerical code of the faces of each person who passes the camera.

These biometric maps are uniquely identifiable to the individual.

"It is just like taking people's DNA or fingerprints, without their knowledge or their consent," said Megan Goulding, a lawyer from the civil liberties group Liberty which is supporting Mr Bridges.

However, unlike DNA or fingerprints, there is no specific regulation governing how police use facial recognition or manage the data gathered.

Liberty argues that even if there were regulations, facial recognition breaches human rights and should not be used.

The tool allows the facial images of vast numbers of people to be scanned in public places such as streets, shopping centres, football crowds and music events.

The captured images are then compared with images on police "watch lists" to see if they match.

"If there are hundreds of people walking the streets who should be in prison because there are outstanding warrants for their arrest, or dangerous criminals bent on harming others in public places, the proper use of AFR has a vital policing role," said Chris Phillips, former head of the National Counter Terrorism Security Office.

"The police need guidance to ensure this vital anti-crime tool is used lawfully."

...

Civil liberties groups say studies have shown facial recognition discriminates against women and those from ethnic minorities, because it disproportionately misidentifies those people.

"If you are a woman or from an ethnic minority and you walk past the camera, you are more likely to be identified as someone on a watch list, even if you are not," said Ms Goulding.

"That means you are more likely to be stopped and interrogated by the police.

"This is another tool by which social bias will be entrenched and communities who are already over-policed simply get over-policed further."

Liberty says the risk of false-positive matches of women and ethnic minorities has the potential to change the nature of public spaces.

...
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.