The citymeetings.nyc logo showing a pigeon at a podium with a microphone.

citymeetings.nyc

Your guide to NYC's public proceedings.

Q&A

Facial recognition technology and concerns about algorithmic bias

1:11:22

ยท

3 min

Council Member Yusef Salaam raises concerns about facial recognition technology and its potential for misidentifying people of color. NYPD officials respond to these concerns and explain their use of the technology.

  • NYPD acknowledges past studies showing disproportionate misidentification rates but argues the technology has improved
  • Facial recognition is used only as a lead, not as sole basis for arrest
  • Multiple human reviews are involved in the process, including frontline and supervisory reviews
  • The facial recognition algorithm uses mathematical measurements, not color, according to NYPD
  • NYPD uses DataWorks Plus, which is recognized under the NIST evaluation system
Yusef Salaam
1:11:22
So they just have, like, if if if someone, for instance, with the FOIA request, they would have to make sure that they would ask for or request that information within that time period.
1:11:34
Yes.
1:11:36
I wanna move to facial recognition and algorithmic bias.
1:11:41
Studies have shown that facial recognition misidentifies people of color at disproportionate rates.
1:11:47
What steps has the NYPD taken to ensure that its facial recognition does not contribute to wrongful arrest?
1:11:54
So
Michael Gerber
1:11:55
I I'm aware of some of the, you know, studies or reports that you're referring to.
1:11:59
Obviously, it's a very serious matter.
1:12:00
I do think that some of those studies were in the earlier days of facial recognition.
1:12:05
I think that the technology has actually gotten much better.
1:12:09
So I think that actually lessens the concerns on that front.
1:12:11
Now that said, I think this is critical, facial recognition is only elite.
1:12:19
It is only elite.
1:12:21
Right?
1:12:21
No one is getting arrested on the basis of a facial recognition match standing alone.
1:12:26
Right?
1:12:27
We we can't do that.
1:12:29
We don't do that.
1:12:30
It is a lead which then sends the detectives to do additional work to try to develop probable cause.
1:12:36
I also so so part of this is it's a lead but it's not in of itself gonna be the basis for an arrest.
1:12:41
Think that's critical.
1:12:41
It's very, very important.
1:12:45
Also I think critical is that this is not some fully automated process that takes human beings out of it.
1:12:51
On the contrary, and this is described in our facial recognition policy which is public, it's also described in the IUP, there are human beings involved here as well.
1:13:00
Right?
1:13:00
And in fact, when a when there's when there's a potential match, you're gonna actually have two separate individuals looking at that.
1:13:08
So the frontline review and then a second supervisory review before the match is even sent back to a detective and again they're only as a lead.
1:13:17
So there there are a lot of there are a lot of checks in place.
Yusef Salaam
1:13:22
What is the false positive rate?
1:13:24
Oh, sorry.
Jason Savino
1:13:25
I'm sorry.
1:13:26
I just wanna add to that in a sense that, you know, we need to know how the technology works.
1:13:32
Right?
1:13:32
It works off an algorithm.
1:13:34
And that algorithm doesn't even see color.
1:13:37
In a sense that it works off a mathematical measurement.
1:13:40
So it'll take quadrants of your face and then measure each variables.
1:13:46
So go nose to ear and and what have you.
1:13:49
So it really doesn't even seem color at all.
1:13:52
In fact, we've used black and white photos and had the exact same replicas from inputting it into this database as we have with color photos.
1:14:03
So it doesn't see gender, it doesn't see race, what it does see is mathematical measurements.
Michael Gerber
1:14:11
I also wanna add one other thing on this which is that we, the algorithm we use, it's DataWorks Plus.
1:14:17
A NIST, it's sort of an evaluation, separate from the NYPD, it's sort of an industry standard kind of evaluation system.
1:14:25
And DataWorks Plus is sort of one of the sort of recognized algorithms under that NIST evaluation system.
Citymeetings.nyc pigeon logo

Is citymeetings.nyc useful to you?

I'm thrilled!

Please help me out by answering just one question.

What do you do?

Thank you!

Want to stay up to date? Sign up for the newsletter.