Your guide to NYC's public proceedings.

PUBLIC TESTIMONY

Testimony by Jerome Greco, Digital Forensic Director of the Legal Aid Society, on NYPD Surveillance Technology and POST Act Compliance

2:35:42

ยท

4 min

Jerome Greco, Digital Forensic Director at the Legal Aid Society, provided testimony challenging NYPD statements on surveillance technology use and POST Act compliance. He highlighted issues with drone usage documentation, facial recognition technology misuse, and the process of facial recognition leading to arrests.

  • Greco reported cases where drone deployment reports were not completed and video not preserved, contrary to NYPD claims.
  • He revealed an instance where FDNY used Clearview AI for facial recognition, which is prohibited by NYPD policy.
  • Greco detailed the problematic process of how NYPD conducts facial recognition, including photo manipulation and potentially biased identification procedures.
Jerome Greco
2:35:42
We try.
2:35:44
Good morning, good afternoon.
2:35:46
I'm Jerome Greco, I'm the digital forensic director at the Legal Aid Society.
2:35:50
Thank you for holding this hearing and for allowing us to speak here.
2:35:54
I had prepared testimony oral testimony, but I think I better spend my time replying to some of the things the NYPD says, particularly things that I I know to either not be true or not be the full picture.
2:36:08
Regarding drones and the drone as a first responder program, we are aware of cases in which the DA's offices have told that deployment report forms were not completed, that the video was not preserved, and the only way we actually were learned that a drone was used was over the the radio run and the iCAD report mentioning drones.
2:36:35
And we had to actually provide that information to the prosecutor for the prosecutor to even be made aware that drone was used in their case.
2:36:45
Related to facial recognition, we're aware of at least one case in which the FDNY provided facial recognition results to the NYPD and not only did they do that but they used a program or a company that the NYPD prohibits its own officers from using which is Clearview AI.
2:37:02
If you read the NYPD's IUP, it prohibits them from using a software or program that compares against anything outside of the NYPD's database, Clearview AI is pulling its data from social media and internet scraping.
2:37:19
The NYPD had actually trial had done a trial with them many years ago and had not continued that.
2:37:25
But we are at least aware of one situation in which that happens which would seem to me to be a violation of the NYPD's own policies.
2:37:34
It's also very confusing here with how willing they were to get around the idea of how many false positives, false negatives, false arrests.
2:37:47
They very clearly said full false convictions.
2:37:49
We're not aware of any false convictions.
2:37:51
That's not the question.
2:37:52
As any of us will know, if you've been falsely arrested that still upturns your life and I'm aware of cases in which that has happened.
2:38:01
Unfortunately because my clients do not want their names on the front page of the New York Times are choosing not to come forward with it and I have to comply with that.
2:38:10
But the NYPD is aware of that as well.
2:38:14
I'd also like to talk about this the way they actually do facial recognition because they were very cagey about that.
Sergio De La Pava
2:38:20
Do my best to be
Jerome Greco
2:38:21
as fast as possible.
2:38:23
A detective gets a still photo and sends it to their FIS, facial identification section.
2:38:29
They actually photoshop that photo to make it more likely to get a result.
2:38:33
They then get up to 250 possible candidates that are ranked in order for what the system believes most looks like the photo that they submitted to it.
2:38:43
Then a detective from FIS looks at it and says, well I think this one is the one that looks most like it.
2:38:50
Doesn't matter if it's number one, number 50, number 200.
2:38:54
They then present that to the supervisor and say, do you agree?
2:38:57
Oh, I forgot a step.
2:38:58
They also will check to see if that person was incarcerated, is still living, is in a was hospitalized at the time in order to to make sure that, oh, you know, we can't be wrong here.
2:39:09
They then provide that to the supervisor who says, oh yeah, they they look alike.
2:39:15
The next step that is most frequently happening now is they will find the officer, an officer who previously arrested that person.
2:39:23
And they will send the photo or the video to that person, to that officer and say, do you recognize who this is?
2:39:29
Do you recognize who this photo depicts?
2:39:32
Often they will say yes.
2:39:34
The problem with that is that it's highly prejudicial.
2:39:37
It is an improper ID procedure because that officer has nothing to do with the case.
2:39:42
So he knows the only reason you are reaching out to him to see if he knows who this person is is because you already assumed that he does.
2:39:48
And so he thinks about who do I know who looks like this?
2:39:52
Right?
2:39:52
Who have I previously arrested who looks like this or or interrogated who looks like this?
2:39:56
Oh, that's who this is.
2:39:58
The person I think most looks like it.
2:40:00
They consider that enough for probable cause and then they make the arrest.
2:40:03
So this whole thing about, oh, use all these different tools, that's how the process actually works.
Yusef Salaam
2:40:08
Thank you.
2:40:08
If you can wrap up, that's that'd be perfect.
Jerome Greco
2:40:11
So I I support these bills to at least update the post act to make it better and as on behalf of Legal Aid Society to do that.
2:40:19
Sorry for taking up too much time.
Yusef Salaam
2:40:20
Thank you.
Citymeetings.nyc pigeon logo

Is citymeetings.nyc useful to you?

I'm thrilled!

Please help me out by answering just one question.

What do you do?

Thank you!

Want to stay up to date? Sign up for the newsletter.