The citymeetings.nyc logo showing a pigeon at a podium with a microphone.

citymeetings.nyc

Your guide to NYC's public proceedings.

Q&A

Council Member Holden questions OTI on AI complaint mechanisms and public transparency

0:59:13

·

3 min

Council Member Robert Holden engages in a Q&A session with Alex Foard from the Office of Technology and Innovation (OTI) regarding the use of AI and automated decision systems in city agencies. The discussion focuses on public complaint mechanisms, transparency, and the identification of AI-assisted decisions.

  • Holden inquires about the existence of a platform for public complaints about AI used by city agencies.
  • Foard explains OTI's approach to transparency and the need to consider existing processes for public engagement.
  • The conversation touches on Local Law 35 and its role in reporting AI tools with material impact on the public.
Robert Holden
0:59:13
Thank you, Sharon.
0:59:13
I'm sorry.
0:59:14
I was late with their mind environmental and with their hearing my bill today.
0:59:20
So I gotta run back there.
0:59:21
But if I if I do ask a question that was answered, though, forgive me.
0:59:26
So is there a place where people can file a complaint about ADS or AI used by a city agency.
0:59:34
And, you know, there is currently no public facing platform that provides a mechanism for receiving public comments and questions about specific ADS used by city agencies.
0:59:47
Do you agree that such a platform is necessary to ensure transparency and public trust?
Alex Foard
0:59:54
We very much support whatever we can do to make sure that public trust and transparency is is paramount.
1:00:00
Right?
1:00:00
That's why we have local law 35.
1:00:02
That's why we report above and beyond what's prescribed in the bill.
1:00:07
In terms of thinking through, you know, opportunities for redress, etcetera, what we sort of need to be aware of is where those processes exist that are not unique to AI, where your agencies may have processes for how members of the public are intended to get in touch with them, to talk about decisions that have been made, whether or not that decision involved, automated decision making or not.
1:00:29
So as we think about our landscape of policy, what we wanna make sure is that as we think about what is particular to AI, that we're also accounting for what already exists.
1:00:39
That addresses some of those needs, but maybe without the AI leave on it.
Robert Holden
1:00:43
So your office will establish protocols for investigating complaints or inquiries.
1:00:50
I mean, you you that has to be you you got you're you're you're working on that?
Alex Foard
1:00:55
So, again, when we think about what a complaint could be, it could take a lot of different forms.
1:00:58
Right?
1:00:58
It could be somebody doesn't like an output that has happened or, obviously, of course, in worst case, somebody could feel that they were discriminated against for example.
1:01:07
Right?
1:01:07
Each of those is not the same as one another.
1:01:11
And so when we talk about what it means to think through policies that address the risks of AI, again, we have to be mindful of what else is there to account for those risks.
1:01:23
That isn't unique to AI.
1:01:24
Right?
1:01:24
So again, if somebody feels that they've been discriminated against, the city's human rights law is there to protect them against discrimination, and there's an avenue for for complaining for that.
1:01:34
But when it comes to say disagreeing with the business, the shit decision, etcetera, that's where we wanna make sure that the agencies have their processes and that whatever processes they have are accounted for before we try and and do something duplicative.
Robert Holden
1:01:49
Yeah.
1:01:49
So how would the public know if a decision was made with the assistance of AI.
1:01:53
I mean, is there gonna be a
Alex Foard
1:01:55
So most of that will be through local law 35 when it does require the reporting of those tools that have a material impact.
1:02:01
So, you know, the presumption is that if there's something that's involving individual directly, that's likely to be a material impact, in which case that tool will be reported under 11135.
1:02:11
Good.
Robert Holden
1:02:11
Alright.
1:02:12
Thank you.
1:02:12
Thank you, Cher.
Citymeetings.nyc pigeon logo

Is citymeetings.nyc useful to you?

I'm thrilled!

Please help me out by answering just one question.

What do you do?

Thank you!

Want to stay up to date? Sign up for the newsletter.