Wednesday

New York will tackle unfair biases in automated city services

“Automated decision systems” are responsible for determining outcomes on a wide range of city/citizen matters. Take eligibility for bail, for example. Training data used to produce algorithms for this system may involve biases that unjustly favour one group of individuals over another. The task force would look at ways certain groups, such as the elderly, immigrants, the disabled and minorities, are affected by these automated processes.

The bill, named Intro 1696-A, is not as wide-reaching as advocates had initially hoped for. An earlier version would have mandated that all agencies making decisions with algorithms make their codes publicly available. The passed version simply requires the taskforce look into the feasibility of this.

If signed, the taskforce will need to be formed within three months, but the report itself wouldn’t be due for 18 months, which is fair given the size of such a data intensive task, and, of course, its importance. Weeding out algorithmic biases and challenging the systems that allow them to exist in the first place will have a massive civic impact and set vital precedents for the rest of the country.

Source link

Follow Us @soratemplates