New NYC law restricts hiring based on artificial intelligence
Share Now on:
When a new law in New York City takes effect at the start of 2023, employers won’t be allowed to use artificial intelligence to screen job candidates unless the tech has gone through an audit to check for bias.
The potential for algorithmic discrimination in hiring has been the target of state laws in Illinois and Maryland. The federal Equal Employment Opportunity Commission also recently formed a working group to study the issue.
The internet has made applying for jobs easier than ever, but it’s also made the process less human, said Joseph Fuller at Harvard Business School.
“When you open the faucet, all of a sudden a lot of applications started coming in, and no one’s gonna hit print 250 times,” he said.
So most big companies use some sort of automated recruiting system, which narrows down the candidate pool using algorithmic filters. “If you don’t have this, you’re out. If you don’t have that, you’re out,” Fuller said.
It can be anything, from years of experience to your choice of words. Companies are also increasingly using automated video interviews, per Lindsey Cameron at the Wharton School.
“And it’s sort of monitoring your tone and your facial expressions and, you know, the depth and quality of your responses as best as they can,” she said.
Which, though maybe a bit creepy, isn’t necessarily bad, she said. Automated systems have the potential to bypass some human biases, but too often bias is just built into the tech, said Nicol Turner Lee, director of the Center for Technology Innovation at the Brookings Institution.
“Computers are programmed by humans, that they come with the same values, norms and assumptions that humans hold,” she said.
Amazon reportedly scrapped the AI recruiting system it was using a few years ago because of concerns about gender bias. Turner Lee said the algorithm was trained on historical data about successful candidates.
“Because the data was trained on men, it kicked out any resume that suggested a woman’s name, a woman’s college or woman’s extracurricular activity, like the women’s lacrosse team,” she said.
Likewise, facial recognition software can disadvantage people with darker skin when algorithms are trained on white faces. There needs to be greater oversight, Turner Lee said, to make sure AI complies with civil rights law.
We’re here to help you navigate this changed world and economy.
Our mission at Marketplace is to raise the economic intelligence of the country. It’s a tough task, but it’s never been more important.
In the past year, we’ve seen record unemployment, stimulus bills, and reddit users influencing the stock market. Marketplace helps you understand it all, will fact-based, approachable, and unbiased reporting.
Generous support from listeners and readers is what powers our nonprofit news—and your donation today will help provide this essential service. For just $5/month, you can sustain independent journalism that keeps you and thousands of others informed.