AI used for hiring and recruitment can be biased. But that’s changing.
Share Now on:
This episode originally aired on Nov. 23, 2022.
Artificial intelligence is commonly used in automated recruitment programs. It helps narrow down large pools of applicants using algorithms to match job seekers to open positions. But there are growing concerns that this technology is disproportionately excluding certain groups, like women, people of color or those who don’t have college degrees, even when they’re perfectly qualified.
Growing up in Michigan, with a dad who spent four decades working for General Motors, Ciare Primus had an example of what a “career” was. But, Primus said, for most of her working life, all she had were jobs. “You don’t really go further in the position that you’re in,” she said, “so you’re just there to do that job, go home, do it all over again.”
Primus has made it to manager, sometimes in two or three jobs at once: at a car dealership, a fast food restaurant and a clothing store. But she never made it the next level up, she said, because she didn’t have a four-year degree.
An increasing number of employers have vowed to do away with that requirement, but degree bias is often embedded in the screening software companies use, said Shad Ahmed with the nonprofit Opportunity@Work.
“How do you create a job description that doesn’t fall back on degree requirements? Our systems haven’t caught up with that,” said Ahmed.
The artificial intelligence typically used to filter resumes looks for degrees and specific schools, previous employers and job titles, which likely had degree requirements. “Those algorithms are trained based on history and trained by humans,” Ahmed added. “And, unfortunately, our labor market historically has used degree requirements as a proxy for skills.”
So Ahmed’s organization created its own hiring platform called Stellarworx, which aims to do away with biased “proxies” and home in on skills. “So a job is a collection of skills. A person is a collection of skills,” said Mohan Reddy, the co-founder of SkyHive, which built the algorithms that power the platform.
SkyHive uses artificial intelligence tested constantly for bias to find patterns in those skills in a more objective way than humans can. “When people write job descriptions, they’re very biased,” Reddy added.
They often include coded language that ends up being exclusionary and wish lists of skills that could be learned on the job. Meanwhile, nontraditional candidates often don’t highlight the transferable experience they already have. Stellarworx uses SkyHive to make those connections.
“That opens up new career pathways, new opportunities,” Reddy said. For instance, the opportunity for someone with a lot of retail and customer service experience to use their communication and team leadership skills as a supervisor at an automaker.
About 10 months ago, Ciare Primus used Stellarworx to get a full-time position as a group leader at General Motors. It’s one of more than 100 companies using the platform to fill jobs that otherwise might have required degrees.
“I went home one day and I went in my room, and I actually kind of cried because my dad passed away and I was able to look up in the heavens and tell my dad, ‘I have a career, Dad. I did it,’” Primus recalled.
Now that she’s not working three jobs, she’s using her free time to pursue a college degree in business.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.