×
×
homepage logo
SUBSCRIBE

Tech Matters: First steps to regulate AI in hiring process

By Leslie Meredith - Special to the Standard-Examiner | Jul 26, 2023

Photo supplied

Leslie Meredith

Artificial intelligence-based tools for recruiters have gained great popularity over the past several years, along with a growing concern that they perpetuate bias in the hiring process and, in some instances, just plain don’t work. Federal legislators are looking to regulate AI tools used for hiring, while two states — Illinois and Maryland — have already passed legislation.

Earlier this month, New York City passed a new law, said to be the first of its kind in the world, that requires companies in the city to have their AI tools audited by a third party to prove the tools do not produce biased results. This is just the tip of the iceberg; you can expect states to grapple with this issue to better protect their job-seeking citizens. But they will have to do it in an AI landscape that is constantly expanding and becoming more sophisticated. What seems like reasonable legislation today may be hopelessly outdated tomorrow.

The public remains relatively unaware of AI’s use in hiring, according to a Pew Research Center survey conducted in the spring. The majority of Americans (61%) have heard nothing at all about AI being used by employers in the hiring process, while 39% say they have heard at least a little about this, including 7% who have heard a lot.

AI tools can be used across the hiring process, starting with writing job descriptions. If you’ve used ChatGPT, you can see how creating descriptions would work: upload a current description, add any modifications, specify variations that you might need based on location or job level, and ask the program to write the descriptions in a specified format. The quality of the output will rely on your prompt.

Next is posting ads to social platforms, and that’s where bias can creep in. AI systems can be delivered in a way that reinforces both gender and racial stereotypes. In a study conducted by Harvard Business Review in 2019, researchers found that Facebook ads for supermarket cashier positions were shown to a pool that consisted of 85% women, while taxi driver jobs were served to an audience of around 75% Black individuals. Similarly, job board recommendation systems, such as ZipRecruiter, learn the preferences of employers: If a recruiter interacts more frequently with white males, the system is likely to produce more white male candidates.

AI tools can be a significant timesaver during the resume review process, but they must be used properly to avoid bias. Industry statistics report recruiters spend just seven seconds reviewing a resume — not much time to analyze an applicant’s work history. In contrast, a scanning program can process thousands of resumes quickly looking for matching keywords, baseline accomplishments, periods of employment and progression. With a valid list of criteria, an applicant shortlist can be generated quickly. An AI tool can disregard names that can reveal gender and ethnicity, along with dates that indicate age to avoid discrimination that may affect a human reviewer.

Some companies are using AI tools to conduct interviews, once the sole responsibility of recruiters and employers. In February of this year, AI interviewer tool developer Sapia announced it had added a new feature to its product “enabling companies to gauge candidate communication skills without resumes.”

Essentially, this is a ChatGPT adaptation that is text-based. The communication score update was tested on 2 million candidates from around the world to benchmark across different demographic groups. “It is critical that our algorithm does not unfairly disadvantage candidates who belong to different demographic groups and various job families,” Sapia’s lead data scientist Madhura Jayaratne said in a company press release.

The communication score algorithm is based on the length of responses, readability, word usage, word choice, development and organization of ideas, and others. A Talent Insights report is generated at the end of each chat interview. If communication is an important part of the role, tools like this one — based on research and thorough testing — can provide an efficient way to identify top candidates.

To circle back to sensible legislation that will grow with the industry, lawmakers should focus on several key factors. First, applicants should be made aware of the use of AI tools in the hiring process by recruiters and employers. For that matter, it’s only fair that applicants disclose that they too have used an AI-powered tool to refine their resumes. Like with the New York City regulation, these tools should be subject to regular independent audits to show they are not biased or could develop bias over time.

Leslie Meredith has been writing about technology for more than a decade. As a mom of four, value, usefulness and online safety take priority. Have a question? Email Leslie at asklesliemeredith@gmail.com.

Newsletter

Join thousands already receiving our daily newsletter.

I'm interested in (please check all that apply)