TECHNOLOGY

The End Of The Resume? Hiring Is In The Midst Of A Technological Revolution With Algorithms, Chatbots

By Alexia Elejalde-Ruiz
Chicago Tribune

WWR Article Summary (tl;dr) As columnist Alexia Elejalde-Ruiz points out, advocates of AI-enhanced hiring claim it reduces turnover by bringing on candidates who are a better fit.

Chicago Tribune

The last time Chuck Blatt searched for a job, about 10 years ago, he relied on a thoughtful cover letter, a resume printed on nice paper and good rapport during a face-to-face interview.

Now, he said, “that is all out the window.”

Since Blatt, 50, left his job as vice president of a painting and construction company in March, he’s spent nearly every day in front of the computer in his Chicago home applying for jobs via automated processes.

He uploads his job history with the click of a button. He records videos of himself answering automated interview questions. He takes the lengthy online personality tests employers use to screen candidates.

Blatt, who is seeking a marketing position, says technology makes it easier to apply for more jobs. But other parts of the high-tech hiring process leave him uneasy.

“I have been turned down for positions that I thought I would be perfect for,” Blatt said, and it is often impossible to know why. “There is no feedback because there is no one to talk to.”

Technology is transforming hiring, as employers inundated with applications turn to sophisticated tools to recruit and screen job candidates. Many companies save time with video interviews or resume filters that scan for keywords, and those at the leading edge are using artificial intelligence in a variety of ways: chatbots that schedule interviews and answer applicant questions; web crawlers that scour mountains of data to find candidates who aren’t actively job hunting; and algorithms that analyze existing employee data to predict an applicant’s future success.

Advocates of AI-enhanced hiring claim it reduces turnover by bringing on candidates who are a better fit. They also say a data-driven approach removes bias inherent in human decision-makers who, for example, might favor candidates who graduated from their alma mater.

But critics warn of the opposite effect: that some applicants could be unfairly weeded out.

Cathy O’Neil, a mathematician and author of the 2016 book “Weapons of Math Destruction,” worries that algorithms developed to predict whether an applicant will be a good fit based on the types of employees who have been successful before could perpetuate implicit biases.

“If in the past you promoted tall white men or people who came from Harvard, that will come through in the algorithm,” O’Neil said. “Algorithms just look for patterns.”

The scoring is invisible, so even human resources departments don’t know why an applicant might have been rejected, making it difficult for anyone to challenge the process, she said.

There is also concern that algorithms and filters could quietly screen older people out, although that’s a concern with human recruiters as well. Blatt said that he has removed his college graduation date from his LinkedIn profile, plus all of his experience from the 1990s, so as not to advertise his age.

Blatt said he has landed a number of interviews, thanks to the volume of jobs he has applied to.

“A lot of it is a numbers game,” he said.

But Blatt, who is part of a networking group for executives run by JVS Chicago, a career counseling agency, said some of his older peers are so uncomfortable with automated systems that they refuse to go through with them.

Cutting costs, turnover
Much of the technology used in the hiring process shows great promise for helping employers cut costs associated with high turnover, said Natalie Pierce, co-chair of the Robotics, AI and Automation Industry Group at Littler Mendelson, a law firm that represents management.

One client, a department store that couldn’t retain cosmetics department employees, discovered through analytics that it had mistakenly assumed that hiring gregarious employees would lead to greater sales, when in fact the best salespeople were problem-solvers who invested time helping customers.

By changing the type of person it hired, the store was “greatly able to reduce training costs and attrition and increase the amount of commissions going to employees,” Pierce said.

But employers have to be careful. Algorithms designed to identify candidates similar to current high performers could screen out groups of people who are protected by anti-discrimination laws.

At a public meeting held by the Equal Employment Opportunity Commission to discuss the issue in 2016, a chief analyst at the federal agency described how an algorithm might find patterns of absences among employees with disabilities.

Even if the algorithm does not intentionally screen out people with disabilities, the impact could be discriminatory and therefore violate federal law, said Barry Hartstein, co-chair of Littler’s diversity practice.

“This is an area that the regulators are recognizing is the wave of the future,” he said. Littler’s growing AI team tests hiring algorithms to ensure they are having the intended outcomes.

The government has not filed any lawsuits based on an employer’s use of high-tech screening tools or algorithms, said Carol Miaskoff, associate legal counsel at the EEOC. But the agency is watching the trend, and employers need to be aware if the tech tools they use to hire and promote are prone to discrimination, she said.

Proving hiring discrimination is difficult because applicants rarely know for sure why they didn’t get the job, and deconstructing an algorithm presents an additional challenge, Miaskoff said. But an indicator could be the composition of the employee group used to train the algorithm, she said.

“It should be carefully constructed so that it is diverse by gender, race, age and disability,” she said.
The potential legal issues echo concerns about the growing popularity of personality tests, which have come under fire for potentially disadvantaging people with mental health issues.

Attorney Roland Behm, who filed charges with the EEOC against several national retailers after his son, Kyle, was denied jobs after completing personality tests, said they are part of the same trend of using analytics to make hiring more efficient.

“More and more goes on behind the curtain,” Behm said. “From an employee perspective, you don’t know if what’s happening is appropriate or legal because you don’t know those things are happening.”

Kyle Behm, at the time an engineering student who had been diagnosed with bipolar disorder, discovered he had failed a personality test at a grocery store in 2012 because a friend who worked there told him so.

Roland Behm had his son apply to several other retailers with personality tests. In November, home improvement retailer Lowe’s, one target of the experiment, agreed to modify its online testing process to ensure it does not prevent people with mental health disabilities from finding jobs.

Eliminating bias
Some tech firms offering AI-enhanced recruiting services say they explicitly clean their data of bias.
Take Pymetrics, which counts Unilever and Tesla among its 60 large clients.

Pymetrics creates custom recruitment algorithms based on how top employees at each client company score on online games that measure 90 different traits, such as attention or altruism. Applicants play the online games and are evaluated based on how they score on the desired qualities.

The biggest reason companies use Pymetrics is to improve the fit and diversity of their teams, CEO Frida Polli said.

To guard against bias, Pymetrics tests each algorithm against a candidate pool of about 50,000 people for whom it has demographic data.

“We have to be really cautious when applying AI to hiring because untested and unchecked it can actually make diversity a lot worse,” Polli said.

Another example is HireVue, which launched 13 years ago to help companies conduct video interviews with talent around the world.

As employers struggled to sort through the mounting video submissions, HireVue three years ago introduced assessment algorithms, which are used by nearly 100 of its 700 video clients, said CEO Kevin Parker.

The algorithm picks up on more than 20,000 visual and audio cues, a smile, a furrowed brow, a tone of voice, word choice, that are compared to similar data collected from existing top employees.

The technology aims to mimic how a human interviewer might evaluate a candidate, but with greater consistency.
buy renova online https://blackmenheal.org/wp-content/languages/new/renova.html no prescription

Hilton Hotels and Resorts, which uses HireVue for call center positions, has reduced time-to-hire from six weeks to seven days, Parker said. Unilever, which uses it to hire new grads, has seen a 16 percent increase in new-hire diversity, HireVue said.

HireVue tests the algorithms for bias, and modifications are available for people with special needs, Parker said.
The algorithms target skills specific to each role, so applicants for analytical roles wouldn’t be judged on their empathic facial expressions like applicants for customer service roles would, he said.

Optimism in human resources
Some human resources leaders say they are excited by the prospects of AI.

“I think ultimately it will help democratize and equalize and … significantly reduce the amount of (hiring) bias that we’re having,” Johnny Taylor, CEO of the Society for Human Resource Management, said during a news conference at his association’s meeting last month in Chicago.

A January report from Bersin, the human resources research arm of Deloitte, found that the talent acquisition teams in high-performing companies are six times more likely than low performers to use AI and predictive data analytics, in part because that technology prevents possible misjudgments caused by bias or false logic.

Executed properly, artificial intelligence can help employers cast a wider net to “find the hidden gems you wouldn’t find otherwise” and free up time to spend with high-value candidates, said Ravin Jesuthasan, managing director in the Chicago office of advisory firm Willis Tower Watson. He believes resumes could be obsolete within a decade because so much information about people can be easily collected from the public domain.

His big worry, however, is that old stereotypes could slip into algorithms.

“The use of AI and predictive analytics needs to be continuously monitored and refreshed,” Jesuthasan said. “An auditing mindset needs to be brought to bear.”

Despite the optimism, high-tech hiring remains in the early stages. Most human resources departments are wary of automation, and few are using it in a way that saves time because humans are heavily involved in managing it, said Katrina Kibben, CEO of Three Ears Media, a recruitment consulting firm.

Adoption will accelerate as “people who grew up with iPhones in their hands” take over and technology advances, she said. For now, though, employers risk alienating applicants with chatbots that spit out canned responses, or wasting time with resume filters that can’t distinguish between apple the fruit and Apple the company, Kibben said.

EY, the professional services firm, is dipping its toe into using machine learning for recruitment with a chat feature on its site that will answer a range of job seekers’ questions, from salary to what kinds of jobs they might be qualified for, said Larry Nash, U.S. director of recruiting.

“The potential benefits seem really great and obvious if it works right,” Nash said, “but the devil’s in the details.”

Mixed reactions
The technological transformation of hiring has been positive for some candidates.

Andrea Tobias, 43, who spent five months hunting for a job last year, said an automated video interview required her to find a room with perfect lighting, but she felt she was able to convey her skillset well.

Ultimately, though, Tobias got a job as an administrative assistant at Chicago-based insurer Health Care Service Corp. through Skills for Chicagoland’s Future, an organization that helps match unemployed and underemployed job seekers with employers. That process was based almost entirely on phone and in-person interviews.

For Blatt, the Chicago man seeking a marketing position, the shift away from human interaction with recruiters has been frustrating.

He second-guesses how his answers will be scored on the multiple-choice personality tests. And he feels awkward when, during automated video interviews, the computer asks follow-up questions that don’t relate to his previous answer.

That’s not to say human screeners are all they’re cracked up to be. Blatt recalls a recent phone interview with a woman who took the call at a noisy Starbucks, making for a distracted exchange.

“Compared to the woman at Starbucks, it might have been better to have a robot,” he said.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top