Businesses have never done as much hiring as they do today. They’ve never spent as much money doing it. And they’ve never done a worse job of it.
For most of the post–World War II era, large corporations went about hiring this way: Human resources experts prepared a detailed job analysis to determine what tasks the job required and what attributes a good candidate should have. Next they did a job evaluation to determine how the job fit into the organizational chart and how much it should pay, especially compared with other jobs. Ads were posted, and applicants applied. Then came the task of sorting through the applicants. That included skills tests, reference checks, maybe personality and IQ tests, and extensive interviews to learn more about them as people. William H. Whyte, in The Organization Man, described this process as going on for as long as a week before the winning candidate was offered the job. The vast majority of non-entry-level openings were filled from within.
Today’s approach couldn’t be more different. Census data shows, for example, that the majority of people who took a new job last year weren’t searching for one: Somebody came and got them. Companies seek to fill their recruiting funnel with as many candidates as possible, especially “passive candidates,” who aren’t looking to move. Often employers advertise jobs that don’t exist, hoping to find people who might be useful later on or in a different context.
The recruiting and hiring function has been eviscerated. Many U.S. companies—about 40%, according to research by Korn Ferry—have outsourced much if not all of the hiring process to “recruitment process outsourcers,” which in turn often use subcontractors, typically in India and the Philippines. The subcontractors scour LinkedIn and social media to find potential candidates. They sometimes contact them directly to see whether they can be persuaded to apply for a position and negotiate the salary they’re willing to accept. (The recruiters get incentive pay if they negotiate the amount down.) To hire programmers, for example, these subcontractors can scan websites that programmers might visit, trace their “digital exhaust” from cookies and other user-tracking measures to identify who they are, and then examine their curricula vitae.
At companies that still do their own recruitment and hiring, managers trying to fill open positions are largely left to figure out what the jobs require and what the ads should say. When applications come—always electronically—applicant-tracking software sifts through them for key words that the hiring managers want to see. Then the process moves into the Wild West, where a new industry of vendors offer an astonishing array of smart-sounding tools that claim to predict who will be a good hire. They use voice recognition, body language, clues on social media, and especially machine learning algorithms—everything but tea leaves. Entire publications are devoted to what these vendors are doing.
The big problem with all these new practices is that we don’t know whether they actually produce satisfactory hires. Only about a third of U.S. companies report that they monitor whether their hiring practices lead to good employees; few of them do so carefully, and only a minority even track cost per hire and time to hire. Imagine if the CEO asked how an advertising campaign had gone, and the response was “We have a good idea how long it took to roll out and what it cost, but we haven’t looked to see whether we’re selling more.”
Hiring talent remains the number one concern of CEOs in the most recent Conference Board Annual Survey; it’s also the top concern of the entire executive suite. PwC’s 2017 CEO survey reports that chief executives view the unavailability of talent and skills as the biggest threat to their business. Employers also spend an enormous amount on hiring—an average of $4,129 per job in the United States, according to Society for Human Resource Management estimates, and many times that amount for managerial roles—and the United States fills a staggering 66 million jobs a year. Most of the $20 billion that companies spend on human resources vendors goes to hiring.
Why do employers spend so much on something so important while knowing so little about whether it works?