Definition of Applicant-to-Hire Ratio:
In recruiting, the applicant-to-hire ratio is the ratio of the number of applicants applying for a position to the number of people hired for that position. For instance, if a company wants to hire two sales representatives, and 100 applicants apply for these positions, then the applicant-to-hire ratio would be 100 to 2, or 50 to 1. In other words, for every 50 applicants the job receives, 1 individual will be hired.
What impacts the Applicant-to-Hire Ratio?
Applicant-to-hire ratios can vary widely depending on how the job is advertised, how attractive an employer brand may be to applicants, the level of compensation, the general health of the labor market, and other factors.
Where the job opening is advertised also has a significant impact on the number of candidates who apply. Online job postings often draw in a large applicant pool, resulting in an elevated applicant-to-hire ratio.
Why is the Applicant-to-Hire Ratio important?
The applicant-to-hire ratio is a useful hiring metric because it helps to shape the employer’s expectation of how intensive the hiring process will be. Knowing this ratio gives insight into the potential time to hire and the resources (both from a financial and a man-hour perspective) required to reach a hiring decision.
Companies with consistently high applicant-to-hire ratios often turn to recruitment management options such as applicant tracking systems (ATS). They may choose to implement additional hiring criteria to filter candidates, like pre-employment tests. When applicant-to-hire ratios are extremely high (more than 100 to 1), using tests and other data-driven selection criteria at the front-end of the hiring process can save employers valuable time and money by allowing them to focus on the most qualified candidates and identify those who are most likely to succeed in the role.