Is AI the Solution to Acquiring Accounting Talent?
Artificial intelligence could reduce the time and costs of hiring, but what risks accompany this new approach to acquiring accounting talent?
By Annie Mueller | Summer 2023
Recruiting in any industry is a bit of a numbers game. Finding the best candidate for a role means sorting through hundreds or maybe thousands of resumes to screen potential candidates—and that’s only the beginning of the hiring process. For each short-listed candidate, hours of interviews, multiple discussions, follow-ups, and documents may be required before a formal offer ever materializes.
Meanwhile, the accounting and finance profession has been in a hiring crisis for years; there are fewer accounting majors and fewer accounting graduates becoming CPAs, according to the AICPA’s “2021 Trends” report. The smaller talent pool is also exacerbated by the growing number of accounting retirees: 75% of accountants will be eligible to retire in the next two to three years.
Of course, these compounded issues only increase the pressure on hiring managers, recruiters, and HR professionals who are already maxed out. A recent survey from KarmaCheck and Findem found that 73% of HR leaders experience burnout directly related to the hiring process. The recruiting process is tough for candidates, too. They’re worn out by application requirements and stressed out by uncertainty. When a candidate experiences poor communication or follow-up delays, there’s no guarantee they’ll wait instead of walking. Therefore, it’s not surprising that organizations are keenly interested in tools to improve the laborious aspects of identifying and hiring the top candidates.
One attractive and increasingly popular solution to this challenge is the use of artificial intelligence (AI) technology that mimics human intelligence to perform, automate, and improve tasks (i.e., a computerized decision-making model). According to a March 2023 Forbes article, “How AI-Powered Tech Can Help Recruiters and Hiring Managers Find Candidates Quicker and More Efficiently,” many businesses today are using AI to:
- Identify potential candidates who may not have even applied for the job.
- Screen large volumes of resumes, matching job requirements with candidates’ qualifications and experience.
- Analyze candidate data, including resumes, social media profiles, and online behavior.
- Engage with candidates using chatbots to help and answer questions about the job or application process.
- Conduct pre-screening interviews.
Though despite AI’s overwhelming potential, experts warn it’s not all good. “There are several considerations around AI, and people are wary about it,” says Elizabeth Pittelkow Kittner, CPA, CGMA, CITP, DTM, vice president of finance at GigaOm. “We need to figure out how people can use AI resources in ways that are beneficial.”
When considering AI’s potential use for hiring practices, a good place to start is to understand where the possibilities of AI meet their limits.
A TOOL, NOT AN EMPLOYEE
Chris Denver, CPA, director of accounting and reporting advisory for Stout, says the use of AI in business isn’t anything new. What’s new is the interface that makes it very easy for anyone to use. For example, the conversational format with today’s AI programs, such as ChatGPT, makes it seem like you’re chatting with another person, albeit one who occasionally makes bizarre statements and doesn’t mind answering the most inane queries. Denver cautions that AI isn’t and shouldn’t be treated as a staff replacement. “People may assume when they’re interacting with AI that it’s omniscient, but it’s not—it’s built by fallible humans based on fallible data sets.”
Kittner also warns that sometimes the responses we receive from AI are inaccurate or the data is no longer relevant, which is a special concern when using AI to help craft job descriptions or other hiring documents with compliance considerations. “We know that tools like ChatGPT aren’t up to date. ChatGPT currently has limited knowledge of events after 2021, and the laws are changing all the time.”
Beyond potential accuracy issues, users need to remember that AI can’t be tasked with decision-making, subjective assessments, or nuanced communication. While AI can process data faster than any human, it’s important to remember that faster isn’t the same thing as smarter. “AI tools are still evolving. They are certainly helpful; however, they are not perfect,” Kittner says. “Human oversight is incredibly crucial. Anytime you are using AI, ensure people are reviewing the work.”
As organizations identify which parts of the hiring process can be shifted safely to AI, it’s important to examine AI’s ethical limits, as well.
DATA ISN’T NEUTRAL
The appeal of AI is its ability to process vast amounts of data in very little time; the processing power far exceeds anything a human brain can accomplish. But it’s important to note that data isn’t neutral. Even if AI is trained on factual information, the selection of those facts can be biased. AI has no built-in ethical barometer by which to measure the data it’s given—bias in means bias out. “AI takes a huge data set and distills it to the next logical thing, the next expected expression in the conversation,” Denver explains. “But it’s not able to necessarily improve it or be objective about itself.” For hiring practices, these biases might result in small mistakes, such as rejecting an applicant who’s actually a great fit because a few key words were missing from the resume, or much more serious mistakes, such as exhibiting gender or racial bias. To avoid these mistakes, Kittner advises that those “using the tool need to know the potential biases and build a human review component into the process of using the AI tool.”
Countering bias is just one important consideration; another is protecting confidential data. “You have to keep in mind that AI is taking your information and contributing it to a big pot of other information,” Denver says. “So, the security issues are pretty broad.” He cautions that it might be better for firms to wait on AI implementation another year or so, until there’s been more development and options to keep data secure. In the meantime, operate under the assumption that anything shared with AI also gets shared with a much wider audience.
THE COSTS OF IMPLEMENTATION
Accounting professionals know the danger of hidden costs. When implementing AI, leaders need to consider everything from the price of the software to the ongoing cost of training, governance, and mitigating security risk. “Governance is important for any process you have, and to ensure ethical behavior, policies and procedures should be written and followed,” Kittner says. “Set up documentation and disclosures so that people know that AI is being used, and that those who are using it understand the tool, are well trained on it, and know the pitfalls.”
Here are four steps to take when implementing AI in your organization:
- Evaluate your organization’s recruitment process from beginning to end and identify the areas where it needs to improve. Not sure? Ask the last hire—they’ll have an opinion.
- Assess the scale of each identified need. Denver stresses that organizations will get more value out of AI when using it to automate and streamline high-volume, rather than one-off, tasks. “As you would with other tools or software, do a cost benefit analysis for these AI products.”
- Consider the nature of the information involved in each task you’d want to automate. Then, determine the security measures needed to protect that information.
- Establish clear responsibility for who’s overseeing AI use and get input from all the key players before implementation. Kittner says to make sure there’s copious communication around AI: who’s using it, why are we using it, and does it make sense with our corporate strategy? “These are conversations organizations need to have as they are introducing AI into their processes.”
Like any process requiring high-volume task completion, Denver says a little efficiency goes a long way. “We’re seeing interesting opportunities with AI to leverage time better, and that’s only the beginning.” But, he cautions, as organizations move forward with AI, it’s critically important that they stay aware of the risks. AI and its potential rewards don’t exist in a bubble; they need to take their place in a hiring process with clear responsibility and governance.
“Ultimately, you should not replace human discernment and context with AI,” Kittner says. “But AI can be used for efficiency. Try it out and see what it can do. Just make sure to incorporate AI in ways that are appropriate, ethical, and create positive experiences.”
Annie Mueller is an experienced financial writer and principal of Prolifica Co. She works with clients from individuals to large financial companies and is a frequent contributor to various financial and business publications.
Related Content: