insight magazine

What’s Really Behind AI Aversion in Auditing?

Experienced auditors don’t necessarily resist AI. Research says they resist how it makes them feel. By Mark E. Peecher, Ph.D.; Christian P.R. Pietsch, Ph.D.; Sebastian Stirnkorb, Ph.D.; and Isaac L. Yamoah, Ph.D. | Digital Exclusive – 2026

 

As artificial intelligence (AI) becomes embedded in auditing practices, firms are discovering an unexpected barrier: experienced auditors’ aversive reactions to AI-supported advice.

A recent Caseware survey found that many auditors are uncomfortable using AI for decision-support analysis or client-facing preparation. According to the survey’s findings, 88% of auditors believe AI tools risk undermining professional judgment, and nearly half of auditors worry that AI could erode public trust in the profession. While it’s easy to interpret those findings as “technophobia” or as a rational reaction to immature tools, our research suggests a different explanation.

Why AI in Auditing Triggers Professional Identity Threat

In our peer-reviewed study of highly experienced auditors, “Coping With Changing Skill Requirements: Does Disaffirmation Versus Affirmation Affect Auditors’ Reliance on AI-Supported Advice From Specialists?,” we expected to find skepticism about AI algorithms or uncertainty around the “black box” effect (i.e., when a system’s internal reasoning is opaque even when its outputs are clear). What we didn’t expect was how deeply the issue would connect to something much more human. We found that aversive reactions to AI-supported advice wasn’t about the technology itself—instead, experienced auditors who lack the newly sought digital skills being demanded from them react defensively as a coping mechanism.

Audit firms are increasingly trumpeting upskilling, reskilling, and digital transformation. At the same time, experienced auditors—many of whom built careers on financial reporting, technical accounting, and tacit knowledge—may perceive a widening gap between their traditional skills and newly emphasized AI competencies.

Research Overview: Understanding Auditors’ Aversion

In our experiments with highly experienced auditors, we first asked participants to rank their professional skills. They largely ranked analytical thinking and problem solving among their strongest skills. Consistently, they ranked advanced data analytics and IT skills among their weakest areas. In other words, the very skills firms are elevating are the ones many seasoned auditors feel least confident about.

In the next part of the experiment, we asked them to write about the importance of either their weakest or strongest skills. We then examined how auditors assessed client management’s proposed discount rate for fair value estimation purposes. Management’s proposed discount rate was aggressive, and the valuation specialist’s advice equally points out this aggressiveness in all conditions. The specialist always reached the same conclusion. The only thing we varied was the specialist’s AI reliance as either higher or lower when providing advice to auditors.

When auditors were first prompted to reflect on their weakest skills, which often included digital competencies, they became significantly more likely to discount quality advice that relied heavily on AI. They perceived the specialist as less competent and the advice as lower quality. This happened even though the specialist’s qualifications and conclusions were identical across conditions.

But when we asked auditors to reflect on their strongest professional skills before receiving the advice, something changed: The defensive discounting disappeared! In a second experiment, affirmed auditors actually weighed AI-reliant advice more heavily than those in a neutral condition. For us, this was the key insight. What looks like algorithm aversion may actually be self-protection.

When auditors feel that the profession is transforming in ways that places less spotlight on their traditional accounting expertise, they may experience what psychologists call “disaffirmation,” a threat to one’s self-regard. In that state, advice that leans heavily on AI instantiates unhelpful defensiveness and is perceived as a signal that one’s own professional expertise is becoming obsolete. For experienced auditors, discounting AI-supported advice becomes a way to cope with the profession’s digital transformation.

This interpretation doesn’t dismiss legitimate concerns about AI: algorithmic bias, transparency and security, and the need for human oversight. We share these concerns.

But our findings suggest that some of the resistance to AI has deeper roots. It’s not simply distrust of the output—it’s discomfort with the idea that highly extolled digital expertise may outrank experienced auditors’ traditional accounting expertise.

How AI Aversion Can Undermine Audit Quality

Our results have important implications for audit quality. In our experimental setting, discounting high-quality specialist advice meant that auditors were less skeptical and resulted in greater acceptance of aggressive management estimates. Defensive reactions didn’t strengthen professional skepticism—they weakened it.

If public trust in auditing is at risk, it may not be because auditors are too cautious about AI-supported expert advice. It may be because audit firms’ continued touting of newly relevant digital skills creates a state of disaffirmation in auditors, causing them to be less skeptical, denigrate the quality of experts’ AI-reliant advice, and perceive the experts as less competent. As a result, auditors are less inclined to incorporate AI-supported advice to reach an objective conclusion on complex accounting estimates.

How Audit Firms Can Mitigate AI Aversion

The encouraging news is that our study identified a simple intervention that helped: A brief, job-related self-affirmation exercise reduced auditors’ defensive reactions. When auditors reflected on their strongest professional skills, such as analytical reasoning or financial reporting expertise, they felt less threatened by AI-supported specialist advice and were more open to incorporating it when assessing their client’s estimate.

However, this doesn’t mean affirmation replaces training. The digital transformation of auditing is real, and auditors need ongoing development in AI-related skills to be successful. Our self-affirmation intervention isn’t a panacea for all upskilling and firms’ initiatives, but it’s an important first step that addresses algorithm aversion.

If firms focus exclusively on what auditors lack, they risk amplifying feelings of inadequacy. If they pair skill development with reinforcement of the enduring value of traditional audit competencies, they may create a more receptive environment where auditors are less defensive to expert advice that relies heavily on AI.

In our view, the conversation about AI in auditing needs to broaden. It’s not just about standards, controls, and frameworks. It’s also about professionals’ liminal identity. Experienced auditors built their careers on professional judgment, skepticism, and tacit knowledge. Those strengths remain essential, as AI doesn’t replace these acquired skills—it extends them. When we recognize that reality and communicate it clearly, we reduce the threat that fuels auditors’ defensive behaviors.

Auditors don’t have to choose between their traditional accounting expertise and the emerging digital transformation. The future of auditing is the duality of both types of expertise. If we can attend to the human and psychological side of technological change, AI integration in auditing will likely be more successful.

 


Mark E. Peecher, Ph.D., is executive associate dean of faculty and research at the University of Illinois Urbana-Champaign. Christian P.R. Pietsch, Ph.D., is an assistant professor at Erasmus University Rotterdam. Sebastian Stirnkorb, Ph.D., is an assistant professor of accountancy at the University of Illinois Urbana-Champaign. Isaac L. Yamoah, Ph.D., is a Ph.D. student in accountancy at the University of Illinois Urbana-Champaign.

 



Leave a comment