Most accountants, especially CPAs, seem smart on paper. Some even live up to it in person. Others are less, um, fortunate… So, it doesn’t come as a big surprise that even our profession’s own publication called us out for being dumb, or at least naïve when it comes to data analytics:
Accountants need to increase their competence in [predictive analytics and prescriptive analytics] to provide value to their organizations.
Sure, since it’s the Journal of Accountancy you can expect it to be phrased in the nicest possible way. We all know what they were really implying.
And, I agree, lots of accountants are on the struggle bus when it comes to keeping up with this type of technology. I think it starts with a general apathy for learning IT topics in the first place, which I’ve written about previously when I talked about verifying completeness and accuracy of system generated reports:
In college, so many students write it off because “it’s not relevant” and, you know, that belief is only true if you end up working in a large firm and have the luxury of passing off the technical stuff to the IT assurance department. Not everyone’s that lucky!
Fear not, help is on the way, brought to you by your trusty standard setters.
Both the AICPA and PCAOB have been dabbling in data analytics for the last couple of years. The AICPA is working on a new Audit Data Analytics Guide, which will supersede the current Analytical Procedures guide. According to the AICPA:
The new guide will:
• replace the current AICPA Analytical Procedures Guide, but will carry forward much of the content from that guide
• discuss audit data analytics at a foundational level
• provide examples of how these tools and techniques can be integrated into the audit process
In addition, they have enlisted the assistance of Rutgers Business School who might know what they are doing since they offer a variety of data analytics classes and certificates, even a Master’s of Business and Science in Analytics.
And earlier this month, Accounting Today quoted PCAOB member Jeanette Franzel about issuing standards on several blossoming areas of interest from data analytics to blockchain. Franzel seems on board with not only trying to nix old standards that might stifle audit innovation but also issuing new guidance.
She implied data analytics is the low hanging fruit, maybe since it’s less technical on its face than blockchain and distributed ledger systems, and will likely be first out the gate. And, she left us with what we could expect a PCAOB standard to cover:
“When we say data analytics, what are we talking about? For different types of data analytics, that can mean different things in audit,” said Franzel. “What are outliers, and how much work do you have to do if you find outliers when you’re doing an analysis? How much analysis do you have to do for it to count as substantive testing versus a planning procedure or a risk assessment procedure? How much credit can an auditor take if they push a button on a computer and the computer spits out the results? Are you done with your audit? Or what else do you need to do?”
And if I may go out on a limb, what about also covering the requirements for testing completeness and accuracy of the system generated data that is used in the analysis? Just one misplaced WHERE clause in the query and no amount of data wizardry will tell you what’s missing. For example, I’d like a list of all the journal entries, except the sketchy ones that Bob was trying to sneak in there without anyone noticing. I don’t need those.
The huge challenge is that rapid change will force our beloved standard setters to issue generic rules that are not super helpful or rules that need to be updated every year or two. That isn’t lost on Franzel; she said that, “[T]he problem with setting standards is it takes a very long time and—guess what—it might be outdated by the time we finally get around to it.”
So what are we left with? Nebulous standards that try to toe the line, keeping it light. Nothing too technical to weigh it down by focusing on standardizing terminology, and guidance on what’s the bare minimum level of testing you have to do to still call it “clean” in your audit opinion.
We, as a profession, have to step it up and stop being lazy when it comes to learning about data analytics. We might even improve audit quality if we do it right.