Is Your Audit Analytics Program Actually Good?
Tax & AccountingÁrea FinanceiraCompliancedezembro 05, 2019

Is your internal audit department's analytics program actually good?

Do you ever wonder if your audit analytics program is any good? And, if so, how do you know? Do you even know what a “good” audit analytics program looks like? And how would you know if you are improving?

If you’ve read much on this topic you’ve probably seen some variation of an analytics maturity model, perhaps like this one from ISACA (2011), where the first level is often labelled “Ad Hoc”. At the other end, these models often use “Continuous Monitoring” to describe mature audit analytics programs.

Is that really what maturity models are for?

These maturity models usually have 4 or 5 steps. While I agree that most organizations tend to follow this path as their analytics program matures, I don’t think it is a valid mechanism for measuring how “good” your analytics program is. Nor do I think the original creators of these models really intended them to be used that way.

As I talk to some of our customers who are really trying to improve, they often tell me that they measure two things: the depth of analytics coverage and the breadth of analytics coverage. As one customer told me, “That which gets measured gets done, right?”

Breadth of coverage

So, let’s start with the easy one, breadth of coverage. This is measured simply by counting the number of audits that contain analytics (and, no, sampling doesn’t count!). The next thing most people think about is how many audits included the use of an analytics tool. However, to count your audit as having included some analytics, it doesn’t have to mean you used your analytics tool—or even did anything overly complex. If your audit requires that you sort a spreadsheet and look at the ten largest transactions (something all auditors can do using just Excel), then it counts.

Depth of coverage

However, that takes us right into depth of coverage and this is more difficult to calculate. Ideally, for each audit you would document the analytics you would like to do to address the risk or the audit objective, and then count how many of them you are doing. Creating this list is a difficult task for most organizations.

John Fitzpatrick, Second Vice President at Guardian Life Insurance Company, recognizes this challenge but still wants to measure the depth of Guardian’s analytics on their audits. They’ve come up with a scale against which they evaluate all their audits. They evaluate depth using the following.

At Guardian Life, data analytics were used to:

  • Develop financial/statistical parameters related to the process and transactions being audited
  • Derive insight into the population of activity to develop targeted sampling and testing approaches
  • Test 100% of the population or test for specific exception criteria
  • Perform automated comparisons or reconciliations of multiple datasets for completeness and accuracy or to identify exceptions
  • Develop recommendations specifically based on trends or indicators in the data

This evaluation method gives them a measure that they can then use to evaluate areas where they may want to go deeper, or where they can evaluate if they are improving over time.

If you are serious about improving your use of analytics in your audits, start by figuring out how to measure your depth and breadth of coverage.

Watch the Webinar

Ken Petersen
Associate Director, Product Management
Ken has over 25 years of experience in developing and implementing systems and working with data in a variety of capacities while working for both Fortune 500 and entrepreneurial software development companies. Since 2002 Ken’s focus has been on the Governance, Risk, and Compliance space helping numerous customers across multiple industries implement software solutions to satisfy various compliance needs including audit and SOX.


TeamMate Analytics

Audit analytics

Provide deeper insights more quickly and reduce the risk of missing material misstatements.
Back To Top