Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.
Given some of Oregon’s high profile computer system failures, the global risk of IT security and the age of some Oregon agencies’ legacy computer systems, it is easy to see the importance of the Secretary of State’s team of Information Technology (IT) auditors. But what exactly do IT auditors do?
Here are some lessons learned and basic steps taken in IT auditing that I learned from Erika Ungern, Principal Auditor, and Matthew Owens, Senior Auditor in a conversation about their recently released IT audit, which found that computer programs for unemployment tax returns and claims at the Oregon Employment Department need attention.
When doing an IT audit, always test the data
In the Oregon Employment Department audit, the audit team followed a typical process for IT audits, including identifying the computer systems to evaluate, examining the process and expected controls of those systems, and testing the data to make sure that the systems were operating as intended.
When I asked the team if they always do the final step of testing the data, their faces lit up. (I’m not sure if it was due to the excitement of thinking about data or shock that I would even ask such a question). They replied in near unison that yes, you always have to test the data. Even if everything looks good on paper, the only way you can know if a system is working is to test it.
Compared to an ideal world, the Department’s computer systems fell short
COBIT and FISCAM are two criteria frameworks that describe an ideal world for government IT systems. IT auditors can measure a computer system against these frameworks to identify areas for improvement.
When IT auditors do this, they look at different points in the system and the controls that they would expect to find at each point. They look at the inputs. What is supposed to get into the system? They look at what the system does. How does it process or manipulate the data? And they look at the output. What happens at the end? Is there a report? Is the data transferred to another system? Or, as is the case here, is the output hundreds of millions of dollars in payments for unemployment claims?
At each point, they look for controls, or processes and checks built into the system or staff operations, that can prevent, detect or correct errors and ensure accuracy. For example, system “edits” are intended to ensure that unemployment insurance claims are not paid to recipients whose claim applications were denied.
The audit team looked at two of the Department’s systems and found that they were set up to handle routine claims and to process most employer tax payments automatically. However, the systems were old. Changes were not well documented and workarounds had been developed. Sometimes the team had to look at the computer code to understand what was going on. Uncorrected system problems could lead to some tax returns bypassing automated checks or requiring manual verification. The team proceeded to the next step to test the data and find examples of cases that were bypassing the system.
Data testing created an example for the Department to replicate
Employers submit unemployment insurance tax return data in two ways, one at the detailed employee wage level and one at the summary payroll level. The audit team took these two data sources and performed various analyses. In one instance, the audit team recalculated taxable wages to identify employers who may have under-reported (or over-reported) taxable wages, which in turn led to under or overpaying unemployment taxes. This analysis was so useful that the Department asked the audit team for a step-by-step explanation (see below) so that they could replicate it.
Finding million dollar issues now could save even more during a busy recession
Based on this analysis, the team found that nearly 2,000 employers had overpaid taxes by approximately $850,000 in 2014 and had not been notified. One non-profit overpaid by $17,000. They also found potentially $2.9 million in underpayments that had not been collected. While these amounts are a small portion of the overall tax collections, they could increase dramatically when unemployment increases, such as during a recession. Additionally, as evidenced by the non-profit example, missing these errors could have a large impact on small employers.
The Employment Department was not catching these discrepancies because they were not looking at generated reports they may have been able to help them identify these issues.
Lessons learned: document as you go along
When I asked the team what lessons they had learned, they told me to document the steps you are taking as you do your data analysis. Hm, I think I have heard that advice before.
Breaking down the methodology
Here is a step-by-step look at how the team analyzed the data for incorrect unemployment insurance tax payment:
- The team took individual wage data and created a calculated field that rerecorded any amount of wages over $35,000 as $35,000 (since $35,000 was the taxable limit). Any value under $35,000 retained its original value.
- They summarized the data to get a total of calculated taxable wages for each employer.
- They filtered the table to show only taxable employers.
- The team then compared the taxable wages field with another field of payroll reported by employers. To do this, they created a new field that subtracted the taxable wages from the payroll field.
- They followed up on the results for any employer where the difference was greater than one dollar.
- They calculated a potential overpayment or underpayment of taxes using the employer’s assigned tax rate.