Methods (to our Madness): How IT audits help keep your $$$ safe

Recently, the Secretary of State Oregon Audits Division released an IT audit of GenTax, the software system that Oregon’s Department of Revenue uses to process tax payments and returns. This month, I sat down to talk to Erika Ungern, an 18 year veteran of the Audits division and the lead for the audit.

Why was the GenTax system selected for an audit?

A lot of the work we do on the IT team supports financial auditors. They need to know that the information they use for their audits is reliable. GenTax is a fairly new system – the Department of Revenue completed the last of four rollouts in November 2017 – so it was a good time to take a look.

What was the goal of this audit?

We were auditing to answer the question: Does the system do what it needs to do? That meant primarily looking to see if there are application controls in place so data remains complete, accurate, and valid during input, processing and output. In this case, GenTax is the software DOR uses to process tax returns and payments – which is something all taxpayers may be interested in.

What sort of criteria do you use to assess how well the controls are in place?

We currently use the Federal Information System Controls Audit Manual, or FISCAM. It’s a standard methodology for auditing information system controls in federal and other governmental entities. It provides guidance for evaluating the confidentiality, integrity, and availability of information systems. The information included in FISCAM ties back to National Institute of Standards and Technology (NIST) publications.

How did you go about gathering information?

This audit, like all IT audits, started with interviews and a review of agency policies and procedures. We need to know how agencies have implemented the technology and how staff are using it. We test different pieces of the technology depending on the answers we get. For instance, if we hear that the agency has specific controls in place, we’ll test those controls. If they tell us they don’t have controls, then that’s our finding. For instance, a lot of agencies don’t have strong disaster recovery controls in place for IT systems. That was the case for this one. We check back on their progress in follow-up audits.

Was there anything unique about this audit?

It was somewhat unique in that we were looking at a system that DOR purchased, and both DOR and the vendor are actively involved in supporting the software. Agencies used to build their systems all in-house, and when we would do an audit, we would only talk to agency personnel. When we do an audit of purchased software, system changes are sometimes made exclusively by the vendor, and our audit questions focus on how the agency makes sure those changes are correct, since we are not auditing the vendor’s change management procedures. In this case, DOR and the vendor both make changes to the system, so we asked both agency and vendor personnel about their processes to ensure the changes were correct.

Another new thing was reporting some results that didn’t hit the materiality threshold. This audit reported on a few things that only affect a small percentage of returns the software processes, like the fact the software doesn’t currently provide notification when taxpayers make a mistake in reporting withholding on their returns that causes them to overpay taxes. These results may end up going hand in hand with the performance audit of DOR’s culture that’s going on right now.

Any other thoughts on auditing for IT auditors, or auditors in general?

You know, IT audits are like a lot of other audits. Getting good results is all about asking the right questions. You don’t always know what they are when you start, but do your best to figure them out!

Read the full audit HERE

Members of the audit team included:
Will Garber, CGFM, MPA, Deputy Director
Teresa Furnish, CISA, Audit Manager
Erika Ungern, CISSP, CISA, Principal Auditor
Sherry Kurk, CISA, Staff Auditor
Sheila Faulkner, Staff Auditor

Accountability and Media Auditors at Work Featured

Methods (to our Madness): Leveraging Administrative Data to Understand a Management Issue

Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Anyone who pays attention to the news or lives near a fire prone area, knows that Oregon’s fire seasons have been extreme the past few years. But I sat down with Amelia Eveland, Senior Auditor, and Luis Sandoval, Staff Auditor, to learn about more than Oregon’s formidable wildfires: how the team used data to understand workforce issues at the Department of Forestry, as described in the recently released audit, Department of Forestry: Actions Needed to Address Strain on Workforce and Programs from Wildfires.

Department of Forestry staff had described fire seasons in terms of acres burned and other fire activity measures, but hadn’t put numbers to what they intuitively knew; those large and frequent fires were affecting all of their programs. The team was able to quantify some of the impact of fires on department programs and staff by analyzing the actual hours worked by employees.

Don’t Overlook Administrative Data Sources

One of the things that I found most interesting was their data source: payroll data. Payroll data is collected for administrative purposes. But administrative data should not be overlooked as a source of information for management analysis. Payroll data provided the information that the team needed and was possibly more accurate than other data sources, since accuracy is important when people are being paid.

Understand Your Data and Its Limitations

Using a data source that is collected for another purpose can have downsides though. The payroll data only went back 6 years and only showed hours billed, not worked. The hours worked by some staff who weren’t eligible for overtime weren’t captured.

The team also had to understand the data and parameters. To do this they worked with the department’s financial staff who were familiar with it. They asked the department staff to pull the data and to check the team’s methodology. In the course of this process, they eliminated pay codes that would double count hours. For example, if someone was working a night shift on a fire line, they could receive pay differential (a supplemental payment) on top of their regular salary. Pay differential hours were logged separately from the hours logged for regular pay, despite applying to the same shift. Initially the team had been counting these hours twice, but working closely with the agency helped them pinpoint and correct potential methodological errors.

Putting Numbers to the Impacts on Staff and Programs

The team overcame these minor obstacles to conduct some pretty interesting analyses. They found that the past three fire seasons had been particularly demanding in terms of staff time, mostly due to regular and overtime hours from permanent employees (as shown in the graph below). This suggests that these employees may be pulled from other activities, and may also feel overworked.

 

forestry-chart

Payroll Hours Billed to Fire Protection by All Oregon Dept. of Forestry Employees

 

The team was also able to get a more accurate picture of which programs were contributing to fighting fire through specialized incident management teams. Because many Forestry employees split their regular time between different programs (for example, someone may split their time 80/20 between Private Forests and Fire Protection), it can be hard to track which programs are being affected when that person goes out to fight a fire. The audit team totaled the regular hours billed to each program and used the proportion of this total to arrive at a proportion of contributing programs.

Get the Power Pivot Add-in (so cool)

I asked the team for advice on using payroll data. They suggested manipulating the data as much as possible in the data query tool before exporting the data for analysis. The team used excel for analysis but used the power pivot add-in to be able to summarize the large quantity of data.

Auditing and Methodology Data Wonk Featured Performance Audit

Methods (to our Madness): A 2 Minute Primer on IT Auditing, Through the Lens of an Employment Audit

Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Given some of Oregon’s high profile computer system failures, the global risk of IT security and the age of some Oregon agencies’ legacy computer systems, it is easy to see the importance of the Secretary of State’s team of Information Technology (IT) auditors. But what exactly do IT auditors do?

Here are some lessons learned and basic steps taken in IT auditing that I learned from Erika Ungern, Principal Auditor, and Matthew Owens, Senior Auditor in a conversation about their recently released IT audit, which found that computer programs for unemployment tax returns and claims at the Oregon Employment Department need attention.

When doing an IT audit, always test the data

In the Oregon Employment Department audit, the audit team followed a typical process for IT audits, including identifying the computer systems to evaluate, examining the process and expected controls of those systems, and testing the data to make sure that the systems were operating as intended.

When I asked the team if they always do the final step of testing the data, their faces lit up. (I’m not sure if it was due to the excitement of thinking about data or shock that I would even ask such a question). They replied in near unison that yes, you always have to test the data. Even if everything looks good on paper, the only way you can know if a system is working is to test it.

Compared to an ideal world, the Department’s computer systems fell short

COBIT and FISCAM are two criteria frameworks that describe an ideal world for government IT systems. IT auditors can measure a computer system against these frameworks to identify areas for improvement.

When IT auditors do this, they look at different points in the system and the controls that they would expect to find at each point. They look at the inputs. What is supposed to get into the system? They look at what the system does. How does it process or manipulate the data? And they look at the output. What happens at the end? Is there a report? Is the data transferred to another system? Or, as is the case here, is the output hundreds of millions of dollars in payments for unemployment claims?

At each point, they look for controls, or processes and checks built into the system or staff operations, that can prevent, detect or correct errors and ensure accuracy. For example, system “edits” are intended to ensure that unemployment insurance claims are not paid to recipients whose claim applications were denied.

The audit team looked at two of the Department’s systems and found that they were set up to handle routine claims and to process most employer tax payments automatically. However, the systems were old. Changes were not well documented and workarounds had been developed. Sometimes the team had to look at the computer code to understand what was going on. Uncorrected system problems could lead to some tax returns bypassing automated checks or requiring manual verification. The team proceeded to the next step to test the data and find examples of cases that were bypassing the system.

Data testing created an example for the Department to replicate

Employers submit unemployment insurance tax return data in two ways, one at the detailed employee wage level and one at the summary payroll level. The audit team took these two data sources and performed various analyses. In one instance, the audit team recalculated taxable wages to identify employers who may have under-reported (or over-reported) taxable wages, which in turn led to under or overpaying unemployment taxes. This analysis was so useful that the Department asked the audit team for a step-by-step explanation (see below) so that they could replicate it.

Finding million dollar issues now could save even more during a busy recession

Based on this analysis, the team found that nearly 2,000 employers had overpaid taxes by approximately $850,000 in 2014 and had not been notified. One non-profit overpaid by $17,000. They also found potentially $2.9 million in underpayments that had not been collected. While these amounts are a small portion of the overall tax collections, they could increase dramatically when unemployment increases, such as during a recession. Additionally, as evidenced by the non-profit example, missing these errors could have a large impact on small employers.

The Employment Department was not catching these discrepancies because they were not looking at generated reports they may have been able to help them identify these issues.

Lessons learned: document as you go along

When I asked the team what lessons they had learned, they told me to document the steps you are taking as you do your data analysis. Hm, I think I have heard that advice before.

Breaking down the methodology

Here is a step-by-step look at how the team analyzed the data for incorrect unemployment insurance tax payment:

  1. The team took individual wage data and created a calculated field that rerecorded any amount of wages over $35,000 as $35,000 (since $35,000 was the taxable limit). Any value under $35,000 retained its original value.
  2. They summarized the data to get a total of calculated taxable wages for each employer.
  3. They filtered the table to show only taxable employers.
  4. The team then compared the taxable wages field with another field of payroll reported by employers. To do this, they created a new field that subtracted the taxable wages from the payroll field.
  5. They followed up on the results for any employer where the difference was greater than one dollar.
  6. They calculated a potential overpayment or underpayment of taxes using the employer’s assigned tax rate.

 

 

CZ_photo

Caroline Zavitkovski, OAD Senior Performance Auditor, MPA

Auditing and Methodology Data Wonk Featured IT Audit

Methods (to our Madness): Using Data to Tell the Story of a Debt Problem

Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Sometimes it takes a number to get the point across. And sometimes it takes actually doing the work to show that the work can be done.

These were two of the big takeaways from a recent conversation I had with Jamie Ralls, principal auditor at the OAD and project lead for a recently released performance audit on debt collection: Oregon Needs Stronger Leadership, Sustained Focus to Improve Delinquent Debt Collection.

Vendor offset analysis showed potential savings of at least $750,000 a year

Jamie conducted an analysis for the audit on vendor offset. Vendor offset is when a state matches a list of debtors that owe the state money to a list of vendors that the state pays money to for services. Then instead of paying money to the vendors for services, the state intercepts those payments and applies it to the debt. This is something that 40 other states do, but Oregon did not do at the time of the audit.

Jamie looked at what Oregon could have collected had it used vendor offset. The result: At least $750,000 a year.

Limitations in the data resulted in a cautious estimate

The $750,000 a year estimate was likely low considering that the list of debtors was incomplete from a statewide perspective. The Department of Revenue maintained the list and it did not include debt held at other agencies. Additionally, due to the complexity of the analysis, the team only calculated debt and payments by year. An ongoing monthly calculation would have produced a greater collection amount.

Lessons learned: document along the way

Jamie said that if she could go back she would have been better about documenting all of the steps she took in the analysis as she went along. She was so caught up in the excitement of the work that she did not always stop to document everything. She then had to go back and retrace some of her work.

Using data to tell the story of a debt problem

When I asked Jamie why the audit team did this specific analysis, she said that paying vendors who owe money to the state has been a long-standing problem. The Audits Division had first recommended vendor offset in 1997. However, in the past our office had only talked about it anecdotally.

Being able to show the extent of the problem through data analysis had a big impact. Actually going through the methodology also demonstrated that doing vendor offset was technically possible. During the course of the audit, in part due to testimony from the audit team showing this analysis, the Oregon Legislature passed Senate Bill 55. SB 55 requires the Oregon Department of Revenue to do a vendor offset.

Breaking down the methodology

Here is a step-by-step look at how Jamie analyzed vendor offset:

  1. She took a list of approved vendors from the Oregon Department of Administrative Services and a list of debtors from the Oregon Department of Revenue. She matched the lists based on tax id numbers. She found 9140 debtors who were approved as vendors to receive payment from the state. These vendors owed a total of $67 million in debt.
  2. Next, she pulled queries in the Statewide Financial Management Application (SFMA) to find and export records of all payments to these vendors for the time period of 2011 to 2014.
  3. She then summarized the debt by each year and summarized the payments each year.
  4. She took the debt for the first year (2011) and subtracted the payments for the following year (2012). If a balance of debt remained, it was rolled over to the next year (2012) to create a rolling balance of debt.
  5. For each year, the amount of debt that could have been collected through payments in the following year was also calculated and rolled forward, to create a rolling balance of what the state could have collected.
  6. She computed $3 million in debt that could have been collected, or an average of $750,000 a year.

 

Caroline Zavitkovski, OAD Senior Performance Auditor, MPA

Caroline Zavitkovski, OAD Senior Performance Auditor, MPA

Auditing and Methodology Data Wonk Featured