Recently, the Secretary of State Oregon Audits Division released an IT audit of GenTax, the software system that Oregon’s Department of Revenue uses to process tax payments and returns. This month, I sat down to talk to Erika Ungern, an 18 year veteran of the Audits division and the lead for the audit.

Why was the GenTax system selected for an audit?

A lot of the work we do on the IT team supports financial auditors. They need to know that the information they use for their audits is reliable. GenTax is a fairly new system – the Department of Revenue completed the last of four rollouts in November 2017 – so it was a good time to take a look.

What was the goal of this audit?

We were auditing to answer the question: Does the system do what it needs to do? That meant primarily looking to see if there are application controls in place so data remains complete, accurate, and valid during input, processing and output. In this case, GenTax is the software DOR uses to process tax returns and payments – which is something all taxpayers may be interested in.

What sort of criteria do you use to assess how well the controls are in place?

We currently use the Federal Information System Controls Audit Manual, or FISCAM. It’s a standard methodology for auditing information system controls in federal and other governmental entities. It provides guidance for evaluating the confidentiality, integrity, and availability of information systems. The information included in FISCAM ties back to National Institute of Standards and Technology (NIST) publications.

How did you go about gathering information?

This audit, like all IT audits, started with interviews and a review of agency policies and procedures. We need to know how agencies have implemented the technology and how staff are using it. We test different pieces of the technology depending on the answers we get. For instance, if we hear that the agency has specific controls in place, we’ll test those controls. If they tell us they don’t have controls, then that’s our finding. For instance, a lot of agencies don’t have strong disaster recovery controls in place for IT systems. That was the case for this one. We check back on their progress in follow-up audits.

Was there anything unique about this audit?

It was somewhat unique in that we were looking at a system that DOR purchased, and both DOR and the vendor are actively involved in supporting the software. Agencies used to build their systems all in-house, and when we would do an audit, we would only talk to agency personnel. When we do an audit of purchased software, system changes are sometimes made exclusively by the vendor, and our audit questions focus on how the agency makes sure those changes are correct, since we are not auditing the vendor’s change management procedures. In this case, DOR and the vendor both make changes to the system, so we asked both agency and vendor personnel about their processes to ensure the changes were correct.

Another new thing was reporting some results that didn’t hit the materiality threshold. This audit reported on a few things that only affect a small percentage of returns the software processes, like the fact the software doesn’t currently provide notification when taxpayers make a mistake in reporting withholding on their returns that causes them to overpay taxes. These results may end up going hand in hand with the performance audit of DOR’s culture that’s going on right now.

Any other thoughts on auditing for IT auditors, or auditors in general?

You know, IT audits are like a lot of other audits. Getting good results is all about asking the right questions. You don’t always know what they are when you start, but do your best to figure them out!

Read the full audit HERE

Members of the audit team included:
Will Garber, CGFM, MPA, Deputy Director
Teresa Furnish, CISA, Audit Manager
Erika Ungern, CISSP, CISA, Principal Auditor
Sherry Kurk, CISA, Staff Auditor
Sheila Faulkner, Staff Auditor