Methods (to our Madness): Leveraging Administrative Data to Understand a Management Issue

Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Anyone who pays attention to the news or lives near a fire prone area, knows that Oregon’s fire seasons have been extreme the past few years. But I sat down with Amelia Eveland, Senior Auditor, and Luis Sandoval, Staff Auditor, to learn about more than Oregon’s formidable wildfires: how the team used data to understand workforce issues at the Department of Forestry, as described in the recently released audit, Department of Forestry: Actions Needed to Address Strain on Workforce and Programs from Wildfires.

Department of Forestry staff had described fire seasons in terms of acres burned and other fire activity measures, but hadn’t put numbers to what they intuitively knew; those large and frequent fires were affecting all of their programs. The team was able to quantify some of the impact of fires on department programs and staff by analyzing the actual hours worked by employees.

Don’t Overlook Administrative Data Sources

One of the things that I found most interesting was their data source: payroll data. Payroll data is collected for administrative purposes. But administrative data should not be overlooked as a source of information for management analysis. Payroll data provided the information that the team needed and was possibly more accurate than other data sources, since accuracy is important when people are being paid.

Understand Your Data and Its Limitations

Using a data source that is collected for another purpose can have downsides though. The payroll data only went back 6 years and only showed hours billed, not worked. The hours worked by some staff who weren’t eligible for overtime weren’t captured.

The team also had to understand the data and parameters. To do this they worked with the department’s financial staff who were familiar with it. They asked the department staff to pull the data and to check the team’s methodology. In the course of this process, they eliminated pay codes that would double count hours. For example, if someone was working a night shift on a fire line, they could receive pay differential (a supplemental payment) on top of their regular salary. Pay differential hours were logged separately from the hours logged for regular pay, despite applying to the same shift. Initially the team had been counting these hours twice, but working closely with the agency helped them pinpoint and correct potential methodological errors.

Putting Numbers to the Impacts on Staff and Programs

The team overcame these minor obstacles to conduct some pretty interesting analyses. They found that the past three fire seasons had been particularly demanding in terms of staff time, mostly due to regular and overtime hours from permanent employees (as shown in the graph below). This suggests that these employees may be pulled from other activities, and may also feel overworked.

 

forestry-chart

Payroll Hours Billed to Fire Protection by All Oregon Dept. of Forestry Employees

 

The team was also able to get a more accurate picture of which programs were contributing to fighting fire through specialized incident management teams. Because many Forestry employees split their regular time between different programs (for example, someone may split their time 80/20 between Private Forests and Fire Protection), it can be hard to track which programs are being affected when that person goes out to fight a fire. The audit team totaled the regular hours billed to each program and used the proportion of this total to arrive at a proportion of contributing programs.

Get the Power Pivot Add-in (so cool)

I asked the team for advice on using payroll data. They suggested manipulating the data as much as possible in the data query tool before exporting the data for analysis. The team used excel for analysis but used the power pivot add-in to be able to summarize the large quantity of data.

Auditing and Methodology Data Wonk Featured Performance Audit

Stephanie Evergreen ReBlog: Timelines, 4 Ways

 

The least helpful timelines I’ve ever seen are these:

TimelineExamples3

where time is basically bulleted, as if each of these intervals is equidistant and as if a bunch of text is the best way to communicate something inherently not based in narrative. You are basically saying, this journey is going to be really confusing because I’m not willing to help you see what’s really going to happen when. Buckle up! And that’s probably not the vibe you’re trying to give off.

Read more about interesting alternatives to developing timelines (useful in reports! useful in project management! useful, all around!) over at Stephanie Evergreen’s humorous and enlightening blog.

Accountability and Media Data Wonk Featured

TeamMate Solutions ReBlog: 5 critical steps to applying data analytics

Consistently seen across available literature are five common steps to applying data analytics:

  1. Define your Objective
  2. Understand Your Data Source
  3. Prepare Your Data
  4. Analyze Data
  5. Report on Results

Each of the steps are critical and each step has challenges. Understand and overcoming the challenges requires a deeper look into each step…

Toby DeRoche, a Market Consultant with TeamMate, outlines how auditors can think more deeply about the five steps to effectively analyzing and reporting on data. Read more here!

Accountability and Media Data Wonk Featured

Harvard Business Review RePost: The data-driven case for vacation

We love reading blogs on productivity research, but one of our running jokes is how each article gives you a list of three more things you have to be doing to be successful. You read three articles and you now have nine more “to-dos” each day. But the conclusion of this article and all of our research isn’t complicated: Go on vacation. If you take all your vacation days and plan ahead for trips, you will increase your happiness, success rate, and likelihood of promotion, and you’ll lower your stress level to boot.

So here’s one easy takeaway for you: Go away.

Break out the sunscreen, plan out your road trip, decide between a relaxing beach holiday or an invigorating week hiking through one of our gorgeous national parks- doesn’t matter! It’s time to get out of the office. It’s summertime, and that means it’s time to take a vacation!

HBRdogpoolOr rather, that’s what it meant in 1978. Unfortunately for many working Americans, vacation time seems to be slowly going the way of the dodo and the passenger pigeon. American workers are taking approximately 1/3rd less vacation time than they were taking in the 70s, 80s, and 90s, and the decline shows few signs of reversing course. Our more technologically advanced and uber-efficient workplaces have not translated to having more free time.

Shawn Achor and Michelle Gielan over at Harvard Business Review lay out of the case for taking that free time, for your own personal good as well as the good of your business or place of work. Of course, not all vacations are created equal. To really get the full benefit of time off, HBR recommends:

…if you plan ahead, create social connections on the trip, go far from your work, and feel safe, 94% of vacations have a good ROI in terms of your energy and outlook upon returning to work. Just make sure you plan the trip at least a month in advance, as one of the key predictors of vacation ROI is the amount of stress caused by not planning ahead….

Read more here!

Interested in other Harvard Business Review offerings? Check it out here!

Accountability and Media Data Wonk Featured

Stats in Auditing: Retirement, income replacement rates, and non-linear relationships

Are you saving enough for retirement? Maybe you have given this question some serious thought. Then again, maybe not… According to research cited in a recent GAO audit report on retirement security, a large portion of workers in the United States have not done a whole lot in terms of retirement planning and many people don’t know how much they should be saving.

Data Wonk Featured Stats in Auditing

Methods (to our Madness): A 2 Minute Primer on IT Auditing, Through the Lens of an Employment Audit

Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Given some of Oregon’s high profile computer system failures, the global risk of IT security and the age of some Oregon agencies’ legacy computer systems, it is easy to see the importance of the Secretary of State’s team of Information Technology (IT) auditors. But what exactly do IT auditors do?

Here are some lessons learned and basic steps taken in IT auditing that I learned from Erika Ungern, Principal Auditor, and Matthew Owens, Senior Auditor in a conversation about their recently released IT audit, which found that computer programs for unemployment tax returns and claims at the Oregon Employment Department need attention.

When doing an IT audit, always test the data

In the Oregon Employment Department audit, the audit team followed a typical process for IT audits, including identifying the computer systems to evaluate, examining the process and expected controls of those systems, and testing the data to make sure that the systems were operating as intended.

When I asked the team if they always do the final step of testing the data, their faces lit up. (I’m not sure if it was due to the excitement of thinking about data or shock that I would even ask such a question). They replied in near unison that yes, you always have to test the data. Even if everything looks good on paper, the only way you can know if a system is working is to test it.

Compared to an ideal world, the Department’s computer systems fell short

COBIT and FISCAM are two criteria frameworks that describe an ideal world for government IT systems. IT auditors can measure a computer system against these frameworks to identify areas for improvement.

When IT auditors do this, they look at different points in the system and the controls that they would expect to find at each point. They look at the inputs. What is supposed to get into the system? They look at what the system does. How does it process or manipulate the data? And they look at the output. What happens at the end? Is there a report? Is the data transferred to another system? Or, as is the case here, is the output hundreds of millions of dollars in payments for unemployment claims?

At each point, they look for controls, or processes and checks built into the system or staff operations, that can prevent, detect or correct errors and ensure accuracy. For example, system “edits” are intended to ensure that unemployment insurance claims are not paid to recipients whose claim applications were denied.

The audit team looked at two of the Department’s systems and found that they were set up to handle routine claims and to process most employer tax payments automatically. However, the systems were old. Changes were not well documented and workarounds had been developed. Sometimes the team had to look at the computer code to understand what was going on. Uncorrected system problems could lead to some tax returns bypassing automated checks or requiring manual verification. The team proceeded to the next step to test the data and find examples of cases that were bypassing the system.

Data testing created an example for the Department to replicate

Employers submit unemployment insurance tax return data in two ways, one at the detailed employee wage level and one at the summary payroll level. The audit team took these two data sources and performed various analyses. In one instance, the audit team recalculated taxable wages to identify employers who may have under-reported (or over-reported) taxable wages, which in turn led to under or overpaying unemployment taxes. This analysis was so useful that the Department asked the audit team for a step-by-step explanation (see below) so that they could replicate it.

Finding million dollar issues now could save even more during a busy recession

Based on this analysis, the team found that nearly 2,000 employers had overpaid taxes by approximately $850,000 in 2014 and had not been notified. One non-profit overpaid by $17,000. They also found potentially $2.9 million in underpayments that had not been collected. While these amounts are a small portion of the overall tax collections, they could increase dramatically when unemployment increases, such as during a recession. Additionally, as evidenced by the non-profit example, missing these errors could have a large impact on small employers.

The Employment Department was not catching these discrepancies because they were not looking at generated reports they may have been able to help them identify these issues.

Lessons learned: document as you go along

When I asked the team what lessons they had learned, they told me to document the steps you are taking as you do your data analysis. Hm, I think I have heard that advice before.

Breaking down the methodology

Here is a step-by-step look at how the team analyzed the data for incorrect unemployment insurance tax payment:

  1. The team took individual wage data and created a calculated field that rerecorded any amount of wages over $35,000 as $35,000 (since $35,000 was the taxable limit). Any value under $35,000 retained its original value.
  2. They summarized the data to get a total of calculated taxable wages for each employer.
  3. They filtered the table to show only taxable employers.
  4. The team then compared the taxable wages field with another field of payroll reported by employers. To do this, they created a new field that subtracted the taxable wages from the payroll field.
  5. They followed up on the results for any employer where the difference was greater than one dollar.
  6. They calculated a potential overpayment or underpayment of taxes using the employer’s assigned tax rate.

 

 

CZ_photo

Caroline Zavitkovski, OAD Senior Performance Auditor, MPA

Auditing and Methodology Data Wonk Featured IT Audit

How To: Use Excel’s Named Cell Range Feature

Developing formulas can be tricky at times when you are dealing with Excel’s built in cell references (e.g. tabname!$J$5:$J$356). You may know where in the spreadsheet this information is and what it describes, but you always need to spend time going back and forth between different tabs to pull the reference in or copying it from another cell. That can take a lot of valuable time.

Wouldn’t it be great if you could just call data by its own name? Such as “PayrollHours” or “AR” for accounts receivable. Well you can. Formulas don’t need to look like this:

The first step is accessing Excels named range feature. And as with everything there are a couple of ways to do this. The easiest is to highlight the cell or range you want to name. Then look to the top left of the spreadsheet and you should see a little box with a cell reference in it. It is just to the left of your formula bar and looks like this:

cellrange2a

Click in the box (shown as V8 above) and rename that to anything you want. For example, I renamed this reference “Apple”. Now in the formula bar you can type “Apple” anywhere you would normally put a cell reference and Excel will know exactly what data to pull in there.

Say you are a financial auditor and want to conduct a ratio analysis. You can go through the balance sheets and various financial documents and rename all the important cells to the terms you are used to. For example, call the cell that contains current assets “Assets” and current liabilities “Liabilities”. Now in Excel you can type in the current ratio by just entering “=assets/liabilities”.

The other way of naming references is to use the name manager on the Formula tab of the ribbon.

cellrange3a

Click on new to create a new name. Enter the name and whether it applies to the whole workbook or just that single tab. You can also add a comment to document what it is you are defining.

cellrange4

On the bottom, you can adjust what individual cell or ranges you are referencing. Click the little box with the arrow to select them manually.

Now you have the tools available to change your formulas to look like this:

cellrange5
The formula is now much easier to understand.  It is summing (adding) all payroll hours. The formula breaks out the totals by location, classification, and pay code. To learn more check out this website.

Ian Green, CGAP and OAD Senior Auditor

Ian Green, CGAP and OAD Senior Auditor

Enjoy reading our ‘how to’ posts? Subscribe to the Oregon Audits blog through the ‘Follow’ link in the right hand sidebar and keep tabs on the latest data wonk tips and tricks, audit releases, interviews, and more!

Data Wonk Featured How To