Periodically, we will highlight some of the methods used in a recently released audit. Every performance audit is unique and can require creative thinking and methodologies to answer our audit objective. Some of these methods could be replicated or present valuable lessons for future projects.

Anyone who pays attention to the news or lives near a fire prone area, knows that Oregon’s fire seasons have been extreme the past few years. But I sat down with Amelia Eveland, Senior Auditor, and Luis Sandoval, Staff Auditor, to learn about more than Oregon’s formidable wildfires: how the team used data to understand workforce issues at the Department of Forestry, as described in the recently released audit, Department of Forestry: Actions Needed to Address Strain on Workforce and Programs from Wildfires.

Department of Forestry staff had described fire seasons in terms of acres burned and other fire activity measures, but hadn’t put numbers to what they intuitively knew; those large and frequent fires were affecting all of their programs. The team was able to quantify some of the impact of fires on department programs and staff by analyzing the actual hours worked by employees.

Don’t Overlook Administrative Data Sources

One of the things that I found most interesting was their data source: payroll data. Payroll data is collected for administrative purposes. But administrative data should not be overlooked as a source of information for management analysis. Payroll data provided the information that the team needed and was possibly more accurate than other data sources, since accuracy is important when people are being paid.

Understand Your Data and Its Limitations

Using a data source that is collected for another purpose can have downsides though. The payroll data only went back 6 years and only showed hours billed, not worked. The hours worked by some staff who weren’t eligible for overtime weren’t captured.

The team also had to understand the data and parameters. To do this they worked with the department’s financial staff who were familiar with it. They asked the department staff to pull the data and to check the team’s methodology. In the course of this process, they eliminated pay codes that would double count hours. For example, if someone was working a night shift on a fire line, they could receive pay differential (a supplemental payment) on top of their regular salary. Pay differential hours were logged separately from the hours logged for regular pay, despite applying to the same shift. Initially the team had been counting these hours twice, but working closely with the agency helped them pinpoint and correct potential methodological errors.

Putting Numbers to the Impacts on Staff and Programs

The team overcame these minor obstacles to conduct some pretty interesting analyses. They found that the past three fire seasons had been particularly demanding in terms of staff time, mostly due to regular and overtime hours from permanent employees (as shown in the graph below). This suggests that these employees may be pulled from other activities, and may also feel overworked.

 

forestry-chart

Payroll Hours Billed to Fire Protection by All Oregon Dept. of Forestry Employees

 

The team was also able to get a more accurate picture of which programs were contributing to fighting fire through specialized incident management teams. Because many Forestry employees split their regular time between different programs (for example, someone may split their time 80/20 between Private Forests and Fire Protection), it can be hard to track which programs are being affected when that person goes out to fight a fire. The audit team totaled the regular hours billed to each program and used the proportion of this total to arrive at a proportion of contributing programs.

Get the Power Pivot Add-in (so cool)

I asked the team for advice on using payroll data. They suggested manipulating the data as much as possible in the data query tool before exporting the data for analysis. The team used excel for analysis but used the power pivot add-in to be able to summarize the large quantity of data.