Benchmarking financial reporting

28th November 2013

It may seem obvious, but in order to bring about process improvement an organization needs to set objective benchmarks against which progress, or the lack of it, can be measured. But as Gary Simon (FSN managing editor) highlights in this execuitive briefing many businesses are surprisingly reluctant to establish benchmarks for financial reporting and consolidation.  And then there is the thorny question of what sort of benchmarks are appropriate.  




Introduction – external measures

External observers have relied for many years on relatively rudimentary measures such as, the time taken after the year end to announce earnings results to the market, or the days taken for the auditors to sign the financial statements. Such measures are often deemed to be a good proxy for the quality of management and have the advantage that they are easily assembled from publicly available documentation.  But are such measures really appropriate? 

What does the speed of delivery of the accounts really tell us about financial governance and the quality of the people?  Deeper consideration suggests that simple measures such as these merely provide a glimpse of management capability and say very little about the efficiency of the reporting process and how it can be improved.

Measures such as “days to report after the year end” are too blunt a tool because they do not take account of the resources deployed in reporting, i.e. they are velocity measures not efficiency measures.  So how can organisations better measure reporting productivity and data quality? 

Productivity measures

A good place to start is the number of “man-days per entity” because contrary to popular opinion it is not necessarily group revenue that drives financial reporting complexity (and hence timescales) but the number of discrete statutory and management entities that need to be consolidated - although there are other factors as well. The benefit of this KPI is that it gives an immediate indication of the finance function effort expended in the reporting cycle and, once recorded, can form the benchmark against which the success of all performance improvement initiatives can be measured.

Modern centralised cloud-based solutions, such as Adaptive Planning, which employ a single consolidation application for the whole organisation lend themselves more readily to this kind of benchmarking since statutory and management entities can be accommodated in a single model, i.e. there is better visibility of the performance of the process in its entirety.

Data quality measures

Data capture from ERP and other operational systems marks the commencement of the financial reporting process and logically this is where initial benchmarking efforts should be directed. In fact the method and conduct of data capture can have a profound bearing on productivity, data quality and speed. So early attention focussed in this area is usually well rewarded.

Interfaces between local ERP systems and the corporate reporting pack represent a hidden ‘bear trap’ for the unwary. Significant risk of error arises when information requirements change but the consequences for the integrity of the interface are not recognised in time.

For example a minor change, such as the addition of an expense line to the chart of accounts in a subsidiary’s ERP system will usually give rise to a change in the mapping tables that lies between the subsidiary’s accounts and the corporate reporting pack.  Similarly, additional information (perhaps a new statutory disclosure) requested by the corporate headquarters has to be mapped to the relevant operational system.

A failure to amend the interface completely and accurately is an instant source of error and delay.  Internal benchmarks should report on the frequency and nature of interface errors in order to isolate and repair recurring control weaknesses.

Data complexity adds profoundly to the risk of failure in interfaces.  For example, take segmental reporting, which usually involves analysing general ledger data in multiple dimensions (segments) to reflect the analysis required in the statutory accounts.  Mapping general ledger account code segments from operational systems to the corporate reporting pack has its hazards, but a more subtle form of error occurs when the GL codes are correctly mapped but the segmental analysis is not. 

This kind of error can be difficult to trap and remain undiscovered for a considerable period of time because the core data appears at first glance to have been transferred completely and accurately. It is only at a much later date that errors begin to surface in segmental analysis by which time it can be very difficult to fix because of the distance the data has travelled up the organizational hierarchy.

So benchmarking the interface is more than identifying how many data resubmissions were necessary during the period of review, since there are different degrees of failure.  Ideally, organizations should strive to develop benchmarks that meet their specific needs and idiosyncrasies.

The group perspective

Simple benchmarking measures applied across the group are valuable because they provide a means of comparing one reporting entity or subsidiary against another. This not only helps to identify the ‘delinquent’ subsidiaries that habitually hold up the overall consolidation but can also help isolate specific staffing and systems issues as well as providing the basis for collaborative problem solving and sharing best practice across the group.  For these reasons it is a good idea that once benchmarking is established it forms a regular part of monthly management reporting.

The impact of cloud

Finally, consolidation in the cloud could have a profound impact on the ease with which companies respond to the challenge of process improvement. For example, Adaptive Planning,  which is based on a uniform architecture  shared by all reporting units provide a much easier foundation for gathering performance data, for example, status reporting and workflow timings.  And as a by-product, the ability to deploy data capture through a web-browser helps to eliminate spreadsheet interfaces and complexity. Over time, web-based consolidation systems such as Adaptive Planning should become a rich source of data for companies seeking to raise the bar and by carefully designing internal benchmarks it is possible provide a framework within which to bring about continuous improvements in the performance of the consolidation process.