Using Benchmarking Data to Track How Much Infrastructure Costs Over Time

Hop Dao
Hop Dao
June 25, 2021
Using Benchmarking Data to Track How Much Infrastructure Costs Over Time

10 years ago, Australia had its first transport infrastructure project worth at least $5 billion. Today there are nine such projects under construction.

A study by Grattan Institute last year spotted an overrun of $24 billion so far on just six current projects.  

Using Benchmaring Data to rack How Much Infrastructure Costs Over Time

The rapid rate of globalisation is making supply chains longer and more complex than ever. At the same time, climate change is increasing the incidence of natural disasters around the globe. These factors are all making it harder to plan and manage capital portfolios so that they fulfil the expectations of the asset owners.

Governments post-COVID-19 are more willing to spend money quickly to create jobs and stimulate the economy. The Federal Government has committed to invest an additional $14 billion in new and accelerated infrastructure projects over the next four years. These projects will support a further 40,000 jobs during their construction.

These factors can all bring the costs of the construction up to a whopping amount of budget. But how can we identify how much, or to what extent different factors may have contributed to any increases? Benchmarking data.  

In this article, we’ll walk you through the power of benchmarking from past project data in minimising the risk of cost overruns in major projects.  

Let’s get started.

What is Benchmarking Data?

Reference class forecasting, often referred to as benchmarking or validation, is the process of comparing cost estimation for one project to those on similar projects that have already been built.  

Project managers can use the average amount of cost overruns recorded across the sample past projects to estimate expected value of cost overruns for future ones. The variance of the outcomes on the comparison projects can be used to understand the range within which a cost estimate is likely to vary.

Why is benchmarking worth doing?

Bigger projects are particularly risky when it comes to unexpected risks. A project is announced prematurely when the proposed cost does not have yet the necessary financial approvals with a technical commitment. This means the initial costs announced are based on someone’s gut instead of a sound basis. In Australia, only one-third of projects are announced prematurely, but they account for more than three quarters of the cost overruns.

Benchmarking data, on the other hand, does not suffer from optimism bias, because it relies on objective historical information. Such data can help project specialists to develop better cost estimates at early stages of a project and avoid unknown increases at the later stages.

The data available in Australia, however, is far from adequate.  

In Australia, the federal government has called for better data to assist cost estimation. The only action taken was a pilot study, published in 2015, of 45 road projects and an update of 32 more in 2017. But there has been nothing since then. This exercise will be useful only if it is repeated regularly over a much longer period.

Using Benchmarking Data to Track How Much Infrastructure Costs Over Time: Sydney Metro City & Southwest scheduled to be completed in 2024. Photo:
Sydney Metro City & Southwest scheduled to be completed in 2024. Photo:

Future-proofed technology that will soon become inevitable

Capital portfolio management teams need not only more thorough & timely data, they also need better software for cost estimation and risk management. Software systems such as Mastt have taken things further and developed new methods of predicting, rather than reacting to, changes in capital portfolio data.  

Mastt has developed a global-first Project Anomaly Detector that allows organisations to automate identification of risks and issues and foresee problems before they occur. The key enabler in all of this is being able to use vast amounts of available data and extract useful information, making it possible to reduce costs, increase efficiency and spot risks and issues before they occur.

With your data collection system set-up through a project controls technology like Mastt built on the latest in Microsoft Azure products and services, machine-learning technology – not project managers - has the infrastructure to rapidly sort through millions of data points to find answers.  

With this new, enriched capability, companies can compare the impact of hundreds of performance drivers on project or business outcomes. They can also identify the obstacles that raise costs and extend timelines. In some areas, advanced analytics may produce savings of up to 25 percent.  

This is an AI service that ingests time-series data of all your organisations projects and, using Machine Learning, selects the best-fitting detection model for your data to ensure the absolute highest accuracy.  

Put simply, the Machine Learning engine receives project and user data points in the millions and identifies patterns and benchmarks for what the engine has determined to be a successful project.

How strict or relaxed we set the Machine Learning engine to be is up to the organisation, however we’ll call the patterns generically ‘best practice’.

Using Benchmarking Data to Track How Much Infrastructure Costs Over Time: Mastt Anomaly Detetor showing a live project staying within pattern of best practise.
Mastt Anomaly Detector showing a live project staying within pattern of best practise.

With  a data collection system, in real time we can track and compare a live project against the “best practice” pattern. If the trend of the live project steps outside of the boundaries that the “best practice” allows, the project managers would know that a potential risk is arising, and attention is required.

Take control of every step in your Capital Project lifecycle