Data management: improving project performance

Beyond big data

15 March 2016

Kim van Rooyen investigates how industry-agreed standards on data use are essential to improving project performance


Expectations about what ‘big data’ can achieve are rocketing, but if the built environment sector is to maximise the benefits, we must take collective action to define, standardise and share its data more widely.

Change has to happen at both a micro and a macro level, within organisations and project teams, across industry groups and supply chains. It will not be easy; to be successful requires a non-adversarial culture and high level of trust. But data maturity is essential if we are to drive the next level of improvement in the way we build and manage assets.

The built environment sector needs its own route map, and professional organisations should take the lead on this by developing certification, accreditation and training.

Setting priorities

With such a vast array of data potentially available on a typical project, measuring the right things is key to driving better performance.

So it is important to define your priorities and corresponding data needs, resisting the temptation to place too great an importance on the things that are easy to measure but less useful to your objectives.

Technology should not become a distraction from the bigger picture. Too much data can overwhelm and slow down decision-making; it could be as damaging to the progress of a project as too little.

This principle should also be applied to how senior management collect and look at data. Traditionally, project managers have filed progress reports at agreed intervals. Now, with real-time visibility, an international programme director can instantly access project data from any location across the world.

But while this can be beneficial, particularly in terms of the assurance aspects of a project, senior directors need to be disciplined about how they exercise this tool. Dipping in and out of the information risks getting too close to one aspect of a project, becoming obsessed with minutiae and failing to see the bigger picture.

Common standards

Without common standards, the quality of data could be at risk at any point in a project, from the moment it is collected to the occasions when it is transferred, stored or aggregated.

Interpretations of standards can vary from project to project within a single organisation, and this can lead to a lack of confidence in the data itself. For example, the anticipated final cost (AFC) at a certain milestone of a project is critical for forecasting outcomes and making decisions. But if the AFC does not conform with a standardised measurement, it cannot be meaningfully compared with the other repeat projects that an organisation is undertaking around the world.

Adopting industry-agreed definitions for data would not only help companies improve their internal processes, it could ultimately drive up standards across the sector as a whole, as data could be shared more frequently.

Creating a common architecture that defines data is an important step. However, if data is to have longevity, it also needs to fit into a hierarchy or architecture that is recognised throughout the industry. This common coding structure would standardise the way that data on every project is captured and stored, from an oil rig to a retail outlet. It would enable organisations to compare performance on areas such as cost and schedule against their peers, or even other industries.

Improve data management capabilities

Other sectors, notably retail, employ armies of data specialists to track and predict trends. The built environment sector is catching up with them slowly, with a noticeable rise in the number of data analysts or information managers being embedded into organisations and project teams. However, data management tasks are often bolted on to an already existing role – for example, a quantity surveyor. Alternatively, they are carried out by an IT specialist who may have little understanding of construction.

In coming years, data management will become so central to the success of a built environment project it should be a recognised discipline in its own right. Data analysts within the sector should have the same status as chartered surveyors, construction managers, engineers or architects.

As an industry, we should start defining what the role of construction data manager or analyst would entail. Accredited qualifications in this area are urgently needed.

Collaborate across supply chains

Sharing data more widely is essential if we are to drive improvements in the way we build and manage assets. But at present, data transfer between parties can be inefficient and ineffective even within a single project.

The lack of common standards and protocols is only part of this problem. There are also commercial and cultural barriers to transparency, rooted in distrust and adversarial ways of working.

Parties in a supply chain can be reluctant to share more than the minimum of data with each other. Organisations may justify this by citing commercial confidentiality, but the real reason behind the secrecy is probably lack of trust: they fear that the data will be used against them to substantiate a claim.

To counteract this problem, common protocols should be established in the earliest phases of a project, long before work gets under way on site. Data should be shared centrally with all parties, including the client, in a common, collaborative environment.

With a critical mass of projects sharing data efficiently, the sector can move on to the next phase: specifically, sharing information horizontally, on a peer-to-peer basis. This is already happening in some sectors. Over the past 2 decades, for instance, oil and gas companies – which have well-established data protocols and hierarchies – have been benchmarking project performance data anonymously through a joint industry performance forum. The initiative has enabled awareness of the rising costs in the sector to be raised.

This practice should spread to other built environment peer groups as they begin to gather and share more consistent data.

Five stages of data maturity

  1. Little useful data: the company cannot generate useful metrics and analytics, and does not understand, let alone anticipate, a project’s performance. It has few useful insights based on robust data with which to make confident decisions.
  2. Big data: the company may be inundated with a flow of data from many sources, but does not yet have the approaches or capability to turn that data into useful information. Project teams can spend too much time trying to make sense of large volumes of data, rather than driving the project forward – often referred to as paralysis by analysis.
  3. Incisive information: the company uses high-quality information, backed by robust data, and applies context and relevance to using the right models to drive the project forward. Common taxonomies and coding structures help standardise and explain data and its use in a meaningful way, explaining the links and interdependencies across the data.
  4. Predictive: the company moves beyond the simple analysis of historical or current status – it is also able to carry out predictive analysis. By using data to anticipate what is likely to happen, it can better plan for how projects are driven forward successfully.
  5. Data-centred: the company’s entire project delivery model is built around its analytical data-led approach. Data and its use sit at the centre of driving project performance and improvement on a sustained basis. Both predictive and historical analysis models play a key role, and more advanced data analysis techniques – such as data mining and benchmarking – become increasingly important for leading performance.

* Adapted from Big Data: Turning Data into Knowledge and Putting Knowledge to Work by Markus Sprenger (BeyeNETWORK, 15 April 2011)

Integrated technology

At present there is no standard model to define how technology should be integrated into a construction project: the systems used by project and cost managers, contractors, architects and engineers are all isolated from one another and they do not exchange information efficiently.

We need to develop best practice models for fitting this jigsaw of technology together so that data can flow efficiently between parties. This will be essential for maintaining a healthy ecosystem of interaction between the multitudes of contractors and suppliers, particularly on complex projects.

Admittedly, some built environment organisations are further ahead than others on the data journey, but the progress is too slow and these pockets of best practice are not benefiting the industry as a whole. Clients and industry groups alike need to act strategically, and with more vision, if they are to drive genuine efficiencies.

Technology is presenting a once-in-a-generation opportunity. Failing to act and muddling along with inconsistent data will lead to loss of trust, worsening relationships and stagnating productivity. Therefore, we need to step up the pace of change. Moving faster to secure the quality of our data will accelerate improvement throughout the sector as a whole.

Kim van Rooyen is a Director at Turner and Townsend

Further information

  • Related competencies include Data management
  • This feature is taken from the RICS Construction journal (February/March 2016)