It is certainly an interesting time to be a data scientist.
Daily, we are bombarded with data from every sector, whether it’s COVID infection rates or economic statistics detailing the ‘demise’ of the economy. The explosion of data is inescapable. What has been sorely lacking is meaningful analysis of such data to provide actionable insights and steps towards genuine innovation.
‘Innovation’ has become somewhat of a buzzword in our field, much stated, but never truly demonstrated. This is particularly true when we view data innovation in the infrastructure sector. The industry is awash with data – efficiency measurements, carbon emissions, and return on investment figures – we have more data than ever before, but not enough of the right quality data to measure what we want to manage. To add value to what we measure, we need to move towards a model of gaining maximum insight from the minimum quantity of data.
Genuine innovation in the world of data could, and should, mean the creation of new value. This can be realized through either data-driven design at the planning stages of infrastructure projects or data-driven decision-making during operational activity. Data underpins everything that we do and yet we currently have systems in place that do not make best use of data’s potential in shaping decision-making or design.
For example, whether you take the Town & Country Planning or Development Consent Order route, our system sets a high bar for developers when it comes to providing supporting evidence. This can be time-consuming and expensive, and transport assessments are often months out of date by the time applications are considered. This has only been exacerbated by the COVID-19 crisis which has challenged many pre-pandemic assumptions about the mobility behaviours which inform transport planning.
The industry already has at its disposal a means of providing up-to-the-minute views of current travel patterns through technological partners – including mode of transport, geographical data and time – which enables developers the opportunity to back up their proposals with confidence and sense-check them when challenged.
Delve deeper with a digital twin
But the infrastructure sector can look to delve deeper and unleash the potential of data in development with the creation of a human-in-the-loop simulator for design and operations, in effect – a digital twin for infrastructure. Digital twins have already been realized by some early adopters in industries such as advanced manufacturing and Formula One but their potential has not been fully realized in infrastructure. Developers have often looked to building information modelling (BIM) but this in itself is not a digital twin.
A digital twin in the infrastructure sector would advance our understanding of ‘what if’ moments in the planning and operation stages of development. By integrating modelling, real-time systems analysis and digital processes we can create a ‘complete’ model of future scenarios across an asset’s entire lifecycle. This would revolutionise how we plan and deliver major infrastructure projects and create assets that are fit for the future by accounting for circumstances such as changes in end-user demand.
To deliver an effective digital twin we need to move beyond a model of ‘data for data’s sake’ and towards a system that maximises insight from minimal data.
The future is now
The coming decade is going to be a game-changer in how we deliver infrastructure provision across the UK. Data systems will evolve and undoubtedly become more intelligent. This will be achieved with the embedding of intelligence through cost-efficient sensors and ubiquitous communications which will enable infrastructure planners and developers to measure and manage a broad swathe of variables. The models that we create based on sensor data will help us make data-led decisions, ensuring that operational decisions are made based on knowledge as opposed to out of date data.
By unleashing the power of predictive analytics we can begin to understand the impact any future interventions might have on final outcomes. Simulations run with the decision cycle-time would allow us to make the optimal decision considering a number of possible scenarios. While it currently takes an experienced operator to undertake such a feat, machine learning is changing the way in which we work and make decisions.
Machine learning tools can, in effect, enhance the cognitive capacity of the operator and automate ‘obvious’ decisions. This is not to say that intelligent design will supersede the presence of a person. I believe it is wise to keep the human-in-the-loop along this learning journey and focus on helping them to make the most positive decisions, more of the time, particularly under extreme time pressure.
This is as true during the design phase for planning as it is for operations. I am still taken aback by how often those functions are separated out into planning andoperations. Over time it is likely that we will see the two areas conflate to one with the back-up of intelligent systems.
Knowing your limits
The limits of predictive analytics are dependent on the complexity of the system. For a closed system with few variables for example, it’s possible to train models on historical data and predict with relative accuracy. Narrow Artificial Intelligence (AI) such as this is already widely deployed today in a number of industries. More complex systems which can involve a combination of both physics-based modelling and behavioural-based modelling are beginning to be more widely adopted and trialled across a diverse range of businesses in a variety of sectors.
Data-reliant systems are bound by the limits of the quality of the available data. Since all models are only as good as the data that was used to train them, when a shock to the system occurs or a seismic transition in behaviour happens – such as a mass lockdown scenario which necessitates the shuttering of the majority of the economy – the models need to be retrained so that the machine can learn from the new data. Under such transient, unprecedented circumstances the accuracy of predictive analytics is significantly impaired.
These restraints can be overcome, but rather unsurprisingly, those first movers to develop new models from quality data will gain a competitive advantage. The more quality data that can be sourced the greater the momentum of innovation from model-based design and operations.
Beginning the recovery
The UK’s economy is under assault on all fronts. 2020 has seen a global pandemic wreak havoc on economic growth, while the threat of a ‘no-deal’ Brexit continues to loom over the economy. We have available to us an opportunity however to craft a ‘new normal’ that utilises available data to navigate the tumultuous waters ahead.
We need to look at the data that is currently available and spot the weak signals in the cacophony of noise to identify the early signs of recovery. Once these early signs have been noted we can then begin to act decisively based on the insights provided by predictive analytics.
There will undoubtedly be an element of risk involved as it is beyond the realms of possibility to predict the future. It is however a far better approach than trying to apply old models (using outdated data) to the ‘new normal’.
About the author
Geoff McGrath is an entrepreneur, strategist, innovator and technologist. For over eight years, Geoff was the former chief innovation officer for McLaren Applied, where he took insights from the world of Formula 1 to drive change in the wider transport sector and beyond, Geoff is now the Managing Director of data science business, CKDelta. To find out more, please visit: https://www.ckdelta.ie/
Featured image: ©Pinkeyes