How the UK public sector can overcome data sharing challenges

Data has the power to transform all aspects of our lives, and this is particularly evident in the public sector

With the government’s response to the National Data Strategy being published last year – the value of data and data sharing has entered the spotlight in the UK as a key to helping the country ‘build back better’.  

In the UK, as well as around the globe, the pandemic has created a pressing need to harness data and technology to tackle modern day challenges. The insights that can be drawn from data help inform public policy, as well as identify challenges and implement informed solutions that benefit citizens. However, data sharing between government agencies can be hindered by three main challenges – data architecture, the skills gap and non-uniform data governance frameworks.  

Modernising data architecture

Both government agencies and public sector bodies create and collect enormous amounts of data each day. This data could range from real-time intel about freights crossing the border, to details about local health services, or even lists of which children are receiving free school meals. The data that is stored can make a monumental difference to the lives of citizens. However, to be used effectively, it must be shared with trusted parties securely, quickly and accurately. To achieve this, the existence of a secure, standardised, interoperable, and open data architecture is imperative.  

Many government departments currently utilise data warehouses or other legacy data storage systems that – in addition to creating information silos – do not allow information to be distributed readily. If a strong foundation is not laid from the outset, inaccurate data sets with duplicated or outdated information can be stored and shared. This can cause bigger issues down the line that may be difficult to rectify. 

Creating a modern data foundation is the best tool for ensuring a timely flow of accurate data that can be used for analysis as well as implementing AI and ML models. Open methods of data storage, such as a lakehouse platform, simplify the process and allow teams to work with flexibility and governance. 

Addressing the skills gap  

Some might consider switching to new data systems as time consuming or risky. So how do we change that mindset and enable government bodies to make the shift? This starts with skills. Public sector bodies – just as many large enterprise corporations – often still operate on legacy IT systems and lack the skills to make a smooth transition to modern data architecture. 

There is a strong heritage of analytics and statistical skills in the public sector that have great potential to be modernised. Finding ways for these traditionally valuable skills to be bridged into modern analytical roles and data science isn’t farfetched. The benefits of open architectures means that data teams can integrate with well-established scientific packages such as RStudio, whilst benefiting from the modern benefits of cloud that provide security and scalability. 

Contributions to learning will also be key. Collaborating with universities to develop the next generation of talent whilst making learning accessible for all ages is core to building modern analytical skills in the public sector and wider data community. The power of community to promote peer-to-peer upskilling in technology cannot be forgotten – indeed this is what has made open source such a driving force. 

Building strong and resilient governance frameworks  

The data that is stored by public sector bodies is most valuable when it can be shared with other agencies. This has been evident throughout the pandemic, when the government has needed access to ever-changing information to mobilise frontline services.  

While the capacity for innovation is clearly present within the public sector, the need for agility in responding to COVID-19 highlighted structural barriers and silos of closed data in government. The need for the government to respond with agility is becoming imperative and, as such, a move to strong governance frameworks – which are also open and flexible – is a shift many are seeking to achieve. 

Therefore, work is needed to improve public and interagency trust and capability in data sharing. This is where there is demand for framework standards around sharing, owning and storing data across the public sector and at all levels of government.  

Government constantly needs to adapt to the new political and social landscape, and this can be done more effectively through technological innovation. In order to future-proof the UK against the world’s toughest problems, in which data sharing plays a key role, accountability and transparency are critical. COVID-19 has left a legacy that will last, proving that data and analytical maturity are both important parts of government decision making. When data is leveraged to better serve citizens, the opportunity for helping those in our communities who need it most can be hugely impactful. 


About the Author

Pritesh Patel is Public Sector Lead at Databricks, working to further the mission of helping organisations realise the potential of a lakehouse architecture. Pritesh collaborates with public sector organisations to deliver machine learning, artificial intelligence and data solutions with the goal of increasing efficiency and improving public services in the UK and internationally. Pritesh is a former consultant with expertise in public sector and government outsourcing. His previous experience includes roles in technology transformation at Capgemini and Cloudreach, where he deployed some of the first hyperscale cloud implementations in industry and Public Sector.

Featured image: ©IRStone