Three steps for conquering the last mile of analytics

Becoming insights-driven is now the ultimate prize of digital transformation, and many organisations are making significant progress toward this goal

However, putting insights into action – the “last mile” of analytics – is still a challenge for many organisations. 

With continued investments in data, analytics and AI, as well as the broader availability of machine-learning tools and applications, organisations have an abundance of analytical assets. Yet the creation of analytical assets should not be the only measure of success for organisations. In reality, deploying, operationalising, or putting analytical assets into production should be the driver for how organisations are able to get value from their AI and data science efforts.

In a traditional data and analytics continuum, data is transformed into insights to support decision-making. If organisations want to break out from experimentation mode, avoid analytics assets becoming shelfware, and empower front lines to make analytics-powered decisions, they must start with decisions. Then they need to decide how to find, integrate and deliver the insights; and identify data to enable that.

These days, I suspect many organisations would argue they’re doing just that – they’ve hired analytics talent and appointed chief data officers (CDOs) or chief analytics officers (CAOs) to collaborate with business leaders to become more data- and analytics-driven. But many organisations are not seeing the desired impact and value from their data and analytics initiatives and are not able to quickly put their pilot projects into production.

According to IDC, only 35% of organisations indicate that analytical models are fully deployed in production. Difficulty in deploying and operationalising analytics into systems or applications – and being consumed by downstream processes or people – is a key barrier to achieving business value.

Some might argue that the main focus within analytics projects has been on developing analytical recipes (e.g., data engineering, building models, merits of individual algorithms, etc.), while not much attention, priority or investment is done for operationalisation of these assets. This is easier said than fixed. Data does not provide differentiation; decisions at scale do. Applying insights consistently to turn data into decisions will let organisations build a true software-led system of insights to grow and break away from competitors.

How can organisations put analytics into action in a systematic, scalable manner and conquer the last mile? Here are the three key areas where organisations need to pay consistent attention.

Understanding Technology Components

The need to streamline and operationalise model management processes requires users to register, deploy, monitor and retrain analytical models. More specifically:

Register. The centralised model repository, life cycle templates and version control capabilities provide visibility into commercial and open-source analytical models, ensuring complete traceability and governance. It will also promote collaboration among different stakeholders and manage the analytics workflow effectively. Letting organisations store data, code, properties and metadata associated with models enables transparency and shows the real value of analytical assets.    

Deploy. The deployment step is all about integrating analytical models into a production environment and using it to make predictions. It is often the most cumbersome step for IT or DevOps teams to handle, but it’s essential in delivering value. Ideally, organisations should be able to combine commercial and open source models in the same project to compare and select the champion model to deploy. Depending on the use case, models can be published to batch operational systems (e.g., in-database, in-Hadoop or Spark), on-demand systems (e.g., web applications), cloud, or a real-time system using streaming data.

Monitor. Once organisations start realising the value from analytics, the real world does not stop. Scores need to be analysed and monitored for ongoing performance and regularly evaluate whether models are behaving as they should as market conditions and business requirements change and new data is added. Performance reports can be produced for champion and challenger models using variety of fit statistics.  

Retrain. If model performance degrades, organisations should take one of three approaches:

o    Retrain the existing model on new data.

o    Revise the model with new techniques (such as feature engineering, new data elements, etc.).

o    Replace the model entirely with a better model. 

This requires commitment between stakeholders on which metrics to measure and which will deliver business impact.

Embracing Roles and Behaviours of Different Stakeholders

In order to be successful in the last mile of analytics, a close collaboration between stakeholders with the right skill sets – data scientists, business units, IT and DevOps – is critical. Lack of interest in dealing with deploying and managing analytics into production, leaving it solely to just one team (e.g., IT or DevOps), or not having the right incentives for all stakeholders to communicate will not create value from your analytics or AI initiatives.

For data scientists, developing analytical assets should only be initiated with deployment in mind, while IT or DevOps teams will have to understand the integration requirements, operational data flows and data preparation for model deployment and retraining. The role of business stakeholders is equally important. They are the ones who have to clearly define what benefits are expected from the analytical models and collaborate with data scientists to understand the results after models are put into production and monitor the results on a continuous basis.

Establishing a Systematic Operationalisation Process

Finally, the only way to ensure the value, integrity and transparency of analytical models is to establish a process for operationalising analytics. Many organisations have a well-defined process for the analytics development phase of the analytics life cycle. But a lack of process-centric understanding around the model deployment and management phase of the life cycle is an important barrier that needs to be overcome.

A well-defined process, with proper templates and workflow, needs to validate that the model developed using training data is still performing as intended in the real world, and integrating and executing the same model against operational systems or processes. Some organisations make a mistake and stop here. In fact, to fully realise value, the performance of models in production needs to be monitored continually.

It’s no surprise that this last mile of analytics – bringing models into deployment – is the hardest part of digital transformation initiatives for organisations to master, yet it’s critical if they’re going to experience real benefits from AI and analytics investment. To systematically realise full potential from data and analytics initiatives, organisations must involve IT and DevOps early on within the data science project such that operationalising analytics is not an afterthought; agree on the quantifiable outcomes before building analytical models; and have a clear understanding of the steps, roles, processes and handoffs involved, from data preparation and model development to putting analytics into action.


About the Author

Dr. Laurie Miles is Director of Analytics at SAS UK & Ireland. SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market. Through innovative solutions, SAS helps customers at more than 70,000 sites improve performance and deliver value by making better decisions faster.

Featured image: ©Kssnr

Copy link