To say 2023 has been nothing short of a blockbuster-worthy start for the tech sector is somewhat of an understatement.
Business leaders have been faced with two extremes. The first is economic uncertainty and high costs, forcing some difficult decisions when it comes to navigating the path ahead. The second is the fast surge in popularity of large language models (LLMs), which has unleashed an interest in artificial intelligence (AI) and its commercial potential, unlike ever before.
This year has led a significant number of organisations to consider whether or not they are fully taking advantage of the wide array of commercial and operational benefits offered by AI – such as increased efficiency and reduced costs. However, many considerations remain for businesses as they contemplate embarking on their journey with data and AI. When thinking about adopting new technologies, leaders often do not consider the bigger picture, and only think about the short-term costs. Understanding the total cost of ownership (TCO) is a key piece of this puzzle – looking past what is initially being spent, and instead considering the cost-saving potential in the future.
Consolidate, simplify, standardise
When beginning the AI journey, leaders need to look at the total picture and consider past the immediate, up-front costs of ownership. In general, hidden costs could include cloud compute costs, purchasing any on-premises hardware, unplanned downtime for repairs, and more. The process becomes even more complex when operating multiple, disparate systems to store and manage data. This may seem like a lot to consider – but it is simpler than many think. It starts with building a strong, modern data foundation at the beginning of the process.
When operating on legacy data architectures, the complexity of these legacy structures can cause information silos to form, and prevent data from being easily distributed. Furthermore, inaccurate datasets that contain duplicated or outdated information may be shared, causing larger, and more expensive problems down the line. Without good quality data, there is no way for AI to reach its full potential. This is where modern data foundations, such as a data lakehouse, come into play. Lakehouses standardises how data is processed, formatted and stored. Organisations can consolidate the number of different platforms needed, simplifying the process of managing data and easily storing it for analysis, as well as AI and ML use cases. This process can be scaled and standardised across the organisation to ensure teams are up to speed and able to realise the full potential of their data. In the long run, this is a much more cost-effective way to begin the AI journey.
There are many additional advantages to migrating to a new data management platform. Once the initial migration is finished, data teams often report an improved experience, as they find a streamlined approach to managing data more user-friendly than operating on disparate systems. There are environmental benefits as well, because operating on one unified platform reduces cloud computing needs and thus lowers the carbon footprint of the organisation. As such, simplifying and consolidating data management processes does not only benefit the IT department, but the whole organisation.
The value of AI for increasing efficiencies
When armed with the necessary insights – leaders can decide which technology costs are crucial, which can be reduced, and fully take advantage of everything that AI has to offer. For leaders that are considering embarking on a data and AI strategy, there are a few important things to consider. Most business leaders at the moment are focused on how to do more with less. However, this should not be at the expense of properly managing data to guide business decisions.
But, what is the exact value that AI can provide to businesses? In addition to benefits pertaining to cost and revenue, AI has the power to help businesses revolutionise their practices and create better value for customers. We can look to supermarket chain Marks & Spencer (M&S) as an example of this. M&S launched its data and AI strategy in 2019, focusing on data-led innovation to give the company better insights into customer behaviour. This move was transformative, as the data gathered allowed M&S to really understand customer preferences from their past purchases and personalise interactions with them. There are a number of additional ways that M&S’s data strategy has enabled the business to use AI and machine learning (ML) to drive business growth, including an Intelligent Sales Probe that recommends in-store activities for store colleagues. The data and AI strategy as a whole enabled the company to increase efficiencies within the business and leverage data to predict the needs of customers, in turn increasing sales and reducing waste.
Building resilience during volatile times
During volatile times, it can be the natural response to cut spending, pause investments and focus on short-term gains. However, a key way to bolster resilience is to think about the bigger picture and focus on long-term strategies. The potential that AI has for creating efficiencies, reducing costs and creating value for customers is endless. There will only be more opportunities on the horizon for how AI can help organisations meet their objectives, and it all starts with future-proofing their data in anticipation of any more uncertainty ahead.
About the Author
Michael Green is Vice President, Head of Northern Europe at Databricks. Databricks is the data and AI company. More than 9,000 organizations worldwide — including Comcast, Condé Nast, H&M, and over 40% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world’s toughest problems.