In the last decade, we have seen organisations embrace the cloud as a way to gain access to automated, elastic and on-demand infrastructure
What used to take weeks, now takes minutes. At the same time, the elasticity of the cloud has also allowed businesses to move towards new models of development.
Together, DevOps and the cloud have torn down the barriers between people and infrastructure.
DevOps and continuous delivery initiatives have become commonplace within businesses, allowing organisations to further improve the frequency, quality and output of software. Organisations have raced towards the latest and greatest technology in the rush to get ahead. The infrastructure needed to embrace new developments – everything from robotics to artificial intelligence (AI) and automation – is readily available. However, in a world where software has become light, agile and streamlined, there is one thing holding organisations back – data.
To move fast, development teams need continuous access to high quality data, and lots of it. If it takes days to refresh data into a test system, teams are caught between a rock and a hard place: move slower, or compromise on quality at the expense of your users.
Data, Data Everywhere…
Data is needed everywhere across an organisation. AI and machine learning applications, for example, require ready access to clean, secure, datasets for training and execution. But much of the most important data remains stuck in enterprise systems, stymied by legacy tools and processes. Organisations that have automated their development and machine learning infrastructure have found they have failed to provide similar capabilities for their data.
Data has grown exponentially in size, complexity, and cost, all with escalating security and privacy concerns. IT teams are forced to limit data access and availability because moving, copying, and securing large amounts of data is simply too cumbersome and costly. Yet those that are driving speed in DevOps, cloud, and other initiatives need data to be everywhere and available on-demand. There’s a very real tension that exists when constraints on data prevent people from meeting the ever-growing demands of the business. And this problem is growing by the second.
But there is hope. DataOps is an emerging cultural movement that aligns people, process, and technology to support high velocity collaboration and innovation through data. As DevOps focused on developing and operating applications, DataOps focuses on securing, managing, and delivering data. Data, and access for those that need it, is a competitive advantage. According to IBM, 2.5 quintillion bytes of data are created every day – those that can leverage data to discover insights and drive innovation will win; those that can’t, will lose.
Data needs its seat at the table
We need to move away from organising our teams and technology around the tools by which we manage data, such as application development, information security, analytics and data science. Instead, we need to recognise that data is the critical asset, and bring together everyone that uses or manages data to take a data-centric view of the enterprise. When teams move beyond the mechanics of data delivery and focus instead on the policies and constraints that govern data in their enterprise, they can better align their infrastructure to let data flow through their enterprise to those who need it.
To get there, DataOps requires that teams embrace the complexity of today’s technology landscape and think creatively about common solutions to data challenges in their enterprise. For example, you may have information about specific users and their roles, attributes of the data and what is required to secure it for different audiences, and knowledge of the resources required to deliver that data where it is needed. Bringing those together in one place with novel solutions allows the organisation to move faster. Instead of waiting hours, days or even months for data, environments need to be provisioned in minutes and at the speed required to enable rapid development and delivery of applications and solutions. At the same time, organisations don’t have to choose between access and security; they can operate with confidence that their data is appropriately secured for all environments and users without cumbersome manual reviews and authorisations.
When done right, DataOps offers a cultural transformation that facilitates collaboration between everyone involved with data. Today, data management can be a joint responsibility of operations, database administrators, application teams, security and compliance officers. And while Chief Data Officers control the governance and quality of data, they rarely take any interest in non-production needs. With no-one taking ownership of cross-functional data management, innovation stalls. With powerful collaborative data platforms, however, businesses can ensure that sensitive data is secure and the right data is made available to the right people, when and where they need it. From the engineers who provide the data to the data scientists that interpret it and the developers who test against it.
The next decade will reshape the face of computing through IoT devices, machine learning, augmented reality, voice computing, and more. And with that change will come more data, more security concerns, and more regulation. This will put incredible pressure on organisations, and whoever cracks the problem first will win. With DataOps, IT can overcome the cost, complexity, and risk of managing data to become an enabler for the business. While users can get the data they need to unleash their capacity for innovation.
If DevOps and cloud were key enablers of today’s digital economy, DataOps is poised to be the engine of the future data economy.
About the Author
Eric Schrock is CTO at Delphix. Delphix free companies from data friction to accelerate innovation. They believe your data should ascend to its rightful place as a pillar of your cloud, digital and governance strategy.
