“De-silofication”: what is it?
Many successful companies today have found their own ways of connecting data, people, and ideas. What sets them apart is how they are taking advantage of an unstoppable force — the increased fragmentation of data, computing, and usage.
Undergoing this data-driven digital transformation — leveraging data as a strategic asset to better inform business decisions — is more popular than ever, and is therefore increasing the need for data literacy across an organization. But adjusting to this new reality can be challenging. With information chaos making headlines in recent months, the data landscape can be challenging to navigate, prompting an increased need for governance, security, and data quality.
The question is: how do we balance the move into an analytics economy while maintaining privacy? It’s about taking fragmented data, people, and ideas out of their silos and connecting them in agile, innovative, and governed ways — known as the “de-silofication” of data.
We’ve identified 11 emerging trends that will start to make this possible for organizations, helping them to transform their business. Read on to find out what they are, and how you can help your business make this move.
Trend #1: Data literacy will gain company-wide and societal priority
Data literacy, known as the ability to read, work with, analyze, and argue with data, is becoming more important in today’s analytics economy. In fact, Gartner1 predicts 80% of organizations will work to increase data literacy across their workforces by 2020. To begin making this change, leading software companies will begin offering these types of programs, and good user organizations will take a structured approach to increasing data literacy.
Trend #2: Hybrid multi-cloud will emerge to connect the dots
The rapid increase of cloud services will exceed even the amount that IT leaders think they have.1 But some data will need to be moved out of the cloud for regulatory, security, cost, and performance purposes. This, in addition to more computing at “the edge,” will lead to fragmented data and application domains. This means analytical architectures that can handle multi-cloud, multi-platform, and hybrid environments will become the new norm.
Trend #3: Data gets edgy
There is a growing number of use cases, especially around IoT, offline mobile, and immersive analytics, where it’s more beneficial for organizations to run workloads locally instead of through public data centers. As a result, we will see a dramatic increase of workloads1 run directly on a variety of devices — making this approach the most optimal for latency, bandwidth, autonomy, and privacy.
Trend #4: Big Data, data discovery, and data science will converge
Typically, these three areas are separate because their users have different tools and skill sets. And while this should still be the case sometimes (e.g., data scientists and engineers should be the ones working with algorithms and data models), now there are a lot more ways to share their work with a broader audience. Promising progress in machine intelligence, big data indexing, and engine-to engine integration is opening new opportunities for users to fully explore many big, complex, and varied data sets.
Trend #5: Data catalogs will become the next frontier for self-service
For a person to be truly data literate it’s important for them not only to be able to analyze data, but also have the ability to read, work with, and argue with it. As a result, in recent years it’s become easier to go beyond self-service analysis into self-service data preparation in a more visually compelling way. Recently, we’ve seen the same self-service trend emerging around data catalogs. But they’ve still largely been for experts, applied on top of data lakes. In the future, new ways of cataloging data will be more deeply integrated with the data preparation and analysis experience. This will help bring it to a broader audience that is able to easily combine governed corporate data, data lakes, and external data as a service.
Trend #6: Need for interoperability and new business models puts focus on APIs
As data, computing, and usage become more distributed, so do the technology environments of corporations. Companies are no longer looking for end-to-end solutions and single stacks as it doesn’t look like their architectures. Rather, they look for parts that can easily be stitched together, as it’s more important that different software systems talk to each other. This means that analytics platforms in this new environment need to be open and interoperable, with extensibility, embeddability, and modern APIs. This interoperability will shift analytics from one destination to become more embedded in workflows, blurring the line between BI applications as we know them today to data-driven apps that fuel the analytics economy.
Trend #7: Blockchain hype will drive experimental applications beyond cryptocurrencies
New techniques are emerging for processing, managing, and integrating distributed data, making the location of data an increasingly smaller factor in information strategies. This means ideas can be sourced from blockchain and peer-to-peer technologies. While this is still in the beginning stages, 2018 will see innovation move beyond cryptocurrencies to experimental applications for analytics and data management.
Additionally, connectivity and analytics will be on the blockchain ledgers. But ultimately, the bigger benefit might lie in the ability to verify lineage and authenticity of data using blockchain technology.
Trend #8: Analytics become conversational
The consumption and interaction of analytics have been focused on drag-and-drop style dashboard list boxes and/or visualization for a long time. While there continues to be value in that, increasingly there are approaches available for what can be categorized as “conversational analytics,” simplifying the analysis, findings, and storytelling, so that users more easily get to that one data point they are after. This includes techniques such as natural language query, processing, and generation, augmented by search and voice. This technology, helped by virtual assistants and chatbots through API integration, provide a new means of interaction. But it’s not a one-size fits all. While out-of-the-box functionality may seem novel, the real value is in contextualizing it for a particular use case and business process.
Trend #9: Reporting redefined. This time highly contextualized
We realize that not everyone will want to, or have the time to, go in and explore their data in detail every time. Instead, we will see different users with varying levels of skillsets. This means that reporting will start to become redefined through providing not just analysts, but also participants, with highly contextualized information — inverting analytics as we know it today. Rather than having to go to a destination to perform an analysis, it will come to users, embedded into the work space where people are. This means getting the right information to the right people, at the right time, in the right place, and in the right context. And in that process, many more people will be empowered with data and analytics than ever before.
Trend #10: Analytics become immersive
Given that the price of virtual reality devices remains a bit too steep for mainstream adoption, we are still several years away from augmented reality. The breakthroughs likely will happen in enterprise use cases, with analytics playing a role. But immersive experiences can also take on other formats where users become engaged from a sensorial and social standpoint. Through better user interfaces, large-scale displays in digital situation rooms, better storytelling with data, and collaborative features, more people will be drawn to using analytics.
Trend #11: Augmented intelligence system changes users into participants and facilitators
In its current state, the most effective use of Artificial Intelligence (AI) is applying it to a diverse but specific set of problems. But blending AI with technologies such as intelligent agents, bots, and automated activities, along with traditional analytical tools such as data sets, visualization, dashboards, and reports will make data more useful. That alone, however, isn’t enough. Instead, a system where machine intelligence and humans participate in a broader ecosystem, and the exchange and learnings that happen between them, is known as augmented intelligence.
About the Author
Dan Sommer is the Senior Director, Global Market Intelligence Lead at Qlik, responsible for the supply, demand, macro, and micro picture. He is a former Gartner analyst specializing in markets, trends, competitive landscape evaluations, and go-to market strategies.