Global businesses need global campaigns that live off global data – but this is impossible to achieve if the data within each market is siloed away, unable to be collated and examined on a global scale
Many organisations might feel afraid to leverage their data from one market in another – justifiably and responsibly afraid of failing to protect their users’ data as mandated by each market’s regulatory authority. It’s right to be concerned, and to hold to the highest standards of data governance and accountability – but data protection regulations do allow you to make use of sensitive customer or corporate personally identifiable data, provided it’s done responsibly and in an auditable way to prove that best practices are adhered to.
Something that data analysts soon realise, whether they are fully accredited data scientists, or what Gartner calls ‘citizen data scientists’, AKA line of business employees, AKA people who need to use their corporate data to better do their jobs, is that a good process is key to a good result.
There are a few areas to implement change if your business has access to large quantities of good data but fails to get the most efficient and effective use out of it due to uncertainty of apprehension over the process and capabilities of the business.
Governance: Safely share data
This might not be the language that gets anyone other than a committed data scientist head-over-heels, but a shared data hub based on modern data cataloguing is something that should make every person doing any data-role jump for joy.
A trusted, shared business and data catalogue serves to empower everyone: Everyday analysts in the line of business, data scientists, and casual users. The reason being they go to one central place to create and share data-derived actions. The catalogue itself is where vetted, prepared, safe data is stored. It’s part of a strong business process whereby true data is sourced from the business and accessed correctly by those with permission.
A mature catalogue will have extensive information categorisation so that analysts can focus their work by data domain (e.g., only look at customer data), compliance risk (to avoid data about EU residents, if not based there), and trustworthiness or usefulness (as rated by other users).
It’s not just that adhering to the alphabetic soup of regulations like GDPR and HIPAA, SarBox and so on (and there are many more for various regulated industries that touch on data protection and privacy) is legally required. Beyond the fines and other legal ramifications, it makes a great deal of sense to upgrade the way organisations of all types manage customer and other corporate data.
Over the past few years many businesses have discovered that their big data lakes have become big data swamps. Errors, inconsistent categorisation, and poor governance snowball and make it harder year on year to extract insight from bad data. And that’s without factoring in that there’s been a global big data/analytics skills shortage for years. Nonetheless, organisations need to be able to use the big data they collect, as even marginal gains across product lines, departments and business performance can be created that will likewise build and snowball – this time to a positive outcome.
Moving to a process involving the data catalogue is one with many virtuous precursor and second order effects. To vouch for data it needs to be updated and cleansed. It gains a clearer view of all customers, partners (and so on) and the relationships the organisation has with them. Then customer experience and communication improves as errors are removed and the customer journey made more efficient. The end result is accurate and true accountability, credibility, and trust in the business processes that allow the enterprise to take really decisive action.
Privacy and security
‘Data democratisation’, enabled by analytics solutions has made sophisticated analytics accessible to the average person within business. It needn’t come at the expense of privacy though.
The cloud has smoothed over traditional organisational barriers and made it much easier for all businesses to streamline and open up enterprise data, whilst still retaining control of access rights and security. But this is still a little-known secret. The organisation needs to make data accessible for all who need it – and not to those who don’t.
It’s also related to the quality component mentioned earlier. It’s no use if Finance needs to know how effectively Marketing budgets are being spent if neither team is using the same metrics or underlying data. And just because Finance has rights to view certain data doesn’t mean they need to see all the data Marketing uses.
Whilst the whole business should be working to common standards and from a shared view of the truth, they should only be able to access data based on business need.
A data catalogue should allow the governance team to set owners and access rights, ensuring the processing of sensitive data is only done by the right personnel. Furthermore, such data should only be shared in accordance with policy, so that an authorised user can’t make an honest mistake.
Remember!
Governance or flexibility isn’t a binary choice: You don’t have to sacrifice one for the other. From data access and data management to authorisation and authentication, a good analytics platform should ensure analytic governance and both data quality and data security. At the same time, it should make the lives of all data users (as well as IT), much more joyful and easier. This article really only introduced the data sharing topic, for those that want to know more, they must investigate the power of an analytics platform that enables all these concepts and serves the needs of the analytic talent and their business goals, not the purpose of growing uncontrolled data for its own sake.
About the Author
Nick Jewell is Director, Product Solutions, Alteryx. Nick started his career with a Ph.D. in Chemoinformatics (Data Science for Drug Design) which helped develop his life-long passion for applied analytics. As a long-serving analyst, architect and innovation leader for Analytics & BI at a large international bank, Nick helped define and implement large scale big data, analytics and informatics solutions at the petabyte scale for global audiences. Nick has a passion for open data, connectivity and emerging technologies in the data and analytics space and is a strong supporter of agile-led and user-centric design.