Historically, many businesses have relied heavily on single, monolithic pieces of business software to manage all their key business functions, from accounting and finance to human resources, manufacturing and supply chains.
While these systems offer a range of advantages, including integrated business processes and simplified IT management, they are also expensive to buy and operate, complicated to implement and inflexible.
More recently, however, technology trends have changed, and businesses have shifted to using specialised software across a wide variety of different functions. This ‘best of breed’ approach has led to the growth in more focused software tools, such as Salesforce for CRM, Zendesk for service delivery and Microsoft Teams for communication, among many other alternatives.
In this context, integrating a suite of software solutions from different vendors, both internally and externally, fuels a hugely successful and growing segment of the software industry. In contrast, the common approach to technical integration hasn’t moved on at anything like the same pace.
For example, traditional integrations were often based on packaging data into CSV and XML files and then moving that file from a location on one server to a location on another server. Despite the obvious usefulness of this approach, it does have some important limitations, including the fact that the system that is sending the data often has little or no visibility of whether the destination system received that data and was able to process it successfully.
To address these issues, the industry developed APIs that allowed a system to establish whether the data transfer was successful or if there were errors and what those errors were. Unfortunately, however, the issues didn’t end there. In the past, integrations were custom-coded within the system itself, and code would be written to either generate a file and place it on the required server or call the destination system’s API.
The problem with this approach is that each integration is custom-made, which means it not only has to work perfectly under normal conditions but also handle any issues that could arise. For example, if the API rejects the data, should the sending system try again? If so, how many times should it try? Should it also keep a log of all sent messages, and should that log be cleared periodically? These are important points because all these various integration challenges need to be addressed for each project.
Addressing the challenges
So where does that leave modern organisations that need their business applications to work together as part of a unified approach?
In many cases, businesses are replacing their legacy software solutions with a modular selection of applications hosted within a public cloud environment. Given the increasing maturity of this market, there is now a range of application stores and marketplaces from the likes of AWS, Microsoft and Google. These have made it much easier for IT teams to identify, purchase and integrate proven applications as part of a bespoke, enterprise-wide ERP strategy.
Take the experiences of mid-market organisations, for example, where the choice and flexibility this offers is helping them not only improve efficiency and performance but to benefit from significant cost savings. In this context, public cloud adoption is going hand-in-hand with digital transformation.
Clearly, the process isn’t quite as simple as selecting applications in a marketplace and switching them on. Organisations still need to ensure software components are operating securely within their environments and can communicate effectively, which can generally be addressed via API technologies and the development of software gateways. In addition, Microsoft offers a range of Azure Integration Services, including API Management (APIM) and Service Buses to speed integration and prevent data silos.
Focusing on data strategy
But, once IT teams have selected and integrated the right business applications within their environment, the next step is to focus on data strategy. The main objective here should be to ensure that data is of the highest quality and can be used to address a diverse range of key business objectives, from driving profit, efficiency and innovation to improving customer service.
This process can be complex and challenging, but there are a number of steps organisations can take to fully exploit their data assets. These include optimising the performance and availability of an existing data environment and prioritising data systems migration. From a data infrastructure perspective, building or expanding data warehouse or data lake environments can play a major role in ensuring they can cope with current and future data volumes. In other situations, the best course of action could be to modernise legacy data systems and expand current data analysis capabilities to improve business intelligence.
According to recent mid-market research, many companies are already going down this route, with over a third planning to build or expand their data warehouse or data lake environments in the next 12 months.
This forms part of a wider picture where businesses are increasingly focusing on the benefits they can deliver by integrating modern, cloud-based business applications. Those that do will be ideally placed to succeed in an environment where data-driven decision-making and agile operations are key to remaining competitive and driving growth.
About the Author
Paul Cartwright is Technology & Innovation Director, ERP at Node4. Node4 empowers public and private sector organisations across the UK to adopt technology and infrastructure that helps them drive positive outcomes and create a lasting impact. With a varied portfolio of tech solutions and managed services on offer, including Business Applications, Modern Workplace, cloud hosting, network, and data and security, our clients can turn to us knowing that they’ll be supported to achieve their strategic goals. Our fully owned data centre network and partnerships with market-leading vendors including Microsoft, Cisco and Fortinet help us tailor the best platforms, applications and infrastructure to meet our clients’ needs.