Traditional Active-Passive High Availability Practices are Dead

Today’s always-on businesses require true active-active capabilities to keep up with modern real-time data demands

In the modern on-demand, always-on economy of today, data availability has become a vital component of businesses’ IT strategies. Data must always be both accessible and available, no matter what is happening behind the scenes. Users expect immediate access to relevant data from wherever they are and on whatever device they are using. In this context, high availability will, at the very least, protect companies from revenue losses when access to their data resources is disrupted. But in sectors such as autonomous driving or healthcare, people’s lives depend on systems being available and functioning at all times. For example, a failure on the operating system of an autonomous vehicle would potentially cause an accident, endangering the passengers, and even other cars and pedestrians. Enterprises serve as data stewards who help navigate information and provide value as everybody’s physical and digital lives merge.

Most systems today rely on traditional active-passive High Availability (HA) technologies to maintain data availability and failover. What this means is a configuration consisting of at least two nodes that will be active in turn. For example, if the first node is active, the second one will be on standby, therefore passive. In the active-passive type of setup, the primary systems handle the workload, while a backup system remains on standby, ready to take over if the primary server is disconnected or unavailable. But while this has been sufficient in the past, it no longer keeps up with today’s pace of business. Indeed, this failover configuration requires time and manual intervention, and thus downtime.

In contrast, active-active systems distribute their workloads across multiple, active servers as well as multiple sites, through load balancing. When one node fails, users are automatically rerouted to the next active node without any manual or IT administrator intervention. This allows the user to experience continued productivity without interruption. An active-active cluster is usually made up of at least two nodes, both actively running simultaneously, which helps to achieve load balancing. And because the workload is distributed across all nodes, it prevents any single node from getting overloaded. This configuration will also deliver faster throughput and response times, as there are more nodes available to serve.

Ultimately, the main difference between the active-passive and active-active architectures is performance. By distributing the workload across several nodes, the active-active cluster allows you to access the resources of all your servers whereas an active-passive cluster will only activate the backup server in times of failover. Engines such as NetApp SnapMirror, Dell EMC SyncIQ and Nutanix Smart DR are examples of active-passive technologies. Data gets duplicated either within the same data centre, from one system to another, or between separate, asynchronous sites. While technically solid, the passive nature of the backup copy and the manual intervention required to make it live is less than ideal.

Traditional active-passive technologies have certainly evolved and there is a current trend towards true active-active systems to provide what today’s users expect. The main reason for this is the fact that consumers are addicted to real-time data. Indeed, according to this IDC whitepaper, “today, more than 5 billion consumers interact with data every day – by 2025, that number will be 6 billion, or 75% of the world’s population. In 2025, each connected person will have at least one data interaction every 18 seconds.” But let’s take a step back and look at the macro picture: today’s economy relies hugely on data, and this will increase as companies collect and analyse every step of their production and supply chains. The global datasphere is constantly growing as consumers are living in a digital world, creating content through nearly every aspect of their lives. The on-demand expectations will put a great deal of emphasis on the edge and the core being able to produce the data that consumers require in real-time, and therefore the active-active model is of the utmost importance.

The global datasphere is growing fast and companies need to guarantee they are not left behind when it comes to collecting, analysing, and processing data in real-time, to respond to the expectations of today’s economy. In this context, and to meet the opportunities of the always on world, it’s important for organisations to ensure that data is actionable wherever it resides (active-active), is as close as possible to the end-user for performance reasons, and that the load of data processing is spread across all compute and storage nodes whether it be at the edge, in the data centre, or in the cloud.


About the Author

Wasima Khan is Director Technical Programme Management at Peer Software. Peer Software develops data management solutions addressing the unique challenges related to data migration, backup, replication and collaboration in a WAN environment. Since 1993, Peer solutions have been in use globally by over 10,000 corporate, government, and education customers including half of the US Fortune 100.

Featured image: Adobe

more insights