This article shines a spotlight on data backups, exploring how this vital practice has evolved over the years, and how ever-more sophisticated data loss threats drive the continual push for innovative new capabilities and procedures to keep organisations ahead of the game.
NAKIVO’s Sergei Serdyuk provides best practice tips to help prevent data loss, and highlights the biggest mistake companies make when it comes to planning data backups.
Over the past 20 years, data sizes have multiplied, as organisations continue to create and store countless terabytes of data every day. With larger datasets comes an increased demand for more efficient backup techniques that legacy backup solutions cannot provide. As their outdatedness became more evident, the data industry saw a large-scale switch from legacy backup technologies to more contemporary approaches. One of the main results of this shift is the dramatic change in the average time it takes to create a backup.
The evolution of backup technologies and the continuous data protection (CDP) approach have made it possible to cut backup windows from days or weeks down to mere seconds. The global virtualisation trend that started in the early 2000s has also boosted the efficiency of backup administration, bringing about faster agentless, image-based backup processes.
Consequently, restoring critical workloads to the state they were in just seconds before a disruptive event for a much lower downtime toll is now possible. This is a remarkable improvement, especially for organisations that abide by strict recovery objectives, such as financial institutions and e-commerce companies
Data protection tactics for data breaches
Regardless of how reliable a storage platform is, keeping all critical data stored in one place is a disaster waiting to happen for any organisation. To avoid the pains of security breaches, ransom payments, and data leaks, companies should aim to create and distribute backup copies across multiple onsite and offsite storage destinations.
Another way to truly keep ransomware at bay is to apply immutability for backup data. Immutability means data is stored in such a way that it cannot be altered, deleted, or encrypted by ransomware.
The ideal data backup solution should have a well-rounded set of ransomware protection and recovery features, allowing customers to achieve near-zero downtime and avoid paying ransom in return for access to the data. For example, the capability to store backups in ransomware-resilient Amazon S3 buckets and hardened Linux-based local repositories to prevent data deletion or encryption by ransomware. Ideally, IT admin teams would be able to leverage a backup to tape functionality to create air-gapped backups on tape to reduce the chance of ransomware encryption.
With this support for a wide range of storage platforms, backup distribution should be streamlined according to the golden rule of backup, the 3-2-1 rule. This methodology involves keeping at least three backups of the data and storing two backup copies on different storage media, one of which being offsite. For instance, an effective procedure to follow would be for IT administrators to keep their primary backup in a local folder, a backup copy on a NAS-based file share, and another in Amazon S3. This wasy, in the unfortunate case that a data breach prove to be successful occur, the solution can swiftly perform full or instant granular recovery and restore workloads to their pre-disaster state with minimal downtime.
Innovative backup capabilities
Organisations require reliable, fast, and affordable data protection for virtual, physical, cloud, and SaaS environments from a single pane of glass. IT administrators seek to create and automate incremental, image-based, application-aware backups of their critical workloads using the chosen solution. In the event of a disruption, the instant recovery of entire VMs or individual files and application objects should be just a few clicks away. It is important that the backup and data recovery solution be easy to use, with performance-boosting features. The latest capability innovations in this area include Global Deduplication, Network Acceleration, and LAN-free data transfer to boost backup speed and optimise storage space usage.
It would be advantageous for the chosen backup solution to contain an extensive feature set. This could include a host of tools that streamline backup automation, reduce the chances of human error and save resources, including Policy-Based Data Protection, Job Chaining, pre-and post-job scripts, and HTTP API (Application Programming Interface) enabling effective communication between systems.
Furthermore, a Site Recovery functionality can help automate disaster recovery workflows and smoothly resume operations at a secondary site after a disruptive event. Ultimately, with these advanced capabilities, businesses can ensure full preparedness for various incidents ranging from natural disasters to cyberattacks, all while maintaining 24/7 data availability.
Data backup planning best practice
Creating regular backups ensures that critical data is always available when the need arises. However, many companies fail to recover their backups when they need them most because they stop short at a vital stage of backup administration: testing. This is one of the biggest mistakes companies make when planning backups. IT administrators might find the routine verification of backup data integrity tedious, but the testing process is essential to ensure a successful recovery. After all, backup data is not immune to corruption and loss either.
Setting up a schedule for periodic backup verification is a healthy IT practice that can save an organisation the surprise of finding out a backup is unrecoverable in an outage. For optimal results, IT administrators should aim to run a test right after a backup job concludes. A test should verify that the targeted backup can be restored effectively and whether the backup is functional upon retrieval. Businesses should also regularly test existing backups as their infrastructures change. This ensures data recoverability over time and helps set more accurate recovery objectives.
A reliable automation system for backup verification can save considerable resources and eliminate the risk of human error associated with manual backup administration and testing.
Top tips for securing data
The best pathway to achieving more robust data security would be a proactive data storage strategy that leverages successful practices for data protection. For a start, the Zero Trust model is a highly effective framework that can transform an organisation’s security profile through comprehensive measures for user, application and device authentication. Assigning user privileges, implementing role-based access control and using two-factor authentication are key elements in dramatically reducing the risks of unauthorised access and identity theft.
It is also important to point out that data might be particularly exposed to cyber threats during network transmission. As such, data encryption plays a crucial role in preventing setbacks while data is being sent to storage by transforming data into unreadable ciphertext.
While it is clear that data loss incidents will not be disappearing any time soon, with proper education as to current threats, and a combination of comprehensive data protection tactics and procedures, organisations place themselves in the best possible position to stay ahead of the game.
About the Author
Sergei Serdyuk has worked in the IT industry for over 15 years and has extensive experience in software project management, product management, virtualization, cloud, and data protection. As Vice President of Product Management at NAKIVO, Sergei is responsible for the company’s global product portfolio. On a mission to deliver the ultimate data protection solution, Sergei is passionate about building customer-centric products that save time, money, and effort for real humans. In addition to product management, Sergei is also involved in NAKIVO’s marketing and customer support operations. Outside of the office, Sergei enjoys science fiction, music, and traveling.
Featured image: ©Miha Creative