The Resurgence of Magnetic Tape

Magnetic tape has been used for data storage since the 1950s

Though it’s been around for years and overshadowed by technologies focused on high-performance needs, we continue to see emerging use cases where tape offers the perfect solution for some vexing storage challenges.

Surging Volumes of Data

The reasons are both operational and economical. Large data sets continue to grow, and companies and institutions face the ongoing challenge of managing rapid growth in unstructured data while keeping a firm hold on costs. 

It’s a burgeoning challenge. The amount of data in the world soon will surpass 40 zettabytes, according to the World Economic Forum. By 2025, that number is expected to reach 175 zettabytes, according to International Data Corporation (IDC). Limiting the scope to only corporate data—data under the management of information technology (IT) organizations—the number is lower. But the volume is doubling every two to three years, according to IDC, and is projected to reach 7.5 zettabytes by 2025.

As smart sensors proliferate, corporate use of artificial intelligence matures, and more organizations become truly data-centric, the amount of data under management will only continue to grow. 

Factors Driving the Resurgence of Magnetic Tape

Organizations are revising their storage strategies for several reasons, contributing to the resurgence of tape.

Value of Data

In every industry sector today, customer experience and innovation are germane to success. Over 76% of consumers in a survey conducted by Salesforce said that they expect companies to understand their needs and expectations (two things that are essential to know for a company to deliver personalized experiences) and 63% said they expect companies to provide new products and services more frequently than ever before.

More than ever, companies rely on data. Often, it’s a company’s data—and how well it is used—that separates one brand or institution from another.

Today, data has value like never before, and it’s often referred to as “the currency of the digital economy.” As such a vital asset, it’s very important that organizations have workable, economically-sound strategies in place for data backups, disaster recovery, and archiving.

Growth in Hyper-scale Environments

Hyper-scale environments are no longer as rare as they once were. A decade ago, the term hyper-scale was reserved for large service providers—companies like Amazon, Google, and Microsoft—that offered cloud computing services on an extremely large scale. To deliver the services they offered, these companies had to find a way to manage massive infrastructure environments to accommodate the demand for compute and storage resources. And they did. 

Large-scale service providers operate complex environments, including exabyte-level storage infrastructures. These providers have done extensive research and testing on storage technologies and concluded that the best way to deliver affordable storage at scale is by using object-based storage techniques and magnetic tape.

Few companies have exabyte-scale needs. But the increase in the use of raw data in complex workflows and research has led to an increase in the number of petabyte-level environments.

Consider the automotive industry and the pursuit of self-driving vehicles. Setting aside the news about successes and failures in this area for a moment, it’s worth noting that there are six levels of automation defined by SAE International—from no automation (Level 0) to full automation (Level 5). The goal is to reach full automation. When that will happen is unknown.

But investments in developing this capability continue. Some automakers are outfitting fleets of vehicles with sensors and generating large amounts of data as they design and test and refine their work. In this environment, one vehicle alone can generate 10-20 terabytes of data per day to be used for analysis. It’s not uncommon for developers in this field to need hundreds of petabytes of storage to keep the workflow going.

Life science is another industry that has felt the effects of growing volumes of data. Fifteen years ago, it cost $3 billion to sequence a human genome. In 2015, that cost had been reduced to $1,500. Today, the cost is lower still—below $1,000 for most commercial applications.

The reduction in cost has made it possible for nearly every research institution to purchase a genomic sequencer for their individual use. Petabytes of data now can be generated and used for research, which has helped researchers uncover new discoveries in science and find breakthroughs in the treatment of diseases. 

These “lower-tier” hyper-scale environments have similar characteristics to the environments of large service providers in that they need to keep large volumes of data for a long time. And many are employing similar methods to address the need. 

Ransomware Threats

Ransomware attacks have fluctuated in recent years. But the threat is still very real. According to a recent report from McAfee, ransomware attacks increased 118% in 2018, and over 2 billion stolen account credentials were found on underground sites used by criminals.

The threat is too serious to ignore. Data backup and security processes must address the risk, and companies are forced to put measures in place to safeguard against it. Best practice methods for data protection follow a 3-2-1-1 strategy—three copies of data using two different types of media with one copy kept offsite and one offline. The best protection against a ransomware attack is to store a copy of critical data offline on magnetic tape where hackers and cybercriminals cannot corrupt it.

Advancements in Tape Technology

As mentioned earlier, magnetic tape has been around since the early days of computing. This long tenure has impacted tape’s image over the years. So has the emergence and aggressive marketing of new storage technologies. But tape technology has advanced, and substantial investment continues to be made in tape storage technologies. 

Linear Tape-Open (LTO) is a magnetic tape format that was developed by the LTO Consortium—a joint development organization consisting of members from IBM, HP, and Quantum—and was first released in 2000.  Since then, LTO has become the de facto standard for tape storage technology.

LTO-8 is the current generation available on the market and provides significant advancements in capability over previous generations. LTO-8 cartridges can store up to 12 TB of data per cartridge (native) and 30 TB of data per cartridge when compressed (using 2.5 to 1 compression ratio). In addition, data transfer rates are much faster, offering 360 MB per second rates natively and 750 MB per second for compressed data.

While LTO-8 is high-performing, investment in the technology continues. The LTO Consortium has a development roadmap that goes out four more generations to LTO-12, with the objective of reaching storage capacities over 150TB per tape native and 480 TB per tape compressed. 

Two additional capabilities have contributed to the resurgence of tape: Object/File Tape Archive and FLAPE (flash plus tape).

LTFS was first available in 2010 with the introduction of LTO-5 tapes. LTFS enabled LTO tapes to be indexed, similar to the way a USB drive works giving users easy access to the contents of LTO tapes. This allows users to manage and share data on tape similar to the way they would on disk.

In the future, we foresee that LTFS (linear tape file system) will endure as the primary format for media data interchange and long-term digital media archiving. Digital media workflows, backup, and archive storage management applications enabled by LTFS will continue to be enhanced for new LTO generation support and the latest LTFS Specification features. Solutions that integrate LTO/LTFS-based libraries with Flash storage and AI show potential for low-latency, cost effective work-in-process active archive repositories.

FLAPE is a tiered-storage method that combines the use of flash storage and tape. When data is written to flash, a copy is also written to tape. Frequently accessed data remains in flash storage for optimum performance. Once the data is no longer being used, it is purged from flash to make room for other data. This enables users to optimize the use of their flash storage while using a lower cost form of media for archiving.

Storage Costs

Every enterprise, regardless of the industry, struggles with the management of data. Average data growth is between 35% and 65%, compounding yearly. This LTO-Tape TCO Calculator from the LTO Consortium shows users how to discover the hidden costs of storing data in the cloud and how to store smarter and save: https://www.lto.org/resources/tcotool/

As data under management continues to grow, organizations will continue to face the challenge of data management. Fortunately, with continued investment in research and new breakthroughs in media density, magnetic tape continues to find innovative uses and remains a vital component in a wide range of storage environments.


About the Author

Diana Salazar has over 19 years’ experience in the storage industry working for many companies ranging from multi-nationals to cloud start-ups. She is in charge of product marketing for the backup and disaster recovery portfolio at Quantum. Diana also represents Quantum with the LTO Consortium.

Featured image: ©