Menu

Data isn’t immortal. Why are we still backing up?

ipod
Table Of Contents

Share this page

Chris Gondek
Chris “Gonzo” Gondek
196 views

It's 2022, we have the clouds, we have software and infrastructure as a service, we have 5G networks and optical fiber to most premises. Data is everywhere in all shapes and sizes in all formats and it’s constantly whizzing around. Surely by now we shouldn’t have to worry about the safety and integrity of our data? It seems that we’re “outsourcing” all of our data activities to subscription services, so we don’t have to own or take responsibility for any infrastructure. That should include the responsibility for our data too, right?

I have asked these types of questions before: Who should care about backup, why do we still back up, will we have to back up forever? I have also written about the threats to data in the past, discussing the ways in which damage can be done to data, and I still always arrive at the same conclusion:

As long as data with value or importance exists, we need to protect it.

Data isn’t immortal—yet

This is one of those constant laws, like the laws of physics, because data isn’t immortal.

But… can it be made immortal? I can dream, can’t I?

I’ve been thinking about this subject for a very long time, longer than my 25-year career in all things data and data protection and recovery. Backup was supposed to be one of those dying conversations: “No one cares about backup anymore.” I hear that every few years, until another, previously unheard of and unanticipated data loss scenario comes along.

Data loss hurts. For me, that realization came back in the late 90s when I discovered an emerging technology known as MP3. I used a command line utility to copy music data from CDs to my hard drive, and then another command line utility to convert that data into MP3 files. This was no small task on my Pentium 100mhz PC; to complete a single album took about 24 hours.

I had a fair number of CDs to get through. Finally, my entire CD collection would be instantly accessible and playable, turned into a digital music library with this new file format . I even went to the computer market to spend my hard-earned “flipping burgers at McDonalds” cash on a brand new 3GB hard drive. (By today’s standards, this would be equal to all the storage in iCloud. That’s an exaggeration, I know, but that’s what it felt like to me back then.) I did this as I was running out of space for my collection. In one fateful moment, when using the FDISK command, I instantly and irreversibly blew away the entire collection I had been slaving over for almost 7 months. I knew then what the “F” in FDISK meant—my data on that disk was well and truly F’d. (Physical data recovery technologies and services were not yet available to consumers.) This experience effectively launched my lifelong commitment to data protection and recovery assurance.

To say that I was devastated would be an understatement. Now I wasn’t going to lose my livelihood or my brand reputation, but multiply my situation by a few thousand and you can appreciate what some businesses have experienced in commercial data loss scenarios. Now multiply that again by the number of locations you keep data in, and the constantly growing list of cyberthreats to that data, and you can see that data loss scenarios are increasing in likelihood and volume.

Can data be made immortal? (Theoretically, anyway)

So what is theoretically needed to make data “immortal”?

From a data protection and recovery perspective, we would need a mechanism to ensure that every single change ever made is kept track of and can be reverted to at any time without ever losing any created data. At the same time, every single change ever made would need to be kept forever because data from any time may need to be retrieved at any time. Also, data would have to be accessible and recoverable, and therefore we need to throw immutability and constant verification and integrity into the mix.

Suppose that this “immortal” data solution takes the form of a powerful “universal file system,” one that can service any and every workload in every environment. To achieve the theoretical requirements of immortality with immutability, and make it commercially viable, this universal file system would need standardized and perfected efficiency factors. We have some of those factors today, with data storage technologies in the form of compression, compaction, deduplication, etc. The current challenge is that these technologies are not standardized; they are all proprietary to a specific provider. Imagine that we did have a globally agreed upon “deduplication standard,” for example, perhaps endorsed and approved by the IEEE/IETF with an associated RFC, like we have for things like DNS and HTTP. This would absolutely revolutionize data storage and all data traversing over all networks, including the internet. Your 256GB iPhone could suddenly become a multiterabyte device. Your mobile network link could go from 5G to a theoretical 7, 8, or 9G and beyond.

Now this is really starting to sound more like a mad dream than something realistically achievable. But I think that this theoretical, omnipresent, multiprotocol, universal file system, with immutability and security built in without compromising efficiencies, can be achieved one day. That’s especially true with the massive amount of infrastructure available today, let alone forecast for the future, to house this data. When that universal file system is achieved, we finally won’t have to worry about backup anymore, simply because the data is effectively immortal.

On the road to data immortality

So what do we do in the meantime? Especially since we’re focusing on data protection and recovery and March 31 is World Backup Day?

I believe that we here at NetApp are in a unique position to help and that we’re well on our way toward the data nirvana of eventual immortality.

We start with an omnipresent, multiprotocol, secure, efficient, immutable when necessary storage capability that leverages a highly efficient “copy on redirect” NetApp® snapshot™ mechanism for massive scalability. We can’t do unlimited Snapshot copies yet, but we can do thousands of them per volume.

This feature-rich, futuristic data service capability was actually created 30 years ago. You see, all of this was front of mind when NetApp’s founders started on the journey to provide data “on tap” without compromise of any sort for the foreseeable future and beyond.

I am of course talking about NetApp® ONTAP® data management software, which is already ticking a lot of the boxes on the road to data immortality:

When you partner with NetApp and use the combination of all these capabilities and much much more under the surface, your workloads, wherever you want to run them, are in the safest possible hands. Get the next best thing to data Immortality today!

Chris “Gonzo” Gondek

Data Driven Technology Evangelist, NetApp ANZ

Techie with Table Manners

My mission is to enable data champions everywhere. I have always been very passionate about technology with a career spanning over two decades, specializing in Data and Information Management, Storage, High Availability and Disaster Recovery Solutions including Virtualization and Cloud Computing.

I have a long history with Data solutions, having gained global experience in the United Kingdom and Australia where I was involved in creating Technology and Business solutions for some of Europe and APAC’s largest and most complex IT environments.

An industry thought leader and passionate technology evangelist, frequently blogging all things Data and active in the technology community speaking at high profile events such as Gartner Symposium, IDC events, AWS summits and Microsoft Ignite to name a few. Translating business value from technology and demystifying complex concepts into easy to consume and procure solutions. A proven, highly skilled and internationally experienced sales engineer and enterprise architect having worked for leading technology vendors, I have collected experiences and developed skills over almost all Enterprise platforms, operating systems, databases and applications.

View all Posts by Chris “Gonzo” Gondek

Next Steps

Drift chat loading