The case for implementing an archive strategy is impressive, move the inactive 80% of data that hasn’t been accessed in the last year off of production storage to an archive store. An archive solution is less expensive, requires less power, less data center floor space and does not need to be protected by the backup process. The cost savings of an archive solution are extraordinary, many showing complete investment recovery in months. So what then is killing archive?
The only legitimate concern that could kill an archive project is how the solution will respond when a user or application needs data from the archive. In reality, the chances of a recall are so minuscule that you could quite literally delete the data, instead of archiving it, and not be impacted. Of course, no IT professional is going to delete 80% of their data, and organizations are increasingly asking IT to keep all data forever. Data retrieval is a legitimate concern, but what usually kills an archive project are concerns that the archive vendors create in the delivery of their solutions. These solutions are either overly complex, proprietary or both.
Complexity
The number one cause of death for an archive project is complexity. Some archive “solutions” are a hodgepodge of software and hardware from a variety of vendors. Their integration requires a team of archiving specialists. First, a software application has to be purchased to analyze the data for archive-worthy candidates. Second, another software application needs to be bought to manage writing to the archive storage target (disk, tape or cloud). While vendors argue over which target is best suited for archive, the truth is there is no “best”. What is best depends on the organization’s needs. The problem is that most “target managers” force the choice upon you. To some extent the target choice does not matter, the complexity of the target is just one small piece of a very complicated puzzle.
[to continue, click HERE]