Once you've made a decision to back up to the cloud, there's a need to make sure you're as efficient as possible
The amount of data organisations back up is exploding. According to a book published earlier this year called 'Big Data, A Revolution That Will Transform How We Live, Work, And Think', just this year the amount of stored information in the world is estimated to be around 1,200 exabytes, of which less than two per cent is non-digital.
It’s easy to see that with this data needs to be stored somewhere, but also it needs to be backed up. Backup technology and the cloud is a natural fit. With such a huge amount of data needing backup, how can organisations optimise backups in order to keep costs and times down?
Optimising your backups in the cloud means devising a new strategy to how that data is managed. While the cost of cloud storage continues to fall, buying more of it isn’t going to be a cost–effective method. It also won’t protect a firm against ballooning data storage requirements.
Check your bandwidth
Tony Thompson, WAN optimisation specialist at Silver Peak says that organisations that invest in the cloud for data backup often overlook one big hurdle – the network.
“Cloud computing often involves, at the highest level, the delivery of hosted services over the internet, which presents bandwidth, distance and quality challenges,” he says. “This wide area network (WAN) is unable to scale with increasing volumes of data required for backups to the cloud.”
While the cost of cloud storage continues to fall, buying more of it isn’t going to be a cost–effective method
He adds that if organisations want to enjoy an optimal cloud service, they must recognise that stabilising the network needs to be at the top of the agenda. “Ignore this and any cloud investment will lose its value over time,” he says.
WAN optimisation software offers a solution by providing 20-times faster movement of data and can reduce redundant data transmission by more than 50 per cent. “This gives customer the scalability needed to support any backup applications and reduces cloud costs,” says Thompson.
Compress and Dedupe
De-duplication makes a significant improvement in backup. It allows storage of data to be more efficient and cost effective. Most full backups are highly redundant. Even daily backups tend to have significant redundancy in them. “Backup applications deems a database or exchange store as an entirely net new file each day,” says Kalyan Kumar, chief technology architect at HCL Technologies. “A de-duplication device will only store changes between the two files. This results in efficiency gains even on a daily basis.”
Deduplication has other benefits too. Lower storage space needs will decrease cloud storage costs. It also cuts down the data that must be sent across WAN links in cloud environment. Such a system detects and removes redundant blocks of data, considerably decreasing the quantity of cloud storage required to store such data. True de-duplication goes to the sub-file level, noticing blocks in common between different versions of the same file.
Claire Galbois-Alcaix, senior marketing manager EMEA at Mozy says that using a provider that enables data reduction techniques such as single instance storage and incremental backups would also help in keeping bandwidth to a minimum.”
“Minimising the amount of data to send to the cloud by only backing up information once makes a substantial difference to the efficiency of your backup,” she says.
Only back up what’s necessary Galbois-Alcaix says that only files that belong to the business should be backed up. “Too often, bandwidth and storage is wasted backing up holiday snaps and iTunes playlists. Setting rules to control who backs up what and when will make backups and restores faster and more affordable,” she says.
This is a point echoed by Jason Rieger, senior network architect at secure cloud hosting provider FireHost. He says that when using a cloud provider for off-site backups only, “there’s no need to transfer anything beyond the essential data.”
“This can be a simple yet effective way to improve performance as many companies’ backups are bloated with unnecessary content. Less data means faster transfer times and less storage,” he says.
Avoiding the pitfalls
There can be dangers to look out for when optimising cloud backups. According to Will Wetton, cloud services product manager at Colt Technology Services, there are sometimes hidden or complicated costs associated with deploying a backup service.
“A raft of metrics like the amount of data you store, the amount you take and list oriented requests that can incur additional charges that aren’t always obvious,” he says. “When a business is in the process of estimating budgets, these costs need to be included. Is client support and management part of the package or is that an additional fee?”
He says it is also worth questioning the integrity of a provider’s network to ascertain whether they are using public network links and whether this data is constantly accessible.
Doug Hazelman, vice president of product strategy at Veeam says that businesses should expect their cloud provider to adopt a modern approach to data protection.
“Essentially, a specialist provider should be able to take advantage of technologies such virtualisation to provide the best service possible,” he says.
He says that organisations should beware of providers that are not offering this modern approach: for example, by still relying on backup techniques designed around physical environments that are ill-suited to the increasingly virtual cloud. “This can result in capabilities that are no better than those available in-house.,” he says.
Galbois-Alcaix says that getting the best from your backups means taking an all round view of what your organisation really needs.
“End users tend to make backup decisions that support their own short-term preferences,” she says. “Optimal performance will come from taking a holistic view of backup with the best interests of the business at heart.”
Before you can talk about controlling backups, you have to know where they are. It sound simple but this can be trickier than it sounds.
“It’s certainly worth the effort of locating all back-up data though, and will ensure a business remains in compliance with any data sovereignty laws,” says Rieger. “Once this data is located, it also needs to be encrypted. Not doing so can leave a business vulnerable to hackers, leaks and regulatory fines should it be of a confidential or sensitive nature.”
While there are can be plenty of pitfalls that stop an organisation from getting the most out of their cloud backups, by doing the simple things right can mean keeping these things at bay. Increasing complexity is seldom the answer and can only work against the organisation more often than not.
This article originally appeared on www.cloudpro.co.uk