VOIP SERVICES

Cloud Backup – Gigaom

[ad_1]

When the COVID-19 pandemic suddenly accelerates the migration of more workloads to the public cloud, all enterprises are already using cloud computing in one way or another. There are many reasons for this. From a technical point of view, the flexibility and agility provided by the cloud cannot be compared with on-premises infrastructure, while the OpEx model allows organizations to adjust their budgets in real time to suit the actual needs of the business.

In the beginning, most organizations trying to move up and down have already learned that this method is particularly inefficient, causing them to rely more on services directly available in the public cloud. In fact, to take full advantage of the public cloud and its TCO, you must fully embrace the ecosystem.

For example, if you are using AWS and need database services, you can avoid all the complexities of building an entire stack of basic EC2 instances, purchasing database licenses, installing the database, configuring it for high availability, and adjusting and maintaining it. Sex. Instead, you only use AWS RDS, which is not only easy to use and manage, but also optimized, and its total cost of ownership is much lower than a DIY solution. This is obviously the correct approach, but there is an obstacle: data protection.

Protect applications and data in the cloud

Most backup solutions are designed before the cloud. They can handle physical and virtual environments well, but they have to struggle with cloud services. There are many reasons for this situation, including many product architecture problems. Usually, we have a backup server, a media server, an agent installed on each computer, and a connector for the virtualized environment. In this case, you can manage applications such as databases through specific integrations or other agents. This type of architecture, whether installed locally or in the cloud, is particularly inefficient, and becomes more and more expensive and less efficient over time, thus wasting any early savings.

Traditional backup products usually use agents installed on cloud VMs (such as AWS EC2 instances) to make backup copies. The user found a very familiar operating environment, but:

  • Most backup servers still use file volumes to store backups, and they can only use object storage for long-term storage in the future, which adds complexity and cost.
  • If the backup target is not within the selected cloud (for example, local), the user will incur export costs, which will increase unforeseen and unpredictable costs in the long run.
  • Due to the lack of attractiveness of RTO and RPO data, the backup time is very long, and the recovery time is even longer.

This method does have its advantages, including the ability to index and search for backups, perform partial restores, and properly manage retention,

Some solutions take a different approach and build wrappers around the standard snapshot API available on the cloud platform. In most cases, we are talking about a nice user interface, which is usually done through API or CLI. It works, but it cannot be scaled, and as time goes by, it is difficult to find a suitable snapshot to restore and decide what to keep and what to delete. We may enjoy faster recovery time, but there are some risks that will affect cost, efficiency and overall operations. In addition, snapshots are usually stored near the local storage system, so they are not disaster-proof.

The third option: Rely on solutions designed specifically for cloud operations. This approach usually has the best of both worlds while minimizing costs and risks. The process is simple-the backup software takes a snapshot and then performs the necessary operations to index it and effectively store it elsewhere. This allows users to create a consistent data protection strategy and gain full visibility about what and how it is truly protected. Users can also search for backups to quickly find the data to retrieve, they can also organize their schedules, and even create necessary gaps to protect applications from the worst-case scenario. Table 1 A comparison of these three options is shown.

Table 1. Evaluation of cloud data protection methods

An example can be found in Clumio’s recently released product: Discover. The product is more than just cloud local backup. In fact, it combines the snapshot mechanism of AWS services and its own mechanism, and combines the two together to provide users with a seamless experience and the best overall TCO.

The solution is excellent because it provides the ability to manage AWS snapshots independently of the backup solution used (including AWS Backup) through the Clumio dashboard. This provides full visibility into protected compute and storage instances, while adding the option to use Clumio’s advanced backup features through Clumio Protect to enable indexing, search, file-level restore, etc. Clumio stores data in different locations, creating the necessary air gap to protect the data in the event of a major system failure, disaster or cyber attack. One of my favorite features in Clumio Discover is analysis, especially cost control, which enables users to simulate the combination of native AWS snapshot strategies and Clumio advanced backups over time.

Closed circle

Traditional data protection does not work in the cloud. From the perspective of the cloud provider, snapshots are sufficient to ensure continuity of operations. If you want to protect mission-critical data and applications, you must find solutions specifically designed to run efficiently in the cloud.

In this case, efficiency also means lower costs and operational scalability. In fact, traditional backup solutions are not designed to cope with the speed of change in the cloud, and the snapshot itself is indeed very time-consuming for system operators. The latter also creates cost issues related to snapshot orphans, retention management, replication, and recovery time for individual files. These are all aspects of snapshot management, which are often underestimated.

Maintaining control of data in the cloud is basic, but using the right tools to control it is more important to reduce costs while simplifying operations. In this case, products such as Clumio Discover provide a convincing balance between availability, integration with cloud platforms, and cost control functions, which are the foundation of a sustainable modern cloud strategy.

Related Articles

Back to top button