by george crump...makes creating a hybrid strategy more difficult. a hy-brid approach is necessary...

12
MORE THAN MIGRATION: DEVELOPING A HOLISTIC CLOUD STRATEGY by George Crump

Upload: others

Post on 20-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

MORE THAN MIGRATION: DEVELOPING A HOLISTIC CLOUD STRATEGY by George Crump

Page 2: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

others move back and forth. Another critical factor is that cloud providers are developing unique areas of expertise. Organizations need to move data between clouds to take advantage of unique cloud capabilities. IT needs to develop a holistic cloud strategy that em-braces the hybrid and multi- cloud models.

The challenge in creating a more comprehensive cloud strategy is the time it takes to design and implement it. Organizations need a solution that enables them to start small with a single use case, but which scales to cover a wide variety of use cases.

CHAPTER 1 — CREATING A CLOUD DATA FABRIC

Most organizations are considering how to take advan-tage of the tremendous resources made available by the cloud. The first step for many of these organizations is to create a cloud migration strategy. As part of this strategy, the organization decides which solution is best suited to copy or move their data to a cloud provider. The problem is that cloud migration implies a one-way, one-time event, which leaves the full potential of the cloud untapped.

The reality is that most organizations want to leverage a hybrid-cloud architecture where some applications stay on-premises, others move to the cloud and still

Most organization chip around the edges of the cloud’s potential using it for a variety of initial use cases. Each of these initiatives typically leverages different products but they are seldom compatible with each other.

A holistic cloud strategy starts with a solid foundation. The organization should build that foundation on a

The Cloud Data Fabric (CDF) needs to enable the organization to continue to use its existing storage pro-tocols. Most organization’s applications and services use protocols like NFS, SMB, Apple File Protocol (AFP) and block protocols like iSCSI. The problem is that the cloud, by default, does not. A lack of compatible cloud based protocols makes leveraging the cloud more diffi-cult and more limited for the organization.

Requirement 1 – Broad Range of Protocol Support

data structure more powerful than a file system, instead using a data structure that acts as a fabric, bringing together on-premises storage with cloud storage, and even multiple cloud providers. The fabric’s seamless integration of multiple storage points enables the or-ganization to start its cloud journey with small steps like backup and grow to fully native cloud applications.

The CDF needs to provide the same protocols across both on-premises and cloud based storage resources. Compatible protocols make the movement of data be-tween on-premises storage and cloud storage seamless. The fabric requires no data conversion, so legacy appli-cations and services run seamlessly. Using compatible protocols also better enables the movement of data back to on-premises and creates a true hybrid IT model.

REQUIREMENTS FOR A CLOUD DATA FABRIC

Page 3: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

The seamless transfer of data between on-premises and the cloud provides great benefit to the organization. The time it takes to transfer data to the cloud or back to on-premises may make the organization more resistant to the change. The transfer time may eliminate any potential performance gains from moving the workload.

Its fabric nature means that the CDF controls both ends of the cloud connection (on-premises and cloud). CDF should optimize data transfers between the storage points instead of counting on basic IP transfers. The CDF without breaking compatibility, provides a customer packet transfer method and customers should expect better link efficiency and reduced packet loss. The resulting 5X or better transfer speed, justifies data movement even for a small performance gain.

Cloud storage provides the advantage of reduced up-front costs. It does not necessarily assure reduced long-term costs. The CDF needs to help the organization optimize its cloud storage spend. On-premises storage systems typically leverage data efficiency techniques like deduplication and compression but cloud based storage solutions seldom provide data reduction capabilities.

The CDF needs to provide deduplication and compres-sion to the cloud storage resources it manages. Provid-ing data efficiency enables CDF customers to enjoy the upfront cost savings of the cloud as while also reducing the long-term costs.

Requirement 2 – Accelerate Cloud Transfers

Requirement 3 – Optimize Cloud Data

Page 4: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

Every cloud provider has multiple storage tiers. The provider typically uses a performance tier for applica-tions and active data, a moderately performing tier for less frequently accessed data sets or less performance sensitive data sets. Providers now offer cold storage tiers for data retention. Each tier is less expensive than the tier above it. If the organization intelligently moves data between these tiers, it further drives down cloud storage costs.

The CDF enables many use cases and makes the organization truly Hybrid IT ready. With a cloud based file fabric, the organization can easily copy data to the cloud and direct backup copies to the cloud. It is also easily able to implement interactive initiatives that use the cloud for more than just a digital dumping ground, like file services consolidation.

Requirement 4 – Leverage All of the Cloud

Typical cloud storage products lack support for these various tiers. A CDF should support multiple cloud tiers by automatically moving data between tiers of storage based on data access parameters or administrator established policies. The CDF could even span tiers be-tween on- premises and cloud storage, moving inactive data from on-premises storage to cloud storage.

As we’ll detail in chapter 4, file services consolidation uses the cloud as the central storage repository for data typically stored on multiple NAS systems spread throughout the organization. The CDF creates a global SMB and NFS mount point, accessible by all the orga-nization’s locations, eliminating the need for on-premis-es file servers and multi-site coordination of data.

USING THE FOUNDATION

Page 5: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

Most organizations use data protection as their first step into the cloud. Unfortunately, that first step, like many first steps, often results in a stumble. If organizations don’t create a cloud data fabric first, they often use an ex-tension to their backup application which can’t fully leverage the cloud and starts a series of siloed approaches to cloud adoption.

With a cloud based file fabric, the organization can easily copy data to the cloud and direct backup copies to the cloud.

CHAPTER 2 — USING THE CLOUD DATA FABRIC FOR DATA PROTECTION

Many on-premises data protection solutions simply add cloud extensions to their solutions. In most cases, on-premises backup solutions only use cloud storage to store an extra copy of backup data. A few on-premises data protection solutions use cloud storage to archive older backups but very few take any advantage of cloud compute.

No matter the level of cloud support, all on-premises data protection solutions only provide cloud connec-

tivity to their application. Most organizations use three or four backup applications. This product specificity means the organization ends up with three or four dif-ferent cloud data protection strategies.

Finally, most on-premises data protection solutions store data in a proprietary format, even in the cloud. Proprietary storage means that cloud compute resourc-es can’t access the backup data set.

THE PROBLEMS WITH BUILT-IN FUNCTIONALITY

Page 6: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

A cloud data fabric presents cloud storage as an SMB or NFS share. Any on-premises data protection solution that writes data to SMB or NFS shares can send data to the cloud. Unlike the siloed, vendor specific solutions, the cloud data fabric enables all applications to send data to the same storage area on the same cloud, greatly simplifying data protection management.

The cloud data fabric also automatically moves older data to less expensive forms of cloud storage, so as

backups age the cost to store those backups decreases. At this point, no on-premises data protection solutions support multiple cloud storage tiers.

The cloud data fabric also makes data available to cloud compute. Cloud compute enables the organiza-tion to leverage data for disaster recovery, test/dev or data analytics.

THE ADVANTAGES OF A CLOUD DATA FABRIC FOR DATA PROTECTION

Most on-premises data protection solutions provide no support for cloud native applications. Those that do, usually do so through a separate application. A cloud data fabric enables on- premises backup applications to better leverage the cloud and it provides protection of cloud native applications.

Data protection requirements are different in the cloud. Most cloud providers deliver excellent decent storage availability. Losing access to an application because of a natural disaster or storage system failure is un-likely. However, cloud-native applications still need point-in-time protection, so they can recover from an application corruption. A cloud data fabric provides point-in-time protection by using integrated snapshots and cloning.

IT can trigger snapshots of their cloud-native applica-tions either manually or on a schedule. They can use snapshots to recover an entire volume or a single file.

Recoveries are easy to execute and happen almost instantly.

A snapshot however, is totally dependent on the prima-ry storage system. If the primary storage system fails, the snapshot is lost. The cloud data fabric protects against snapshot corruption with clones. The fabric creates clones of volumes from the snapshots; those clones are independent of the original volume. IT can also instruct the cloud data fabric to replicate clones to another cloud region or even another cloud, for more complete protection. The cloud data fabric can also ar-chive clones to a cold tier of cloud storage for long-term data retention at reduced costs.

Snapshots and clones also bring value to on-premises backups. Snapshots and clones of data sent to the cloud via a backup application, add another layer of protection.

THE CLOUD-NATIVE ADVANTAGE OF THE CLOUD DATA FABRIC

Page 7: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

A key component of most organization’s cloud strategy is migrating applications to a cloud provider to reduce de-pendence on on-premises infrastructure. The organization may want to host the application entirely in the cloud, use the cloud as a failover point for the application or use the cloud to bring additional compute resources to the application when demand exceeds data center capacity; a situation known as cloud bursting.

CHAPTER 3 — USING THE CLOUD DATA FABRIC FOR MIGRATING APPLICATIONS TO THE CLOUD

While many organizations will see some level of success using the cloud for data protection, most organizations fail completely at the application migration part of their cloud strategy. Those that do make it through appli-cation migration find it a much more time-consuming process than originally planned.

The primary difficulty is how to lift and shift applica-tions that are already running in the data center. Most businesses see rewriting their existing data center applications to be cloud native as their only option. Rewriting the application is time-consuming and very

Migrating applications to the cloud is just the beginning of an organization’s challenges. Next, the organization needs to manage how applications consume cloud capacity and they need to make sure it consistently de-livers the right level of performance and cloud storage. Instead of managing data, most organizations put all

expensive. It also means that the application must be completely requalified for stability and functionality.

Another challenge with rewriting applications is that it makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy, the organization needs to maintain a legacy on-premis-es version of the applications and a cloud native version of the applications. The organization could also create a cloud-like infrastructure in their data center, but that is even more costly, requiring the investment in compute, cloud management software and cloud-like storage.

their application data on the highest performing (and most expensive) tier of cloud storage available. The reality, just as it is in the data center, is that this strategy is costly and inefficient compared to moving aging data to appropriate, less expensive cloud storage tiers when access to that data subsides.

THE CLOUD MIGRATION PROBLEM

MIGRATION IS JUST THE BEGINNING

Page 8: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

The first step for many organizations is using the cloud for data protection storage.

In the same way, IT can use the Cloud Data Fabric (CDF) to ease and enhance data protection with the cloud, IT can also use the CDF to improve an organiza-tion’s success rate and flexibility when migrating appli-cations to the cloud. The CDF creates a POSIX compli-ant file system in the cloud that can provide both object and block storage access (via various storage protocols like: NFS, SMB/CIFS, iSCSI). With the CDF as a foun-dation, IT can move on to the next step of migrating the applications unchanged, to the cloud.

In the migration step, whether rewriting the applica-tions or leveraging the CDF, IT has to deal with the

reality that bandwidth is not limitless. Unlike the rewrite method, which provides no assistance, the right CDF solution can assist with intelligent high-speed bulk transfers. Since a CDF is in place both on-premises and in the cloud, it owns the transfer method. It can optimize transfers over the WAN and move data at a greatly accelerated rate than traditional TCP/IP protocols can handle.

After migration, the CDF also supplies continuous sync of the data between the data source and the data in the cloud, which is vital for the disaster recovery and cloud bursting use cases. The feature insures rapid

FOUR SIMPLE STEPS TO LIFT AND SHIFT WITH A CLOUD DATA FABRIC

application start since the latest copy of data is already in place.

Once the application is in the cloud, IT needs to ensure acceptable performance to its users. At the same time, IT needs to make sure it consumes cloud resources wisely and cost effectively. It is critical that the CDF can support automatic storage tiering of data between the different cloud storage types. The CDF needs to ensure that the most active data is on the best performing storage tier while also making sure that the less active

or older data is accessible but on cost effective, high capacity storage tiers.

The final step is to secure the data, both on-premises and in the cloud. Most organizations use either Active Directory or LDAP for authorization, the public cloud by itself, does not. The lack of common authentication controls and management of those controls, means that the IT administrators need to learn a new authen-tication method and two ways to manage access to data. The CDF uses Active Directory or LDAP natively,

Page 9: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

The first step for many organizations is using the cloud for data protection storage. Since the CDF, presents both on-premises S3 Object Storage and Cloud Stor-age as a POSIX compliant file system, backup software

The cloud data fabric (CDF) allows organizations without a cloud pedigree to take full advantage of cloud stor-age and compute resources. The CDF also enables the organization to move to the cloud at its own pace without ending up with dozens of siloed solutions in the end.

CHAPTER 4 — A STEP-BY-STEP PATH TO FULL CLOUD UTILIZATION

Step 1 – Data Protection

meaning that the move to the cloud requires no special cloud authentications; it just uses the same model that IT is already using in the data center.

Another aspect of control and security is making sure that data is unreadable if there is a breach (like the

past AWS data breaches). With a cloud rewrite, the application needs to either implement its own data encryption algorithm and key management or add a third party solution and manage that. However, a CDF includes built in encryption, for both for data at-rest and in- flight.

THE USE CASES FOR A CLOUD DATA FABRIC

that writes backup data to an NFS or SMB share can leverage the CDF to migrate backup copies to the cloud.

Page 10: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

Some organizations may want to use this capability to archive old backup jobs to the cloud instead of to tape. The cloud archive has a few advantages over tape ar-chives. First, it is easier to ensure data durability. While tape media is reliable, scanning those media for poten-tial weak spots is time-consuming and labor-intensive. Cloud storage automatically verifies data integrity and automatically takes corrective action. Second, the access times to cloud storage archives are significantly

Backup solutions like Veeam provide technology to deliver highly efficient block-level incremental backups. The problem is the creation of new synthetic fulls from those backups is time-consuming and traditional back-

faster than tape archives.

The CDF helps by presenting an NFS/SMB share to which the backup software can write data. Additionally, some CDF solutions provide performance enhance-ments both in data transfer time and in cloud storage performance. To mitigate cost the CDF should support all available cloud tiers, especially the cloud provider’s

“cold” tier.

up storage is too slow for the task. The CDF’s object storage acceleration technology improves performance by as much as 400% making cloud- based synthetic fulls possible.

Tape Migration

Store Veeam Backup Files in the Cloud

Another early-stage use case for the CDF is file server consolidation. Today’s organizations have data dis-tributed to dozens of locations, multiple data centers, remote office/branch office data centers and multiple cloud providers. The CDF enables the organization to consolidate all file servers to the cloud. The organiza-tion can create one NFS/SMB mount point, and store all file data there. The acceleration of both file transfer times and object storage helps again by ensuring excel-lent performance without the hassle of syncing data.

Step 2 – Files Server Consolidation

For an organization that must have local on-premises storage, the CDF enables the organization to turn any object storage system into a fast SMB/NFS file server. The CDF can also archive from the on-premises object storage system to cloud archive, limiting the growth of the on-premises investment. The result is the organiza-tion has a single file server that consolidates all data for all offices and is location independent.

Page 11: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

The next step is application migration. Without the CDF, application migration is most challenging. Orga-nizations have to lift and shift their application to the cloud. The process includes transferring the application and all its data as well as modifying the application to work in the cloud. Since the CDF provides NFS/SMB and iSCSI block storage access, there are no changes

Most organizations don’t want to eliminate their data centers with the cloud, only augment them. The chal-lenge is that using the traditional migration and trans-formation methods to get the application running in the cloud, means that moving those same applications back on-premises, becomes more difficult. Since the CDF requires no changes going to the cloud, it also requires no changes coming back from the cloud. The transfer back and forth is seamless.

With the CDF, the organization can leverage the cloud for moving workloads to the cloud temporarily when it

Step 3 – Application Migration

Step 4 – Empower a Hybrid Strategy

required to the application. Again, thanks to the CDF’s transfer acceleration and object storage acceleration, the organization can migrate the application much faster, and the organization may also find that the application performs adequately on lower cost cloud object storage.

runs out of data center resource because of a spike in workloads. It can also use the data center as a DR site to cloud-hosted applications if there is a cloud failure, IT can then move the application back to on-premises. These capabilities require another essential capability of the CDF, continuous synchronization of data to and from the cloud. Most migration utilities are a one-way, single-use product. The CDF is a continual use, bidirec-tional solution.

For an organization that must have local on-premises storage, the CDF enables the organization to turn any object storage system into a fast SMB/NFS file server.

Page 12: by George Crump...makes creating a hybrid strategy more difficult. A hy-brid approach is necessary to use the cloud for disaster recovery and for cloud bursting. With a rewrite strategy,

SoftNAS®, Inc. has pioneered cloud data control and management with its SoftNAS Cloud® data platform. The company began six years ago as the global leader in software-defined Cloud NAS and has matured into an enterprise software company. The SoftNAS Cloud data platform provides customers a unified, integrated way to aggregate, transform, accelerate, protect and store data and to easily create hybrid cloud solutions that bridge islands of data across SaaS, legacy systems, remote offices, factories, IoT, analytics, AI and machine learning, web services, SQL, NoSQL and the cloud – any kind of data. SoftNAS Cloud works with the most popular public, private, hybrid and premises-based virtual cloud operating systems, including Amazon Web Services, Microsoft Azure and VMware vSphere.

George Crump is President and Founder of Storage Switzerland. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN. Prior to founding Storage Switzerland he was CTO at one the nation’s largest storage integrators where he was in charge of technology testing, integration and product selection.

ABOUT US

Storage Switzerland is an analyst firm focused on the storage, virtualization and cloud marketplaces. Our goal is to educate IT Professionals on the various technologies and techniques available to help their applications scale further, perform better and be better protected. The results of this research can be found in the articles, videos, webinars, product analysis and case studies on our website storageswiss.com