Hybrid Cloud Scenarios: Control the Data

If there’s anything that scares the daylights out of regulated, security sensitive organizations, it’s losing control of sensitive data and getting hacked or falling out of compliance. According to a global survey of IT decision makers conducted by Dimensional Research, data security is the highest hurdle to jumping into the public cloud. IT concerns include not only the type of security and controls applied by the cloud provider, but the possibility that sensitive data might end up in some cloud data center abroad in a region with different regulations and bigger security risks than their home country. Crafting the right hybrid architecture is mostly about deciding what to keep private, which depends on the relative importance of cost, performance, security and compliance and the parts of the infrastructure they apply to. Our first hybrid cloud scenario blog, Own the Base, Rent the Spike, was about controlling costs and performance. This second set of scenarios, Control the Data, has everything to do with better managing security and compliance. Good News The good news is that you can retain complete control of your data and take advantage of public cloud scalability, flexibility and fast ramp-up at the same time. How? Put applications, servers and anything else you want in the public cloud, but store the data in a private cloud where you have complete control. This is of utmost importance for compliance purposes where auditors need to pinpoint the location of data to the exact data center, cage, rack and rack unit of the storage device. Performance is the obvious concern when data and applications are in two different locations. A typical public Internet connection is just too slow and unreliable to make this scenario work. A direct, high-performance, low-latency connection between the public cloud service and the private cloud is a must and offers superior security as well. In addition, locating your private infrastructure within a colocation facility close to public cloud and network providers reduces latency and lowers communication costs. Aside from maintaining complete control over data security and compliance, keeping your data private lets you customize data storage for the performance needs of your particular workload and accommodate any software limitations that prohibit moving data into the public cloud. There are several ways to slice the data:

  • You can put all of your data onto private clouds (on- or off-premise in a multitenant data center) and most of your compute resources onto public clouds. This might be the simplest, most straightforward approach.

Protecting Sensitive Data, Replicating to/from Enterprise img1

  • The other alternative is to slice data based on its sensitivity. Put your compute resources and less sensitive data in the public cloud but separate sensitive customer and proprietary data, such as medical records or financial information, for private storage.To maximize performance, place your stateful and chatty application tier in the private cloud close to your data and put your stateless application tiers (such as the Web tier) in the public cloud where they can scale quickly. You can leverage high-performance, secure direct connections to the public cloud and remove many of the security barriers to cloud adoption.
  • Slice data based on performance characteristics. Deploy cloud storage in a tiered fashion across public and private clouds using solutions such as those offered by StorSimple from Microsoft. This technique allows you to tier your storage in cost-effective ways with ‘pay as you grow’ pricing. For example, you can use tier 1 for local copies of rapidly accessed data, tier 2 for cloud “on-line” storage and tier 3 for cloud “deep” storage for backups.

The private cloud can also act as a repository for production data and the public cloud a second source of data storage for backup and disaster recovery. Tiered Cloud Storage solutions for the Enterprise img2 The Best of Three Worlds: Scalability, Security and High Availability The private storage-public compute solution at Equinix is a perfect example of a highly secure, highly available and high-performance hybrid deployment. This model has been adopted by major storage providers as can be seen in NetApp’s Private Storage for Microsoft Azure and AWS, as well as EMC’s private storage for Azure. The solution collocates private storage in the same Equinix IBX facility as the edge node of the compute cloud service provider, interconnecting them with a direct, high-speed, low-latency connection for fast performance. The cloud provider offers significant compute scalability, while private storage gives you maximum security and compliance. Putting the storage and compute as close as possible yields high-performance with low communications costs. An even more compelling example is a cloud-to-cloud failover demonstration that was proven at one of our Equinix Solution Validation Centers in this NetApp video. The demonstration shows a high-speed failover via a cluster that spans Microsoft Azure and AWS, with storage in the NetApp Private Storage cloud. When service stops on the AWS servers, Microsoft Failover Cluster Manager fails the SQL database application over to Azure instantly. This is the best of not only two worlds, but three ̶ combining public cloud scalability with multicloud cluster failover reliability and highly secure private data storage. Enterprise Replication and Disaster Recovery from the Cloud img3 If security and compliance are your priorities, these Control the Data scenarios could be the ideal solutions for your organization to move its valuable data assets into a hybrid cloud environment.

Contact our GSA Team to learn more.

Follow our Hybrid Cloud Scenarios Series