Over the years, the use of public cloud storage by enterprises has been growing steadily, but many IT departments are still in the early stages of the journey.

It managers are well aware of the potential benefits of public cloud storage, including total cost of ownership, agility and unlimited on-demand capacity, and many are facing cloud directives. However, those enterprises that invest heavily in on premises infrastructure are still exploring the most valuable applications of public cloud storage, in addition to backup, archiving and disaster recovery (DR).

Ken lamb, who oversees cloud computing elasticity at JPMorgan Chase, believes that cloud computing is very suitable for them, especially when the financial services company needs to quickly bring applications to market. Lamb said JPMorgan uses public cloud storage from multiple providers for development and testing, production applications and disaster recovery, and runs workloads internally in “production parallel mode”.

At JPMorgan Chase, cloud storage accounts for a relatively small proportion of its overall storage capacity, but lamb said the company was planning a large-scale migration to Amazon cloud computing services (AWS). “The biggest problem is how applications interact. When you put things in the cloud, you need to consider: can it connect back to anything inside you? Does it have high communication with other applications? Is it tightly coupled? Is it delay sensitive? Do you have compliance requirements? These are the key determinants,” Lamb said

Scott Sinclair, senior analyst at enterprise strategy group (ESG), said that ESG research showed that the number of enterprises running production applications in the public cloud has increased, while a few years ago, most enterprises only used the public cloud for backup or archiving. Sinclair said that they also see more enterprises regard themselves as “cloud first” in terms of overall IT strategy, although many enterprises “have just begun their journey”.

Sinclair said: “if you are an established company with decades of history, you will have your own data center, and you may already have a Pb environment. Even if you don’t have to worry about the pain of moving data, you can’t transfer Pb books to the cloud overnight. Unless there is an urgent need, they will remain silent. The same is true of analysis.”

Stephen Whitlock, who is responsible for cloud operations in computing and storage at the Hartford insurance and financial services company, said that they have only a small amount of data in the public cloud. However, the company plans to transfer the TB (if not Pb) data of its Hadoop analysis environment to Amazon’s simple storage service (S3).

Whitlock said that one of the challenges faced by the Hartford company is to map permissions to its big data set from internally deployed hortonworks Hadoop to Amazon elastic MapReduce (EMR). Whitlock said that the company plans to migrate computing instances to the cloud, but the data based on Hadoop distributed file system (HDFS) remains in-house, while the team will migrate to EMR file system (emrfs), Amazon’s HDFS deployment.

Whitlock said that the Hartford company’s top priority is to complete the Hadoop project before seeking other use cases of public cloud storage, including “cutting-edge” and “edge” workloads. He knows that the cost of network connection, bandwidth and data transmission may increase, so the team plans to focus on applications that provide the greatest cloud advantage. The hartlock’s local private cloud is usually suitable for small applications, while the public cloud makes more sense for data-driven workloads, such as “we can’t catch up” analysis engines.

“We never thought about moving everything to the cloud. We analyze metrics. Cloud computing is not cheaper, it’s like a convenience store. When you’re missing something and you don’t want to drive 10 miles to a big supermarket, you go there,” Whitlock said

Capital District physicists’ health plan (cdphp), a non-profit organization based in Albany, New York, has learned from practice that cloud computing may not be suitable for all applications. Cdphp launched the cloud initiative in 2014, used AWS for disaster recovery, and soon adopted the cloud priority strategy. However, Howard fingeroth, cdphp’s director of infrastructure and data engineering, said the organization planned to move two or three management and finance applications back to its local data center for cost reasons.

Fingeroth said: “we initially made a lot of direct migrations, which in some cases did not prove to be a really wise choice. Now we have modified the cloud strategy to become what we call ‘smart cloud’, which actually involves a lot of analytical work to determine when it makes more sense to move things to the cloud.”

Fingeroth said that cloud computing can help improve: agility, affordability, flexibility and recoverability. He said that cdphp mainly uses Amazon’s elastic block storage for production applications (running in the cloud), and there are cheaper S3 object storage for backup and Dr, combined with commercial backup products.

John Webster, senior partner and analyst at evaluator group, said: “over time, people become more sophisticated in using the cloud. They will start with disaster recovery or simple use cases, and once they understand how it works, they will start to expand further.” Webster said that the evaluator group’s hybrid cloud storage survey in 2018 showed that disaster recovery was the main use case, followed by data sharing / content repository, test and development, archive storage and data protection. About a quarter of people use the public cloud for analysis and tier 1 applications, he said.

Leave a Reply

Your email address will not be published. Required fields are marked *