BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
A recurring theme is emerging with AWS use: The more customers use the service, the more they want to use it in new and different ways.
This consumption cycle around deeper use of cloud services is echoed by the Financial Industry Regulatory Authority (FINRA) and The New York Public Library. Both organizations turned to the cloud to cut costs and manage spikey infrastructure demands, but found themselves reshaping their entire IT operations to take advantage of the growing number of services on AWS that go beyond basic infrastructure.
FINRA is a non-governmental organization that regulates brokerage firms and exchange markets and handles 75 billion transactions a day. It stores petabytes of data and millions of objects in AWS, and in July, it completed a two-and-a-half-year transition to the platform.
FINRA had a sizable investment in on-premises data warehousing and initially viewed AWS as a place to store additional data using Amazon Simple Storage Service (S3). Initially, it did mostly lift-and-shift techniques, before completely re-platforming one of its major applications to Elastic MapReduce (EMR), said John Hitchingham, director of data and analytic services at FINRA. The switch resulted in improved resiliency and failover, and cut the cost for operating the application in half.
In a traditional data center, the hardware refresh cycle is typically between three and five years, while application refreshes can take five to 10 years, Hitchingham said. But the technology refresh can happen much faster on AWS because of an evolving catalog of features and the fact that apps are untethered from specific hardware.
"We evolved with Amazon's product stack as we went," Hitchingham said. "The cases where we were in more of a lift-and-shift for some of our systems to begin with, we started to take advantage of the Amazon services as they matured."
Jay Haque, director of DevOps and enterprise computing at The New York Public Library, joined Hitchingham in a breakout session at the AWS Summit in New York last week. The two men talked about their organizations' journey to their AWS use and lessons they learned during the transition.
Cloud computing continues to make up a small fraction of overall IT spending -- estimated by industry analysts as 5% to 10% of budgets -- and that's rapidly expanding. Organizations such as The New York Public Library and FINRA represent a growing number of AWS users that have moved beyond low-hanging fruit such as backup and disaster recovery, and are looking for more ways to use the cloud to move from legacy systems.
Thus, cloud providers are viewed as an integral part of their changing IT strategy, not just a means to cut costs. They often integrate baseline features with new higher-level services to gain even greater efficiencies. Vendor lock-in -- an early bugaboo about public cloud -- is downplayed by those that go all-in on AWS as the perceived benefits of these new models outweigh the risk.
The New York Public Library, which has 20 million items in its collection, started with a small proof-of-concept project with a website on AWS. There was little appetite to take on another major IT project because it was already in the process of adopting Workday and ServiceNow, but the cost and time savings were considerable enough that the organization soon shifted to a mindset to integrate AWS use into everything it did.
An unintended result is the ability to foster and encourage experimentation, and saying "no" much less often when requests for new projects come in, Haque said. "We move so fast everyone wants it faster, so we have to figure out how to do that."
Now the organization is looking at things it didn't consider possible three years ago as it adopts DevOps and looks to move to NoOps, providing a self-service production environment for its developers that the operations team is OK with running. That involves using services such as Elastic Transcoder, Glacier and Lambda as it tries to figure out how to safely move hundreds of petabytes to a digital medium that would provide improved access for the public. It's also taking advantage of open-source tools and APIs so other organizations can help build capabilities alongside them.
Going all-in isn't always realistic in a multicloud world
Of course, going fully all-in on a single cloud platform isn't really possible for most companies, despite the efforts of AWS and others to constantly extend their capabilities into new areas. Even with their full-throated endorsement of AWS, The New York Public Library has applications it either can't or won't move to the cloud, while FINRA has a large database workload still on-premises -- though it aims to either move it on top of AWS or transfer the data to an AWS-native database service.
Enterprises rely on a wide range of providers, whether it's using Microsoft or Google for email and documents, or Oracle and SAP for business applications, said Holger Mueller, principal analyst and vice president at Constellation Research in Cupertino, Calif. There are also different cloud platforms that are better suited for certain workloads, so while it remains beneficial to put as many eggs in one basket as possible, the future will be multicloud, he added.
"The reality is from execution perspective nobody offers everything, if you think of everything an enterprise needs to run," Mueller said.
And while those who go all-in on AWS are less concerned with lock-in, there are other companies that remain hesitant about permanently hitching their strategic applications to a single cloud infrastructure. Some of the largest corporations in the world have backups in AWS, Microsoft Azure and Google Cloud Platform, while platform as a service offerings, such as Cloud Foundry, are finding success with enterprises as infrastructure-neutral software.
Tips on moving to the cloud
The first step in getting the most out of a cloud transition is building in as much automation as possible, Haque said. However, as most organizations lack the internal skill set to go all-in on public cloud, and because AWS comes out with hundreds of new features each year, he highly recommends staff participate in training programs and digest all vendor documentation.
"It's important that you're providing enough air cover so you can actually rebuild and retool," Haque said. "There's a learning curve -- it's not huge, but it takes time."
The New York library system is still figuring out how to budget for the cloud, in part because the scale is different and costs can be higher than expected when there's a surge in use, Haque said. The key to selling it internally is to promote an entrepreneurial spirit among developers, and encourage them to go out and do their own homework on a cost model that can easily be wrapped into the business case.
FINRA built a data registry to know where all the data was, which helped considerably in the transition, Hitchingham said. It also found considerable savings by moving away from a fixed set of nodes to run Hadoop, and instead integrating EMR and other open-source tools with S3 to query the data natively. The corporation still sometimes loads data into Redshift for performance reasons, but hopes to see better integration with S3 as the product matures to avoid having to make those transitions.
Hitchingham recommended newer users look at higher-level services when possible to reduce total cost of ownership and focus on the parts of their workloads that make a difference for the business.
"One of our big lessons ... is trying to leave that data center mentality behind," Hitchingham said. "[It's about] not thinking of people operating boxes and getting hands-on, it's really infrastructure becomes an extension of your application code base."
Trevor Jones is a news writer with TechTarget's Data Center and Virtualization media group. Contact him at firstname.lastname@example.org.
AWS unveils Application Load Balancer, other features
What's next in AWS' second decade
Hybrid cloud deployment makes sense for many AWS users