Pros of cloud computing: Costs, automation
With public cloud, a business avoids large upfront capital costs associated with on-premises alternatives. Rather than fork over hundreds of thousands of dollars for servers all at once, public cloud enables companies to spread their IT expenses into ongoing, monthly payments.
Not only are in-house servers expensive, but they also have firm capacity limits.
“Many businesses overprovision their data center resources because they want to avoid potential performance issues,” said Jay Lyman, principal analyst 451 Research. Public cloud’s bursting capabilities enable a business to pay only for the resources it needs.
By comparison, cloud services include more automation features than a typical on-premises data center. Rather than retain multiple layers of employees involved in tasks such as allocating new infrastructure, a cloud service will provide application-centric automation functions for self-service provisioning, said Torsten Volk, managing research director at EMA.
Keeping a data center current requires manpower as well as money. Staff needs to manage system performance and secure network connections. With technology evolving at a rapid pace, corporations find themselves scrambling to find and hire qualified workers. By using public cloud services, businesses offload staffing challenges and reduce their own personnel expenses. “We have a client that supports 20,000 virtual machines on a global network with less than 100 IT workers,” Christiansen said.
Another key driver behind the push to the public cloud is the CFO’s demand to get out of the business of operating IT infrastructure, Volk said. Many business leaders do not view infrastructure management as strategic to the business and instead want to hand that task over to a third party.
Not even cloud is perfect
But there are cons of cloud computing that enterprises need to understand. Some workloads fit better on site. In fact, Uptime Institute found that despite the recent cloud hullabaloo, the percentage of applications running in corporate data centers has remained constant at 65% since 2014.
One reason is that cloud computing isn’t always cheaper than on-premises systems.
To help organizations determine how much services will cost, public cloud vendors offer pricing calculators. Some organizations have found such toolshelpful.
Hightail, a file-sharing and collaboration platform vendor, began researching a move to Amazon Web Services (AWS) in 2014. The business, which has about 100 employees, operates data centers in the United States and the United Kingdom. Initially, Hightail was leery of relying on AWS tools to price its service because costs have many variables, said Shiva Paranandi, senior vice president of technology at Hightail. “The AWS solution has proven to be quite accurate in its estimates,” Paranandi said.
Since these tools come from vendors, they may not present businesses with the most efficient options for cloud services.
“The public cloud financial tools only examine basic functions, like average and peak system utilization,” said Andrew Hillier, CTO at Densify, a cloud predictive analytics supplier that until recently went by the name Cirba.
In terms of actual costs, the details matter. AWS charges customers for moving information in and out of the cloud. A workload that has high I/O — say, an e-commerce application — may not be a good fit for public cloud. In response, third-party software — such as Densify, CA Unified Infrastructure Management, CloudFabrix AppDimensions and VMware vRealize Operations — offers vendor-agnostic help so businesses discover potential financial gotchas.
Performance is always a worry when weighing the pros and cons of cloud computing. Cloud moves applications and information out from central data centers — which often are located in the corporate office — to remote locations. The change often affects performance.
“You need to know how much database latency your application can endure while still delivering acceptable performance,” EMA’s Volk said.
The enterprise network may become a bottleneck. With public cloud, traffic moves out of the data center and over the WAN. Necessary upgrades to those connections can be tedious and expensive.
In the spring of 2016, PrimeSource Building Products, a building materials manufacturer, decided to migrate its SAP ERP application to the public cloud with the help of Virtustream, a cloud provider and Dell Technologies business unit. “Initially, we had a 20 Mbps [multiprotocol label switching] connection but needed more bandwidth,” said Tony Caesar, PrimeSource’s CIO. The company then upgraded to a 100 Mbps burstable connection.
Public cloud providers often offer only basic network protocol support: HTTP/HTTPS and TCP. Customers may require more. The provider’s data center could be geographically far from users, which increases latency and diminishes performance. The public cloud provider may not support techniques — such as caching, compression and TCP optimizations — typically used to boost network performance.
Taking the first step
Once a company decides to move to public cloud, another consideration is a cloud migration tool. Cloud vendors have taken the lead here, too.
The AWS Migration Service, for example, automatically replicates live server volumes to AWS and installs machine images as needed. The system creates custom migration schedules and tracks their progress.
The migration process requires a detailed workload analysis. Here a business identifies application dependencies, sometimes discovering unwelcome system surprises. “We found that business units had installed applications without input from the IT department,” Caesar said.
Third-party migration tools are available to address such problems. Racemi sells the DynaCenter migration tool. RiverMeadow Software offers live workload cloning for workload testing. Virtustream designs new workflows and consumption models.
While the various cloud planning tools are helpful, they also have shortcomings. These products focus on the most common application migration scenarios. On-premises and cloud storage systems often manage data in fundamentally different ways, so custom development may be needed to bridge the gaps.
In addition, public cloud services are dynamic. AWS, for one, issues new releases every few minutes. The third-party vendors find it difficult to keep their tools current with such rapidly changing system designs.