Some organizations rush into a cloud migration, assuming cost savings are a guarantee. But not all applications are meant for the public cloud, and moving them may cost you more.
Consuming resources only when you need them seems like the most obvious way to increase efficiency. While you can shut down a server to save pennies on power and cooling when it’s not in use, you can’t recoup any of the capital costs. And most OS or software licensing models don’t care how often you use the application. So, when you’re able to pay for the bundled resource, delivered as a service, only when you need it, of course you save money — except when you don’t.
Many applications just aren’t suited to run in a public cloud, for either technological or financial reasons, said David Linthicum, senior VP at Cloud Technology Partners based in Boston. To avoid paying more than they need to, organizations should carefully consider their application costs in an on-premises vs. cloud environment.
“It could be as many as 50% of applications in a traditional enterprise, and the average is about 30 to 40%,” Linthicum said. “You have to do the triage and understand the application portfolio — otherwise you will end up making dumb decisions and moving workloads to the cloud that will end up costing you more money.”
More of the SearchCloudComputing article from Nick Martin