Since it hit the scene in earnest a few years ago, I’ve despised the word ‘cloud’ in the context of what I do for a living. I’ve warned people prior to using it in presentations, proclaimed my joy for having not used the word and bashed it most every time it was mentioned. I’m here to say that my position on ‘the cloud’ has matured now. I don’t hate the word. But I do hate how most people in the world are defining it.
My own definition has taken a long time to develop. I’ve known it was a powerful concept for some time. I’ve also known most of the IT world has been talking about of the side of their neck when they made their salaries by talking about it. I particularly enjoy people proclaiming they know all about the ‘next generation of cloud’ when we were all still defining the first one. At any rate, to define ‘cloud’, I first have to define ‘PaaS’ to my own satisfaction.
PaaS (noun) – short for Platform As A Service. It provides a complete application platform for users or developers (or both) by sufficiently abstracting those services away from their underlying technology platforms.
A little wordy, I know, but we need to be specific here. Not only must a PaaS provide a platform, but it must do it in a way that abstracts developers and users from administration of the platform. It must also handle all of the tertiary services (DNS, port forwarding, scalability, inter-connectivity, etc.) that administers usually have to handle after the fact.
PaaS lets developers develop, and lets administers admin.
So what is cloud?
cloud (noun) – an implementation of a PaaS solution that is seamlessly and automatically scalable to handle load demands and they grow and shrink.
So you start with a PaaS and you build it out so it will grow and shrink automatically as needed for it work load.
What is NOT a cloud?
- provisioning virtual machines really quickly
- setting up a PaaS that is brittle and confining for developers and users
- writing 3 or 4 scripts to help automate your virtualization infrastructure
- almost everything being marketed as a cloud today
To define a cloud you have to define PaaS. PaaS is defined as that slick layer of magic that abstracts the application away from everything the application runs on or in. A cloud is a seamlessly scalable instance of a good PaaS. Easy, isn’t it? Step 3, profit!
A few months agao I wrote up a series of thoughts (and here, here, and here) about tools that you really shouldn’t wait on when you started up a new project. I totally stand by them.
Today I had a really great conversation with a proven IT/web heavyweight and he’s starting up an IT project with as close to a non-existant IT budget as I’ve ever heard. Their codebase is Python (lots of Django love), so it’s living in Google Project’s realm. Most of their other services (CI is what we primarily talked about) was running on Heroku (http://www.heroku.com/), which at the levels they currently need, is free.
So they’re pretty effectively leveraging IaaS and PaaS offerings at the low, free levels to get their company started. But are they really saving money or are they just deferring the cost with interest like all of our student loans?
With the math I’ve done in the past and recently, IaaS and PaaS are most cost-effective for high-intensity, low-duration operations. At 5AM one of the biggest uses we’ve found for AWS is processing large chunks of scientific data. When you’re dealing with a service you want up 24/7/36[5-6], unless it’s AWFULLY lightweight, it doesn’t end up being a cost-effective candidate for pushing to you “cloud” provider. I think there are still a few years of truth in these statements. Eventually it will become too cheap to not do it, but that’s a little ways off, still. A company still needs its “40 Acres” somewhere.
So what about these guys? I’m entering into a little bit of speculation here, as I’ve never used Heroku, and I’ve only lightly used Google Project, but both of these are pretty highly customized environments. It would seem to me that the longer you live exclusively in these environments the more your code will become dependent on the idiosyncracies of these environments. I don’t think these environments are bad or undesirable. I’m just saying that I think they will diverge from a “vanilla” build by their very nature, and bringing an application back from one of these environments to run on bare-metal or a “regular” virtual machine will be increasingly difficult.
So, provided that cost exists, the longer you defer the cost, the higher it will be. Is it worth it? I don’t know.
If someone has experience either way, I’d love to hear about it. I know I’m going to continue thinking about it…