Archive for November, 2011
Hello Little Printer, the fun gadget that brings the web to you http://t.co/MBUgOtK3 Ambient Devices should have done this 5 years ago
When I started Sonian in 2007, one of the driving forces for beginning what would become my third start-up journey was the allure of all-you-can-consume “ten-cent-per-hour” cloud computing. Amazon Web Services was the new IT game changer in town, and the on-demand compute platform they launched in August 2006 literally brought cloud computing to the masses over night. These past five years I have been studying “cloud costs” in different ways, and this weekend I looked back at the compute pricing history and uncovered some interesting trends.
Before I continue with this post, here’s a brief history of my experience with previous “clouds,” which illustrates why in 2006 I was ready to take a big leap into the AWS cloud as an early adopter.
In 2004 I was involved with another SaaS information archiving project, and I worked with a team at SUN Microsystems to create a reference architecture for our archive software stack to live on the “SUN Utility Compute Grid.” At the time we were hosting the archiving software on dedicated co-located hardware racks, and planning a large capital expenditure to increase capacity. In the guise of “there has to be a better way!” we entertained the idea of moving our software to SUN’s “cloud.” (In 2004 the term cloud computing was pretty alien…. the common term for this type of shared virtual computing was “utility computing”). SUN offered the promise of true utility computing, but at the end of six months of effort, we could not make the underlying cost structures work. SUN was charging one dollar per CPU hour and one dollar per gigabyte per month for storage. We ended up adding more capacity to our existing co-located hardware plant because our “all-in” internal unit costs were less than what SUN was willing to sell their compute grid for.
Now back to the purpose of this post… a historical analysis of the cost of cloud CPU from 2006 through 2011.
Beginning in August 2006, Amazon Web Service’s new Elastic Compute Cloud (EC2) service introduced the concept of the EC2 Compute Unit (ECU) …. a standardized way to define a unit of cloud computing, the associated characteristics of that unit (processor speed and memory), and a revolutionary hourly cost model requiring no up-front expense. Amazon achieved what SUN, IBM and others had been talking about for years, but could never bring to market. In 2006, for ten cents per hour 1 EC2 Compute Unit could be rented with no up-front costs. In 2006, a single ECU was defined as equivalent to a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor with 1.7 Gb of RAM. This 1 ECU reference is still in effect today.
@jrichards Inspiration… going to get a fire going tonight. First of the season.
I have just returned from a week in the United Kingdom meeting Sonian customers and business partners. The purpose of the trip was to expand Sonian relationships, but an added benefit was the opportunity to glean perspectives and adoption attitudes toward cloud computing in the greater EU market. Sonian is in a unique position to observe cloud adoption trends since our SaaS service is powered by true cloud computing infrastructures, and the conversations with EU business and technology leaders revealed their true thoughts about the state of cloud computing and indicators on adoption curves in 2012, and beyond.
The Buying Market
The EU market is not one single cohesive market, but rather smaller subsets that share some ideas, and diverge on others. UK and Ireland are (as you would expect) similarly aligned, and appear to share more in common with the Scandinavian countries, than with Germany and France, which have their own country-centric view of the cloud. The French language institute can’t even come to agreement on what to call “cloud computing” in France, settling on ”informatique en nuage” as a placeholder, but still searching for a unique French term that doesn’t break their rules on language purity and consistency (other examples: Software development is called “software addition” and the people that create software are called “software editors” since they literally “edit” source code files.) From my observations, the UK, Ireland and Scandinavian countries share more in common with the US thinking about the cloud, compared to Germany, France, Italy and Spain, which diverge on a number of key issues around data locality.
The total addressable information technology market in EU is roughly equal to that of the US. Except instead of a single national set of business rules, the EU market is fractured into separate countries, languages, tax systems and local business customs. This separation dramatically reduces the business efficiencies of technology providers attempting to service the EU community. The cloud could be seen as an antidote to inefficiency. Imagine an “EU Cloud” operating in a locality that pleases all consumers, and is the trusted provider. But it feels like a stretch goal to expect a single EU cloud to be accepted with the current barriers to a cohesive EU business strategy.
The Role of Government
The role of government in EU countries is more pronounced than we see in the United States, but there is no evidence yet that EU governments are pushing cloud computing as generic trend onto the private sector. Just recently the UK government established their “G.Cloud” initiative which looks similar to the US government’s “Cloud.gov” and Data.gov initiatives. This trend could be described as a “lead by example” scenario, with central government adopting cloud computing as proof it’s safe, cost effective and viable for the private sector. A myriad of data handling regulations seek to enforce “privacy” and “resiliency” to ensure citizens are protected from un-authorized access to personal information.