I attended a Cloud Interoperability Forum in Mountain View yesterday, hosted by Stephen O'Grady from Redmonk and David Berlind from InformationWeek. I roughly counted around 50-60+ in attendance, with a moderate drop off after lunch.
Twitter stream is available under #cloudinterop.
Here are my takeaways, the day after....
Cloud Taxonomy: aka "What we have here is, failure to communicate"
TL;DR version of this post: We think we know what we're talking about when we discuss "cloud computing". We really don't know what we're talking about - there's a lot of confusion, and it's rapidly becoming a marketing term. Thus, a taxonomy would be useful, if we're ever going to foster interoperability or portability.
Out of everything, I think the desire and will to build a taxonomy was the main outcome of the meeting.
Diversity of Clouds
Clouds come in many shapes and sizes. Infrastructure, developer platforms, storage services, etc.
There's a groundswell of "me too" infrastructure-as-a-service cloud plays, and they're the ones that want/need interoperability the most. I worry that this tends to drown out the conversation, and I'm not sure that this is what customers really are after (more on this later). The two Google App Engine guys (Architect & PM) in the room left after lunch, from what I could tell.
Interoperability at a platform level like Google App Engine or Salesforce becomes just like good old data integration - ETL, EAI, SOA, REST, etc. Some in the audience seemed to want to solve this latter problem (which seems, politely, a high hill to climb).
I spoke up and noted that we should try to understand the areas where there is broad agreement, and the areas where there is no broad agreement, and focus on the former. Because otherwise we're just going to wind up with a messy niche. This was echoed by several participants.
Even with areas with broad agreement we're going to have a lot of work to do, weighing existing standards against their old assumptions which may (or may not) apply. For example, "Cloud Storage" was brought up as an area in need of standardization. But, at what level? Management, provisioning, monitoring, etc? Should it be a high-level API? Or something more like the specs that the SNIA has put out? All of this requires a lot of thought as to the intended audience and the scope of use cases.
Openness, Ideology, and Standards
Bob Sutor, of IBM, stood up to speak to his experience of previous standards efforts. Two points struck me as debatable:
1. "The days of making boatloads of money on locked in technology are gone -- you're not going to get a patent and sit on it."
I agree with this, to some degree, but I think it may be misleading. It's easy to say that "nuclear weapons are no longer effective" when you sit on the largest stockpile of them. IBM has (and continues to collect) the world's largest patent library. And most of their software portfolio is proprietary, and will likely remain so.
No question, open standards and open source implementations are essential, but the issue is figuring out how to balance collaboration, adoption, and the desire to make money by (in part) excluding competitors. "Commercial open source" companies do this by offering proprietary add-ons. Even RedHat does this, by excluding 3rd party distributors from using its trademark. You'll also notice most ISVs certify their software on RHEL, not CentOS... as intended.
2. Bob urged caution in the tale of REST vs. WS-* to avoid ideology in developing cloud standards. Despite misgivings, "A lot of people made a lot of money on WS-*".
Firstly, I respectfully think this is a misunderstanding as to the role of ideology in standards making. That sword cuts both ways - is all I'll say.
Secondly, I think that it short changes the importance of architecture when defining interoperability standards. Do you build a Cloud API? Or a hypermedia format? Or a document exchange protocol? Or a data schema?
These things lead to drastically different market and business results, and depend on decisions made in the first day - so-called "ideological" decisions such as "what's your architecture?". If all you want is cloud providers to use the same API, I'm not even sure that's the main problem. Sure, it helps small providers in a small, burgeoning ecosystem, but I don't think that's what enterprise IT cares about yet, primarily.
IMO, standards bodies are dangerous affairs for small companies. It's rare that they have a tall seat at the table.
Open Implementations vs. Open Standards
A minor bun fight ensued related to the frustrations of market dynamics vs. building software that one can rely on beyond the lifespan of a company, or if the company has a policy you don't like.
Tim Bray noted that there is a visceral fear of lock-in among many of the companies he talks to. "Substitutability is everything". A senior tech executive from IBM noted that "substitutability focuses on a very narrow set of problems though - enterprise IT and CIOs have an integration problem to deal with".
Followed by various comments from the audience:
"The cost of an Oracle maintenance is too much to deal with".
"Yet few are switching away from Oracle for new deployments."
I suspect Sun's acquisition of MySQL likely has something to do with the above discussion.
I don't really think there will ever be a resolution to substitutability vs. lock-in: it's a fundamental market dynamic that will be played out repeatedly in different ways.
Anyway, it seems we're back to the old nugget of standardizing for Interoperability vs. Portability, something I recall that was the argument for WS-* over EJB back in the late 1990's. EJB supposedly gave you portability, and RMI/IIOP was what gave you interoperability, and it wasn't good enough because it (realistically) preferred Java on both ends. SOAP/XML was language agnostic and, better yet, supposedly "ideologically agnostic", so that VB developers would play as equally as C++ developers and Java developers.
At best these have both been "modest" successes. I would lean towards believing that interoperability as something actually having lasting business impact -- reducing transaction costs. Portability can do that too, but it's much more case-by-case. We really should be careful as to which we prioritize, and in what area.
A second thread of discussion was on how hard it is to build an open standard, and how difficult it is for one to actually gain traction and become successful. One suggested that open source implementations are more effective means of interoperability - because since it is mechanism, it works, it doesn't have to be (badly) interpreted by several organizations.
But this too has problems, which I and several others pointed out:
a) you CAN get locked into open-source software - switching costs are still pretty high, based on how dependent you are. What happens if the project is taken in a direction you don't like? What happens if it doesn't address your needs? Well, you fork.... which leads us to:
b) If there realistically can't be ONE open source project for an area of cloud computing, there likely will be several. That don't interoperate, or aren't portable.
Which leads us back to the need for open standards with (at best) reference implementations.
Interestingly both the Chairman and President of the DMTF were in attendance and were actively trying to foster dialogue, particularly around the need for a cloud taxonomy.
"Identity" and "Trust" are rat holes of epic proportions.
A significant chunk of the meeting was discussing the ability to carry federated identity across cloud providers. I chimed in that I think more important is carrying identity from location to location in one's application.
I know this topic is near and dear to James Urquhart, and I agree that it's crucial for long-run adoption of a multi-provider marketplace. I unfortunately think that reality is quite a long ways off.
But, this is a problem that goes beyond clouds, and I'm not sure this audience was the right one to wrangle with it.
We have plenty of answers, but we aren't asking the right questions, yet.
The audience was largely falling into the trap of being technologists rushing to solutions without thinking through problems and the audience they're targeting.
There was a focus by many to "scratch personal itches". Which is all well and good, but that's what open source projects are for, arguably, not standards bodies.
There were a few comments of a dislike or disinterest in "academic standards" that will try to do too much. I caution that what is academic in one person's eyes is essential in another's. And sometimes people mistake "academic" for "breadth" or "ambition". Are SNMP MIB's academic? They certainly look pointy-headed, until you realize how pervasive they are. How about all the CIM schemas at the DMTF? Aren't they useful? What about OVF? Certainly it doesn't do much today, but I bet they have broader plans for it.
Finally, there wasn't much discussion about what Enterprises or CIOs want, despite the attempts of some audience members. Which to me, is the biggest concern - above the needs of the "cloud ecosystem" of small vendors, or the frustrations developers get when using today's cloud platforms. We need to focus on what businesses actually want out of this technology.