"Open" is a controversial term in technology. It sounds like common sense to prefer "open" to "closed," but some technology professionals will say open is a buzzword and there is no real value for customers behind it. We need to differentiate between marketing jargon and reality when it comes to an open cloud, which requires that we first define the term.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Let's begin by understanding what open means. Open cloud is not the same as open source cloud. Open source defines a software licensing model. The fact that it is free is not what's most relevant for customers, and while other aspects like community-based development can be valuable, the most important aspect of openness is not about cost or how code is written.
Being open is about embracing the technology choices that organizations have made and giving them freedom to move across technologies, models and cloud providers. The value of openness is in the flexibility it gives customers. Open is not a licensing model or a consortium; it is a philosophy that encourages customers to do business with you because they want to, not because they have to.
Embracing an open cloud means there is no technology lock-in, no contractual lock-in and no services lock-in. It means providers don't dictate technologies and that competition is embraced. But what does all this mean for cloud providers in the real world?
For customers building cloud-aware applications -- those that use a cloud application programming interface (API) to control the infrastructure and provision resources dynamically -- then fear of lock-in is a very compelling reason to use open technologies. Without that openness, customers that have built applications with a cloud provider's proprietary API may encounter problems if they need to change cloud providers later on because an application has gotten too large and complex. They may discover that switching providers requires them not only to change the API calls but to also rethink or re-architect the entire application, which is often incredibly expensive and impractical.
It is important to understand that an API is more than an application interface: It serves as an abstraction of the underlying model and the technology choices of the cloud provider. Building to a specific API implies adopting an architecture style, best practices, rules and designing an application around a specific implementation. That's the lock-in.
OpenStack enables portability, federation and multi-cloud
Whereas 12 or 18 months ago there might have been some doubts about the readiness of OpenStack, today the industry recognizes it as the open alternative for the cloud. Organizations that embrace the idea of openness are looking to avoid lock-in and to enjoy portability, federation and the ability to do multi-cloud deployments.
Open cloud is not the same as open source cloud.
Portability does not mean customers want to move workloads across providers on a day's notice. I can't imagine a customer wanting to run its ERP system in one provider one day and in a different one the next day. The value of a federation of clouds that share a similar architecture, interfaces (APIs) and management tools is in the flexibility to move workloads if or when it is needed. It does not need to be magical or even automated. It just needs to be straightforward.
A good analogy is how J2EE applications are relatively easy to port from one application server to another, even if differences in implementations exist. The old Java slogan of "write once and run anywhere" was not true in the literal sense, but it was certainly possible to build an application that would run in more than one architecture or to move an app to a different J2EE technology.
OpenStack portability is like an insurance policy for the future. Although there are a few differences between the different OpenStack-based clouds, customers generally can build applications to the OpenStack API that can provision resources across multiple cloud providers as well as OpenStack private clouds (wherever they are hosted) and even dedicated infrastructure. This enables them to move applications from one open cloud to another and, just as importantly, provision resources in multiple clouds simultaneously.
Multi-cloud deployments and cloud federation are immediate benefits of using OpenStack.
A common API that enables a single codebase is only the most visible element of this multi-cloud technology. Anyone who understands the value of private clouds and the power in the flexibility to host them on- and off-premises while connecting them with public clouds will appreciate having a single set of APIs to code for. However, just as important is the ability to do that with the same set of skills that can work on multiple clouds that share not only the same APIs but also the same foundational architecture.
Another benefit of embracing open technologies is the fact that a community can innovate faster, take advantage of a broader talent pool and provide more options than a single vendor or cloud provider. In the case of most proprietary technologies, all innovation happens in a single office or location. It's hard to imagine that all of the innovation for an entire industry should come from one place. Open communities are built on the wisdom of many. There is no one vendor, service provider, person or organization that defines (or controls) the future; instead, it is a meritocracy where the best ideas become reality.
In the end, there is always room for both open and proprietary models. Many professionals are OK with a long-term commitment to a specific model and provider, and many of them will do great. But there are many others who believe in the power of open innovation, who will look for a platform that gives them flexibility to run the clouds they need on their own terms, where they want and with the provider they choose.
About the author:
Gerardo A. Dada is a product marketing leader at Rackspace, a cloud and managed hosting provider based in San Antonio, Texas. Dada is an experienced technologist with more than 15 years of experience in enterprise software and Web technologies.
In his current role, he is responsible for all product marketing initiatives at Rackspace, including defining the company's go-to-market product portfolio strategy, global launch readiness, market analysis and segmentation.