- Get Stackato
- Why a Private PaaS?
- Features & Benefits
- Stackato by Language
- Compare Editions
- Stackato & Cloud Foundry
- Developer Tools
- Stackato Training
- Professional Services
- Commercial Support
- Code Recipes
Bernard Golden, May 1, 2014
If you work in technology, you’d have to have been under a rock to have not heard about Docker. In a nutshell, Docker provides a lightweight container for code that can be installed onto a Linux system, providing both an execution environment for applications and partitioning to securely segregate sets of application code from one another. While this high-level description doesn’t sound that exciting, Docker addresses three key issues confronting application developers:
- Efficient resource use: One of the problems confronting IT organizations is how to get the most benefit from computing resources; this translates as to how to raise utilization of servers to ensure that their cost and power use is actually applied to computing rather than being used to operate a server that is running, but performing no useful work. The previous solution to this issue was virtualization, which enabled a single server to support multiple virtual machines, each containing an operating system and software payload. While virtualization helps address the issue of utilization, it seems obvious that operating multiple virtual machines, each with its own operating system presents the problem that a lot of the server’s resources may be tied up with running multiple operating systems rather than application code, which is where all the value resides. Said another way, the operating system is a necessary evil, but it’s not where business value resides. A solution that reduces the proportion of the server’s overall processing capacity devoted to running operating systems would be extremely valuable. Docker is that solution -- it requires only one operating system per server and uses containers to provide the segregated execution environment that individual virtual machines previously provided. My colleague Phil Whelan used an analogy of a server as being like a jar -- and choosing to use sand rather than marbles to most efficiently fill the jar; just so, containers are more efficient as optimizing overall server use and waste less computing capacity (i.e., leave less “wasted space in the jar”) than virtualization.
- Workload encapsulation: A container offers exactly what it sounds like -- an environment to hold something. In the case of Docker, it holds a set of executable code that runs inside the Docker container. This means that the container encapsulates the execution code, and that the container can be transferred from one location to another. This simplifies the application lifecycle, as containers can be passed from one group to another with no need for separate groups to recreate the same application into different environments via recompiling and repeated configuration.
- Workload portability: It’s a fact of life that businesses use a variety of application deployment environments -- a single company may deploy applications into an on-premise VMware vSphere environment, a virtual private cloud run by an OpenStack-based provider, and also Amazon Web Services. Each uses a different hypervisor and has a different set of operational controls, which presents a challenge to organizations that desire greater flexibility and choice for workload deployment. The previous vendor solution to this issue was OVF -- the Open Virtualization Format -- which promised to enable workload portability, but in practice ended up being a mechanism to transport proprietary virtual machine images along with operational metadata. This reduced the vision of true workload portability to vendor-constrained islands of technology homogeneity, which didn’t really address end user objectives at all. By contrast, Docker containers are easily transported and run on any hypervisor environment that supports Linux -- which is all of them. Therefore, Docker is a much better solution to workload portability and addresses a key user desire. You’ll hear much more about how Docker enables workload portability over the coming months and years.
Given the advantages Docker offers, it’s easy to understand why it has been so avidly embraced by the vendor and user community. It addresses efficient use and provides for better workload portability.
On the other hand, Docker does not solve all application problems. In fact, its benefits expose a significant issue: if it’s easier to run and distribute workloads, then efficient creation and management of application workloads is all the more important. And Docker does nothing to ease application creation and management -- it merely does a fantastic job of deploying workloads once they are created.
And application creation and management is where Stackato shines. Its Cloud Foundry-based framework accelerates application development and management by providing easy to use code deployment inside a Docker container, as well as predefined and managed application data storage (i.e., database). Moreover, Stackato makes it easy to grow and shrink the pool of Docker containers within which an application operates.
For organizations looking to ease application development and deployment and that also want to retain the maximum flexibility in terms of deployment location, combining Docker and Stackato is the perfect solution. In fact, ActiveState agrees with this so much that it integrates Docker into its Stackato product.
So if you’re a company or IT organization looking to address the issue of workload portability, Docker and Stackato is a good place to start your search.
Use Stackato to obtain the workload portability you need. Build your own Stackato Cluster using up to 20GB of RAM running on your own infrastructure or public cloud. The cluster is free to use for internal testing or in production.
Subscribe to ActiveState Blogs by Email
Share this post: