"So let me get this straight. You want to build an external version of the Borg task scheduler. One of our most important competitive advantages. The one we don’t even talk about externally. And, on top of that, you want to open source it?" Craig McLuckie, Co-founder of Kubernetes and Senior Product Manager at Google
Cloud computing has become progressively relevant over the past decade and continues to be fundamental to businesses far and wide. However, before the shift to cloud computing server administrators would host web services on physical hardware that was owned or rented. This method worked but it did not scale well and had a very high entry barrier. Cloud Computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet. Large clouds, predominant today, often have functions distributed over multiple locations from central servers. If the connection to the user is relatively close, it may be designated an edge server.
Running a data center requires purchasing and maintaining hardware, securing on-site staff to ensure the data center is running smoothly, and other detailed requirements. On the flip side when utilizing cloud services, we are paying to use a slice of the cloud provider’s hardware. The provider of the cloud service attends to the data center maintenance and upgrades. Their maintenance cost is distributed across all of their clients who use the services, which makes the final cost much less expensive for a specific customer.
Whether we are anticipating increased traffic or seeing unexpected spikes in traffic, using cloud architecture allows us to easily scale the number of servers we use. Instead of having to purchase more hardware or upgrade specifications, we now use the administration console to scale up and within a few minutes there are additional servers available. After that, we are ready to scale back down in a short amount of time.
Instead of running a single server for every application or colocating countless applications on a single server, we can run a pool of servers and run each application isolated in its own container. A container gives similar benefits to a virtual machine (VM), like separation of concerns and isolation of each server process, while also being considerably lighter and more native than VMs. Since the containers are given a configuration, the same container can be run on a production server and on a developer’s local machine, streamlining the development process.
Kubernetes was originally designed by Google with over 15 years of its deployment practices in mind, now it is maintained by the Cloud Native Computing Foundation. Kubernetes is an open-source container platform that eliminates many of the manual processes involved in deploying and scaling containerized applications. Containers have proven themselves to help with predictability, scalability, and security. They also create a need for an intricate infrastructure with a per-container configuration that can be automatically updated in different environments, Kubernetes is a popular orchestrator for this infrastructure. Kubernetes helps with container configuration, persistent data volumes, networking, and security with isolated namespaces and secret management.
Kubernetes allows us to change our mindset: instead of pushing changes up, we now define a deployment and then the cluster will find the best place within a pool of servers to deploy it. It will constantly run health checks, so when we make a code change and are ready to push a new version, it will ensure the new version is ready before users are routed to it. This allows for zero-downtime deployments and will also notify us if a deployment fails although the users will never see an error page.
Over the years, top companies from around the world have embarked on a Kubernetes journey; Spotify, BOSE, eBay, Adform, and Pokemon Go all have a story to tell. For more information about Kubernetes, our deployment infrastructure, or how to integrate containers into your development process, contact us.