4 Ways Docker Makes Engineers More Efficient

Stephen Grider

Udemy for Business instructor

September 12, 2019

Docker containers. If you’ve attended any tech conferences or read industry publications in the last few years, you’ll know that these terms—‘docker’ and ‘container’—have seen growing buzz. While the words are often used interchangeably in the software engineering and IT space, they refer to two different things: Docker is an open-source platform used for the creation of software packages called containers

Docker use has surged in recent years. For the first time in its history, the 2019 annual Stack Overflow Developer Survey, included questions about Docker adoption and respondents named it the third most commonly used platform after Linux and Windows. Though the technology of containers is seeing growth, I find that the technology behind Docker is often misunderstood. In this blog, I will explain some of the common misconceptions about Docker containers, its benefits, and how Docker can be used with Kubernetes. You can also find out more on my course on Udemy Docker and Kubernetes: The Complete Guide.

What is Docker used for?

Docker has become especially useful in the DevOps process as a way to automate some of the manual tasks of a DevOps Engineer. The DevOps role typically requires troubleshooting complex problems such as provisioning servers and configuring them to run software associated with your company’s tech stack. A DevOps Engineer may write one unique configuration file for use on a single server, but with Docker the engineer could write one configuration and use it across many instances, avoiding cumbersome and manual configuration. 

An example of Docker’s use in industry is music-streaming service, Spotify, which runs a microservices architecture with nearly 300 servers for every engineer on staff. This large number of microservices caused a strain on the team’s deployment pipeline. By implementing Docker, the teams could move a single container across their CI/CD pipeline. Developers could guarantee that the container that passed the build and test process was the same one used in production. 

What are Docker containers?

Containers package everything required for the building and shipping of an application via something called “a container image.” This container image exists as the blueprint to execute the container and all of its determined operating systems, language libraries, environment variables, and more. Multiple containers can be created using the same container image. The container itself is a live computing environment while the image is a set of instructions for setting up the computing environment. You might think of the container image as a cake recipe to the container’s cake. 

Docker is the sandbox where developers can configure the very specific dependencies, operating systems, and libraries needed in a container for their applications and projects. The use of containers allows these configured packages to be used across computing devices. From your laptop to your coworker’s laptop to a cloud server, one container ensures they’re all running on the same operating system and programs. 

4 benefits of Docker for developers and IT

Across the board, the use of Docker and containers drive efficiencies for both development and IT teams. When developing applications on their local machines, developers need to ensure they have very specific versions of the software and tools required for their projects and Docker helps with this. Specifically, those benefits include:

  1. Onboard teams fast—For engineers starting at a new company, it’s not unusual to spend your first few days on the job working with IT to upload new environments to your system. Docker helps get new team members up to speed quickly by easily replicating a full environment. Teammates can run a Docker configuration file on their system and within the same day start contributing to company projects. 
  2. Straightforward, consistent collaboration—In addition to fast onboarding for new employees, Docker simplifies how engineers collaborate on projects without worrying about a common setup. Everything needed to work with a fellow developer or a DevOps engineer in a different office can be found in the container. 
  3. Cost-efficient—In pre-Docker and container times, you’d typically deploy to a virtual machine (VM); one computer running one piece of software. Even if you wanted to run software that would never be executed, you’d still have to utilize space on a VM. However, with Docker, more VMs fit on a single instance. A team can run multiple software containers on a single VM, including those that will never be executed within the final product, which is a more budget-friendly solution for development teams to scale. 
  4. Secure, fast deployment—Security threats are everywhere and one solution to mitigating the possible effects of running untrusted code from a third-party is to create a container and run code there. Say there was something harmful in that untrusted code, once I find it, I can just delete the container without having compromised my entire system. Because a container is an isolated environment, it won’t affect the rest of the computer since it doesn’t get persisted to the computer outside of the container. 

Want your development team to building containers with Docker? Request a Udemy for Business Demo to learn more about our development courses. 

How Kubernetes is used with Docker 

A term often used in conjunction with Docker is Kubernetes, which is an open-source platform originally developed by Google for the management and orchestration of containers. The most widely used implementations of Kubernetes are Google Kubernetes Engine, running within Google’s Cloud Engine, and Red Hat’s OpenShift, popular for hybrid cloud uses.

Why use Kubernetes with Docker containers? 

  • Permits multiple Docker containers to work together—With small applications, there’s likely one container running on a server and nothing else; it’s simple to manage. With larger applications, many containers need to run correctly together. Docker by itself won’t solve the problem of getting different containers to work together. That’s where Kubernetes comes into play; it allows developers to run different containers that communicate with each other. 
  • Self-healing—Kubernetes has built-in features to help teams administer these many servers. If anything goes wrong when running a container in a Kubernetes environment—say, a bug in the code causes the entire server to crash—Kubernetes will automatically detect that and bring the full container back up online. 
  • Easy horizontal scaling—Kubernetes provides an easy solution for scaling an app.  Kubernetes can monitor the number of resources being used by a container. If the container uses too much RAM or CPU for some time, Kubernetes will automatically launch additional containers to handle the load. When those extra containers are no longer needed, Kubernetes will shut them down as well. 

Docker ≠ Virtual Machine

Finally, Docker shouldn’t be used as the single answer to every platform question. Many developers know that Docker is something useful for deployment and managing servers, but it’s not a one-stop-shop for all cases. Docker should be used for specific problems. Likewise, some of the benefits of Docker for development teams won’t always translate to the project at hand or your company’s specific needs. Like all technical solutions, it’s one of many to consider.

Most notable of misconceptions is the idea that a Docker container is a 1:1 alternative to a Virtual Machine (VM.) It’s not. The team at Docker does a good job of explaining the difference between the two by noting that Docker is not a virtualization technology, it’s an application delivery technology. 

Ready to utilize Docker’s many efficiencies across your development, IT, and DevOps teams? Join over 70,000 students in my course Docker and Kubernetes: The Complete Guide and learn Docker fundamentals from the ground up—no experience required!

About the author:

Stephen Grider has been building complex Javascript front ends for top corporations in the San Francisco Bay Area. With an innate ability to simplify complex topics, Stephen has been mentoring engineers beginning their careers in software development for years, and has now expanded that experience onto Udemy, authoring the highest rated React course. He teaches on Udemy to share the knowledge he has gained with other software engineers. Invest in yourself by learning from Stephen's published courses.

About Udemy for Business:

Udemy for Business is a learning platform that helps companies stay competitive in today’s rapidly changing workplace by offering fresh, relevant on-demand learning content, curated from the Udemy marketplace. Our mission is to help employees do whatever comes next—whether that’s the next project to do, skill to learn, or role to master. We’d love to partner with you on your employee development needs. Get in touch with us at business@udemy.com

Top 10 Tech and Soft Skills Trending in 2019 thumbnail

Infographic

Based on data from 30+ million learners on Udemy, here are the fastest-growing top 10 tech & soft skills in 2019

More from Tech Skills Topic

Tech Skills

7 Software Testing Principles To Build the Best Customer Experience

Even among the most experienced development teams, errors find their way into software, creating a significant business risk for any...

Tech Skills

React vs. Angular vs. Vue: Which is the Best JavaScript Framework?

When students ask which JavaScript framework to use for frontend website development projects—Angular vs React.js vs Vue.js, I have to...

Tech Skills

The New PMP Certification: People, Process, and Business

The Project Management Professional (PMP) certification is slated to change in December 2019, but what are the new changes and...