Born out of open source collaboration, Docker helped revolutionize the software development world. By encasing software in shells of code called containers which included all the resources the software would need to run on a server—tools, runtime, system libraries, etc—the software can perform the same way across multiple hosting platforms. Docker’s container technology is at the forefront of mobile, scaleable development.
Today developers are using Docker to build modules called microservices, which decentralize packages and divide tasks into separate, stand-alone apps that collaborate with each other. Developers for a nationwide pizza chain might build microservice applications for taking an order, processing a payment, creating a ‘make’ ticket for the cooks, and a delivery ticket for the drivers. These microservices would then operate together to get pizzas cooked and delivered all over the country.
Designing with microservices in Docker requires new thinking and approaches, but it also creates unparalleled abilities for building stable, scalable apps. Here’s a look at the ins-and-outs of microservices and how to make them work for you.
From monolithic to microservices: planned decomposition.
What Are Microservices?
Developing in microservices is the art of breaking down the old model of building one large application, i.e. a “monolithic” application, and forming a new model where specialized, cloud-hosted sub applications—each charged with a very specific task—work together. Microservices distributes application load and can help ensure stability with replicable, scaleable services interacting with each other. Learn more about monolithic vs microservices here.
But what’s the right approach for breaking a monolithic app apart? When deconstructing an application into modules, engineers tend to follow planned decomposition patterns, sorting the new software modules into logical working groups.
For example a grocery chain’s shipping and tracking software that currently uses one application for fruit might decompose into modules that process bananas, oranges, etc. This may improve aspects of tracking, but decomposing software along logical subdomains—fruit types in this instance—can have unforeseen consequences on business ability.
Author and highly regarded software developer expert Martin Fowler examines the trap of hyperfocus on decomposition by subdomain:
“When looking to split a large application into parts, often management focuses on the technology layer, leading to UI teams, server-side logic teams, and database teams. When teams are separated along these lines, even simple changes can lead to a cross-team project taking time and budgetary approval.”
Microservice architecture takes a different approach to organizing modules. It decomposes applications around business capabilities, building cross-functional teams to develop, support, and continually deploy microservices. Fowler emphasizes the “products not projects” approach to business-focused decomposition: delivering a package isn’t a one-time project with a team that breaks up on completion, but an ongoing, collaborative commitment to continually delivering excellent product.
Microservices also decentralize traditional storage models found in monolithic application development. Microservices work best with native management of their own data stores, either repeated instances of the same database technology or a blend separate database types as most appropriate for the service. This is a the full realization of an approach first outlined by the developer Scott Leberknight, which he called Polyglot Persistence. The ability to mix and match data store types presents myriad possibilities for microservice developers.
The advantages of the microservice approach are still being explored. So, as with all systems, be aware of potential pitfalls and limitations of the practice.
Microservices are a powerful approach to architecture, but they aren’t challenge-free
Challenges of Building a Microservice Architecture
The power and possibilities that can be realized through microservices come with these common areas to address in design and manage on an ongoing basis:
Services distributed across multiple hosts can be hard to track. Rather than a single stop to tweak monolithic apps, collaborating microservices scattered throughout your environment need to be inventoried and quickly accessible.
Rapid resource scaling
Each microservice consumes far less resources than monolithic applications, but remember that the number of microservices in production will grow rapidly as your architecture scales. Without proper management a lot of little hosts can consume as much compute power and storage or more as a monolith architecture.
Inefficient minimal resourcing
If you’re utilizing the Amazon Web Services environment, there is a bottom limit to the resources you can assign to any task. Microservices may be so small that they require only a portion of a minimal EC2 instance, resulting in wasted resources and costs that exceed the actual resource demand of the microservice.
Increased deployment complexity
Microservices stand alone, and can be developed in a wide array of programming languages. But every language is dependent on its own libraries and frameworks, so these multiple programming languages in play will require a completely different set of libraries and frameworks. This grows resource overhead (and costs), and makes deployment a complex consideration.
But these obstacles aren’t insurmountable. This is where groundbreaking container technology like Docker can step in and fill existing gaps.
Docker brings the technology you need to make a microservice architecture work.
Docker to the Rescue for Microservices
The Docker technology of the container, now emulated by other container services, helps address the biggest challenges to building a microservice architecture in the following ways.
Create a Docker container for each individual microservice. This solves the problem of resource bloat from over provisioned instances idling under the almost non-existent strain of a lone service, and multiple containers can be run per instance.
Support multiple coding languages
Divy all the services required to run a language, including libraries and framework information, into linked containers to simplify and manage multiple platforms.
Use containers to host one or more data volumes, then reference them from other microservices and containers. Chris Evans at ComputerWeekly explains the concept:
“The benefit of this method of access is that it abstracts the location of the original data, making the data container a logical mount point. It also allows ‘application’ containers accessing the data container volumes to be created and destroyed while keeping the data persistent in a dedicated container.”
Gain deep insights into data flow within by monitoring individual container logs with powerful tools like Sumo Logic for logging and machine learning, saving your team’s time and accelerating the continuous delivery pipeline.
Follow these five guidelines for gaining mastery over your microservice environment.
5 Patterns to Enable Your Architecture
Designing an efficient microservice architecture is no accident. Sumo Logic’s own Mike Mackrory outlines five patterns for staying in control of a complex environment powered by microservices:
1. Cultivate a solid foundation. Everything starts with people, so make sure yours are ready to live and breathe in a microservices world.
2. Begin with the API. Simple math: one microservice starts with one API.
3. Ensure separation of concerns. Each microservice must have a single, defined purpose. If it starts feeling like they should add a responsibility, add a new microservice (and a new API) instead.
4. Production approval through testing. Write comprehensive testing parameters for each microservice, then combine them into a full testing suite for use in your continuous delivery pipeline.
5. Automate Deployment. And everything else. Automate code analysis, security scans, pass/fail testing, and every other possible process in your microservice environment.
Build your teams themselves and your general approach to a microservice architecture gradually, carefully, and in the same DevOps spirit of continual feedback and improvement.
Make the Move to Microservices
Microservices are a new approach to older models for making software collaborate and scale. The long term efficacy of the approach is still to be determined, but there’s no denying the capabilities it brings to designing and managing complex infrastructures in a DevOps environment. Want to dive deeper into the worlds of microservices and Docker? Learn more about benchmarking microservices and check out the power and versatility of the Sumo Logic App for Docker.