Microservices describes a cloud-native architectural approach to software development that structures an application from loosely coupled services in communication with each other via APIs or messaging protocols. Each service is autonomous and self-contained and runs a unique process. As developers seek to build scalable and resilient applications, microservices have become increasingly popular.
Microservices, also known as microservices architecture, is a type of software architecture used in cloud-native application development. Applications engineered by this design comprise small, independent and loosely coupled deployable components that together provide the capabilities of the application.
Each service in microservices architecture performs a distinct business function and communicates with other microservices through well-defined interfaces, mostly using RESTful APIs.
Departing from the monolithic application developed as a single unit, microservices allows developers to build with modules they can independently develop, test and deploy, which accelerates time to market. As the decoupling nature of microservices allows developers to push new code and functionality more frequently than they otherwise could, modern applications are able to keep pace with evolving customer needs.
More than three-quarters of businesses have pivoted to microservices, replacing their monolithic applications hosted by individual web servers for containerized, cloud-native applications distributed across a cluster of host servers.
Service-oriented architecture (SOA) has been around since the early 2000s as a way to build large, distributed systems by decomposing them into smaller, loosely coupled services. Microservices architecture emerged as a natural evolution of SOA.
The concept of microservices was introduced by Fred George in a 2011 workshop on software architecture. George had been trying to solve scalability issues with SOA while working on an e-commerce site and came up with the idea of building small, autonomous services.
Microservices architecture took SOA principles of service orientation and refined them for modern cloud-native applications. The coarse-grained services of SOA became fine-grained, granular “micro” services, which made them highly efficient and provided the flexibility to match a technology stack to a given service. Microservices architecture even reduced the communication load by replacing cumbersome SOAP APIs with lightweight options, such as REST APIs or message queues.
Microservices soon gained popularity among software architects, and companies like Netflix, Amazon, The Guardian and Spotify began adopting this architecture.
Microservices provide a framework for building cloud-native applications that can adapt to changing business requirements. The myriad benefits arise from the architecture of the application.
Microservices architecture lends itself to independent development and deployment. Unlike monolithic applications, where changing a line of code involves updating an entire application, developers can modify or replace services without affecting the distributed system. The ability to deploy individual services makes it easy to add new features or roll back changes to an application.
Scaling an entire application isn’t optimal. With microservices, only the components that require scaling are scaled. Developers can address an individual service as needed, which ultimately facilitates better performance under heavy loads, the efficient use of resources and lower infrastructure costs.
In microservices architecture, the cloud-native application doesn’t share a common stack and database. Developers can choose the tools they prefer and the technologies to meet distinct requirements of individual services.
Developers can write microservices in any language — and connect them to microservices programmed in any language. What’s more, microservices can run on any platform, which makes them available to integrate with legacy systems.
Microservices architecture allows developers to build modular services they can reuse across applications. By working with reusable components, programmers reduce development time and improve the quality of code as they invest in a "write once, reuse often" culture.
Microservices architecture promotes resilience. With services designed to operate autonomously, the failure of a single service rarely shuts down the application, as tends to happen with monolithic applications.
Microservices architecture enables teams to work on different services simultaneously, which translates into faster time-to-market. While developers make decisions without needing to coordinate with other teams, microservices also promote cross-team collaboration, as each team is responsible for developing and maintaining a part of the whole. This can lead to better alignment with business goals and more efficient use of resources.
The application built with microservices is built to evolve. Developers can quickly deploy core microservices as a minimum viable product and upgrade the application as teams complete additional services. New technologies can be incorporated into the design as they emerge. The microservices-based application remains in process, continuously moving toward theoretical perfection.
While container-based microservices provide many benefits, they’re not always the right application architecture to choose. When making software engineering decisions, think about your goals for the application, as well as development hurdles and needs you foresee in view of the application’s lifespan. Microservices work best with complex applications. Scenarios to consider using microservices include:
If you're building a large and complex application, microservices will allow you to divide the application into manageable pieces, making it easier to develop, deploy and maintain.
Microservices architecture can accommodate independent services with different development rates. Even if a service sees an unexpected delay, the project can continue without global implications to the application development timeline.
Microservice architecture is ideal for applications that will require frequent updates, as independent services allow developers to modify the module instead of the application.
If your application needs to handle a high volume of traffic or needs to scale rapidly, microservices are essential. This is particularly if you need to scale specific parts of the application, rather than scaling the entire application.
If you have multiple development teams working on the same application, microservices will help you maintain agility and efficiency. Each team can work on their microservice, using the technology stack that works for them, without worrying about the rest of the application.
If you want to build an application with a decentralized architecture, microservices are autonomous and can be deployed in different locations, even among different cloud service providers.
If you're planning on a hybrid cloud architecture, where some services will continue running on-premises and others will run in the cloud, microservices will help you manage the complexity of the application.
Microservices architecture requires careful planning. Certain technologies and practices common to the production environment enable developers to effectively develop, maintain and operate microservices-based applications.
DevOps practices, including CI/CD, are essential to the architectural approach of microservices. Unlike monolithic applications, microservices are inherently complex distributed systems with numerous moving parts and independent tech stacks. This complexity requires frequent collaboration between development and operations teams to ensure that components are seamlessly integrated.
DevOps practices provide the necessary collaboration, communication and automation tools to effectively bring teams together throughout the entire software development lifecycle.
Continuous delivery goes hand-in-hand with microservices, allowing developers to release software updates frequently and reliably by making use of infrastructure automation tools, such as continuous integration servers, deployment pipelines and automated testing frameworks to streamline the CI/CD process.
Continuous delivery is especially important to ensure that each service can be updated and released independently of the other microservices.
Microservices communicate with microservices — and most do so within web applications — which makes REST complementary. REST, or Representational State Transfer, is an architectural design pattern for building RESTful APIs, which allow services to communicate via HTTP in standard formats, such as XML, HTML and JSON. But REST APIs are foundational to microservice-based apps for several reasons.
REST APIs are light-weight and platform-agnostic, meaning they provide a standardized interface that enables microservices to communicate, regardless of their underlying technology. Because requests contain the information needed to complete the request, REST APIs don’t require context stored on the server. They can handle large volumes of requests without compromising performance, and services in a REST-based microservices architecture can evolve independently, efficiently communicating in a stateless manner.
While microservices give teams the option to choose their service’s language and framework, working with various languages in the same CD pipeline poses challenges. Containers abstract away the variance between services, as each microservice becomes a self-contained unit packaged with its codebase, database and dependencies. The CD pipeline, now homogenous, can perform consistent testing of each container.
Services are able to interact without interfering with each other when separated by containers, and once deployed, containers provide a lightweight, portable runtime environment that allows services to function consistently across platforms. Tools like Docker and Kubernetes are widely used to manage containerized microservices.
An orchestration tool like Kubernetes can abstract the underlying infrastructure and automate container management, deployment and scaling across multiple servers. Its extensibility also enables developers and operators to use their preferred open-source and commercial software tools, reducing the manual work of container management.
Serverless computing is another option for deploying microservices. Serverless architectures use functions as a service (FaaS) platforms to create even smaller units of deployment and scale on demand. Though serverless architectures can increase vendor dependencies, they offer reduced operational cost, complexity and engineering lead time.
Designing a microservice architecture requires careful planning and consideration. To build successful microservices-based applications, developers should observe the following best practices:
eBay, Etsy, Uber — countless enterprises have dismantled their monolithic applications and refactored them into microservices-based architectures to capitalize on scaling advantages, business agility and financial wins.
Organizations that plan to transition to microservices should first adopt DevOps, which will set you up to manage the complexities you’ll encounter. On a basic level, anticipate the stages outlined below when mapping out your project.
The first step in migrating to microservices architecture is to identify the business capabilities or features your application needs to support. This will help you define the scope of your application and inform your decisions about which capabilities should be prioritized for development, as well as how those microservices should be designed and integrated with one another.
Most organizations use domain-driven design or feature-based decomposition to decompose their monolithic applications.
Having identified the application’s business capabilities, define the service boundaries for each microservice, ensuring that each microservice has a distinct, well-defined responsibility. You’ll want to map dependencies between the business capabilities, datastores and external systems. Based on the bounded contexts and dependencies, define the microservices that will replace the monolithic application.
Each microservice, focused on a single business capability, should have clear interfaces. Review how the data entities are accessed and, finally, consider how to partition the data to reduce dependencies between services.
Implement the service interfaces for each microservice, ensuring that the interface reflects the microservice’s sole responsibility. You can use different techniques, such as RESTful APIs or messaging protocols, to define the service interfaces.
Depending on your requirements and expertise, choose the programming languages and frameworks to implement the services. Iterate on the design as needed, including testing the new interfaces, communication protocols and datastores.
Once you have implemented and tested the services, you’ll want to containerize them using container technologies, such as Docker or Kubernetes. Containerization will enable you to deploy and manage the services independently.
Automate the orchestration of the services using tools such as Kubernetes or Docker Swarm. In addition to efficiently streamlining the deployment of services, automation through Kubernetes or Docker will improve the reliability and availability of the application. Either platform can detect when a service instance fails or becomes unresponsive and take action to remediate the issue. Kubernetes, for example, can restart failed instances or reschedule them to other nodes, while Docker can automatically migrate the failed container to another node.
Decomposing a monolithic application is not a one-time process. It requires maintenance and updates as the needs of the application and its users evolve. Monitor the new microservices and track key metrics, such as response time and resource utilization.
Highly distributed, cloud-native microservices applications introduce security complexities. Instead of a single entry point, they come with dozens, if not hundreds, of potential points of vulnerability — and each must be secured. APIs and code dependencies represent but two sources of risk in the expanding attack surface of the modern application.
Modern applications consume input from a range of sources — standard web requests, mobile device API calls, cloud events, IoT device telemetry communication, cloud storage, etc. What’s more, a single client’s web request (i.e., north-south traffic) can spawn hundreds of API calls between internal microservices (i.e., east-west traffic).
The complexity of API-centric web applications require scalable, flexible and multilayered strategies that work for any type of workload in any type of environment or cloud architecture. Securing the frontend web interface of the cloud-native application isn’t enough. Cloud-native applications require application-layer protection for cloud-native APIs. Web application and API security (WAAS) is essential.
Open-source software components make up approximately 70% of cloud-native applications. While this expedites development, many open-source packages and their dependencies contain vulnerabilities. As well, each version of a dependency-driven OSS package can change critical functionality. Without full visibility, vulnerabilities go undetected.
Standalone software composition analysis (SCA) tools surface open-source risks too late in the development lifecycle, which causes a backlog of vulnerabilities that can’t always be resolved. Separate tools for SCA and IaC security result in noisy alerts without context and knowledge of interconnected risks. Because gaps are inevitable without runtime and workload coverage, it’s best to secure cloud-native applications with integrated cloud-native security.
Identifying and prioritizing critical risk of the entire cloud-native application, a cloud-native application protection platform (CNAPP) integrates multiple types of security to deliver comprehensive, code-to-cloud protection — cloud security posture management (CSPM), cloud workload protection, cloud infrastructure entitlement management (CIEM), Kubernetes security posture management (KSPM), infrastructure-as-code security, WAAS, SCA and more.
Cloud security leadership exploring how best to secure the rapid development needs of applications employing cloud-native technologies, such as containers, microservices and serverless functions, should consider adopting a CNAPP.