What is microservices security?
A microservices architecture breaks down the traditional monolithic software deployment model into independent, distributed microservices that developers can deploy and scale separately. This software development approach builds a single application as a collection of small services. Each service runs within its own process and communicates with lightweight mechanisms, such as APIs and HTTP resources.
Developers build microservices around business capabilities, using automation to deploy them independently. A key advantage of the microservices architecture is that it enables writing services in various programming languages and using different data storage technologies.
A microservices architecture is highly distributed and dynamic, introducing unique security risks. To address these risks, DevOps teams require a new approach to security. Ideally, teams should implement security into the design and architecture patterns, integrating security measures across the entire software development lifecycle (SDLC).
Security challenges in microservices architecture
Increased attack surface
Microservices establish communication using APIs independent of machine architecture and programming language, increasing the attack surface. Additionally, interactive services can further increase the possible points of critical failure. As a result, DevOps and security teams need to stay one step ahead of microservice interruptions. A microservice that breaks down is no longer operable, making it difficult to learn how it contributes to security threats or if it is already exploited for an ongoing attack.
A microservices application relies on isolation to enable building, testing, deploying, and scaling an application in a way that ensures microservices do not affect each other. Teams can also implement isolation at the database level to ensure each microservice has its own copy of data and cannot access the other microservices’ data. Implementing isolation at all layers enables teams to harden the security of their microservices-based application.
A DevOps microservices ecosystem is spread out. Microservices are distributed, stateless, and independent, generating more logs than a monolithic application. Unfortunately, this massive amount of logs can camouflage issues.
Since microservices can run from multiple hosts, teams must send logs across all hosts to one external and centralized location. Effective microservices security relies on correlating user logs needs to events across several platforms. It requires a higher observation viewpoint, independent from any single service or API.
Fault tolerance is more complex in a microservices environment compared to monolithic systems. It requires building services that can cope with service failures and any timeout occurring due to unknown reasons. If service failures accumulate, this issue can affect other services and lead to clustered failures. Teams can address this challenge by focusing on interdependence and adopting a new model to ensure stability across all services. However, it can become a difficult effort without a centralized microservices security platform.
Caching can help reduce the frequency of requests made. However, as caching requests grow to handle more services, this excess reserve can increase the complexity and the amount of interservice communication. Teams can address this challenge by ordering, automating, and optimizing this communication.
Microservices security best practices
Build security into the design
A microservices initiative requires development, operations, and security teams to collaborate to build security across the entire SDLC and standardize their efforts to mitigate security risks. Ideally, DevOps or DevSecOps teams should build security into the design and then build and test using secure coding practices and automation to test continuously. These practices can help teams build a solid security foundation and catch issues.
Protect your data
It is critical to protect data across the entire SDLC. Teams can implement various measures into their microservices architecture. Common methods include:
- Encryption—helps secure data at rest. Encrypting data early and decrypting it only when needed can help minimize the chances of exposure.
- HTTPS—secures data in transit. HTTPS encrypts the connection of data transmitted over the public Internet to protect its integrity and privacy.
- HTTP Strict Transport Security (HSTS)—a response header that tells browsers to access endpoints only when using HTTPS communication.
Use the API gateway
A microservices architecture typically does not allow service consumers to communicate directly with microservices. Instead, the architecture uses an API gateway as a single entry point for this traffic, directing it to the relevant microservices.
An API gateway usually employs token-based authentication for managing services’ data privileges and determining how they can interact with certain data. Since clients cannot directly access services, they cannot exploit them. Placing the API gateway behind a firewall extends this protection to ensure all the application’s microservices are secure.
Implementing a Service Mesh
A service mesh is a networking framework for adding observability, security, and reliability to distributed applications. It is important to secure inter-service communications to ensure that if attackers compromise one service, they cannot move laterally to the rest of the application. Service mesh technologies enable mutual transport layer security (mTLS) connections between services. This introduces a security model that enables features like authentication, authorization, and encryption for your microservices communication.
Practice defense in depth
Defense in depth is a cybersecurity strategy that requires incorporating several layers of security control within an application. Applying this strategy to a microservices-based application ensures sensitive services have their own layers of security. As a result, threat actors that manage to exploit other microservices cannot invade other layers of the application.
Teams should not rely on one security measure. Instead, teams should implement all applicable security measures to create layers of security between microservices and potential threat actors. For example, using a combination of measures such as a network perimeter firewall, token-based identification, keeping the addresses of sensitive microservices private, and maintaining a strong monitoring layer to identify abnormal behavior.
Secure at the container level
Microservices applications employ containers for deployment. A container is based on a public or private image, both of which can potentially introduce security vulnerabilities. Regularly scanning container images can help ensure images do not introduce vulnerabilities or carry other security issues.
Teams should also implement the principle of least privilege to secure containers. It involves restricting access to resources and managing the usage of these resources. Ideally, teams should provide access to each resource on an as-needed basis. Additionally, never store secrets in a container.
Learn more in our detailed guide to microservices best practices. (coming soon)
Microservices Security with Solo
The Solo Gloo Platform integrates service mesh functionality from Gloo Mesh and API gateway capabilities from Gloo Gateway into a single platform designed to simplify how developers and SREs manage microservices security.
Gloo Mesh is built around Istio and has enriched the service mesh even further, identifying the gaps in Istio and addressing them. Gloo Mesh includes n-4 Istio version support with security patches to address Common Vulnerabilities and Exposures (CVE), as well as special builds to meet regulatory standards such as Federal Information Processing Standards (FIPS). The enterprise features also include multi-tenancy, global failover and routing, observability, and east-west rate limiting and policy enforcement through authorization and authentication plug-ins.
Gloo Gateway extends Envoy Proxy with a rich set of Security, Scalability, Resiliency, Cloud Integrations, and Ease of Use capabilities. Gloo Gateway’s architecture enables customers to significantly reduce their API-Gateway footprint (vs. legacy API Gateways), as well as improve overall scalability and reduce application latency. Gloo Gateway is part of the broader Gloo Platform framework for centralized deployments and policy management, integrated with GitOps best-practices.
When considering how to address the unique security risks of the highly distributed and dynamic microservices environment, consider how Gloo Platform could provide the foundation to closing your microservices security gaps.