[Tutorial] Canary Releases

July 8, 2020

As application architecture evolves from monolithic to microservices, the strategies at every stage of the application lifecycle must also evolve to take advantage of the new architecture and to ensure optimal performance, uptime, security, and user experience of the application. 

What is a Canary Release?

Canary release is an approach for safer application delivery, specifically the delivery of new software to end users without disrupting or interrupting their experience. Techniques like Canary and Blue/Green fall within the practice of Progressive Delivery which allow organizations to better manage their risk by slowing down and controlling how many end users have access to the new software as it is deployed to production.

Unlike Blue/Green which routes all incoming traffic to the new service (Green) and only rolls back to the prior version of the service (Blue) in the event of a failure, the canary release approach sends a small portion of the traffic to the new version of the service until the service is validated to be operating successfully, and only then is more traffic routed to the new version until finally, all traffic is transferred to the new version. Canary release offers a more controlled release vehicle with as many validation checks as desired by the engineering team. 

 

Role of API Gateways in Canary Releases

API or Edge Gateways provide a point of control to set up routes to manage, secure and observe traffic coming from external clients and end users to backend applications services. 

Common questions answered by using API Gateways for Canary Releases include:

  • How do I bring a new application online?
  • How do we upgrade an application?
  • How do we divide responsibilities across the platform, ops and development teams?

Gloo API Gateway has a flexible deployment architecture leveraging Envoy as the edge proxy to facilitate simple to complex canary release use cases for diverse application portfolios. At a high level, the workflow for Canary Release with Gloo looks like the diagram below. Gloo’s flexibility allows organizations to automate the release process, integrate it with other tools, scale across multiple teams, and more.  

 

Hands on Tutorial

Get started with canary releases with two options for a step by step tutorial using Kubernetes, Gloo API Gateway, and ecosystem tools including; instructions to set up your own demo environment and a hosted lab to try the workflow in your browser. 

 

Demo Environment 

This three part series uses a combination of blog posts and publicly accessible demo code to guide you through setting up the environment and running different canary release scenarios. 

Part 1: A two-phased canary roll out workflow starts by shifting a small subset of traffic to the new version to safely perform tests and verify correctness before starting to progressively shift more traffic while monitoring the new version under load and eventually decommissioning the old version of software. 

All resources for Part 1 of the tutorial are included in the blog post and GitHub repo.

Part 1 of the tutorial outlines how to:

  • Set up the environment and expose the appropriate systems and APIs externally
  • Deploy v2 of the application 
  • Add and test subset routes
  • Shift traffic with weighted destinations
  • Simulate load testing
  • Decommission application v1
  • Advanced topics to explore 

 

Part 2: Build on part 1 to scale the canary release workflow across multiple services owned by different teams with the ability to gracefully handle configuration errors. This is an important consideration as responsibilities are often spread across different roles within a team or organization on a shared application platform. The goal of this tutorial is to design a workflow that allows teams to operate in parallel, remove bottlenecks, and to do that while mitigating the risk of invalid configurations of one team blocking another team’s progress. 

All resources for Part 2 of the tutorial are included in the blog post and GitHub repo.

Part 2 of the tutorial outlines how to:

  • Explain and evaluate three options for scale 
  • Deploy multiple applications to different namespaces
  • Set up upstream destinations, route tables, and virtual services
  • Run the two phased canary workflow but as two teams with different services
  • View and remediate invalid configuration validation

 

Part 3: Operationalize the canary release workflow with Helm to enable developers to manage their own versioning and invoke canary release workflows, making the workflow easier to execute while minimizing maintenance and risk of misconfiguration.  

All resources for Part 3 of the tutorial are included in the blog post and GitHub repo.

Part 3 of the tutorial outlines how to:

  • Set up the Helm chart
  • Break down requirements and design Helm chart values
  • Create templates for deployment, service, upstream, and route table
  • Run through the two phased canary workflow using Helm

 

Online Lab Environment

For a guided experience in canary releases using Gloo API Gateway without having to install or configure any software, a hosted lab environment is available on Katacoda. Start with the canary release tutorial and then try the many different courses on traffic management, security and more. Start the tutorial 

 

More Resources

Learn more about Gloo API Gateway and the canary release use case to get started in evolving to progressive delivery. Read the docs for more detail and check out the integration to Flagger by Weaveworks.

Watch this talk and demo

 

Download the presentation