Announcing Gloo Edge 1.12 with Scalability, Response Caching, AWS Lambda, and more

Today, the Solo.io team released Gloo Edge 1.12, our latest release of the industry-leading Kubernetes-native, Envoy Proxy-based API gateway and Kubernetes ingress. The new version of Gloo Edge makes a major architectural change to improve the efficiency of the translation loop between 200-400%. With the latest release, Solo.io continues to lead the market in making Envoy Proxy the best choice to handle API connectivity and ingress routing to distributed microservices in the public cloud, multi-cloud environments and on-premises data centers.

Here are just some of the new features, grouped by the benefits they provide you.

 

Scalable and Simplified Control Plane

Gloo Edge is used in some of the largest, and most sophisticated, environments in the world. Financial companies use Gloo Edge for over 80% of their API traffic (billions of transactions per month) and the service is listed as one of the most critical in the entire organization – equivalent to systems that are responsible for financial transactions. Telco companies are planning on running Gloo Edge and Envoy at even larger scales with tens of thousands, and eventually 100+k, services. 

To handle these future scaling demands, we put everything on the table to make Gloo Edge faster and over the past few quarters have made significant improvements. 

In order to get to the scales where customers were moving, Solo.io decided to combine the Gateway and Gloo pods. 

Above: Pre-v1.12 architecture of Gloo Edge

 

Gloo Edge 1.12

Above: New, faster, combined pod architecture of Gloo Edge

When we combined pods and tested it at scale we measured: 

  • 200-400% improvement in creating new routes
  • 130-270% improvement in validation

The results mean better scalability, better response times, faster transitions with failover scenarios, and an overall better API experience for our customers applications.

The change in architecture will also help with future horizontal scaling initiatives in Gloo Edge. 

 

Response Caching

Multiple customers have asked for response caching in Gloo Edge and we are proud to now offer this capability in the product. Some customers have very interesting use cases where they are building a mini-internal CDN or forward proxy cache where they cache resources that are primarily stored in AWS but accessed from internal services. 

Envoy has a still evolving response caching fitler in the open source but there is still a ways to go before it can be used in production. The Gloo Edge response caching adds a first of its kind production ready response caching feature to Envoy.  

 

Lambda Integration – Rip and Replace AWS API Gateway

Some Gloo Edge customers use the product with AWS API gateway but more and more customers are looking to move away from AWS API gateway as I explained in the last, Gloo Edge 1.11, release blog [LINK]. The Solo teams have added many new features to Gloo Edge to support different Lambda use cases, such as cross AWS account support. However, while that AWS Lambda integration was good for net new services, there are customers with hundreds or thousands of existing Lambda functions that wanted a “rip and replace” option that would parse the response in exactly the same way as AWS API Gateway. With Gloo Edge 1.12, existing AWS API Gateway customers can now rip and replace with Gloo Edge with no changes to request or response formats.  

 

Build a Unified Data Graph with GraphQL Schema Stitching

Many organizations using GraphQL quickly evolve to combine multiple individual GraphQL APIs (subgraphs) into a single, unified GraphQL API (supergraph). Schema stitching is a solution developed in the GraphQL community to solve this use case. Gloo GraphQL introduced schema stitching as a beta feature in Edge 1.11 and we’re moving that to a GA feature in Edge 1.12. In the spirit of an open GraphQL ecosystem, we have also added the ability to incorporate remote GraphQL servers into a stitched GraphQL API defined with Gloo GraphQL. This provides our customers with an incremental approach to adopting GraphQL in Gloo Edge when they have already invested in an alternative GraphQL implementation. 

 

Try Gloo Edge today!

Many of the enhancements above came from customer requests, so if you have ideas of other things you’d like to see, reach out to us on the #gloo-edge channel on the Solo.io Slack.