What is the NGINX API gateway?
An API gateway secures and orchestrates traffic between backend services and API consumers. NGINX Plus API Gateway receives all API requests from clients, determines which services are required by the request, and delivers responses with high performance. NGINX provides ultra-fast API responses in less than 30 milliseconds, and can handle thousands of requests per second.
Key features of the the NGINX Plus API Gateway include:
- Authenticates API calls
- Routes requests to the appropriate backend
- Applies rate limiting to prevent service overload
- Mitigates DDoS attacks
- Offloads SSL/TLS traffic
- Improves API performance and handles errors and exceptions
Read more about API gateways in a cloud native world
How is NGINX currently used as an API gateway?
NGINX can be deployed as an API gateway in three ways:
- Using native NGINX functionality—some organizations use basic NGINX features to directly manage API traffic. For example, NGINX can identify whether API traffic is HTTP or gRPC, and translate API management requirements into NGINX configurations to receive, route, rate limit, and secure API requests.
- Extending NGINX with Lua—OpenResty is an open source product based on NGINX, which adds the Lua interpreter to the NGINX core, allowing users to build feature-rich functionality on top of NGINX. Lua modules can also be compiled and loaded into NGINX open source and NGINX Plus builds. There are many NGINX-based open source API gateway implementations, most of which use Lua or OpenResty.
- NGINX Plus API Gateway—using NGINX’s dedicated API gateway product.
When using the first two options for API management, keep the following in mind:
- Performance and request latency—Lua is a great way to extend NGINX, but it can degrade NGINX performance. Independent testing of a simple Lua script shows a performance degradation of 50 to 90%. If you choose a solution that relies heavily on Lua, make sure it meets your performance requirements without adding latency to requests.
- Converged approach—the NGINX server can manage API traffic in parallel to normal web traffic. An API gateway is a subset of standard NGINX features. When you use other products to manage the network or API communications, these solutions can add complexity to DevOps, CI/CD, monitoring, security, and other application development and delivery capabilities. Using NGINX both for web and API traffic can simplify the architecture.
What are the benefits of the NGINX API Gateway?
The table below illustrates API gateway use cases for channeling external API calls to internal services.
NGINX can control web traffic by translating across protocols like HTTP/2, HTTP, FastCGI, and uwsgi, offering uniform configuration and monitoring platforms. In addition, NGINX is flexible enough to execute in containers with minimal resources required.
Tutorial: Deploying NGINX as an API gateway
Here is a tutorial that explains how to deploy NGINX as an API gateway for a RESTful API that communicates using JSON.
The general process is:
- Install NGINX Plus: If you don’t already have NGINX Plus installed, you’ll need to download and install it on your server. You can find detailed installation instructions on the NGINX website.
- Define the API endpoints: Next, you’ll need to define the endpoints that your API will expose. An endpoint is a specific location within the API that can be accessed and performs a specific function. For example, you might have an endpoint that returns a list of users or an endpoint that allows users to create a new account. You can define your endpoints in a configuration file, using the NGINX Plus API gateway syntax.
- Configure the API gateway: Next, you’ll need to configure the API gateway to route requests to the appropriate backend service. You’ll need to specify the location of the backend service and the URL path that the API gateway should use to access it. You can also configure additional functionality, such as rate limiting and data transformation, at this stage.
- Test the API gateway: Once you’ve configured the API gateway, you can test it to ensure that it is working correctly. You can use a tool like cURL or Postman to send test requests to the API gateway and verify that it is correctly routing requests to the backend service and returning the correct responses.
- Deploy the API gateway: When you’re satisfied that the API gateway is working correctly, you can deploy it to your production environment. You’ll need to make sure that the API gateway is running on a server that is accessible to your clients, and that it is configured to handle the expected load.
Here is what the code looks like:
First, in the main nginx.conf/sites-enabled/<website.conf>
file, you will need to add the following declaration:
Then, you can define an API gateway for a simple RESTful API in the NGINX configuration file, as follows:
# Declare the API gateway
gateway_api api;
This configuration defines two endpoints: api.get_users
and api.create_user
:
- The
api.get_users
endpoint is accessed through the/users
URL path and is routed to the backend service athttp://backend/api/users
. - The
api.create_user
endpoint is also accessed through the/users
URL path, but only responds to POST requests and is also routed to the backend service athttp://backend/api/users
. All other requests are redirected to the default backend.
API gateway management with Solo Gloo Gateway
While NGINX can be used in API Gateway use cases, many companies are alternatively choosing to move to a more modern, cloud native API gateway architecture based on Envoy Proxy. Solo Gloo Gateway is the leading API Gateway based on Envoy Proxy, which delivers a more secure, more scalable, more extensive API Gateway than NGINX.
- Gloo Gateway is Kubernetes native, and able to run on any cloud.
- Gloo Gateway integrates with GitOps to enable highly automated environments.
- Gloo Gateway integrated with DevSecOps best practices to ensure compliance with major industry standards.
- Gloo Gateway integrates with Gloo Mesh (Istio), to help scale as the Kubernetes and microservices environments grow.