Integration with the 3scale platform is accomplished by deploying traffic control agents, which enforce traffic policies, access controls, and rate limits.
We offer a variety of easy-to-use gateway deployment options to make it easier to get started with 3scale without any edits to backend application code. Gateway deployment is the recommended option for most implementations. Gateway deployment options include:
NGINX is an effective and high-performance gateway which communicates with 3scale to authenticate, authorize, and report on incoming calls. NGINX can be deployed on almost any infrastructure—cloud or on premise—and offers scalability and full control.
Heroku users can take advantage of the 3scale add-on to rapidly deploy NGINX traffic management. Install with a single click, and the buildpack will deploy an API gateway that communicates with 3scale to authorize, authenticate, and report traffic.
Amazon API Gateway
This option allows you to run a zero-infrastructure API gateway on AWS. The Amazon API Gateway directs all incoming calls to a simple Lambda function, which communicates with 3scale to authorize your traffic as permitted by access and rate limit policies configured in 3scale.
Amazon Machine Image
Amazon users can install our AMI, available in the AWS marketplace, to rapidly deploy traffic controls. The AMI contains all of the required NGINX, Lua scripts, and all other necessary libraries, making it easy to to get the NGINX gateway running and communicating with your 3scale instance in just a few minutes.
APIcast is the cloud-hosted NGINX gateway offered by 3scale. Free for up to 50,000 calls per day, APIcast is a perfect choice for APIs in the testing phase, providing an effective way to test your NGINX configuration before migrating to production.
Red Hat OpenShift
OpenShift is Red Hat’s container application platform, which allows developers to quickly build, deploy, and manage containerized services and applications in the cloud. APIs for these components can be managed by 3scale using NGINX deployed on OpenShift.
3scale provides code plugins in multiple languages, which can be deployed in any application to add API control. The installation pattern for each library varies depending on the programming language in use (each bundle includes instructions). Each plugin implements authentication, rate limiting, and traffic reporting.
Gateway deployment and plugin integration methods are both built as a “wrapper” on top of 3scale’s service management API to simplify deployment. However, for custom or bespoke deployments it is also possible to integrate with the 3scale API directly. The API can also be called directly from anywhere for greater flexibility.
Synchronous and Asynchronous Modes
3scale’s architecture supports both synchronous and asynchronous traffic management approaches. The majority of software plugins are set to provide synchronous integration by default, but can be modified to asynchronous operation. Gateway integration is by default asynchronous, with NGINX maintaining a local cache of keys and policies.
In synchronous mode, plugins or agents make calls to 3scale for each API call received, in order to validate API keys and report traffic. This mode is easy to set up and effective at low or medium call volumes.
In asynchronous mode, plugins or agents retain a local cache of currently live API keys and serve traffic based on the content of this cache. This cache is updated asynchronously in the background to keep usage rate and policy information current. This mode affords more control to the traffic agent and helps to reduce latency in API responses, making it ideal for higher traffic volumes
Mixing deployment modes
In general, you’ll only need one of the deployment methods presented above for a given API (or family of APIs), but there are no constraints on adopting two or more deployment modes if needs vary for different APIs or different API consumers. In these cases, traffic control agents can be placed at all relevant points of entry and seamlessly use the same 3scale account.
3scale places no restrictions on the number of traffic control agents used or their physical locations. Hence, if API traffic is served from multiple physical locations, agents would be deployed at each location as needed. Access keys issued to developers are equally valid at all locations.