Rate Limiting in APIGEE
In this blog, we will learn about Rate limiting in Apigee.
In today’s digital landscape, protecting APIs from threats such as Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS) attacks is essential. Apigee provides built-in security policies like Quota and Spike Arrest to safeguard APIs against traffic surges and ensure controlled usage.
Quota Policy:
The Quota policy enforces usage limits over a specified time interval, per app, API, or API product. It helps manage API consumption and supports use cases like rate limiting and monetization.
Quota configuration options:
- Directly in the API proxy.
- At the API product level.
- Within a Shared Flow, enabling the reuse of the quota policy across multiple API proxies for consistent and centralized enforcement.
To apply quota limits set at the product level, a Quota policy must still be present in the API proxy, and a VerifyAccessToken (or similar OAuth policy) must be used to populate quota-related flow variables.
Quota values configured at the API product level can be updated dynamically without requiring an API redeployment, enabling flexible and efficient quota management. Shared flows further enhance reusability and simplify the maintenance of quota policies across multiple API proxies.
We can also set quotas at a product level or at the operation level inside a product. You can also configure separate quota limits for different resource paths within the same API, allowing more granular traffic control based on functionality.
Additionally, quota enforcement data is captured in Apigee Analytics, offering valuable insights into:
- Usage patterns
- Near-limit consumers
- Policy enforcement statistics
Spike Arrest Policy:
The Spike Arrest policy in Apigee helps throttle and smooth out traffic spikes by limiting the rate at which requests are processed. Unlike quota, it distributes allowed requests evenly over time (e.g., 10 requests/second), mitigating brute-force or DDoS attempts.
Spike Arrest can also be applied at the proxy or shared flow level, supporting scalable and consistent traffic control across APIs. It protects the backend by stopping the surge of requests if they exceed the safe limits.
These policies, when used together, offer a strong defense against misuse while enabling fair and controlled access to your APIs.