Adam Green has a great post on programmableweb this morning about Twitter API Rate Limits – with great tips on how to get the most out of the Twitter API without hitting rate limits.

Sharing API calls between your server side and the client’s browser is actually critical to making some Twitter integrations work and there’s a great tip on how to use server side caching:

You can call the API from the browser, display the results, and then call your server with the data you got from the API. In effect you are using all of the user browsers as a large collection grid. This approach can be used to reduce the amount of API calls you have to do from the server.

API Rate Limiting: Mapping Use Cases

This type of technique highlights how important it is for API Providers to map out the use cases they aim to support with their APIs – since rate limits can typically completely handicap some usage scenarios – in particular for APIS which have server-side and end-user scenarios. Here are some of the things we recommend 3scale API Providers think about when planning rate limits:

  • Which audiences does your API Target? Customers? Partners? End Users?

  • What are the typical integration cases for each of those groups – bulk downloads? regular scheduled queries? user driven calls caused by browser activity?

  • Which groups of API calls go together and get called in sequence – e.g. is it possible to do something useful with 1-2 calls or do you need 5 or 6? Often this might result in an API redesign – but some APIs are inherently call hungry (for example in mapping visualisations you’ll often need to grab offscreen tiles as well as onscreen to create smooth scrolling).

  • How are the API call dynamics affected by end user actions? In the case of pure-server side interactions there is often no variance based on usage of the applications calling the API since the server caches responses – however across the spectrum for mobile apps, the usage volume of a mobile application which goes viral is almost entirely out of the control of the developer that wrote the application.

  • Do you allow caching? Often API Providers instinctively disallow response caching on their API due to fear of data misuse. However often caching significantly reduces the volume of API calls which need to be made and hence server loads. Giving developers the right parameters to build efficient applications is good practice where other constraints allow.

Adam Green’s point on caching browser as well as per-user and per-IP limiting highlights that often as an API provider you might need rate limiting for multiple dimensions of API usage.

As a developer, when doing something like browser data collection, you should always ensure that you’re doing this in way which doesn’t violate either user-privacy (by sharing authenticated data across users or overburdening the browser) but it’s a powerful tool and one that benefits the API Provider as well if done right.

3scale’s infrastructure supports all of the above – most providers use application rate limiting to manage access per application key or credentials, others (most often for mobile APIs) add user-rate limiting which allows restrictions on the number of calls per identified user (much as Twitter does) and others also use IP and/or Referrer domain filtering. Often a combination is the best way to facilitate access without having runaway data usage by a few rogue applications or users.

Learn more about 3scale’s rate limiting services here.