Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate limiting #86

Open
burkasaurusrex opened this issue Oct 27, 2020 · 2 comments
Open

Rate limiting #86

burkasaurusrex opened this issue Oct 27, 2020 · 2 comments
Labels

Comments

@burkasaurusrex
Copy link

FYI, just got the following email from Trakt

As of October 27, 2020, we’re enforcing rate limiting for all API apps. You’ve created a Trakt API app, and we wanted to let you know directly. If you have any questions, please continue the discussion in the GitHub API project.

WHY ARE RATE LIMITS BEING ENFORCED?
Over the past several months, Trakt performance has been negatively impacted by a huge increase in API traffic and poorly coded apps. In order to stabilize the API and increase performance for everyone, we’re turning on rate limiting. The Trakt API is free to use, and rate limiting will help us keep it that way.

WHAT UPDATES DOES MY APP NEED?
Your app will need to handle the 429 HTTP status code and headers that are sent when the rate limit is exceeded. This might be built into your API library, or you might need to customize your code to handle it. The API docs have more details.

WHAT ARE THE LIMITS?
/sync/history/* 2 calls every second
All API methods 1,000 calls every 5 minutes

All limits are per user.

MOVING FORWARD
We plan on adjusting limits until we find a balance of good performance with minimal app impact. The goal is to prevent API abuse, but allow users to use apps normally. We’ll keep the API docs updated with the current rate limits.
If you have any questions, please continue the discussion in the GitHub API project.

It would be awesome if this client helped throttle API calls when a 429 response is received

@aburke20
Copy link

aburke20 commented Nov 4, 2020

Any updates on this ?

Still trying to figure out how to fix by myself, but I have very little idea what to actually do.

@fuzeman
Copy link
Owner

fuzeman commented Mar 19, 2021

I think the best solution would be to implement a queue mechanism (enabled with a configuration option) that will ensure requests are processed at the allowed endpoint rate.

I'll take a look at this when I have some time available.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants