Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to auth api usage for github for 100X traffic support #41

Open
grssam opened this issue Aug 27, 2013 · 6 comments
Open

Switch to auth api usage for github for 100X traffic support #41

grssam opened this issue Aug 27, 2013 · 6 comments
Assignees

Comments

@grssam
Copy link
Contributor

grssam commented Aug 27, 2013

Right now, the limit is 60 calls per hour. If we switch to auth api calls, we can have 5000/hr. I think Sankha has an idea on how to do this..

@ghost ghost assigned ngsankha Aug 27, 2013
@ngsankha
Copy link
Contributor

Yes, I do. But that would require having API keys, which is unsafe if we keep them openly on the client side. But if we don't want to do that, then its another headache to let a user login to get access to his API keys to make the calls via his ID. Without a proper server infrastructure, I suggest we close this!

@debloper
Copy link
Member

  1. The frontend wouldn't make the call anyway, going forward. There will be a service running to get that data, serialize it & create consumable JSON - we'll be having backend for that, which can safely have the API Tokens & Secrets.
  2. Now that we're at it... having the cron running to give quasi-realtime JSON, why'd we need more than 60 API calls per hour? That ought to suffice, IMO.

@grssam
Copy link
Contributor Author

grssam commented Aug 27, 2013

Because even one run of the cron job can lead to more than 60 calls in one go. Assume 10 tracked GH repos, 100 users, it is highly likely that more than 60 calls will be made.

@debloper
Copy link
Member

With https://api.github.com/users/:user/events in hand, it beats me why are we going for repo-specific search - where the list/count of repositories will always be incomprehensive.

From the user's public events, we filter out only the ones performed on /github\.com\/mozilla(.*)/ and we get what we need with one single call.

Possible, we might or might not require ALL that we need. But for the time being, this appears to be good enough. Plus, always remember - these are to be done with a daemon & not by Frontend with AJAX calls. So, consider the advantages/pitfalls according to that.

@ngsankha
Copy link
Contributor

We looked into it. But it doesn't show the entire history, so lets say I have a 6 month old commit, it will not turn up using the API call.

@grssam
Copy link
Contributor Author

grssam commented Aug 28, 2013

@debloper The solution you are proposing requires minimum of #users calls (which itself can be more than 60) on top of that, the API is paginated, so we will not get everything in one page, while the solution we are following requires maximum of #users * # repos calls but a minimum of only #repos calls. Even then, as said by @sankha93 above, the users API does not give every commit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants