How can i get 12 million requests to complete my graduation thesis? #140078
Unanswered
hxby112345678
asked this question in
Accessibility
Replies: 2 comments
-
if you have any ideas that can help me,i would be extremely grateful! |
Beta Was this translation helpful? Give feedback.
0 replies
-
GitHub has strict rate limits for API requests, especially for unverified or individual accounts. However, there are several strategies you can employ to potentially gather the required data more efficiently or work around the limits, especially for research purposes: 1. Use Personal Access Tokens with Rate Limit Increases
2. Distribute Requests Across Multiple Accounts
3. Optimize API Requests
4. Parallelization and Request Scheduling
5. Consider Bulk Data Sources
6. Check GitHub’s Data Grant Programs
7. Rate Limiting with Exponential Backoff
Summary of Steps
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Due to my graduation thesis, I need all projects related to China from 2008 to 2024, as well as corresponding information in other categories. I have crawled about 70000 Chinese users who meet the requirements, and now I need to crawl projects under their names. After calculation, I need a total of about 2000000 * 6=12000000, which is about 12 million requests to obtain all the data. However, my account only has 5000 visits per hour, and it only takes about 120000 requests per day. That is to say, it will take me about 100 days to obtain the data I want using my account. Is there any way supported by GitHub to obtain this data for research? It would be even better if we could obtain these data within a month, thank you!
Beta Was this translation helpful? Give feedback.
All reactions