A high-performance, asynchronous proxy scraper with multi-source support and advanced verification capabilities.
- π Asynchronous multi-source proxy scraping
- β Built-in proxy verification system
- π Detailed proxy information (country, speed, anonymity)
- π Export to both TXT and Excel formats
- π Support for HTTP, SOCKS4, and SOCKS5 proxies
- π‘οΈ Error handling and timeout management
- π― Multiple proxy sources for increased reliability
- Clone the repository:
git clone https://github.com/9de/ProxyScraper.git
cd ProxyScraper
- Install required packages:
pip install -r requirements.txt
Run the script:
python proxy_scraper.py
Follow the interactive menu to:
- Select proxy type (HTTP, SOCKS4, SOCKS5)
- Set timeout value
- Wait for the scraping and verification process
- Find your proxies in the 'proxies' directory
The script creates two types of files in the 'proxies' directory:
{proxy_type}_proxies_{timestamp}.txt
- Simple list of proxies{proxy_type}_proxies_{timestamp}.xlsx
- Detailed Excel file with additional information
- ProxyScrape API
- ProxyScan.io
- Proxy-List.download
- Additional sources can be easily added
Timeout values:
- Minimum: 5 seconds
- Maximum: 60 seconds
- Default: 10 seconds
- Async/await implementation for concurrent operations
- Automatic proxy verification
- Country and speed detection
- Anonymity level checking
- Duplicate removal
- Comprehensive error handling
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
Give a βοΈ if this project helped you!