Error using random_proxies
Hello, just testing your package and have an error. Could you look throug and fix it?
installed packages: beautifulsoup4==4.9.1 bs4==0.0.1 certifi==2020.6.20 chardet==3.0.4 idna==2.10 requests==2.24.0 soupsieve==2.0.1 urllib3==1.25.9 random_proxies ofc
my code: import requests from bs4 import BeautifulSoup import os from time import sleep import csv from fake_useragent import UserAgent import lxml from random import choice, uniform from random_proxies import random_proxy
URL = 'http://sitespy.ru/my-ip'
proxy = random_proxy() random_proxy()
Cache server returns error (success | "no"), so I've been bypassing it with the use_cache=False atribute. Not an ideal solution tho.
Edited your code to show it:
import requests
from random_proxies import random_proxy
theurl = 'http://sitespy.ru/my-ip'
proxy = random_proxy(use_cache=False)
print(proxy)
headers = {'http':proxy}
petition = requests.get(theurl,headers=headers)
print(petition.status_code)
Which returned last time I ran it:
161.35.4.201:80
200
Hello, just testing your package and have an error. Could you look throug and fix it?
installed packages: beautifulsoup4==4.9.1 bs4==0.0.1 certifi==2020.6.20 chardet==3.0.4 idna==2.10 requests==2.24.0 soupsieve==2.0.1 urllib3==1.25.9 random_proxies ofc
my code: import requests from bs4 import BeautifulSoup import os from time import sleep import csv from fake_useragent import UserAgent import lxml from random import choice, uniform from random_proxies import random_proxy
URL = 'http://sitespy.ru/my-ip'
proxy = random_proxy() random_proxy()
Thanks for reporting the issue 😃!
Most likely the cache server is down and hence the problem.
I will look into it.
Apologies for any inconvenience caused.