I'm trying to deploy a simple python telegram bot to heroku. It web-crawls 'http://www.wordreference.com/sinonimos/' using requests and bs4, and extracts synonyms for that word. But once uploaded to the server it is unable to stablish connection, it raises the following exception
requests.exceptions.ConnectionError:
HTTPConnectionPool(host='www.wordreference.com', port=80): Max retries
exceeded with url: /sinonimos/Calor (Caused by
NewConnectionError('<urllib3.connection.HTTPConnection object at
0x7fb3aeed80b8>: Failed to establish a new connection: [Errno 101] Network
is unreachable',))
It works perfectly executed locally. I tried to crawl other urls aswell on heroku and it just works fine. So I guess that if it was a port-binding issue, I'd had get a response for other urls. I tried using a free proxy server in case free accounts had limitations to certain addresses as shown here
proxies = {'http': 'http:/40.141.163.122:8080'}
page='http://www.wordreference.com/sinonimos/{}'.format(word)
r = requests.get(page,proxies=proxies)
It worked for a while, and then stopped, not raising any exceptions to the logs. I contacted Heroku support, but they didn't reply yet, so if anyone can shine some light on the issue so I stop flying in the dark it'd be appreciated
Related
I'm currently working with an Imgur Bot using Python and requests lib with a rotating proxy. I run multiple instances of this bot, but sometimes, some of them got this error:
HTTPSConnectionPool(host='api.imgur.com', port=443): Max retries exceeded with url: /3/credits (Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response')))
Here is how the session is defined:
session = requests.Session()
session.proxies = {"http": cre["Proxy"]}
I have no idea why it crashes since only a part instances I run got this error. The others work well.
Thanks, mates!
Maybe your proxies are wrong configurated?
I think you run into some form of "requesting too much too fast".
Try to delay some requests (like 3 seconds).
So I have pytests (made using requests) and I run them while my server is running, to test if they work. They work fine but currently, the base URL is hardcoded to port number 3000 since my server is running on port 3000. How can I make sure these pytests run even if my port number is not hardcoded, i.e. if my server is running on port 4080 then the tests will succeed without me having to hardcode the URLs.
I tried doing this:
URL = f"http://127.0.0.1:{sys.argv[1]}/user"
since I take in the first argument as my port number when running my server. This does not work though. Any other suggestions for a beginner? Thanks
requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=3000): Max retries exceeded with url: /user (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))
I am trying to do the following:
from urllib.request import urlopen
data = urlopen("https://www.duolingo.com/users/SaifullahS6").read()
I get the following error:
URLError: <urlopen error [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
Similarly, when I try this:
import requests
session = requests.Session()
data = {"login": "SaifullahS6", "password": "mypassword"}
req = requests.Request('POST', "https://www.duolingo.com/login", data=data,
cookies=session.cookies)
prepped=req.prepare()
returned = session.send(prepped)
I get:
ConnectionError: HTTPSConnectionPool(host='www.duolingo.com', port=443): Max retries exceeded with url: /login (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x000000000E6948D0>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond',))
I am not sure how to give details of my internet connection.
I'm at work and I know we have a corporate proxy.
We have Windows Firewall turned on, but i have checked that python and pythonw are ticked in the "Domain" column of the control panel for allowing a program through the firewall.
When I ping google.co.uk from a command shell, all four requests time out, but I can access it from a browser.
In the Internet Options control panel, I click on the Connections tab and then LAN settings, and I have "Automatically detect settings" turned on, and also "Use a proxy server for your LAN", "Address" is "localhost" and "Port" is 3128. This is cntlm. I set it up once to do download python packages, and it appears to still be active because I have just managed to update one of my packages.
I don't even need a direct answer to my question; at this point I'll just settle for some clarity on what is actually going on behind the scenes. Any help much appreciated!
For the first case above (urllib module), I solved it by inserting the following lines before the data = urlopen(...).read() line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
proxy = urllib.request.ProxyHandler(proxies)
opener = urllib.request.build_opener(proxy)
urllib.request.install_opener(opener)
For the second case (requests module), everything was the same except the last line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
returned = session.send(prepped, proxies=proxies)
Hope this note helps others who come across this page.
Lets consider 'mywebsite' as my website uname as username and pwd as password.
Now the scenario is i have a system that was previously working but now when i am trying to connect to my magento from odoo it returns me an error
<ProtocolError for mywebsite/index.php/api/xmlrpc/: 301 Moved Permanently>
however this particular url ie https://mywebsite.com is accesible if you hit it on browser and also returns a true result when hit with Postman
i tried to hit the same url using a python script
import xmlrpclib
server = xmlrpclib.ServerProxy('https://mywebsite.com')
session = server.login('uname','pwd')
Multiple times over multiple environments
when i execute this script from the same environment that my server is hosted upon i get the same error
Error 301 Moved Permenantly
Now when i hit the same above script from my local enviroment i get
SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
which i assumed arises due to using https so when i change the url with http i get same error back which is
xmlrpclib.ProtocolError: <ProtocolError for mywebsite.com/: 301 Moved Permanently>
Hitting the above scripts from a staging environment gets me the same result as my local enviroment
also when i change the above script and run it using ip of the website along with port i get
socket.error: [Errno 110] Connection timed out
then i tried changing the script and running it with this code
import urllib
print urllib.urlopen("http://mywebsite.com/").getcode()
when i run this code from my local machine i get
Error 403 Forbidden Request
Hitting this new code with ip of website with port gets me
IOError: [Errno socket error] [Errno 110] Connection timed out
When i hit this code without mentioning the port i get
SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
now hitting these code from live environment using mywebsite.com gets me
Error 403
using ip without the port
[SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:590)
with the ip and port
IOError: [Errno socket error] [Errno 110] Connection timed out
Any pointers or suggestions would be appreciated
if there are any silly mistakes please excuse them as i am a amateur in odoo/python
also if you have any other way to check if a url is hit-able please do let me know
Well it seems that when i entered the mywebsite url in odoo it was being appended in the backend by /index.php/api/xmlrpc which was working fine for some time
but now due to some changes it doesnt accept index.php anymore as it is automatically routed(mayb)
anyways i solved the error by changing the appending string to /api/xmlrpc
I have developed a desktop client using PyQt4, it connect to my web service by requests lib. You know, requests maybe one of the most useful http client, I think it should be no problem. My desktop client works all right until something strange happened.
I use the following code to send request to my server.
response = requests.get(url, headers = self.getHeaders(), timeout=600, proxies = {}, verify = False)
where header only includes auth token.
def getHeaders(self, additional = None):
headers = {
'Auth-Token' : HttpBasicClient.UserAuthToken,
}
if additional is not None:
headers.update(additional)
return headers
I cannot connect to my web service, all the http request pop the same error "'Cannot connect to proxy.', error(10061, '')". For example:
GET Url: http:// api.fangcloud.com/api/v1/user/timestamp
HTTPSConnectionPool(host='api.fangcloud.com', port=443): Max retries exceeded with url: /api/v1/user/timestamp (Caused by ProxyError('Cannot connect to proxy.', error(10061, '')))
this API does nothing but return the timestamp of my server. When I copy the url into Chrome in same machine with same environment, it returns correct response. However, my desktop client can only returns error. Is it anything wrong with requests lib?
I googled this problem of connection error 10061 ("No connection could be made because the target machine actively refused it"). This maybe caused by TCP connect rejection of web server.
The client sends a SYN packet to the server targeting the port (80 for HTTP). A server that is running a service on port 80 will respond with a SYN ACK, but if it is not, it will respond with a RST ACK. Your client reaches the server, but not the intended service. This is one way a server could “actively refuse” a connection attempt.
But why? My client works all right before and Chrome still works. I use no proxy on my machine. Is there anything I miss?
I notice there is a white space in URL, is that correct?
I tested in my ipython with requests.. that the response was:
{
"timestamp": 1472760770,
"success": true
}
For HTTP and HTTPS.