I have a strange problem with a python script.
import requests
argumente = {'NRBUS': 'LINIA_1', 'COORDONATE': '46.195323,21.306300'}
r=requests.get("http://www.roroid.ro/php/GPS_cloud/GPS_cloud.php",params=argumente)
print r.url
print r.text
On my PC is worknig without any problems but on my raspberry PI after some time I end up with
Traceback (most recent call last):
File "testt.py", line 4, in <module>
r=requests.get("http://www.roroid.ro/php/GPS_cloud/GPS_cloud.php",params=argumente)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 375, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='www.roroid.ro', port=80): Max retries exceeded with url: /php/GPS_cloud/GPS_cloud.php?COORDONATE=46.195323%2C21.306300&NRBUS=LINIA_1 (Caused by <class 'socket.error'>: [Errno 110] Connection timed out)
Thanks for any help.
Related
ua = UserAgent()
headers={
'user-agent':str(ua.random),
'Connection':'close'
}
r = requests.get(url,headers=headers,timeout=5)
I want to scrape some information from a website ,but the function request.get() raise exception occasionally (sometimes successful but sometime not). I've tried many methods, random u-a, timeout, time.sleep, max tries, but of no use.
Is there something wrong with my code, or is it a fault or some anti-scraper system of the website?
Here is the full exception:
Traceback (most recent call last):
File "d:\AAA临时文档\抢课app\爬虫\run2.py", line 7, in <module>
r=requests.get(url=url,headers=headers,timeout=20)
File "C:\Users\86153\AppData\Local\Programs\Python\Python38-32\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\86153\AppData\Local\Programs\Python\Python38-32\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\86153\AppData\Local\Programs\Python\Python38-32\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\86153\AppData\Local\Programs\Python\Python38-32\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\Users\86153\AppData\Local\Programs\Python\Python38-32\lib\site-packages\requests\adapters.py", line 504, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='www.dy2018.com', port=443): Max retries exceeded with url: /i/103887.html (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x04046A18>, 'Connection to www.dy2018.com timed out. (connect timeout=20)'))
When I run my Python script to get data from RESTAPI of an application, I get the following error. I installed PIP and I installed requests package for the python. Here is my below query:
./simpleRunQuery.py <args> <args>
Traceback (most recent call last):
File "./simpleRunQuery.py", line 25, in <module>
res = requests.post(url, auth=(args.username, args.password), data=jsonRequest, headers=headers)
File "/Library/Python/2.7/site-packages/requests/api.py", line 112, in post
return request('post', url, data=data, json=json, **kwargs)
File "/Library/Python/2.7/site-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/Library/Python/2.7/site-packages/requests/sessions.py", line 508, in request
resp = self.send(prep, **send_kwargs)
File "/Library/Python/2.7/site-packages/requests/sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "/Library/Python/2.7/site-packages/requests/adapters.py", line 490, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(54, 'Connection reset by peer'))
I can attach the script, but my script doens't have the lines 112 or 58 or any of these, it's a simple RESTAPI Script that queries and posts results here. Any pointers?
I am trying to send web request to my php code via linux running on raspberry pi. I am using putty to access the OS. When I am running the code, I am getting gai error-2. Below is my code:
import requests
values = {'"firstname":' "abc ", '"lastname":' "xyz"}
r = requests.post('http://rts.msmq.site/security.php', data=values)
I am unable to figure out the problem. Can someone let me know what could be the issue? I am using linux.
Error message:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 94, in post
return request('post', url, data=data, json=json, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 407, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))
gaierror stands for get_address_info_error, which means urllib is having trouble reaching the hostname 'http://rts.msmq.site/security.php', check that you can nslookup rts.msmq.site or dig rts.msmq.site and ping rts.msmq.site
I wrote a python script to fetch definitions and URLs of list of items (I am using a long list with no less than 3000 items).
The script was working fine, I used it several times, but suddenly I started to get the following error:
('Connection aborted.', error(54, 'Connection reset by peer'))
here is the full traceback
Traceback (most recent call last):
File "Wiki.py", line 41, in <module>
page = wikipedia.page(item)
File "/Library/Python/2.7/site-packages/wikipedia/wikipedia.py", line 270, in page
results, suggestion = search(title, results=1, suggestion=True)
File "/Library/Python/2.7/site-packages/wikipedia/util.py", line 28, in __call__
ret = self._cache[key] = self.fn(*args, **kwargs)
File "/Library/Python/2.7/site-packages/wikipedia/wikipedia.py", line 103, in search
raw_results = _wiki_request(search_params)
File "/Library/Python/2.7/site-packages/wikipedia/wikipedia.py", line 737, in _wiki_request
r = requests.get(API_URL, params=params, headers=headers)
File "/Library/Python/2.7/site-packages/requests/api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "/Library/Python/2.7/site-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/Library/Python/2.7/site-packages/requests/sessions.py", line 502, in request
resp = self.send(prep, **send_kwargs)
File "/Library/Python/2.7/site-packages/requests/sessions.py", line 612, in send
r = adapter.send(request, **kwargs)
File "/Library/Python/2.7/site-packages/requests/adapters.py", line 490, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(54, 'Connection reset by peer'))
It seems that the while installing the regular requests python library, it does not install the necessary packages to cope for https connections. Install these by:
pip install requests[security]
I need to call a web requests and Im using Python request module.
I have a requests being served on a machine For ex:55.84.201.228. When I enter it throught the browser it works fine and Im able to view the webpage..
but when I use the requests.get, it does not work..It errors with a socket error..
>>> import requests
>>> r = requests.get('https://55.84.201.228')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 486, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='55.84.201.228', port=443): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 110] Connection timed out)
How can I fix this issue?
>>>r = requests.get('https://www.cnn.com')
This works fine.
I am making a wild guess here since I can't access your machine but from experience, since you are using https on what I assume is some test server try
r = requests.get('https://55.84.201.228', verify=False)
According to the documentation certificate verification is on by default.