Flask+requests API ConnectionRefusedError: [Errno 111] Connection refused - python

I'm trying to run a simple Flask API, but it's not working as expected. I'm not very experienced in Python, so found the error and solve it have been very challenging. I would appreciate a lot if someone could help.
The system's settings are:
Ubuntu 18.04
Conda environement with python 3.7
And these are the requirements:
$ pip freeze
ansimarkup==1.4.0
asn1crypto==0.24.0
better-exceptions-fork==0.2.1.post6
certifi==2019.3.9
cffi==1.12.3
chardet==3.0.4
Click==7.0
colorama==0.4.1
cryptography==2.7
Flask==1.0.3
idna==2.8
itsdangerous==1.1.0
Jinja2==2.10.1
loguru==0.2.5
MarkupSafe==1.1.1
pycparser==2.19
Pygments==2.4.2
pyOpenSSL==19.0.0
PySocks==1.7.0
requests==2.22.0
six==1.12.0
urllib3==1.24.2
Werkzeug==0.15.4
My project structure is like this:
├── statsapi
│ ├── data_store.py
├── app.py
├── client.py
├── requirements.txt
Here is the app.py code:
#!/usr/bin/env python
from flask import Flask, request, jsonify
from loguru import logger
from statsapi import data_store
app = Flask(__name__)
# Creating an endpoint
#app.route("/data", methods=["POST"])
def save_data():
# setting log for this action
logger.info(f"Saving data...")
# transform content requisition to json
content = request.get_json()
# save in a module just the "data" field
# The uuid of the data
uuid = data_store.save(content["data"])
# set log for las action
logger.info(f"Data saved with UUID `{uuid}` successfully")
# define information to be returned
return jsonify({"status": "success",
"message": "data saved successfully",
"uuid": uuid})
if __name__ == "__main__":
app.run(host="0.0.0.0", debug=True)
The data_store.py code:
#!/usr/bin/env python
from uuid import uuid4
# Create a dictionary to keep things in memory
_in_memory_storage = dict()
# Save received data in memory giving an uuid
def save(data):
data_uuid = uuid4()
_in_memory_storage[data_uuid] = data
return data_uuid
And the client.py code:
#!/usr/bin/env python
import requests
def send(data):
response = requests.post("http://localhost:5000/data", json={"data": data})
print(response.json())
def main():
send([1, 2, 3, 4])
if __name__ == "__main__":
main()
The client.py should send some data to API, but when called it returns this long error message:
$ python client.py
Traceback (most recent call last):
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connection.py", line 159, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/util/connection.py", line 80, in create_connection
raise err
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/util/connection.py", line 70, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connection.py", line 181, in connect
conn = self._new_conn()
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connection.py", line 168, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fd0c7a26a58>: Failed to establish a new connection: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/urllib3/util/retry.py", line 399, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=5000): Max retries exceeded with url: /data (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd0c7a26a58>: Failed to establish a new connection: [Errno 111] Connection refused'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "client.py", line 17, in <module>
main()
File "client.py", line 13, in main
send([1, 2, 3, 4])
File "client.py", line 7, in send
response = requests.post("http://localhost:5000/data", json={"data": data})
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/api.py", line 116, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/home/bruno/anaconda3/envs/statsapi/lib/python3.7/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=5000): Max retries exceeded with url: /data (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd0c7a26a58>: Failed to establish a new connection: [Errno 111] Connection refused'))
As I said, I'm far away from to be a python expert, so I will be many thankful for any help.

Might not exactly relate to the question, but it might help someone facing similar issue.
I had similar issue, but in my case it was because of docker containers. When container #1 is connecting to container #2 using requests api it was failing as both the containers are not in a single network.
So I modified my docker-compose file to
networks:
default:
external: true
name: <network_name>
Make sure to create the docker network in advance and modify the host to docker service name of the container #2 while connecting using requests.
So use http://<docker_service_name>:8000/api instead of http://127.0.0.1:8000/api

i think this might be your port is already serve by another apps, try to change flask port.
i used your code in local and successfully
try something like this
main.py
import data_store
from flask import Flask, request, jsonify
app = Flask(__name__)
# Creating an endpoint
#app.route("/data", methods=["POST"])
def save_data():
# setting log for this action
print(f"Saving data...")
# transform content requisition to json
content = request.get_json()
# save in a module just the "data" field
# The uuid of the data
uuid = data_store.save(content["data"])
# set log for las action
print(f"Data saved with UUID `{uuid}` successfully")
# define information to be returned
return jsonify({"status": "success",
"message": "data saved successfully",
"uuid": uuid})
if __name__ == "__main__":
app.run(port=8000)
data_store.py
from uuid import uuid4
# Create a dictionary to keep things in memory
_in_memory_storage = dict()
# Save received data in memory giving an uuid
def save(data):
data_uuid = uuid4()
_in_memory_storage[data_uuid] = data
print(f"Cached value => {_in_memory_storage}")
return data_uuid
client.py
import requests
def send(data):
response = requests.post("http://localhost:8000/data", json={"data": data})
print(response.json())
def main():
send([1, 2, 3, 4])
if __name__ == "__main__":
main()
and when it test successfully like this
→ python client.py
{'message': 'data saved successfully', 'status': 'success', 'uuid': '050b02c2-e67e-44f7-b6c9-450106d8b40e'}
flask logger
→ python main.py
* Serving Flask app "main" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
* Running on http://127.0.0.1:8000/ (Press CTRL+C to quit)
Saving data...
Cached value => {UUID('050b02c2-e67e-44f7-b6c9-450106d8b40e'): [1, 2, 3, 4]}
Data saved with UUID `050b02c2-e67e-44f7-b6c9-450106d8b40e` successfully
127.0.0.1 - - [06/Jun/2021 12:51:35] "POST /data HTTP/1.1" 200 -

Related

TimeoutError: Large amount of data in requests python

I am trying to make a script that scrape presentation from a slideshare link and download it as a PDF.
The script is working fine, until the total slides are under 20. Is there any alternative to requests in python that can do the job.
Here is the scripts:
import requests
from bs4 import BeautifulSoup
from PIL import Image
import io
URL_LESS = "https://www.slideshare.net/angelucmex/global-warming-2373190?qid=8f04572c-48df-4f53-b2b0-0eb71021931c&v=&b=&from_search=1"
URL="https://www.slideshare.net/tusharpanda88/python-basics-59573634?qid=03cb80ee-36f0-4241-a516-454ad64808a8&v=&b=&from_search=5"
r = requests.get(URL_LESS)
soup = BeautifulSoup(r.content, "html5lib")
imgs = soup.find_all('img', class_="slide-image")
imgSRC = [x.get("srcset").split(',')[0].strip().split(' ')[0].split('?')[0] for x in imgs]
imagesJPG = []
for img in imgSRC:
im = requests.get(img)
f = io.BytesIO(im.content)
imgJPG = Image.open(f)
imagesJPG.append(imgJPG)
imagesJPG[0].save(f"{soup.title.string}.pdf",save_all=True, append_images=imagesJPG[1:])
Try changing URL_LESS to URL, you will get the idea.
Here is the traceback
Traceback (most recent call last):
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connectionpool.py", line 386, in _make_request
self._validate_conn(conn)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connectionpool.py", line 1040, in _validate_conn
conn.connect()
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connection.py", line 358, in connect
conn = self._new_conn()
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connection.py", line 186, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x00000259643FF820>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\adapters.py", line 440, in send
resp = conn.urlopen(
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\connectionpool.py", line 785, in urlopen
retries = retries.increment(
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='image.slidesharecdn.com', port=443): Max retries exceeded with url: /pythonbasics-160315100530/85/python-basics-8-320.jpg (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000259643FF820>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "d:\Work\py\scrapingScripts\slideshare\main.py", line 16, in <module>
im = requests.get(img)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\sessions.py", line 529, in request
resp = self.send(prep, **send_kwargs)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\sessions.py", line 645, in send
r = adapter.send(request, **kwargs)
File "D:\Work\py\scrapingScripts\tkinter\env\lib\site-packages\requests\adapters.py", line 519, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='image.slidesharecdn.com', port=443): Max retries exceeded with url: /pythonbasics-160315100530/85/python-basics-8-320.jpg (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000259643FF820>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did
not properly respond after a period of time, or established connection failed because connected host has failed to respond'))
The script worked perfectly for me both when using URL and URL_LESS, so your internet might be the culprit here.
My guesses are:
You're having a slow/inconsistent internet.
Slideshare is blacklisting your IP/ web-agent maybe for DDOS protection.(unlikely)
You're Using ipv6, which has been the culprit in these kind of cases for me, try switching your network to use ipv4 only.
and when it comes to requests, I have personally used it to scrape a fairly large amount of data for a fairly long time so I can say it's an amazing library to use

openstack python api script getting SSL error

I have build new openstack based on Queens release and now trying to learn python SDK api script and stuck here not sure what is wrong.
my script
from keystoneclient.auth.identity import v3
from keystoneclient import session
from keystoneclient.v3 import client
auth_url = 'http://172.28.0.9:5000/v3'
username = 'dev'
user_domain_name = 'Default'
project_name = 'dev'
project_domain_name = 'Default'
password = 'Password123'
auth = v3.Password(auth_url=auth_url,
username=username,
password=password,
project_id='344506541fd94f068d25990af3eff4b8',
user_domain_name=user_domain_name)
sess = session.Session(auth=auth)
keystone = client.Client(session=sess)
#keystone.projects.list()
from novaclient import client
nova = client.Client(2, session=keystone.session)
nova.flavors.list()
Getting error
[dev#openstack ~]$ python /tmp/myscript.py
/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py:200: UserWarning: Using keystoneclient sessions has been deprecated. Please update your software to use keystoneauth1.
warnings.warn('Using keystoneclient sessions has been deprecated. '
Traceback (most recent call last):
File "/tmp/o.py", line 22, in <module>
nova.flavors.list()
File "/usr/lib/python2.7/site-packages/novaclient/v2/flavors.py", line 145, in list
return self._list("/flavors%s" % detail, "flavors", filters=qparams)
File "/usr/lib/python2.7/site-packages/novaclient/base.py", line 257, in _list
resp, body = self.api.client.get(url)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 328, in get
return self.request(url, 'GET', **kwargs)
File "/usr/lib/python2.7/site-packages/novaclient/client.py", line 77, in request
**kwargs)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 487, in request
resp = super(LegacyJsonAdapter, self).request(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 213, in request
return self.session.request(url, method, **kwargs)
File "/usr/lib/python2.7/site-packages/keystoneclient/session.py", line 428, in request
resp = send(**kwargs)
File "/usr/lib/python2.7/site-packages/keystoneclient/session.py", line 466, in _send_request
raise exceptions.SSLError(msg)
keystoneauth1.exceptions.connection.SSLError: SSL exception connecting to https://10.30.2.9:8774/v2.1/flavors/detail: HTTPSConnectionPool(host='10.30.2.9', port=8774): Max retries exceeded with url: /v2.1/flavors/detail (Caused by SSLError(SSLError("bad handshake: SysCallError(104, 'ECONNRESET')",),))
10.30.2.9 is my F5 load-balancer and i did configured 8774 correctly route to nova api server.
my horizon GUI working fine and also all openstack commands working fine without error.
verify=False helps:
keystone = client.Client(session=sess, verify=False)
You need to include the certificate for SSL.
sess = session.Session(auth=auth, verify=path_to_certificate)
keystone = keystoneclient.Client(session=sess)
Perhaps there is an option to disable SSL.

connect to docker hosted on remote server

How do I connect to remote docker host using python?
>>> from docker import Client
>>> cli = Client(base_url='tcp://52.90.216.176:2375')
>>>
>>> cli.containers()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/docker/api/container.py", line 69, in containers
res = self._result(self._get(u, params=params), True)
File "/usr/local/lib/python2.7/site-packages/docker/utils/decorators.py", line 47, in inner
return f(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/docker/client.py", line 112, in _get
return self.get(url, **self._set_request_timeout(kwargs))
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 437, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='52.90.216.176', port=2375): Max retries exceeded with url: /v1.21/containers/json?all=0&limit=-1&trunc_cmd=0&size=0 (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fd87d836750>: Failed to establish a new connection: [Errno 111] Connection refused',))
If I log-in to 52.90.216.176 and use the following:
>>> cli = Client(base_url='unix://var/run/docker.sock')
this works. But how do I connect to docker running on another server?
It sounds like you're using docker-py.
Also, it sounds like maybe you're not familiar with TLS, so please read the documentation for using TLS with docker-py. You may need to download your TLS files and copy them local to the docker-py client as they are used to authenticate that you are authorized to connect to the Docker daemon.
I hope your remote Docker daemon is not exposed to the world.
If it is not running TLS (exposed to the world):
client = docker.Client(base_url='<https_url>', tls=False)
If it is secured with TLS (not exposed to the world):
client = docker.Client(base_url='<https_url>', tls=True)
This is not answer, but need your feedback.
The error message is: Connection refused, so can you run the command:
telnet 52.90.216.176 2375
To confirm if there is no firewall issue. Sometime the port is 2376
Add tcp option to sys config as shown here:
vi /etc/sysconfig/docker
OPTIONS="--host=tcp://0.0.0.0:2375"
After restarting docker, I could connect to remote docker server using python.

rollbar not sending exceptions once deployed in GAE

I have a project in GAE using python, in which I set up rollbar to track any errors and exceptions. Error reporting worked on localhost, but once I deployed it, my app isn't sending any error log/info that I intentionally created.
I have the rollbar set up as below:
import rollbar
rollbar.init('xxxxxxxxxxxxxxxxxxxx', 'production')
..
..
#when an error occur
try:
....
except:
rollbar.report_exc_info()
UPDATE(11/20):
I'm able to get rollbar to work now, but had to use a older version of requests library (ver 2.3.0). Newer version such as 2.7.0 or 2.8.1 would give me a connection error, anyone knows why or how to get around?
ERROR 2015-11-20 17:44:03,543 __init__.py:1158] Exception while posting item ConnectionError(ProtocolError('Connection aborted.', error(13, 'Permission denied')),)
Traceback (most recent call last):
File "/Path/to/my/project/rollbar/__init__.py", line 1156, in _send_payload
_post_api('item/', payload, access_token=access_token)
File "/Path/to/my/project/rollbar/__init__.py", line 1197, in _post_api
verify=SETTINGS.get('verify_https', True))
File "/Path/to/my/project/requests/api.py", line 109, in post
return request('post', url, data=data, json=json, **kwargs)
File "/Path/to/my/project/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/Path/to/my/project/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/Path/to/my/project/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/Path/to/my/project/requests/adapters.py", line 412, in send
raise ConnectionError(err, request=request)
ConnectionError: ('Connection aborted.', error(13, 'Permission denied'))
I ran into the same issue with Rollbar and GAE and the underlying cause was that I was running Google App Engine locally as a normal user that didn't have permissions to open a socket. When I deployed it to Google App Engine, it ran fine.

Name or service not known error in EasyWebDav for Python

I'm trying to setup a WebDAV connection using easywebdav in Python. (Using 2.7.8 for now)
import csv, easywebdav
webdav=easywebdav.connect('https://sakai.rutgers.edu/dav/restoftheurl,username="",password="")
print webdav.ls()
Though when I run this I get the following error message. My guess is that it possibly has something to do with the URL using HTTPS?
Traceback (most recent call last):
File "/home/willkara/Development/SakaiStuff/WorkProjects/sakai-manager/file.py", line 4, in <module>
print webdav.ls()
File "build/bdist.linux-x86_64/egg/easywebdav/client.py", line 176, in ls
File "build/bdist.linux-x86_64/egg/easywebdav/client.py", line 97, in _send
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 375, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='https', port=80): Max retries exceeded with url: //sakai.rutgers.edu/dav/url:80/. (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
[Finished in 0.1s with exit code 1]
I find it strange that you combine HTTPS protocol and port 80. HTTPS uses port 443.
Though the error message "Name or service not known" would rather indicate that the hostname sakai.rutgers.edu is not recognized on your system. Try to ping the host.
I noticed that you shouldn't have http:// or https:// in the beginning of your adress, only the host name. You select protocol with protocol='https'. Also, I couln't get it to work if I added the path the url, I had to use it as argument to the operations like easywebdav.ls('/dav/restoftheurl') or easywebdav.cd('/dav/restoftheurl').

Categories