Paramiko Upload Errors [Errno 2] [duplicate] - python

I'm calling the Paramiko sftp_client.put(locapath,remotepath) method
This is throwing the [Errno 2] File not found error below.
01/07/2020 01:12:03 PM - ERROR - [Errno 2] File not found
Traceback (most recent call last):
File "file_transfer\TransferFiles.py", line 123, in main
File "paramiko\sftp_client.py", line 727, in put
File "paramiko\sftp_client.py", line 689, in putfo
File "paramiko\sftp_client.py", line 460, in stat
File "paramiko\sftp_client.py", line 780, in _request
File "paramiko\sftp_client.py", line 832, in _read_response
File "paramiko\sftp_client.py", line 861, in _convert_status
Having tried many of the other recommend fixes I found that the error is due to the server having an automatic trigger to move the file immediately to another location upon the file being uploaded.
I've not seen another post relating to this issue and wanted to know if anyone else has fixed this as the SFTP server is owned by a third party and not wanting to change trigger attributes.
The file actually uploads correctly, so I could catch the Exception and ignore the error. But I'd prefer to handle it, if possible.

Paramiko by default verifies a size of the uploaded file after the upload.
If the file is moved away immediately after upload, the check fails.
To avoid the check, set confirm parameter of SFTPClient.put to False.
sftp_client.put(localpath, remotepath, confirm=False)
I believe the check is redundant anyway, see
How to perform checksums during a SFTP file transfer for data integrity?
For a similar question about pysftp (what is a wrapper around Paramiko), see:
Python pysftp.put raises "No such file" exception although file is uploaded

Also had this issue of the file automatically getting moved before paramiko could do an os.stat on the uploaded file and compare the local and uploaded file sizes.
#Martin_Prikryl solution works works fine for removing the error by passing in confirm=False when using sftp.put or sftp.putfo
If you want this check to still run like you mention in the post to see if the file has been uploaded fully you can run something along these lines. For this to work you will need to know the moved file location and have the ability to read the file.
import os
sftp.putfo(source_file_object, destination_file, confirm=False)
upload_size = sftp.stat(moved_path).st_size
local_size = os.stat(source_file_object).st_size
if upload_size != local_size:
raise IOError(
"size mismatch in put! {} != {}".format(upload_size, local_size)
)
Both checks use os.stat

Related

CherryPy Python error "No such file or directory"

Im trying to run a python Server using CherryPy for a WebSite but when I run it this error pops up.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cprequest.py", line 638, in respond
self._do_respond(path_info)
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cprequest.py", line 694, in _do_respond
self.hooks.run('before_handler')
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cprequest.py", line 95, in run
self.run_hooks(iter(sorted(self[point])))
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cprequest.py", line 117, in run_hooks
hook()
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cprequest.py", line 65, in __call__
return self.callback(**self.kwargs)
File "/usr/local/lib/python3.10/dist-packages/cherrypy/_cptools.py", line 280, in _lock_session
cherrypy.serving.session.acquire_lock()
File "/usr/local/lib/python3.10/dist-packages/cherrypy/lib/sessions.py", line 550, in acquire_lock
self.lock = zc.lockfile.LockFile(path)
File "/usr/local/lib/python3.10/dist-packages/zc/lockfile/__init__.py", line 117, in __init__
super(LockFile, self).__init__(path)
File "/usr/local/lib/python3.10/dist-packages/zc/lockfile/__init__.py", line 87, in __init__
fp = open(path, 'a+')
FileNotFoundError: [Errno 2] No such file or directory: '/var/www/html/cncsessions\\/session-73ab2ecbe9bd50153b4f20828fcc08bff6e9cd6e.lock'
It's my first time using this module and I don't know what's wrong.
I'm using Ubuntu 22, Python 3.10.6
Hard to say without seeing your exact code that this is calling.
Judging by the Error you are trying to use Sessions.
The sessions are looking for
/var/www/html/cncsessions
To place the session files in
But it gives an error. It looks like the path might be wrong. There's a double backslash at the end there and a forward slash.
\\/
If you haven't given up on this/figured it out already I would try changing this path to just this
/var/www/html/cncsessions
Also be sure you do not store your session data in your web root. Looks like from that path you might be doing that! Anything in a webroot will be served via public webserver. There's little to no chance anyone would guess the file names though.

How do I use ftp.storbinary to upload a file? [duplicate]

This question already has an answer here:
ftplib.error_perm: 553 Could not create file. (Python 2.4.4)
(1 answer)
Closed 2 years ago.
I am just starting to learn Python. I'm trying to upload a file as follows:
import ftplib
myurl = 'ftp.example.com'
user = 'user'
password = 'password'
myfile = '/Users/mnewman/Desktop/requested.txt'
ftp = ftplib.FTP(myurl, user, password)
ftp.encoding = "utf-8"
ftp.cwd('/public_html/')
ftp.storbinary('STOR '+myfile, open(myfile, 'rb'))
But get the following error:
Traceback (most recent call last):
File "/Users/mnewman/.spyder-py3/temp.py", line 39, in <module>
ftp.storbinary('STOR '+myfile, open(myfile, 'rb'))
File "ftplib.pyc", line 487, in storbinary
File "ftplib.pyc", line 382, in transfercmd
File "ftplib.pyc", line 348, in ntransfercmd
File "ftplib.pyc", line 275, in sendcmd
File "ftplib.pyc", line 248, in getresp
error_perm: 553 Can't open that file: No such file or directory
What does "that file" refer to and what do I need to do to fix this?
Reading the traceback, the error is deep in the ftp stack processing a response from the server. FTP server messages aren't standardized, but from the text its clear that the FTP server is unable to write the file on the remote side. This can happen for a variety of reasons - perhaps there is a permissions problem (the identity of the FTP server process does not have rights to a target), the write is outside of a sandbox setup on the server, or even that its already open in another program.
But in your case, you are using the full source file name in the "STOR" command when it wants the target path. Depending on whether you want to write subdirectories on the server, calculating the target name can get complicated. If you just want the server's current working directory, you could
ftp.storbinary(f'STOR {os.path.split(myfile)[1]}', open(myfile, 'rb'))
"That file" refers to the file that you're trying to upload to the FTP. According to your code, it refers to the line: myfile = '/Users/mnewman/Desktop/requested.txt'. You get this error because Python can't find the file in the path. Check whether it exists in the correct path. If you want to test whether there is an error in the script, you can add a test file to the directory in which your Python script exists and then run the script with the path of that file.
Example Script for FTP Upload:
import ftplib
session = ftplib.FTP('ftp.example.com','user','password')
file = open('hello.txt','rb') # file to send
session.storbinary('STOR hello.txt', file) # send the file
file.close() # close file and FTP
session.quit()

Paramiko put method throws "[Errno 2] File not found" if SFTP server has trigger to automatically move file upon upload

I'm calling the Paramiko sftp_client.put(locapath,remotepath) method
This is throwing the [Errno 2] File not found error below.
01/07/2020 01:12:03 PM - ERROR - [Errno 2] File not found
Traceback (most recent call last):
File "file_transfer\TransferFiles.py", line 123, in main
File "paramiko\sftp_client.py", line 727, in put
File "paramiko\sftp_client.py", line 689, in putfo
File "paramiko\sftp_client.py", line 460, in stat
File "paramiko\sftp_client.py", line 780, in _request
File "paramiko\sftp_client.py", line 832, in _read_response
File "paramiko\sftp_client.py", line 861, in _convert_status
Having tried many of the other recommend fixes I found that the error is due to the server having an automatic trigger to move the file immediately to another location upon the file being uploaded.
I've not seen another post relating to this issue and wanted to know if anyone else has fixed this as the SFTP server is owned by a third party and not wanting to change trigger attributes.
The file actually uploads correctly, so I could catch the Exception and ignore the error. But I'd prefer to handle it, if possible.
Paramiko by default verifies a size of the uploaded file after the upload.
If the file is moved away immediately after upload, the check fails.
To avoid the check, set confirm parameter of SFTPClient.put to False.
sftp_client.put(localpath, remotepath, confirm=False)
I believe the check is redundant anyway, see
How to perform checksums during a SFTP file transfer for data integrity?
For a similar question about pysftp (what is a wrapper around Paramiko), see:
Python pysftp.put raises "No such file" exception although file is uploaded
Also had this issue of the file automatically getting moved before paramiko could do an os.stat on the uploaded file and compare the local and uploaded file sizes.
#Martin_Prikryl solution works works fine for removing the error by passing in confirm=False when using sftp.put or sftp.putfo
If you want this check to still run like you mention in the post to see if the file has been uploaded fully you can run something along these lines. For this to work you will need to know the moved file location and have the ability to read the file.
import os
sftp.putfo(source_file_object, destination_file, confirm=False)
upload_size = sftp.stat(moved_path).st_size
local_size = os.stat(source_file_object).st_size
if upload_size != local_size:
raise IOError(
"size mismatch in put! {} != {}".format(upload_size, local_size)
)
Both checks use os.stat

PermissionError: [Errno 13] Permission denied when saving HoloMap to GIF

I am trying to create an animated gig from a series of heat maps with HoloViews.
I need to do this in a Python script, i. e. specifically not in a Jupyter notebook.
When saving the image, Python throws an error because it cannot create a temporary file in the temp-folder of the current user (this is under Windows). Happens regardless of the user, even when I run Python as admin.
When I stop in the debugger and change the temp-file path to some other place, e. g. Desktop, that works, but the resulting holo.gif in the working directory is empty (0 bytes). The temporary gif, though, is correctly animated, so I guess the code is basically OK.
[Edit: Not so sure anymore. I ran this the night through on 26.531 heat maps each of which consisted of a 5x5 grid. The process did not finish (i. e. did not hit the breakppoint at Image.py line 1966). Is there a way to do what I want that is less painfully slow?]
Answers to similar problems on StackOverflow did point to permission problems (but what kind of problem could that be if it doesn't even work for an admin?) and suggest saving to another location, which is impossible here as I have no control over where matplotlib will try to create temporary files.
The problem is specifically with gif's, I can create *.png or *.html output without error. (AFAIK, the difference is that gif-creation uses ImageMagick.)
Here's the code (construction of underlying heat map data left out):
import holoviews as hv
hv.extension('matplotlib')
renderer = hv.renderer('matplotlib')
renderer.fps = 3
heatMapDict = {
k: hv.HeatMap(measurements[k].sensors) for k in range(len(measurements))
}
holo = hv.HoloMap(heatMapDict, kdims='index')
renderer.save(holo, 'holo', fmt='gif')
And the traceback:
INFO:matplotlib.animation:Animation.save using <class 'matplotlib.animation.PillowWriter'>
Traceback (most recent call last):
File "cm3.py", line 69, in <module>
renderer.save(holo, 'holo', fmt='gif')
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\holoviews\plotting\renderer.py", line 554, in save
rendered = self_or_cls(plot, fmt)
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\holoviews\plotting\mpl\renderer.py", line 108, in __call__
data = self._figure_data(plot, fmt, **({'dpi':self.dpi} if self.dpi else {}))
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\holoviews\plotting\mpl\renderer.py", line 196, in _figure_data
data = self._anim_data(anim, fmt)
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\holoviews\plotting\mpl\renderer.py", line 246, in _anim_data
anim.save(f.name, writer=writer, **anim_kwargs)
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\matplotlib\animation.py", line 1174, in save
writer.grab_frame(**savefig_kwargs)
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\contextlib.py", line 119, in __exit__
next(self.gen)
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\matplotlib\animation.py", line 232, in saving
self.finish()
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\matplotlib\animation.py", line 583, in finish
duration=int(1000 / self.fps))
File "C:\Users\y2046\AppData\Local\Programs\Python\Python37\lib\site-packages\PIL\Image.py", line 1966, in save
fp = builtins.open(filename, "w+b")
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\y2046\\AppData\\Local\\Temp\\tmp4im5ozo8.gif'
Addendum:
I'm coming to think that this is not a permission problem after all. Perhaps it has to do with reentrancy and file-locking under Windows? The Python process in fact may create files in the temp directory, as proved by inserting the following test code before calling renderer.save():
import os
import builtins
filename = 'C:\\Users\\y2046\\AppData\\Local\\Temp\\test.txt'
fp = builtins.open(filename, "w+b")
try:
fp.write("first".encode('utf-8'))
finally:
fp.close()
os.remove(filename)
I should test this under Linux. If it works there, there must be a bug in the Pillow writer.
It looks like there is something broken with HoloViews. I have opened issue #3151 with them.

Mercurial: Permission Denied for hgwebdir

Yesterday I setup Apache to serve my Mercurial repositories and got everything working properly. I then tested pushing changes back to this repository and was presented with an error, and now that error pops up for every single operation I attempt - even just a simple GET request of the repositories! Here is the error:
mod_wsgi (pid=1771): Target WSGI script '/var/hg/hgweb.wsgi' cannot be loaded as Python module.
mod_wsgi (pid=1771): Exception occurred processing WSGI script '/var/hg/hgweb.wsgi'.
Traceback (most recent call last):
File "/var/hg/hgweb.wsgi", line 18, in ?
application = hgwebdir(config)
File "/usr/lib64/python2.4/site-packages/mercurial/hgweb/__init__.py", line 15, in hgwebdir
return hgwebdir_mod.hgwebdir(*args, **kwargs)
File "/usr/lib64/python2.4/site-packages/mercurial/hgweb/hgwebdir_mod.py", line 52, in __init__
self.refresh()
File "/usr/lib64/python2.4/site-packages/mercurial/hgweb/hgwebdir_mod.py", line 82, in refresh
self.repos = findrepos(paths)
File "/usr/lib64/python2.4/site-packages/mercurial/hgweb/hgwebdir_mod.py", line 36, in findrepos
for path in util.walkrepos(roothead, followsym=True, recurse=recurse):
File "/usr/lib64/python2.4/site-packages/mercurial/util.py", line 1164, in walkrepos
for hgname in walkrepos(fname, True, seen_dirs):
File "/usr/lib64/python2.4/site-packages/mercurial/util.py", line 1146, in walkrepos
for root, dirs, files in os.walk(path, topdown=True, onerror=errhandler):
File "/usr/lib64/python2.4/os.py", line 276, in walk
onerror(err)
File "/usr/lib64/python2.4/site-packages/mercurial/util.py", line 1127, in errhandler
raise err
OSError: [Errno 13] Permission denied: './dev/fd'
My repository directory is owned by apache, the user running Apache. I dont know why './dev/fd' is being operated on either. I've restarted the server numerous times, recreated the repository directory, but I still get this error no matter what! I dont have access to restart the machine, so that is not an option. But it seems to have gotten in a very bad persistent state, and I dont know how to fix it. Any help is appreciated!
This turned out to be a configuration error on my part, and rather than delete the question I'll post the resolution here in case someone has this problem in the future.
Here was the hgweb.config I was using:
[paths]
/ = /var/hg/repos/*
#[web]
style = gitweb
allow_archive = bz2 gz zip
maxchanges = 200
allow_push = *
push_ssl = false
Two problems here, one is obvious. I had the [web] header commented out, and I assume that many of the options are not valid for the [paths] section. Also, after re-reading the Hg docs again, the push_ssl directive does not belong in the hgweb.config file, but rather in each repository's .hg/hgrc (or the ~/.hgrc of the user that runs apache). After fixing these, things are working perfectly!

Categories