Python scp copy images from image_urls to server - python

I have written one function which recieves a url and copy it to all server.
Server remote path is stored in db.
def copy_image_to_server(image_url):
server_list = ServerData.objects.values_list('remote_path', flat=True).filter(active=1)
file = cStringIO.StringIO(urllib.urlopen(image_url).read())
image_file = Image.open(file)
image_file.seek(0)
for remote_path in server_list:
os.system("scp -i ~/.ssh/haptik %s %s " % (image_file, remote_path))
I am geeting this error at last line cannot open PIL.JpegImagePlugin.JpegImageFile: No such file
Please suggest me what's wrong in the code, i have checked url is not broken

The problem is that image_file is not a path (string), it's an object. Your os.system call is building up a string that expects a path.
You need to write the file to disk (perhaps using the tempfile module) before you can pass it to scp in this manner.
In fact, there's no need for you (at least in what you're doing in the code snippet) to convert it to a PIL Image object at all, you can just write it to disk once you've retrieved it, and then pass it to scp to move it:
file = cStringIO.StringIO(urllib.urlopen(image_url).read())
diskfile = tempfile.NamedTemporaryFile(delete=False)
diskfile.write(file.getvalue())
path = diskfile.name
diskfile.close()
for remote_path in server_list:
os.system("scp -i ~/.ssh/haptik %s %s " % (path, remote_path))
You should delete the file after you're done using it.

Related

os.rename saying cannot access the file because it is being used by another process

I am working on a little useful script used to rename an mp3 file based on the tags of the song using audio_metadata library.
I already tried with os.rename and using os.system('ren "FILENAME" "NEWNAME"').
My Code:
import os
import audio_metadata
for x in range(len(songs)):
song = songs[x]
metadata = audio_metadata.load('%s' % song)
titlel = metadata['tags'].title
artistl = metadata['tags'].artist
title = titlel[0].strip()
artist = artistl[0].strip()
newname = '%s - %s.mp3' % (title, artist)
print("[*] %s => %s" % (song, newname))
os.rename(song, newname)
I expect:
your love.mp3 => Your Love-The Outfield.mp3
But I get this:
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process:
'your love.mp3' -> "Your Love-The Outfield.mp3"
Is it possible that audio_metadata is not properly closing the file object after reading when you call metadata = audio_metadata.load('%s' % song)? I took a quick glance at the audio_metadata source and it looks like you can pass a file object instead of a filepath string.
Try getting the metadata this way instead:
with open('%s' % song, 'rb') as f:
metadata = audio_metadata.load(f)
If you're running through the VS debugger, I've run into a similar issue, and my solution was ensuring no other instances of my exe was secretly running in the background processes (as seen in the task manager). Ran into that issue when working with FMOD, not sure if that helps?

subprocess gunzip throws decompression failed

I am trying to gunzip using subprocess but it returns the error -
('Decompression failed %s', 'gzip: /tmp/tmp9OtVdr is a directory -- ignored\n')
What is wrong?
import subprocess
transform_script_process = subprocess.Popen(
['gunzip', f_temp.name, '-kf', temp_dir],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)(transform_script_stdoutdata, transform_script_stderrdata
) = transform_script_process.communicate()
self.log.info("Transform script stdout %s",
transform_script_stdoutdata)
if transform_script_process.returncode > 0:
shutil.rmtree(temp_dir)
raise AirflowException("Decompression failed %s",
transform_script_stderrdata)
You are calling the gunzip process and passing it the following parameters:
f_temp.name
-kf
temp_dir
I'm assuming f_temp.name is the path to the gzipped file you are trying to unzip. -kf will force decompression and instruct gzip to keep the file after decompressing it.
Now comes the interesting part. temp_dir seems like a variable that would hold the destination directory you want to extract the files to. However, gunzip does not support this. Please have a look at the manual for gzip. It states that you must pass in a list of files to decompress. There is no option to specify the destination directory.
Have a look at this post on Superuser for more information on specifying the folder you want to extract to: https://superuser.com/questions/139419/how-do-i-gunzip-to-a-different-destination-directory

Python ftplib error 426 when putting files on iSeries

Have a peculiar issue that I can't seem to fix on my own..
I'm attempting to FTP a list of files in a directory over to an iSeries IFS using Python's ftplib library.
Note, the files are in a single subdirectory down from the python script.
Below is an excerpt of the code that is giving me trouble:
from ftplib import FTP
import os
localpath = os.getcwd() + '/Files/'
def putFiles():
hostname = 'host.name.com'
username = 'myuser'
password = 'mypassword'
myftp = FTP(hostname)
myftp.login(username, password)
myftp.cwd('/STUFF/HERE/')
for file in os.listdir(localpath):
if file.endswith('.csv'):
try:
file = localpath + file
print 'Attempting to move ' + file
myftp.storbinary("STOR " + file, open(file, 'rb'))
except Exception as e:
print(e)
The specific error that I am getting throw is:
Attempting to move /home/doug/Files/FILE.csv
426-Unable to open or create file /home/doug/Files to receive data.
426 Data transfer ended.
What I've done so far to troubleshoot:
Initially I thought this was a permissions issue on the directory containing my files. I used chmod 777 /home/doug/Files and re-ran my script, but the same exception occured.
Next I assumed there was an issue between my machine and the iSeries. I validated that I could indeed put files by using ftp. I was successfully able to put the file on the iSeries IFS using the shell FTP.
Thanks!
Solution
from ftplib import FTP
import os
localpath = os.getcwd() + '/Files/'
def putFiles():
hostname = 'host.name.com'
username = 'myuser'
password = 'mypassword'
myftp = FTP(hostname)
myftp.login(username, password)
myftp.cwd('/STUFF/HERE/')
for csv in os.listdir(localpath):
if csv.endswith('.csv'):
try:
myftp.storbinary("STOR " + csv, open(localpath + csv, 'rb'))
except Exception as e:
print(e)
As written, your code is trying to execute the following FTP command:
STOR /home/doug/Files/FILE.csv
Meaning it is trying to create /home/doug/Files/FILE.csv on the IFS. Is this what you want? I suspect that it isn't, given that you bothered to change the remote directory to /STUFF/HERE/.
If you are trying to issue the command
STOR FILE.csv
then you have to be careful how you deal with the Python variable that you've named file. In general, it's not recommended that you reassign a variable that is the target of a for loop, precisely because this type of confusion can occur. Choose a different variable name for localpath + file, and use that in your open(..., 'rb').
Incidentally, it looks like you're using Python 2, since there is a bare print statement with no parentheses. I'm sure you're aware that Python 3 is recommended by now, but if you do stick to Python 2, it's recommended that you avoid using file as a variable name, because it actually means something in Python 2 (it's the name of a type; specifically, the return type of the open function).

Creating an on-the-fly zip file from string content for AWS Lambda in Python

I have a Python script that creates a Lambda script in AWS along with all the policies and triggers. I use python boto3 library for that. I create the zip file for the lambda as on-the-fly rather than uploading a static zip file from the hard drive. I use this simple code from below to create my zip file. It creates the zip file without any problems and my python code uploads this zip file as a lambda script and I can view my lambda script in the AWS without any problems. But when I run my lambda script it gives me the module not found error even though I can clearly see that both the module name and the file name does exist and is view-able.
Unable to import module 'xxxx': No module named xxxx
In the file system I double click that zip file that was created by this code and see that the content is created and everything looks normal.
If I bypass zipping on the fly and create the zip statically using WinZip and let the rest of the Python & boto3 script upload this file then it works just fine.
def CreateLambdaZip(self, fileName, fileContent):
with zipfile.ZipFile('Lambda/' + fileName + '.zip', 'w') as myzipc:
myzipc.writestr( fileName + '.py', fileContent)
myzipc.close()
It kinda looks like for the zip file I'm skipping some special headers that is needed by Aws Lambda. Is there such thing? Because in the file system the zip file that is created by Python code and the other one that is created by WinZip are exactly the same. So I know there's nothing wrong with the lambda script.
Update: I'm uploading the zip file using the below code that reads the zip file which was created using the above snippet.
with open('Lambda/'+ fileName +'.zip', 'rb') as zipFile:
func = boto3.client("Lambda").create_function(
FunctionName=lambdaFunction,
Runtime='python2.7',
Role=role['Role']['Arn'],
Handler= fileName + "." + functionName,
Description=description,
Timeout=10,
MemorySize=256,
Publish=True,
Code={'ZipFile': zipFile.read()},
)
When I use zipFile.read() I get 2 different headers for the same content when I zip it using WinZip and when I zip it using Python's module.
Zip file that's created programmatically using Python
b'PK\x03\x04\x14\x00\x00\x00\x00\x00\xe4~\x01IO\x96J=Z\x07\x00\x00Z\x07\x00\x00\x19\x00\x00\x00schedule-ec2-snapshots.pyimport json\nimport boto3\nimport time\nfrom datetime import date, timedelta\n\nprint(\'Loading scheduled EC2 backup actions\')\n\ndef create_snapshots(event, context):\n """\n Lambda function that executes daily snapshots for the instances that
and zipfile created by WinZip
b'PK\x03\x04\x14\x00\x02\x00\x08\x004X\xfcH\x88\x1f\xce\xb5&\x03\x00\x00b\x07\x00\x00\x19\x00\x00\x00schedule-ec2-snapshots.py\x8dU]k\xdb#\x10|7\xf4?,\nA\x12qL\xda\x06B\r~I\x93Bh\x9b\x87&\xf4E\x15\xe1\xac[\xdb\xd7HwBw2\t\xc1\xff\xbd{+\xeb\xcb.\xb4\n\xc4\xba\xdb\xd1\xec\xce\xdc\xae\xa4\x8a\xd2T\x0e~[\xa3\'\xaa\xb9_\x1ag>\xb6\x0b\xa7\n\x9c\xac*S\x80\x14\x0e\xfd\n\xf6\x11\xbf\x9er\\b\xee\xc4dRVJ\xbb(\xfcf\x84Tz\r6\xdb\xa0\xacs\x94p\xfb\xf9\x03,E\xf6\\\x97
With the info above I was able to start the in-memory solution. The deployment of that zip file worked but I could not use the resulting function. Got error:
Unable to import module '<function-name>': No module named <function-name>
I got it to work by specifying the file permissions.
I then use the in-mem-zip to create an AWS lambda function.
Setup:
file_map is a dictionary of full_path->file_bytes.
files is a list of full_paths
def create_lambda_function(function_name, desc, role, handler, file_map, files)
zip_contents = create_in_mem_zip_archive(file_map, files)
result = lambda_code.create_function(
FunctionName=function_name,
Runtime="python2.7",
Description=desc,
Role=role,
Handler=handler,
Code={'ZipFile': zip_contents},
)
return result
def create_in_mem_zip_archive(file_map, files):
buf = io.BytesIO()
logger.info("Building zip file: " + str(files))
with zipfile.ZipFile(buf, 'w', zipfile.ZIP_DEFLATED) as zfh:
for file_name in files:
file_blob = file_map.get(file_name)
if file_blob is None:
logger.error("Missing file {} from files".format(file_name))
continue
try:
info = zipfile.ZipInfo(file_name)
info.date_time = time.localtime()
info.compress_type = zipfile.ZIP_DEFLATED
info.external_attr = 0777 << 16L # give full access
# info.external_attr = 0644 << 16L # -r-wr--r--
# info.external_attr = 0755 << 16L # -rwxr-xr-x
zfh.writestr(info, file_blob)
except Exception as ex:
logger.info("Error reading file: " + file_name + ", error: " + ex.message)
buf.seek(0)
return buf.read()
I have experienced the exactly same problem you have. My solution is do NOT use on the fly zip file. Create a real zip file and add real file into it, and it just works. You can do that even in the lambda environment, by create filepath like "/tmp/yourfile.txt" you can create temp real file when lambda execute.

Accesing a file in a ftp server

I have made a program, and there is a function where it gets a text file called news_2014.txt from a ftp server. I currently have this code:
def getnews():
server = 'my ftp server ip'
ftp= ftplib.FTP(server)
username = 'news2'
password = ' '
ftp.login(username,password)
filename = 'ftp://my ftp server ip/news/news_2014.txt'
path = 'news'
ftp.cwd(path)
ftp.retrlines('RETR' + filename, open(filename, "w").open)
I wanna make so the program displays the lines using readlines onto a Tkinter label. But if I try calling the top function, it says:
IOError: [Errno 22] invalid mode ('w') or filename: 'ftp://news/news_2014.txt'
RETR wants just the remote path name, not a URL. Similarly, you cannot open a URL; you need to pass it a valid local filename.
Changing it to filename = 'news_2014.txt' should fix this problem trivially.
The retrlines method retrieves the lines and optionally performs a callback. You have specified a callback to open a local file for writing, but that's hardly something you want to do for each retrieved line. Try this instead:
textlines = []
ftp.retrlines('RETR ' + filename, textlines.append)
then display the contents of textlines. (Notice the space between the RETR command and its argument, too.)
I would argue that the example in the documentation is confusing for a newcomer. Someone should file a bug report.

Categories