I have to use The following command line tool: ncftp,
After that I need to
execute the following commands:
"open ftp//..."
"get -R Folder", but I need to do this automatically . How do I achieve this using Python or command line
You can use the Python subprocess module for this.
from subprocess import Popen, PIPE
# if you don't want your script to print the output of the ncftp
# commands, use Popen(['ncftp'], stdin=PIPE, stdout=PIPE)
with Popen(['ncftp'], stdin=PIPE) as proc:
proc.stdin.write(b"open ...\n") # must terminate each command with \n
proc.stdin.write(b"get -R Folder\n")
# ...etc
See the documentation for subprocess for more information. It can be a little tricky to get the hang of this library, but it's very versatile.
Alternatively, you can use the non-interactive commands ncftpget (docs) and ncftpput (docs) from the NcFTP package.
I recommend reading through the documentation on these commands before proceeding.
In the comments, you said you needed to get some files, delete those files and after upload some new files. Here's how you can do that:
$ ncftpget -R -DD -u username -p password ftp://server path/to/local/directory path/to/remote/directory/Folder
$ ncftpput -R -u username -p password ftp://server path/to/remote/directory path/to/local/directory/Folder
-DD will delete all files after downloading, but it will leave the directory and any subdirectories in place
If you need to delete the empty folder, you can run the ncftpget command again without -R (but the folder must be completely empty, i.e. no subdirectories, so rinse and repeat as necessary).
You can do this in a bash script or using subprocess.run in Python.
Related
In Python I'm using subprocess to call gsutil copy and move commands, but am currently unable to select multiple extensions.
The same gsutil command works at the terminal, but not in python:
cmd_gsutil = "sudo gsutil -m mv gs://xyz-ms-media-upload/*.{mp4,jpg} gs://xyz-ms-media-upload/temp/"
p = subprocess.Popen(cmd_gsutil, shell=True, stderr=subprocess.PIPE)
output, err = p.communicate()
If say there are four filetypes to move but the bucket is empty, the returning gsutil error from terminal is:
4 files/objects could not be transferred.
Whereas the error returned when run through subprocess is:
1 files/objects could not be transferred.
So clearly subprocess is mucking up the command somehow...
I could always inefficiently repeat the command for each of the filetypes, but would prefer to get to the bottom of this!
It seems, /bin/sh (the default shell) doesn't support {mp4,jpg} syntax.
Pass executable='/bin/bash', to run it as a bash command instead.
You could also run the command without the shell e.g., using glob or fnmatch modules to get the filenames to construct the gsutil command. Note: you should pass the command as a list in this case instead.
I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.
I have to use cURL on Windows using python script. My goal is: using python script get all files from remote directory ... preferably into local directory. After that I will compare each file with the files stored locally. I am able to get one file at a time but I need to get all of the files from remote directory.
Could someone please advice how to get multiple files?
I use this command:
curl.exe -o file1.txt sftp:///dir1/file1.txt -k -u user:password
thanks
I haven't tested this, but I think you could just try launching each shell command as a separate process to run them simultaneously. Obviously, this might be a bad idea if you have a large set of files, so you might need to manage that more carefully. Here's some untested code, and you'd need to edit the 'cmd' variable in the get_file function, of course.
from multiprocessing import Process
import subprocess
def get_file(filename):
cmd = '''curl.exe -o {} sftp:///dir1/{} -k -u user:password'''.format(filename, filename)
subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT) # run the shell command
files = ['file1.txt', 'file2.txt', 'file3.txt']
for filename in files:
p = Process(target=get_file, args=(filename,)) # create a process which passes filename to get_file()
p.start()
Is there a way you can use a full wget command into python?
I know that we can do this: os.system('wget %s' %%url)
But I want a full command with all of the data saved into a directory:
wget -r --accept "*.exe,*.dll,*.zip,*.msi,*.rar,*.iso" ftp://ftp.apple.asimov.com/ -P e:\e
There is the subprocess module for that (this is what os.system calls but with a bit more flexibility). Specifically, you can use the call function in the following way to execute any command
import subprocess
subprocess.call(r'wget -r --accept "*.exe,*.dll,*.zip,*.msi,*.rar,*.iso" ftp://ftp.apple.asimov.com/ -P e:\e', shell=True)
Alternatively, you can pass individual arguments as a list omitting the shell flag:
subprocess.call(['wget', '-r', ...])
Also check the return value for errors. For details, see the standard library documentation on subprocess.
I am using pexpect.run to execute a command. See below:
cmd = "grep -L killed /dir/dumps/*MAC-66.log"
output = pexpect.run(cmd)
When I run this, output equals to:
grep: /dir/dumps/*MAC-66.log: No such file or directory
But when I run the same command in my shell, it works, everytime. I don't see the problem. Any help is appreciated! Does pexpect.run require the command to be split in some fancy way?
Your shell is interpreting the glob, pexpect is not. You could either use python's glob.glob() function to evaluate the glob yourself, or run it through your shell, for example:
cmd = "bash -c 'grep -L killed /dir/dumps/*MAC-66.log'"
Also, if all you're after is output of this command, you ought to check out the subprocess module.