When I execute something like:
run('less <somefile>')
Within fabric, it prepends the lines with Out: and interacting with it doesn't work as expected.
If I run it with:
run('cat <something>', pty=False)
The output isn't prepended with anything and I can actually pipe that into less locally, like:
fab less | less
However I'm not sure if that's recommended since I feel it may be taxing on the remote resource since cat will continually be piping back through ssh. Also when I quick less before the whole file is cat'd (it could be over 1GB), I get a broker pipe error.
What would be the recommended way to facilitate this? Should I just use ssh directly like:
ssh <remote host> less <something>
If you're doing interactive work on the remote host, then maybe just using SSH is fine. I think fabric is mostly useful when automating actions.
Related
I am using fabric right now to upload a large directory of files to ~100 different servers. These servers don't have rsync installed, so that's out of the question. Right now, I am using the upload_project method to accomplish this which is working well on the test servers I have, but I have heard that ftp (even if it's sftp) isn't allowed on some of these servers and that I might also need to limit the bandwidth of the transfer. To avoid these problems, I am trying to use scp, however, I am having some issues. I originally thought I could just go by the code in the rsync_project method and do something like local("scp -r %s %s" % (local_str, remote_str). However, scp still wants a password. So, I echoed the env.password to scp, which worked, but then it needed me to type yes to accept the key. I know I could just echo the password and echo yes to any key prompt, but I was wondering if there was some other way to accomplish this without all of the prompts. Also, sadly, the version of fabric on the server (which I cannot update) is a bit behind (1.6ish) and doesn't have the context-lib functionality that can handle prompts.
I was wondering if there was some other way to accomplish this without all of the prompts
The answer is not exactly related to Fabric. In the case of your scp command, you should look into the StrictHostKeyChecking option, to skip the known hosts key check. Example usage:
scp $src $dest -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null
It sounds like you've addressed the issue with the password prompt; was there anything else you were trying to solve here?
So I'm writing a command line utility and I've run into a problem I'm curious about.
The utility can be called with file arguments or can read from sys.stdin. Originally I was using sys.stdin.isatty() to figure out if data is being piped in, however, I found that if I call the utility remotely via ssh server utility, sys.stdin.isatty() will return false, despite the fact that there is no actual data being piped in.
As a workaround I'm using - as the file argument to force reading from stdin (eg: echo "data_here" | utility -f -), but I'm curious to know if there's a reliable way to tell the difference between a pipe getting data from a process and a pipe that's only open because the call is via ssh.
Systems programming is not my forte, so I'm grateful for any help I can get from you guys.
You can tell if you're being invoked via SSH by checking your environment. If you're being invoked via an SSH connection, the environment variables SSH_CONNECTION and SSH_CLIENT will be set. You can test if they are set with, say:
if "SSH_CONNECTION" in os.environ:
# do something
Another option, if you wanted to just stick with your original approach of sys.stdin.isatty(), would be to to allocate a pseudo-tty for the SSH connection. Normally SSH does this by default if you just SSH in for an interactive session, but not if you supply a command. However, you can force it to do so when supplying a command by passing the -t flag:
ssh -t server utility
However, I would caution you against doing either of these. As you can see, trying to detect whether you should accept input from stdin based on whether it's a TTY can cause some surprising behavior. It could also cause frustration from users if they wanted a way to interactively provide input to your program when debugging something.
The approach of adding an explicit - argument makes it a lot more explicit and less surprising which behavior you get. Some utilities also just use the lack of any file arguments to mean to read from stdin, so that would also be a less-surprising alternative.
According to this answer the SSH_CLIENT or SSH_TTY environment variable should be declared. From that, the following code should work:
import os
def running_ssh():
return 'SSH_CLIENT' in os.environ or 'SSH_TTY' in os.environ
A more complete example would examine the parent processes to check if any of them are sshd which would probably require the psutil module.
I need to make a python script that will do these steps in order, but I'm not sure how to go about setting this up.
SSH into a server
Copy a folder from point A to point B (cp /foo/bar/folder1 /foo/folder2)
mysql -u root -pfoobar (This database is accessible from localhost only)
create a database, do some other mysql stuff in the mysql console
Replaces instances of Foo with Bar in file foobar
Copy and edit a file
Restart a service
The fact that I have to ssh into a server, and THEN do all of this is really confusing me. I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
Look into Fabric more. It is still probably what you want.
This page has a lot of good examples.
By "context" I'm assuming you want to be able to cd into another directory and run commands from there. That's what fabric.context_managers.cd is for -- search for it on that page.
Sounds like you are doing some sort of remote deployment/configuring. There's a whole world of tools out there to professionally set this up, look into Chef and Puppet.
Alternatively if you're just looking for a quick and easy way of scripting some remote commands, maybe pexpect can do what you need.
Pexpect is a pure Python module for spawning child applications; controlling them; and responding to expected patterns in their output.
I haven't used it myself but a quick glance at its manual suggests it can work with an SSH session fine: https://pexpect.readthedocs.org/en/latest/api/pxssh.html
I have never used Fabric.
My way to solve those kind of issues (before starting to use saltstack) it was using pyexpect, to run the ssh connection, and all the commands that were needed.
maybe the use of a series of sql scripts to work with the database (just to make it easier) would help.
Another way, since you need to access the remote server using ssh, it would be using paramiko to connect and execute commands remotely. It's a bit more complicated when you want to see what's happening on stdout (while with pexpect you will see exactly what's going on).
but it all depends from what you really need.
What I'm wanting to do, is run a bash script on another computer, from python. My first thought was to SSH into my remote computer with python, and run it that way. Is there any other way of running a remote bash script in python?
Btw, the remote bash script is on the same network as my python script.
As others have said, ssh is the normal way to do this. If you have need to detach the other program (you don't want to just leave the ssh link open), have a look at screen. There are certainly more efficient ways to do this, but you'll have to give more information to get a more detailed answer.
I want to run a python script on my server (that python script has GUI). But I want to start it from ssh. Something like this:
ssh me#server -i my_key "nohup python script.py"
... > let the script run forever
BUT it complains "unable to access video driver" since it is trying to use my ssh terminal as output.
Can I somehow make my commands output run on server machine and not to my terminal... Basically something like "wake-on-lan functionality" -> tell the server you want something and he will do everything using its own system (not sending any output back)
What about
ssh me#server -i my_key "nohup python script.py >/dev/null 2>&1"
You can use redirection to some remote logfile instead of /dev/null of course.
? :)
EDIT: GUI applications on X usually use $DISPLAY variable to know where they should be displayed. Moreover, X11 display servers use authorization to permit or disallow applications connecting to its display. Commands
export DISPLAY=:0 && xhost +
may be helpful for you.
Isn't it possible for you to rather use python ssh extension instead of calling external application?
It would:
run as one process
guarantee that invocation will be the same among all possible system
lose the overhead from "execution"
send everything trough ssh (you won't have to worry about input like "; possibly local executed command)
If not, go with what Piotr Wades suggested.