I am doing research that requires me to run multiple experiments with many permutations of parameters. My main issue is that I would like to free up my main machine for my daily tasks and offload my experiments to other machines (I have two extra laptops).
Currently I am using redis-rq to queue and run tasks on those "remote servers". Once redis is running on my main machine and my remote servers, I simply run my code which queues the tasks to the specific redis-rq port on my remote (via ssh). This seems to work fine except that I need to make sure to push my code to my remote server before doing this otherwise the task will fail. The remote will essentially have old code on it.
I have two questions:
Does this pattern make sense or is there a better way for me to offload tasks to remote servers?
Is there a way I can ensure the remote always has up-to-date code when I start queueing tasks? (currently thinking of including a function in my code that will copy the current directory to the remote via scp)
Thanks for your help with this,
NB: my code is all in python and would prefer to keep it that way
Is it possible to create a hyperlink/button that calls a bash/python script on the user/local machine. I did search on the topic but there is a lot of discussion about the opening a port to a server (even the local port) but I don't want to open a port but execute everything locally. Is this even possible?
Thanks
No. That is not possible. Nor desirable, due to the security implications.
I am writing a Python script that creates and runs several VMs via virsh for the user. Some of the configuration has to be done by executing commands inside the VM which I would want to do automatically.
What would be the easiest way to get remote shell access in Python? I am considering the following approaches:
To use the virsh console command as a sub-process and do I/O to it.
To bring up an SSH session to the VM. I can configure the VM before it boots by editing its file system so I know its target IP address.
Any better API for doing this. RPC?
I need to get the return values for commands so I know if they executed correctly or not. For that matter I need to be able to detect when a program I invoke has finished. Options #1 and #2 rely on scraping the output and that gets complex.
Any suggestions much appreciated.
I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)
I'm writing a Python script to access all computers on the network, log in to them and read some log files. I don't want to use something as low-level as socket, but I can if I must. I realize that my problem is similar to this question, but not the same.
Are there any modules for accessing external Windows machines?
Has anyone done anything like this before?
I'm specifically looking to log into Windows 7 machines, not unix.
Let's also assume that each computer I want to log into has Remote Desktop installed and enabled. I'm also not worried about network security or encryption because these files are not confidential. Windows machines don't have SSH installed on the by default do they?
There has to be something on the other side for you to talk to. This limits you to either setting up a "server" on each machine, installing a real server (i.e. sshd), building a "server" yourself and installing it, or using a built in and active feature of the OS.
Based upon this, what kind of system do you want to set up on these machines? What does it need to do? Just read the contents of a prespecified file list? Will that list change?
One solution is to turn on telnet, and use paramiko or twisted to
talk across it. This isn't very secure of course
Next up, set up a samba share, and access the folder remotely. This
is also insecure, though less so than telnet
You could find a ssh daemon port and run that, if you are so inclined
Psexec from sysinternals might work
Use twisted to build a server app with the features you need
Use ncat to listen on a port and spawn a cmd prompt
Be aware that most of the solutions for accessing windows remotely are... poor. The best solution is probably to roll your own, but that is hard work and you will probably make mistakes.
Also, Windows 7 is not exactly multi-user friendly. Individual processes can run as separate users, but the OS does not support having multiple users logged in at the same time. Someone is going to be the "user" and everyone else is just a process with a different credential set.
This is more an artificial limitation on M$'s part than anything technical. To see this in action, try to log in with RDP while a user is logged in locally. Fun times.
Per your edit, the easiest thing to do is just set up a samba share on the box.
After this share is set up:
with open(r'\\myCompNameOrIP\C\windows\logs\logfile.txt','rb') as logfile:
loglines = logfile.readlines()
Or you can use the gencat sample found here. Just give it r'\\myCompNameOrIP\C\windows\logs\*.txt' as the search path and watch the magic.
From Ubuntu I use samba:
In Bash:
gvfs-mount smb://them/folder
Here I give name, domain and password
Then in python:
folder = '/home/me/.gvfs/folder on them'
using the os module I read folders and files inside.
I am working in a small business environment.
Why not have each of the computers send the log file to the central computer?