SSH into server using python script - python

I am an astronomy student working with large data set. I have 80 TB of .fits file on a supercomputer that I am trying to process using python script. I could process the data stored on the supercomputer by submitting a job(which stays in queue
for ages) or I could process the data in my local desktop(without downloading all the data). However, I cannot download all(80TB) data to my local desktop due to storage issue. I was wondering if there is a way to run the processing python script on my local desktop but it reads data from the supercomputer using secure shell.
Thanks.

Check out Perform commands over ssh with Python and Download files over SSH using Python How to list all the folders and files in the directory after connecting through sftp in python
You could put it in a loop and get a file > Parse it > get next file and so on.

Related

Prompt "server" PC to run python file

We have developed code to compile calculations into tex files and then convert to pdf's. We are now trying to instead of generating the pdf on local PC's to simply send the tex file to a hosting PC and prompting this PC to run a python file (using its own Python.exe instance) that will then generate the PDF and send to a server folder.
It might seem like a silly approach (since it can be generated on the local PC) but we are trying to remove the step where people have to install software like MikTex and Strawberry on local PC's completely and not use an online converter like Overleaf.
Is there a way that this is possible?

How to always run specific python script on windows server manager 2016

i am new to the community and new in using servers and could use some help.
I am trying to setup an automatic JSON parser to another server using http post calls. The idea is as follows:
I manually put JSON files into a folder Input on the server
A python script that is always running on the server reads the files located within the folder
It reads the JSON files, posts all objects to another server, and moves the files to a "Processed" folder one file at a time.
I have been given a Windows Server, with Windows Server Manager 2016, and have managed to do the following:
installed python 3.8.2. on the windows server
able to run a python script using powershell
Installed NSSM to create a windows service
Now the windows server manager says i cannot resume or start the service that i tried to install via NSSM.
I am very new to servers, as well as python itself. Can somebody help me to get a python script running 24/7 on a windows server with windows server manager 2016?
Edit:
I managed to create a python script that can read files, upload them and move them to a processed folder, but i still have to run it by myself while i want it to always run on the server

transfer files between local machine and remote server

I want to make access from remote ubuntu server to local machine because I have multiple files in this machine and I want to transfer it periodically (every minute) to server how can I do that using python
Depending on your local machine OS and network setup, I would recommend the following:
File transfers
Based on the file size, if its a small copy, I would use scp (secure copy). This is because of the simplicity of the command.
In most use cases however I would use rsync because of its great capabilities, most importantly the ability to handle failed partial transfers. It works by analysing the differences between the source and destination. It has pretty much every preference under the sun available (overwriting, deltas, etc.)
Note that when using these commands in an automation script over a longer period of time, you'll probably want to set up a static IP or DDNS for your remote machine.
Python
To run shell commands in a Python script, use pexpect. Its built around the original C based expect and it's fantastic. I used it the other day to transfer a folder from a dev computer to a number of different Raspberry Pis remotely at the same time. Check out the documentation here: https://pexpect.readthedocs.io/en/stable/
Automation
As for automation, it really depends on how you want it set up. If you want the python script responsible for transferring data to be called when you want to transfer data you could look into crontab. It's very well known by admins so easy to Google.
Alternatively, if this is part of a Python app you could have the app running in the background and sleeping (time.sleep() or a time elapsed function to check) between transfers. If you needed to do other things in the same Python app then you could stick the whole transfer and sleep part into a thread (also easily implemented in Python).
I hope this helps, let me know if you want details elaborated.
You can easily transfer files between local and remote or between two remote servers. If both servers are Linux-based and require to transfer multiple files and folder using single command, however, you need to follow up below steps:
User from one remote server should have access to another remote server to corresponding directory you want to transfer the file.
You might need to create a policy or group and assign to server list to that group
which you want to access and assign the user to that group so 2 different remote
server can talk to each other.
Run the following scp command:-
scp [options] username1#source_host:directory1/filename1
username2#destination_host:directory2/filename2

How can I automate remote deployment in python?

I want to automate the remote deployment which currently I am doing manually.
The process includes
Make the tar ball from certain folders
SFTP to the remote server
Rename the old folders
Untar the new tar file
Restart apache
The remote system is on the intranet and has no access to the outside internet
I want to know how can I transfer the file from my python script and then when the transfer is complete then log into ssh and do stuff. I am confused about how can I achieve that. On localhost and I can do all that but how can I do that on a remote host?
For simple&dirty work you can use fabric (This by no means say that you cannot use fabric to build serious product)
For heavy configuration routines, you'd better pick a CMS (e.g., ansible)

update file on ssh and link to local computer

I may be being ignorant here, but I have been researching for the past 30 minutes and have not found how to do this. I was uploading a bunch of files to my server, and then, prior to them all finishing, I edited one of those files. How can I update the file on the server to the file on my local computer?
Bonus points if you tell me how I can link the file on my local computer to auto update on the server when I connect (if possible of course)
Just use scp and copy the file back.
scp [user]#[address]:/[remotepath]/[remotefile] [localfolder] if you want to copy the file on the server back to your local machine or
scp [localfolder]/[filename] [user]#[address]:/[remotepath] in case you want to copy the file again to the server. The elements in [] have to be exchanged with actual paths and or file names. On the remote it has to be the absolute path and on the local machine it can be absolute or relative.More information on scp
This of course means that the destination file will be overwritten.
Maybe rsync would be an option. It is capable to synchronize different folders.
rsync can be used to sync folders over the network and is capable to be combined with ssh.
Have you considered Dropbox or SVN ?
I don't know your local computer OS, but if it is Linux or OSX, you can consider LFTP. This is an FTP client which supports SFTP://. This client has the "mirror" functionality. With a single command you can mirror your local files against a server.
Note: what you need is a reverse mirror. in LFTP this is mirror -r

Categories