Python: Copy a CSV file from one RPi to another [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have 2 Raspberry Pi's 3, running Ubuntu Mate.
On each RPi there is a CSV file to be read or copied ( depends if it master or not ).
A python code runs on RPI #1, and need to copy files from RPi#2 ( and read both as local file).
How can it be done usinh SSH ?

Too many options. But in general I'd shell out unless you have a good reason not to:
import subprocess
result = subprocess.run(['ssh', 'dude#otherpi', 'cat /var/lol/cats.csv'], stdout=subprocess.PIPE)
result.check_returncode()
cats_csv = result.stdout
We're telling Python to run this command: ssh dude#otherpi "cat /var/lol/cats.csv". So an ssh process will connect to dude#otherpi and run the command cat /var/lol/cats.csv on the remote. You can try this by just running the line in your shell. The output of this command is captured by Python, we've to configured this with stdout=subprocess.PIPE. The call to check_returncode is just there to abort in case something goes wrong, like connection error or file not found.
Instead of immediatley capturing the entire CSV, you could also copy it over then open it locally. This would allow handling big files. Then the command would be ['rsync', 'dude#otherpi:/var/lol/cats.csv', '/tmp/cats.csv']. Use scp if rsync is not available.

Another useful way, since it is not a big file- is copying it to local RPi.
result=subprocess.run(['scp','guy#192.168.2.112:/home/guy/PythonProjects/schedule.csv','/home/guy/schedule_192.168.2.112.csv'],stdout=subprocess.PIPE)

Related

Running multiple python files on aws 24x7 [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I'm running 6 python files on aws EC2 ubuntu instance. they are telegram bots.
they runs fine, but once a while one of the files stops running. and I've to find the screen and run it again.
is there any way to keep this script running reliably?
if I reboot ubuntu using crontab every day, would it automatically run all the .py files afterwords?
You can make a shell script to check if a specific python file is running or not and if not start that file. Once that shell script is done and working, you can make a cron job for that shell script to check every x minutes or however you want.
Single Application Lookout
The following Bash script checks if a specified python script is running or not and if that is not running then it starts the python script
#!/bin/bash
current_service_name="YOU_SCRIPT_NAME.py"
if ! pgrep -f "python3 ${current_service_name}"
then
python3 ${current_service_name}
fi
Multiple Application Lookout
#!/bin/bash
all_services=("YOUR_SCRIPT_NAME_1.py" "YOUR_SCRIPT_NAME_2.py" "YOUR_SCRIPT_NAME_3.py" "YOUR_SCRIPT_NAME_4.py" "YOUR_SCRIPT_NAME_5.py" "YOUR_SCRIPT_NAME_6.py")
for index in ${!all_services[#]} ; do
current_service_name="${all_services[$index]}"
if ! pgrep -f "python3 ${current_service_name}"
then
python3 ${current_service_name}
fi
done

how to test if a python script is running successfuly through a batch file [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 3 years ago.
Improve this question
I have 2 python scripts that do some geoprocessing, one is depending on the other and i run them via a batch file. At the end of the execution, i send email using powershell script for feedback.
I just want to receive emails when there is an error and not everytime the script is running.
I want a way to test if the batch file has run successfuly the python scripts.
Any Ideas?
You have to control the output of your python script depending on the success of the script. For example via exit(1).
I would probably start the python scripts via powershell. You are able to get the exit code of applications via $LASTEXITCODE.
python code (script does not run as intended):
exit(1)
PS code:
#run python script here
if ($lastexitcode -gt 0){
#send mail
}

Improving IO performance on a mounted directory? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm trying to optimize a Python test that involves writing some 100k files to a mounted directory using SSH & a simple Bash command.
I'm rather inexperienced at this so I need some advice on how to minimize IO time.
Basically the Python script mounts a directory on a remote server (let's call it %MOUNTED_DIRECTORY% for that matter), then SSH into the remote host and calls the following bash command on that host:
for number in `seq 1 100000`; do touch %MOUNTED_DIRECTORY%/test_file$number; done
I find that a lot of time on spent on this process, waiting for the creation of the files to finish. I need the files to be created before I continue, so I can't do anything in the meantime - I have to speed up the process.
Also, when the directory is mounted it takes a lot more time to finish than when it's not, so that's why I'm in this problem in the first place.
I thought about multithreading or multiprocessing but they don't seem to be efficient, either because I'm doing something wrong or because the command is actually on a remote host and is creating the files with Bash, not Python?
With xargs:
seq 1 100000 | sed 's|^|%MOUNTED_DIRECTORY%/test_file|' | xargs touch
This passes as many names as possible to each touch command.

Python program to grab a file off of Raspberry Pi [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am looking to automate file retrieval from a python program that gets a file from my Raspberry Pi and returns it to my local PC. I have tried SSH, FTP & SCP but can't get any to work and run into connection problems in my Python program. Any one have a quick code snippet. below is the code I think should work but getting an error
From PI: Raspberry PI Zero W
Receiving PC: Windows 10 running a pycharm python program
-IDE: Pycharm
NOTE: Connected to same network, ssh, putty, cmd line SCP, remote desktop work to PI but I can't do the same by just running a python program to get a file.
Filename: testfile.jpg
Pi: Directory. /home/pi/testfile.jpg
Open to any method to retrieve file as long as it can do it automagically?
Ideas?
Thank you!
Code failing with Cryptography deprecation error
Code won't make simple connection - feel its on my local pC?
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.Connect(ipadd.re.ss)
CAN'T GET PAST HERE ERROR BELOW
Error: CryptographyDeprecationWarning: encode_point has been deprecated on EllipticCurvePublicNumbers and will be removed in a
future version. Please use EllipticCurvePublicKey.public_bytes to
obtain both compressed and uncompressed point encoding.
m.add_string(self.Q_C.public_numbers().encode_point())
Have you heard of Paramiko? It's an SSH client for Python.
You could do something like:
client.connect(...)
i, o, e = client.exec_command('cat /home/pi/testfile.jpg')
with open('testfile.jpg', 'wb') as f:
for line in o:
# these are lines in the file.
f.write(line)

Run python in terminal and don't terminate when terminal is closed [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need to make static website. So I connected via ssh to some local server, where I want to make a static website. Then I used python to make it work:
$ python -m http.server 55550
But if I close terminal, then python program is terminated. I want to shut down my computer, but I want to let this process running on that local server, so other people could still access that website.
How can I do this? After that, how should I terminate that process later?
Thanks for any help
Use the nohup shell builtin:
nohup python -m http.server 55550
To terminate the process, simply kill it using the kill command, just like any other process.
you can also launch it in background
python -m http.server 55550 &
then enter
disown
to detach the process to the current term
screen
python -m SimpleHTTPServer 55550 &
press ctrl+a, then press d
exit
shutdown your computer
...
start your computer
ssh your server
screen -r

Categories