Communicate between two python scripts on two systems over wifi [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am running a python code on a wipy board (micropython environment) and a python code on an embedded linux system. The wipy board is connected to the linux system with wifi. I was wondering how to create bi-directional communication for passing data between the two independent scripts
I have looked into both threading and multiprocessing but I do not know if either would be appropriate for this use so I am just looking for a conceptual answer so I can find someplace to start

Threading and multiprocessing has nothing to do with your problem. Threading and multiprocessing is all about running multiple programs or part of programs on the same system.
What you want is to use the network to send/receive messages. Please read the
WiPy documentation:
About your WIFI connections
About TCP sockets
The part about the TCP sockets should be exactly what you need. The part about WIFI connections will tell you how to adjust the WIFI settings of your board.
The same goes for your embedded Linux system. Look for documentation on your system and check for the chapter about sockets. I would open a server on one of these devices (or both) and use the other devices to connect to the server and get the information the system needs. It might be a good idea to use the device with more resources as the server.

Related

Improving IO performance on a mounted directory? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm trying to optimize a Python test that involves writing some 100k files to a mounted directory using SSH & a simple Bash command.
I'm rather inexperienced at this so I need some advice on how to minimize IO time.
Basically the Python script mounts a directory on a remote server (let's call it %MOUNTED_DIRECTORY% for that matter), then SSH into the remote host and calls the following bash command on that host:
for number in `seq 1 100000`; do touch %MOUNTED_DIRECTORY%/test_file$number; done
I find that a lot of time on spent on this process, waiting for the creation of the files to finish. I need the files to be created before I continue, so I can't do anything in the meantime - I have to speed up the process.
Also, when the directory is mounted it takes a lot more time to finish than when it's not, so that's why I'm in this problem in the first place.
I thought about multithreading or multiprocessing but they don't seem to be efficient, either because I'm doing something wrong or because the command is actually on a remote host and is creating the files with Bash, not Python?
With xargs:
seq 1 100000 | sed 's|^|%MOUNTED_DIRECTORY%/test_file|' | xargs touch
This passes as many names as possible to each touch command.

Python program to grab a file off of Raspberry Pi [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am looking to automate file retrieval from a python program that gets a file from my Raspberry Pi and returns it to my local PC. I have tried SSH, FTP & SCP but can't get any to work and run into connection problems in my Python program. Any one have a quick code snippet. below is the code I think should work but getting an error
From PI: Raspberry PI Zero W
Receiving PC: Windows 10 running a pycharm python program
-IDE: Pycharm
NOTE: Connected to same network, ssh, putty, cmd line SCP, remote desktop work to PI but I can't do the same by just running a python program to get a file.
Filename: testfile.jpg
Pi: Directory. /home/pi/testfile.jpg
Open to any method to retrieve file as long as it can do it automagically?
Ideas?
Thank you!
Code failing with Cryptography deprecation error
Code won't make simple connection - feel its on my local pC?
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.Connect(ipadd.re.ss)
CAN'T GET PAST HERE ERROR BELOW
Error: CryptographyDeprecationWarning: encode_point has been deprecated on EllipticCurvePublicNumbers and will be removed in a
future version. Please use EllipticCurvePublicKey.public_bytes to
obtain both compressed and uncompressed point encoding.
m.add_string(self.Q_C.public_numbers().encode_point())
Have you heard of Paramiko? It's an SSH client for Python.
You could do something like:
client.connect(...)
i, o, e = client.exec_command('cat /home/pi/testfile.jpg')
with open('testfile.jpg', 'wb') as f:
for line in o:
# these are lines in the file.
f.write(line)

Python: Copy a CSV file from one RPi to another [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have 2 Raspberry Pi's 3, running Ubuntu Mate.
On each RPi there is a CSV file to be read or copied ( depends if it master or not ).
A python code runs on RPI #1, and need to copy files from RPi#2 ( and read both as local file).
How can it be done usinh SSH ?
Too many options. But in general I'd shell out unless you have a good reason not to:
import subprocess
result = subprocess.run(['ssh', 'dude#otherpi', 'cat /var/lol/cats.csv'], stdout=subprocess.PIPE)
result.check_returncode()
cats_csv = result.stdout
We're telling Python to run this command: ssh dude#otherpi "cat /var/lol/cats.csv". So an ssh process will connect to dude#otherpi and run the command cat /var/lol/cats.csv on the remote. You can try this by just running the line in your shell. The output of this command is captured by Python, we've to configured this with stdout=subprocess.PIPE. The call to check_returncode is just there to abort in case something goes wrong, like connection error or file not found.
Instead of immediatley capturing the entire CSV, you could also copy it over then open it locally. This would allow handling big files. Then the command would be ['rsync', 'dude#otherpi:/var/lol/cats.csv', '/tmp/cats.csv']. Use scp if rsync is not available.
Another useful way, since it is not a big file- is copying it to local RPi.
result=subprocess.run(['scp','guy#192.168.2.112:/home/guy/PythonProjects/schedule.csv','/home/guy/schedule_192.168.2.112.csv'],stdout=subprocess.PIPE)

Remote SSH server accessing local files [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Is it possible to access local files via remote SSH connection (local files of the connecting client of course, not other clients)?
To be specific, I'm wondering if the app I'm making (which is designed to be used over SSH, i.e. user connects to a remote SSH server and the script (written in Python) is automatically executed) can access local (client's) files. I want to implement an upload system, where user(s) (connected to SSH server, running the script) may be able to upload images, from their local computers, over to other hosting sites (not the SSH server itself, but other sites, like imgur or pomf (the API is irrelevant)). So the remote server would require access to local files to send the file to another remote hosting server and return the link.
You're asking if you can write a program on the server which can access files from the client when someone runs this program through SSH from the client?
If the only program running on the client is SSH, then no. If it was possible, that would be a security bug in SSH.

Linux: Warn Python Script of Restart [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have a Raspberry Pi which uses a USB wireless adapter and wicd-curses. A Python script runs in the background which uses a WebSocket. When I call sudo reboot, my Python script gets the signal to restart (with SIGTERM) about 20 seconds later. (I don't know why the computer takes 20 seconds to restart anyway. I don't remember it being this way before installing wicd-curses.)
By the time 20 seconds has passed, wicd-curses has already disconnected from the wireless network, meaning my Python script cannot properly close the WebSocket connection. So the core of my question is this: what Python commands are available to me to ensure that my script is notified of the system shutdown earlier than it is now? Is there any sort of event for which I can listen? Preferably, I want to be able to run the script on demand (python myscript.py) without the use of a daemon or service or whatever it might be called in the Linux world. Thank you.

Categories