Running a python script using several computers (grid/cluster) [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Is there a way to run my python script using several computers communicating through my web server? Willing to do a lot more research if someone can point me in the right direction, but I can't seem to find any useful info on this.

A simple script can't be automatically distributed, you need to break it into components that can run independently when given a part of the problem. These components run based on commands received from a library like PyMPI, or pull them from a queuing system like http://aws.amazon.com/sqs/
This also means you can't rely on having shared local memory. Any data that needs to be exchanged must be exchanged as part of the command, stored on a shared file system or placed in a database AWS Dynamo, Redis, ect.
There are a large number of links to more resources available at https://wiki.python.org/moin/ParallelProcessing under the Cluster Computing heading.

Related

SFTP with python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Does anybody know of a methodology to get a remote file from a server using a python script, that will work suitably well for very large files?
I'm currently using Paramiko, and it works quite well, however i'm concerned that the target use case for this will be .bag files of considerable size, potentially around 10 gig. My limited understanding of this is that the downloaded file will be stored in RAM rather than on the drive, until I store it onto the drive.
Or am I loking at optimising a problem that doesn't exist?
Is there a way to save the data as it is being downloaded?
I had thought about just using a bash script, which i suppose would work, but there's a lot of additional functionality that is required. Hence my use of python.
Another option would be to use OS library to simply run SFTP.
I'd appreciate anyones thoughts on this.
Thanks!
It would seem my understanding of Paramiko and SCP / SFTP was flawed.
The default way to do it, does not store it in RAM, and i'd misread code that I'd written last year - we're all guilty of that!
A minimum working example -
def DownloadSFTP(self):
self.sftp.get(self.remotepath, self.localpath)
self.sftp.close()

How to design architecture for scalable algo trading application? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I was using Django-python framework for my web app. When I build a strategy in the UI it gets converted into a .py script file and every 5 mins (can be variable based on my candle interval) the file gets executed. I was using celery beat to invoke the file execution and the execution happens on the same machine using celery.
Here the problem is actually with scalability, if I have more strategies my CPU and memory usage were going more than 90%. How do I design the server architecture so that it can scale. Thank you.
When one Celery worker is no longer enough, you create more. This is quite easy if you are on a cloud platform where you can easily create more virtual machines.
If you can't create more, than you have to live with the current situation and try to spread the execution of your strategies across a longer period of time (throughout the day I suppose).

How to communicate between two computers in python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am attempting to make a python a "buzzer" application which will function like the buzzers in jeopardy. It will (hopefully) work by linking several computers to a main computer. When a user taps the screen of their computer, if they are the first, it will change the color of their screen and alert the main computer. Now for my question: when module would be best to like together these two computer. I would need to send the name of the computer and a timestamp and the main computer would need to respond. I was reading that something like socket might work, but i am unsure. Also, could you please give me a link to documentation on whatever module you suggest. Thanks!
You mentioned socket in your question.
https://docs.python.org/3/library/socket.html
This might be appropriate for your needs, however with multiple clients it can get quite complicated.
Also, you may want to try using email for easier connections (if you don't mind the send time of a few seconds). I know it sounds stupid, but it has worked for me in the past, with significantly less difficulty than a multi-threaded socket connection.
https://docs.python.org/3/library/email.html
https://docs.python.org/3/library/smtplib.html

How to run python web-crawler effectively [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a python web-crawler that get info and put it in SQL. Now I also have php page, that read this info from the SQL, and represent it. The problem is: in order that the crawler program to work, my computer has to stay 24/7 working. I have simple home computer- so its a problem. Does there is a diffrent way to run the web-crawler? Or do I have to run it on my pc?
If you are monitoring pages that get updated constantly then yes you will need to run it on some computer that is on 24/7. You can either use a cron job or a continuous loop (with some monitoring). If you don't want to run it on your home machine, I would recommend getting a free AWS account. You can get an EC2 micro instance which will allow you to run your python script and an S3 database instance which will allow you to store the information.
Here is a link to AWS

How to externally force a running application to perform basic tasks? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I need to develop a code that causes a currently open application (running in the background) to perform basic tasks when a certain condition is met.
Let me explain through a random example…
Imagine I am working on a Microsoft word document and I want it to print exactly every 10 minutes automatically, i.e. without having to physically click the print button. What options do I have to implement something like this? Obviously gaining access to the MS word source code is an option, but is their an easier way…perhaps using a python script?
Thanks,
David
Microsoft Office products, as well as Internet Explorer and some other programs expose a Component Object Model (COM) interface. You can find more detail about COM and how it applies to Python here (including examples). They often expose every action you could do manually in the applications, but are aimed at automation and artificial input.
For a more generic application, you could work with sending messages. There's a Windows function called 'postmessage', and another called 'sendmessage' which have several wrappers in Python.
By the way, the MS Word source code is not freely available.

Categories