Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a python web-crawler that get info and put it in SQL. Now I also have php page, that read this info from the SQL, and represent it. The problem is: in order that the crawler program to work, my computer has to stay 24/7 working. I have simple home computer- so its a problem. Does there is a diffrent way to run the web-crawler? Or do I have to run it on my pc?
If you are monitoring pages that get updated constantly then yes you will need to run it on some computer that is on 24/7. You can either use a cron job or a continuous loop (with some monitoring). If you don't want to run it on your home machine, I would recommend getting a free AWS account. You can get an EC2 micro instance which will allow you to run your python script and an S3 database instance which will allow you to store the information.
Here is a link to AWS
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Terraform manages all my virtual machines on GCP.
There is a part of them where I would like to add a label that describes who is allowed to be shut down after 18:00 by a Python script.
Therefore, I need a Python script that stops a virtual machine with a specific label, a bucket to store the script, cloud functions to run the script, and a scheduler to trigger the script at a specific time.
Or, If GitHub actions could do the same
I posted an answer to this on the Google Community forums a while back, so here it is:
Below is the python code, so you will have to turn it into a Cloud Function and then use Cloud Scheduler to run it on a schedule. You can use something similar to start your instances as well.
You will need google-api-python-client==2.31.0
import json
import googleapiclient.discovery
compute = googleapiclient.discovery.build('compute', 'v1')
result = compute.instances().list(project='mygcpproject-123', zone='northamerica-northeast2-a').execute()
#print(json.dumps(result, indent=2))
#print(len(result)) #returns: 4 (id, items, selfLink, kind)
vms_list=result['items']
num_vms=len(vms_list)
# loop through your instances list and find the ones with 'save_money' in the output (which is your label's key name -- it doesn't matter what the value is)
for i in range(num_vms):
if "save_money" in json.dumps(vms_list[i]):
print("stopping {}".format(vms_list[i]['name']))
compute.instances().stop(project='mygcpproject-123', zone='northamerica-northeast2-a', instance=vms_list[i]['name']).execute()
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I programmed a Python program that can take a photo as an input and find faces (Face Detection). But I want to implement it in a cloud server.
My program should go and read the python code from the server and return a valuel.
Take your python program and make it an endpoint on a flask or django server.
Or if you don't want to host your server, put your program into for example an AWS Lambda and make it access it with API Gateway (https://docs.aws.amazon.com/apigateway/latest/developerguide/getting-started-with-lambda-integration.html)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Is there any way, through code, to prevent your app from idling? I'm thinking something like what is being done here. How to do this through code, but in python/django as that is what my app is done in, i.e. what is the equivalent in python to Node's setInterval. What are the disadvantages of using a pinging service over doing it through code(other than if the site goes down your app wont get pinged). Can you recommend a good pinging service? Thanks in advance.
You cannot do it from within the application. It needs to be woken up "externally". This can be just another Heroku application but that kind of defeats the purpose. There are some Heroku add ons that allow you to monitor your website thus also pinging it in regular intervals.
For example:
https://elements.heroku.com/addons/newrelic
With this your website would always stay awake.
The article you linked is imo more complicated than necessary. If you own a VPS you could host your website there. You could also write a curl command save it as a shell script and call it from cron (https://help.ubuntu.com/community/CronHowto).
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
i have Ubuntu Server machine running Minecraft Server to play with my friends and stuff, but i am bored and i want to make simple python app in console to display server status, like current number of connected players, difficulty, world seed, RAM used and stuff... my question is, if its possible to get this data from MC server application, and if then how. Mainly i am interested about live player data (count, names, probably posotion), chat feed etc.. I didnĀ“t tried anything because i have no idea where to start :D
thanks
-N
Maybe you were looking for something like this:
https://github.com/Dinnerbone/mcstatus
Edit:
The module is named "mcstatus" and provides a "A Python class for checking the status of an enabled Minecraft server".
You can install it via pip: pip install mcstatus
It provides three modes of access (query, status and ping), the differences of which are listed on the page of this link.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Is there a way to run my python script using several computers communicating through my web server? Willing to do a lot more research if someone can point me in the right direction, but I can't seem to find any useful info on this.
A simple script can't be automatically distributed, you need to break it into components that can run independently when given a part of the problem. These components run based on commands received from a library like PyMPI, or pull them from a queuing system like http://aws.amazon.com/sqs/
This also means you can't rely on having shared local memory. Any data that needs to be exchanged must be exchanged as part of the command, stored on a shared file system or placed in a database AWS Dynamo, Redis, ect.
There are a large number of links to more resources available at https://wiki.python.org/moin/ParallelProcessing under the Cluster Computing heading.