I have written a python program with "csv" file as backend. Actually the algorithm which is involved performs long and deep numerical computations.So I am not able to run it on my local computer.Last time,when I ran it,it took more than 2 hrs but didn't fully execute.Can anyone tell me Is there any way of executing the program with csv file as backend on some servers?
Or is there any other way to do it?
Also,I don't want to use spark for the purpose.
Related
Over the past few weeks I've been coding a program which runs a reddit bot locally from my machine. I've perfected it such that it does not reply to the same comment which it has replied to before, it runs quite efficiently, and in my opinion is complete.
Now, I'm looking for a way to get the program to run on a schedule. I currently have the code in google colab, and I don't know how to use google colab for this functionality.
The program does not require any local storage, it's one code file, and does not require much memory, so I wanted to ask if anyone has a resource which has an detailed tutorial accessible for beginners which I could use to host this code
Note: The code requires an installation of PRAW, in google colab I simply do !pip install PRAW if that means anything differently for what I need to do, what should I do differently?
Thank you in advance.
Google Collab is not designed for this kind of things, and most likely it cannot be used to run your app on schedule.
Probably the easiest solution is some kind of Continuous Integration tool that lets you run code remotely.
Step 1 would be to host your code to some remote code repository like GitHub. Since it most likely won't have to be interactive switching from Collab notebook to simple Python script will make your config much easier later on.
Step 2 would be connecting that repo to some CI tool. One I am familiar with that lets you run pipelines on schedule is CircleCI, with this tutorial here showing very simplistic configuration for running Python scripts from a pipeline.
So far when dealing with web scraping projects, I've used GAppsScript, meaning that I can easily trigger the script to be run once a day.
Is there an equivalent service when dealing with python scripts? I have a RaspberryPi, so I guess I can keep it on 24/7 and use cronjobs to trigger the script daily. But that seems rather wasteful, since I'm talking about a few small scripts that take only a few seconds to run.
Is there any service that allows me to trigger a python script once a day? (without a need to keep a local machine on 24/7) The simpler the solution the better, wouldn't want to overengineer such a basic use case if a ready-made system already exists.
The only service I've found so far to do this with is WayScript and here's a python example running in the cloud. The free tier that should be enough for most simple/hobby-tier usecases.
So, I've written a python program that collects data from an api every minute, stores the data in a csv file and then email it every 24 hours using the smtp library. I wrote it using Pycharm and it is running on my laptop. But, I want to know what's the easiest way to run it continuously on a cloud server(I have an azure subscription but any other way works) so that it it keeps collecting data and emailing it to me. I don't have much knowledge about cloud so any help is appreciated.
If you are using Azure I would first deploy your code on Azure serverless
Then set up a scheduler to run the code every n hours!
Here's what I want to automate:
get data from an API,
do some cleaning and wrangling in python,
save as Excel,
load to PowerBI and
produce very simple dashboard.
I've been searching for ages on Microsoft forums and here to find out if there's a way to run PowerBI via a script, like a Powershell script, for instance. I'm comfortable with writing a script for doing the first 3 steps, but I have not been able to find a solution for steps 4-5. It has to be loaded to PowerBI because that's what clients are most familiar with. The ideal result would be a script that I can then package as an executable file, send to a non-technical person so they can run it at their own leisure.
Although I'd prefer a solution in python, if it's possible in SQL or R, I'd also be very happy. I've tried all the in-PowerBI options for using python scripts, but I have found them limited and difficult to use. I've packaged all my ETL functions into a py script, then imported it to PowerBI and ran functions therein, but it's still not really fully automated, and wouldn't land well with non-technical folks. Thanks in advance! (I am working on PC, PowerBI Desktop)
I've got a following setup:
VB.NET Web-Service is running and it needs to regularly call Python script with machine learning model to predict some stuff. To do this my Web-Service generates a file with input for Python and runs Python script as a subprocess. The script makes predictions and returns them, as standard output, back to Web-Service.
The problem is, that the script requires a few seconds to import all the machine learning libraries and load saved model from drive. It's much more than doing actual prediction. During this time Web-Service is blocked by running subprocess. I have to reduce this time drastically.
What I need is a solution to either:
1. Improve libraries and model loading time.
2. Communicate Python script with VB.NET Web-Service and run Python all the time with imports and ML model already loaded.
Not sure I understood the question but here are some things I can think of.
If this is a network thing you should have python compress the code before sending it over the web.
Or if you're able to, use multithreading while reading the file from the web.
Maybe you should upload some more code so we can help you better.
I've found what I needed.
I've used web.py to convert the Python script into a Web-Service and now both VB.NET and Python Web-Services can communicate. Python is running all the time, so there is no delay for loading libraries and data each time a calculation have to be done.