I have a python script that uses a python api to fetch data from a data provider, then after manipulating the data, writes some of that data to Google Sheets (via the Google Sheets python api). So my workflow to update the data is to open the python file in VS Code, and run it from there, then switch to the spreadsheet to see the updated data.
Is there a way to call this python script from Google Sheets using Google Apps Script? If so, that would be more efficient; I could link the GAS script to a macro button on the spreadsheet.
Apps Script runs in the cloud at Google's servers rather than in your computer, and has no access to local resources such as Python scripts in your system.
To call a resource in the cloud, use the URL Fetch Service.
Currently, I am using following hack.
The appsscript behind gsheet command button writes parameters in a sheet called Parameters. And the python function on my local machine checks that Parameters sheet in that google workbook every 5 seconds. If there are no parameters, then it exists. And if there are parameters, then it executes the main code
When the code is deployed on service account, the portion which is polling remains inactive. And the appsscript directly makes a call directly to python code in service account.
There are many reasons why I need to call python function on LOCAL machine from gsheet. One reason is --- debugging is better in local machine and cumbersome on service account. Another reason is --- certain files can be put on local machines and we do not want to move these files to workspace. And gsheet needs data from these files.
This is a HACK I am using
.
I am looking for a better way that this "python code keeps polling" method.
Related
I'm using selenium in order to extract some data (as a json file). This json is the final output of the script.
I've managed to do it locally so far in two different ways:
With a local webdriver (for Chrome).
With a Docker container.
However, I need it to be accessible from anywhere, in systems that don't have either webdrivers/Docker installed.
I have thought about deploying the script to Heroku and work around that idea, but I have no idea how to handle the data in this situation.
I think that cloud services are meant for these situations.
A storage account (S3 in Amazon or Blob Storage for Azure) allows you to acces the data from anywhere, and without almost any limitation of space, using its API or by using their SDK's.
Also you can specify access policies if your data should not be publicly accessible.
As you have already developed your script into a Docker conatiner, your are ready to run it in almost every cloud provider (for example in Amazon ECR).
I currently have a python project which basically reads data from an excel file, transforms and formats it, performs intensive calculations on the formatted data, and generates an output. This output is written back on the same excel file.
The script is run using a Pyinstaller EXE which basically is packing all the required libraries and the code itself, so every user is not required to prep the environment to run the script.
Both, the script EXE and the Excel file, sit on the user's machine.
I need some suggestion on how this entire workflow could be achieved using AWS. Like what AWS services would be required etc.
Any inputs would be appreciated.
One option would include using S3 to store the input and output files. You could create a lambda function (or functions) that does the computing work and that writes the update back to S3.
You would need to include the Python dependencies in your deployment zip that you push to AWS Lambda or create a Lambda layer that has the dependencies.
You could build triggers to run on things like S3 events (a file being added to S3 triggers the Lambda), on a schedule (EventBridge rule invokes the Lambda according to a specific schedule), or on demand using an API (such as an API Gateway that users can invoke via a web browser or HTTP request). It just depends on your need.
As the title suggests, I'm looking for a way to run local python scripts from the Google Apps Scripts environment. Right now my method is to write a json file from app scripts and save it to a Google Drive directory. The local machine polls the directory via Google Drive File Stream, and runs the necessary code when the file appears in the directory. Not pretty, but it gets the job done. The main concern with this is that File Stream can be fairly latent - could be up to 15 minutes after a file is put in the directory before File Stream catches it and syncs the local machine.
Is anybody aware of other options? I could try re-writing the python code in Apps Script, though the python code relies on a few third party libraries that aren't available in JS. Another idea is setting up a flask server, but that seems like a good bit of work for the return.
I am writing an python application which reads/parses a file of this this kind.
myalerts.ini,
value1=1
value2=3
value3=10
value4=15
Currently I store this file in local filesystem. If I need to change this file I need to have physical access to this computer.
I want to move this file to cloud so that I can change this file anywhere (another computer or from phone).
If this application is running on some machine I should be able to change this file on the cloud and the application which is running on another machine which I don't have physical access to will be able to read updated file.
Notes,
I am new to both python and aws.
I am currently running it on my local mac/linux and planning on deploying on aws.
There are many options!
Amazon S3: This is the simplest option. Each computer could download the file at regular intervals or just before they run a process. If the file is big, the app could instead check whether the file has changed before downloading.
Amazon Elastic File System (EFS): If your applications are running on multiple Amazon EC2 instances, EFS provides a shared file system that can be mounted on each instance.
Amazon DynamoDB: A NoSQL database instead of a file. Much faster than parsing a file, but less convenient for updating values — you'd need to write a program to update values, eg from the command-line.
AWS Systems Manager Parameter Store: A managed service for storing parameters. Applications (anywhere on the Internet) can request and update parameters. A great way to configure cloud-based application!
If you are looking for minimal change and you want it accessible from anywhere on the Internet, Amazon S3 is the easiest choice.
Whichever way you go, you'll use the boto3 AWS SDK for Python.
I have a python script (on my local machine) that queries Postgres database and updates a Google sheet via sheets API. I want the python script to run on opening the sheet. I am aware of Google Apps Script, but not quite sure how can I use it, to achieve what I want.
Thanks
Google Apps Script runs on the server side, so it can't be used to run a local script.
you will need several changes. first you need to move the script to the cloud (see google compute engine) and be able to access your databases from there.
then, from apps script look at the onOpen trigger. from there you can urlFetchApp to your python server to start the work.
you could also add a custom "refresh" menu to the sheet to call your server which is nicer than having to reload the sheet.
note that onOpen runs server side on google thus its impossible for it to access your local machine files.