I found a python package that I want to use within my angular based firebase project (it does some complex analysation of a text file).
What is the best way to use it? I see the following options:
Own docker container with flask, running in Cloud run (e.g. like that) - pass file in AJAX request, return JSON result.
Downsides: own endpoint that has to be noted down somewhere in main project, own repository: not in the other Node.js cloud functions
Call python script in Node.js cloud function (like this)
a little bit hacky piping file and log outputs as strings, probably not easy to get all the python dependencies working (would this work at all?)
Totally independent microservice just taking the file, analysing it and sending back JSON. Maybe as AWS lambda?
again "cut out" of the main project
I'd love to have a "clean and easy" integration in my existing Node.js cloud functions that I use in firebase. The firebase CLI can then take over all the URL endpoint handling etc. But I don't see a way to do this.
A kind of better capsulated approach would be to go with 1. or 3. and have a Node.js cloud function calling the endpoint. With that I would also not have the client code call the endpoint and have better possibilites in configuring it without having to update the client code.
Am I missing an approach? What would be the best way to do this?
Use Case: User uploads file, the file and some other values are saved to his account. The content of the file should be analysed (can be done asynchronously) and the results should be available for the user to be shown.
I went with option 1 and it is working quite well so far. It's very easy to deploy the dockerized flask server on Cloud Run. Also it's possible to control access and authentication from the firebase function to the Cloud Run container with Google's IAM.
Related
Problem: Habitica is a habit-tracking app, but its personal data logs are not as detailed as I want. I want to create a local log of when I mark off habits/todo's in the app. Habitica offers certain webhooks that trigger when habits/todo's are checked off, which seems perfect for what I want, but how do I turn these triggers into a local log? I would like to use Python for this.
Ideas: It seems to me that I would need to set up some kind of personal cloud server to receive this data, turn it into a log, and then store it for download. I have previously deployed a Flask app using Heroku, so if this could be done similarly, that would be ideal. However, I don't know much about this, so I would welcome any ideas or advice.
Creating the Habitica webhook as Flask application is a good approach.
Heroku supports Python/Flask very nicely however the file system is ephemeral, hence it gets wiped out at every application restart.
In order to persist data you can look at various options:
save the file to AWS S3
save the data into a DB (Heroku has a free plan for PostgreSQL)
I am writing an application that uses Google's python client for GCS.
https://cloud.google.com/storage/docs/reference/libraries#client-libraries-install-python
I've had no issues using this, until I needed to write my functional tests.
The way our organization tests integrations like this is to write a simple stub of the API endpoints I hit, and point the Google client library (in this case) to my stub, instead of needing to hit Google's live endpoints.
I'm using a service account for authentication and am able to point the client at my stub when fetching a token because it gets that value from the service account's json key that you get when you create the service account.
What I don't seem able to do is point the client library at my stubbed API instead of making calls directly to Google.
Some work arounds that I've though of, that I don't like are:
- Allow the tests to hit the live endpoints.
- Put in some configuration that toggles using the real Google client library, or a mocked version of the library. I'd rather mock the API versus having mock code deployed to production.
Any help with this is greatly appreciated.
I’ve made some research and it seems like there’s nothing supported specifically for Cloud Storage using python. I found this GitHub issue entry with a related discussion, but for go.
I think you can open a public issue tracker asking for this functionality. I’m afraid by now it’s easier to keep using your second workaround.
I have a running python application that needs to receive some data and process them. and I also have a PHP server that can get these data. I want to send JSON data from PHP to my python app.
anyway except running a python web server and send data to it, or insert into DB and get from DB with python?
thanks.
I tried using python cherryPy web server.
#Niklas D It would be easier to answer your question, if you can give some more context about the application or use case you want to solve.
Some further possibilities are:
Glue Code (I never did it with python and php only C++ with python, but you should be able to find examples on the internet e.g. https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages#PHP )
Messaging Systems like RabbitMQ, ActiveMQ, ZeroMQ, etc.
Redis (I know you said except writing to a database, but Redis provides some features for publish subscribe https://redis.io/commands/pubsub which allows you to write to Redis from the one side and get data on the other side without polling the db all the time, which is the issue you have with using a database I guess) It's a bit easier to setup and use, than a messaging system.
TCP connection between the python and php application. https://medium.com/swlh/lets-write-a-chat-app-in-python-f6783a9ac170
If you want to send data to a python application using web protocols, i.e send POST, GET requests etc then you need to create a python web app to receive and handle those requests. Which in turn needs to be running off a webserver or you could build serverless functions to handle this, see https://serverless.com/
If you want to get data using a python application, i.e the python app sends POST and GET requests etc to your php app to ask for the JSON payload you can build an app using python's standard requests library https://docs.python.org/3/library/urllib.request.html or better still us the Requests package http://docs.python-requests.org/en/master/
Or you could do something and save the JSON file to disk and then open it with your python app. You'd need to set up scheduling or make your php app execute python code on the server... This last suggestion is a bad idea please don't unless your app is isolated and not publicly accessible or you know how to lock down your security.
Just start using the firebase + react to build a website. One of the designed features of my website is to crawl and show users the data parsed from another website (e.g., the stock price changes). I already have a python crawler responsible to parse the data, but I have no idea how to execute this python crawler (in the background) of my server in firebase (or it is not even possible)?
Here is the example usage of my system
user login and subscribe the website/data they are interesting
my crawler will parse that website every 1 hour and update the data to database
user can see the summary of change of website from database
One option I have in mind is running the crawler in my local machine and use the REST api to update the parsed data to firebase database. However, it seems a very inefficient/naive approach because it is kind of losing the meaning of deploying my server with cloud service, like firebase.
Firebase does not have any service/feature that allows you to periodically run Python or any other code. The closest thing to that is Cloud Functions, which can be triggered through an external service like cron-job.org.
For more in this, see:
Firebase Hosting Flask Application (on running Python on Firebase Hosting)
Using google cloud function to spawn a python script (for an elaborate way where you might apparently run Python on Cloud Functions, but I never have, nor am likely to ever try this myself)
Cloud Functions for Firebase trigger on time? (for running Cloud Functions periodically either through AppEngine, or cron-job.org).
I have a server which runs flask with python.
Now I want to make an application which can do various tasks like uploading files, updating redis database and various other things.
Now ofcourse this could be done using html pages but since the operation could involve lots of files realtime input of data and other things it might be better to make an application and manage the server from that point rather than webpages.
do you suggest using webpages anyway or would you make an application for it?
and if I make an application should I use http or not?
sorry if this is a uninformed question but I would like to learn the best methods
You might want to look into Flask-Script. It allows you to run various commands related to your flask application easily. It also allows you to easily add your own commands to it. This way you will be able to keep your administrative code still within the Flask app, but not necessarily have it accessible via a web page.