I need to load some data from a CSV file and populate my local test GAE environment DataStore. I figure I need to use the NDB Client Library.
My question is how to direct the operations to my local test environment rather than my production cloud environment?
Thanks.
The local datastore emulation is done by the development server itself, see Using the local Datastore.
So simply by running the script that translates the CSV file into the NDB writing calls inside the development server would get you what you need.
To run the script inside the server you can make it a part of the app itself or you can execute it (or call its relevant function) inside the Interactive Console from the local admin page, which you can access at the URL displayed when the devserver starts:
INFO 2017-09-19 13:34:40,772 admin_server.py:116] Starting admin
server at: http://localhost:8000
You can also load and use code from your app itself inside the interactive console script, just as it would be part of your app.
Related
I have used the dev_appserver.py app.yaml --enable_console and I was able to make use of the Interactive Console by going to the url localhost:8000 or Web Preview and port 8000 using the Cloud Shell option.
But when I deployed the application into GAE Standard using the same app.yaml as in development:
runtime: python39
handlers:
- url: /app
script: main.py
- url: /admin/.*
script: google.appengine.ext.admin.application
login: admin
I can not access neither the console or the admin by using the the link provided in the Cloud Shell as: gcloud app browse and add it the /admin/ and other variations to the url.
Example:
<project-name>.appspot.com/admin
<project-name>.appspot.com/admin/
<project-name>.appspot.com/admin/console
I am greeted with a Not Found page. I have also tried to use Postman and obtianed the same result.
How do I properly access the Interactive Console of a deployed application in Google App Engine?
Am I missing something?
What is it that you are trying use the interactive console to do?
If youre just trying to interact with Google Datastore via ndb you just need to open a python shell in your terminal at your project root and initialize ndb with your project id:
$ python
>>> from google.cloud import ndb
>>> ndb_client = ndb.Client(project='my-project-id')
>>> ndb_context = ndb_client.context()
>>> ndb_context.__enter__()
>>> # start interacting with your datastore
If you get a permissions error, then I think that means that the google service account specified in your env variable GOOGLE_APPLICATION_CREDENTIALS doesnt have access to that project.
The same goes for other actions like enqueuing cloud tasks via the python library google-cloud-tasks
Don't know if what you're doing is possible but the script you have supplied doesn't seem to exist. This is a link to the google.appengine.ext module and it doesn't have the module you're referring to.
Also not sure why you need the interactive console when you're on production given that you have console.cloud.google.com which gives you access to what the interactive console gives you. My understanding is that interactive console is for dev server only and is basically a simulation of what you get from console.cloud.google.com
I am using Python Google API Client by following this tutorial. I was able to run the API locally by downloading the credential JSON and running export GOOGLE_APPLICATION_CREDENTIALS="[PATH]". Now, I would like to deploy my application that uses the API to AppEngine. How can I do so? Do I have to somehow download JSON on my AppEngine machine and run export via app.yaml?
Place the JSON file inside your application directory and it will be deployed to GAE as part of your application. Then you can reference it in your app with just a file path (relative to the your app.yaml file). Easiest is to put it right beside your app.yaml file and reference it with just the filename, no file path.
The exact way to reference it may be depending on the GAE environment and runtime your app uses (which you didn't mention), exporting GOOGLE_APPLICATION_CREDENTIALS via app.yaml may be right.
You may want to check your app's handlers and ensure that you don't accidentally respond with the JSON file to a properly crafted external request - you probably don't want that file served to an external request :)
I have a python script (on my local machine) that queries Postgres database and updates a Google sheet via sheets API. I want the python script to run on opening the sheet. I am aware of Google Apps Script, but not quite sure how can I use it, to achieve what I want.
Thanks
Google Apps Script runs on the server side, so it can't be used to run a local script.
you will need several changes. first you need to move the script to the cloud (see google compute engine) and be able to access your databases from there.
then, from apps script look at the onOpen trigger. from there you can urlFetchApp to your python server to start the work.
you could also add a custom "refresh" menu to the sheet to call your server which is nicer than having to reload the sheet.
note that onOpen runs server side on google thus its impossible for it to access your local machine files.
In Django (1.9) trying to load .py files (modules) dynamically (via importlib). The dynamic reload is working like a charm, but every time I reload a module, the dev server restarts, having to reload everything else.
I'm pulling in a lot of outside data (xml) for testing purposes, and every time the environment restarts, it has to reload all of this external xml data. I want to be able to reload a module only, and keep that already loaded xml data intact, so that it doesn't have to go through that process every time I change some py-code.
Is there a flag I can set/toggle (or any other method) to keep the server from restarting the whole process for this single module reload?
Any help very appreciated.
If you run the development server using --noreload parameter it will not auto reload the changes:
python manage.py runserver --noreload
Disables the auto-reloader. This means any Python code changes you make while the server is running will not take effect if the particular Python modules have already been loaded into memory.
I'd like to know if anyone has pointers about how to configure AppEngine remote_api, to so that I can debug my code locally but use the remote_api to fetch some data from my server. That way, I can test against real information.
Thanks!
If you want to debug your own script with using data from High Replication Datastore, then read Using the Remote API in a Local Client. First you need to enable remote_api in app.yaml and upload the application. Then you add this part to your script:
from google.appengine.ext.remote_api import remote_api_stub
def auth_func():
return ('your_username', 'your_password')
remote_api_stub.ConfigureRemoteApi(None, '/_ah/remote_api', auth_func, 'your-app-id.appspot.com')
Now you access data from High Replication Datastore instead from local mockup.
Also if you want to quickly add test data to HRD through console I recommend using PyCharm, which has a feature of running scripts with custom parameters. From PyCharm Menu select Run->Edit Configurations. Create new configuration, set the following parameters:
Name: Name of the script
Script: Point to your $GAE_SDK_ROOT\remote_api_shell.py
Script parametres: -s your_app_id.appspot.com
Working directory: I recommend setting this. You probably want to test entities and to successfully import class definitions it is best to be in root directory of your application. So set it to ROOT of your application.
Now when you run or debug specified configuration PyCharm will open a python console, prompting you to connect to GAE with your username and password. Now you can use it for manipulating data on Google servers.
For more information on remote_api read:
Accessing the datastore remotely with remote_api
Remote API for Python
For more information on Pycharm custom configurations, read:
Creating and Editing Run/Debug Configurations
You could download data as described here, and use it to populate your local dev app. There's no reason that PyCharm needs to be involved.