I run my application locally using dev_appserver.py
using the following command
dev_appserver.py mydir --port=xxxx
Is there a way to delete all the local data generated by app engine server.
Currently, I go to localhost:8000/datastore, and delete entity groups manually.
IS there a way to automate it
Where does dev_appserver.py write local data to? (File or directory? may be in can delete that)
Yes there is!
dev_appserver.py accepts some arguments that clears things before it starts up. Two arguments that I use consistently are:
dev_appserver.py --clear_datastore=true --clear_search_indexes=true ...
dev_appserver.py --help can probably give you more information about other things that you may want to clear -- But this always gets everything I need it too.
Related
Just started using Google Cloud SDK Shell after using the older, gui-based, version. I have multiple projects under development, if that matters.
Here's what I do
run gcloud SDK shell (click on the icon!)
cd \myproject
dev_appserver.py app.yaml
In the browser (Chrome),
browse to http://localhost:8000/datastore
Under Datastore Viewer, I see 'tables' from a completely different project
(say, myotherproject)
Under Datastore Indexes, I see 'indexes' from the correct project (myproject)
Under Task Queues, I see the correct queues listed (I have specified different queues setup for parts of myproject)
Everything works fine for myotherproject. So, is there something I am missing to get the Datastore Viewer to show the correct 'tables'?
Many thanks, David
Edit: no matter what project I run, Datastore Viewer shows the same data (from myotherproject) but Datastore Indexes show the correct indexes.
Edit: Windows 8.1, Python v2.7.13:a06454b1afa1
Edit: further questions 1) does gcloud sdk use a different datastore from the original app engine sdk? 2) if so, where is it by default or do I have to define it upfront?
Thanks to everyone for their help with this. It appears GCloud uses one datastore for all projects so the --datastore_path is not really optional when you have multiple paths. However, I kept getting errors with --datastore_path so I went with the following...
dev_appserver.py --storage_path=c:\gcdata\projectname app.yaml
Yes, could have been c:\temp but this gives me separate 'databases', one for each project.
Note also that GCloud SDK does not use the same data as the original App Engine SDK grrrrrr!
I work on multiple appengine projects in any given week. i.e. assume multiple clients. Earlier I could set application in app.yaml. So whenever I did appcfg.py update.... it would ensure deployment to the right project.
When deploying, the application variable throws an error with gcloud deploy. I had to use
gcloud app deploy --project [YOUR_PROJECT_ID]. So what used to be a directory level setting for a project, is now going into our build tooling. And missing out that simple detail can push a project code to the wrong customer.
i.e. if I did gcloud config set project proj1 and then somehow did a gcloud app deploy in proj2, it would deploy to proj1. Production deployments are done after detailed verification on the build tools and hence it is less of an issue there because we still use the --project flag.
But its hard to do similar stuff on the development environment. dev_appserver.py doesn't have a --project flag.
When starting dev_appserver.py I've to do gcloud config set project <project-id> before I start the server. This is important when I using stuff like PubSub or GCS (in dev topics or dev buckets).
Unfortunately, missing out a simple configuration like setting a project ID in a dev environment can result into uploading blobs/messages/etc into the wrong dev gcs bucket or wrong dev pubsub topic (not using emulators). And this has happened quite a few times especially when starting new projects.
I find the above solutions as hackish-workarounds. Is there a good way to ensure that we do not deploy or develop in a wrong project when working from a certain directory?
TL;DR - Not supported based on the current working directory, but there are workarounds.
Available workarounds
gcloud does not directly let you set up a configuration per working directory. Instead, you could use one of these 3 options to achieve something similar:
Specify --project, --region, --zone or the config of interest per command. This is painful but gets the job done.
Specify a different gcloud configuration directory per command (gcloud uses ~/.config/gcloud on *nix by default):
CLOUDSDK_CONFIG=/path/to/config/dir1 gcloud COMMAND
CLOUDSDK_CONFIG=/path/to/config/dir2 gcloud COMMAND
Create multiple configurations and switch between them as needed.
gcloud config configurations activate config-1 && gcloud COMMAND
Shell helpers
As all of the above options are ways to customize on the command line, aliases and/or functions in your favorite shell will also help make things easier.
For example in bash, option 2 can be implemented as follows:
function gcloud_proj1() {
CLOUDSDK_CONFIG=CLOUDSDK_CONFIG=/path/to/config/dir1 $#
}
function gcloud_proj2() {
CLOUDSDK_CONFIG=CLOUDSDK_CONFIG=/path/to/config/dir2 $#
}
gcloud_proj1 COMMAND
gcloud_proj2 COMMAND
There's a very nice way I've been using with PyCharm, I suspect you can do so with other IDEs.
You can declare the default env variables for the IDE Terminal, so when you open a new terminal gcloud recognises these env variables and sets the project and account.
No need to switch configurations between projects manually (gcloud config configurations activate ). Terminals open in other projects will inherit it's own GCP project and config from the ENV variables.
I've had this problem for years and I believe I found a decent compromise.
Create a simple script called contextual-gcloud. Note the \gcloud, fundamental for future aliasing.
🐧$ cat > contextual-gcloud
#!/bin/bash
if [ -d .gcloudconfig/ ]; then
echo "[$0] .gcloudconfig/ directory detected: using that dir for configs instead of default."
CLOUDSDK_CONFIG=./.gcloudconfig/ \gcloud "$#"
else
\gcloud "$#"
fi
Add to your .bashrc and reload / start new bash. This will fix autocompletion.
alias gcloud=contextual-gcloud
That's it! If you have a directory called that way the system will use that instead, which means you can load your configuration into source control etc.. only remember to git ignore stuff like logs, and private stuff (keys, certificates, ..).
Note: auto-completion is fixed by the alias ;)
Code: https://github.com/palladius/sakura/blob/master/bin/contextual-gcloud
These are exactly the reasons for which I highly dislike gcloud. Making command line argument mandatory and dropping configuration files support, much too error prone for my taste.
So far I'm still able to use the GAE SDK instead of Google Cloud SDK (see What is the relationship between Google's App Engine SDK and Cloud SDK?), which could be one option - basically keep doing stuff "the old way". Please note that it's no longer the recommended method.
You can find the still compatible GAE SDKs here.
For whenever the above will no longer be an option and I'll be forced to switch to the Cloud SDK my plan is to have version-controlled cheat-sheet text files in each app directory containing the exact cmds to use for running the devserver, deploy, etc for that particular project which I can just copy-paste into the terminal without fear of making mistakes. You carefully set these up once and then you just copy-paste them. As a bonus you can have different branch versions for different environments (staging/production, for example).
Actually I'm using this approach even for the GAE SDK - to prevent accidental deployment of the app-level config files to the wrong GAE app (such deployments must use cmdline arguments to specify the app in multi-service apps).
Or do the same but with environment config files and wrapper scripts instead of cheat-sheet files, if that's your preference.
I have a worker running python script every 2 hour on Heroku.
The problem is each time I 'pull' the changes from git.
There is no changes at all for the sqlite3 database.
But I am sure the program is running and the database has changed by looking at the log file.
heroku log
How to retrieve the .db file then ?
It sounds like you have a little misconception. Heroku's git support is effectively one-way; you can use it to push new code to be run on the server, but you can't use it to copy files from Heroku back to your local tree.
Unfortunately it looks like there's not a good easy way to copy a file from your app to your local machine; you can use heroku run console to get a bash shell, and then scp a file out, but you're "pushing" it out of Heroku, and thus run can only copy to things with valid IP addresses.
If you're really using sqlite for your app's storage, though, you're going to run into a bigger problem. The filesystem for your app on Heroku is ephemeral, in that changes you make can be wiped out at any time. Heroku will delete your app's local storage and start over fresh whenever it wants to.
The right way to do it is use Heroku's built-in Postgres support and store your application's data there. Not only will it persist, but you'll be able to access it directly using the Postgres command-line tools.
Accessing the heroku console can now be done with:
heroku run bash
then i downloaded the linux gdrive application and ran in locally in the folder to upload my file to google drive. https://olivermarshall.net/how-to-upload-a-file-to-google-drive-from-the-command-line/ (skip step 4 and run with ./ like this ./gdrive upload my_file.txt
the other suggestion of heroku run console did not work for me (running a python flask app)
I need to know a value holding timestamp when my app was deployed on the GAE server. In runtime.
Surely I could generate some Python constant in the deployment script. But is there an easier and more correct way to reach the goal?
(I'd like not to use data store for that.)
Wrap appcfg.py in a shell script. Before actually running appcfg.py update, save the current time, possibly adjusting for your time zone, if necessary, in a file that's marked as a resource. You can open and read that file from the deployed app.
Alternatively, have that script substitute the current time directly into code, obviating the need for a file open.
I do not believe there is a GAE API that gives you access to the deploy timestamp.
The closest functionality is the CURRENT_VERSION_ID environment variable, but that only gives you access to the version specified in app.yaml, not timestamps.
How can I find where my local development datastore is located? I am using the Python SDK and Linux.
I think it depends on if you got Java or Python SDK.
For Python, here's what the instructions say from Google:
"The web server prints the location of the datastore file it is using to the terminal when it starts up. You can make a copy of the file, then restore them later to reset the datastore to a known state. Be sure to restart the web server after replacing the datastore file.
To change the location used for the datastore file, use the --datastore_path option:
dev_appserver.py --datastore_path=/tmp/myapp_datastore myapp
more info here: http://code.google.com/appengine/docs/python/tools/devserver.html
I'm using Windows 7 with the Python SDK. My local datastore is located at
C:\Users\[username]\AppData\Local\Temp\dev_appserver.datastore
To find the file location for the local AppEngine datastore on MacOSX/Python, you can run the following command:
dev_appserver.py -help
Mine was at something like:
/var/folders/uP/uP1GHkGKGqO7QPq+eGMmb++++TI/-Tmp-/dev_appserver.datastore
I think a lot of the answers on this page are out of date. Under the current Python dev kit (1.8.6) on Windows 7 I eventually found the datastore at:
c:\Users\[username]\AppData\Local\Temp\appengine.[appname]\datastore.db
I couldn't find this info in anything dev_appserver.py printed out, either with normal startup options or with --help. On other OSes you might try searching for a file called datastore.db.
For Python u can make access to datastore admin interface path:' /_ah/'
or
add the app handler to app.yaml
- url: /admin/.*
script: $PYTHON_LIB/google/appengine/ext/admin
login: admin
and access it at /admin/
I use OS X Mavericks (10.9), Python 2.7.5, and Google App Engine SDK 1.9.3 (Python).
None of the above worked for me, however, referencing #alsmola's answer, I executed sudo find / | grep datastore.db and found the file in /private/var/folders/vw/7w1zhkls4gb1wd8r160c36300000gn/T/appengine.YYYY.XXXXX/datastore.db (YYYY is the project name, XXXXX is my username).
Since it's top question on Google search and I spent quite amount of time searching for an answer I'll say that on Windows/Java mix DB file called local_db.bin.
With Maven the files are sitting here:
target/{buildName}/WEB-INF/appengine-generated/
I'll restate a solution to getting permanent datastore as it worked for me (circa Feb 2017), running GoogleAppEngineLauncher on OS X v10.10.
Create the folder path for permanent datastore
In GAEL, click on the project in question e.g. PROJECTNAME
Click Edit-Application Settings
in Extra Flags field:
--datastore_path=/Users/foo/GAE_datastore/PROJECTNAME/datastore.db
Filename has to be included; in my config, datastore.db works.
Having searched all over for GAE datastore path, and head-bonked on dev_appserver.py --datastore_path command line, it was very helpful to find this.
Application Settings under the Edit menu is an odd choice, Google :-)
The default location of the datastore for the platform you're running the app engine on is provided in the README that comes with the platform (at least, in the one for Linux). The README is in google_appengine_x.x.xx/google_appengine/README. This is what is says in the Linux'es one:
--datastore_path=DS_FILE Path to file to use for storing Datastore file
stub data.
(Default /tmp/dev_appserver.datastore)