Im trying to upload to my server a static file that contains photos from a previous version which has been overwritten since then (same name, different image). I think the upload skips these files, how can I make it go over these files?
Increase the deployment verbosity using the --verbosity option for gcloud app deploy and you'll see exactly which files are skipped and which aren't (you may need to modify them again since you already attempted to deploy the most recent versions).
But it could very well be a caching issue, see App Engine: Static Files Not Updating on Deploy.
Related
EDIT: I found the issue from .pyc files. For some reason some compiler has created .pyc files which contained old code. When I uploaded files to the server, server won't compile .py files. Instead it will run .pyc files uploaded from my computer. I deleted all .pyc files, deployed, and now the server runs fresh code /EDIT
When I run gcloud app deploy I quite often deploy a wrong version of my app. My app is running on GAE standard environment is written using Python 2.7.
I can easily reproduce the problem by having one of my urls return a hard coded string, for example "test1". Now when I change this between deployments, I quite often receive a previously written string from the endpoint.
When running the app on the local server the changed return string is correct, but after deploy the string can be from an earlier version.
I have to deploy my app to both test and production environments and I am worried about deploying wrong code. When deploying the gcloud console correctly shows that only 2 files are being uploaded (if I have only edited the static return string).
I have tried killing all other versions from the App Engine console.
Also tried using flag --stop-previous-version.
I have also tried adding new endpoints and after gcloud says the deployment was successful these endpoints are still inaccessible.
How can I make sure my current code gets deployed correctly?
1) Make sure you change the version number in app.yaml
2) Go to
https://console.cloud.google.com/appengine/versions?project={your project id}
to tell GCP which version to serve, and which to stop.
Your note regarding the pyc files suggests that you may have customized your app.yaml's skip_files section and accidentally wiped its default values in the process, which would normally prevent deploying pyc (and other potentially interfering files) to GAE. From that doc (emphasis mine):
The skip_files has the following default:
skip_files:
- ^(.*/)?#.*#$
- ^(.*/)?.*~$
- ^(.*/)?.*\.py[co]$
- ^(.*/)?.*/RCS/.*$
- ^(.*/)?\..*$
The default pattern excludes Emacs backup files with names of the form
#...# and ...~, .pyc and .pyo files, files in an RCS revision control directory, and Unix hidden files with names beginning with a
dot (.).
To extend the above regular expression list, copy and paste the above
list into your app.yaml and add your own regular expressions.
So to no longer worry about manual cleaning the .pyc files make sure you still have the above patterns in your skip_files section, in particular - ^(.*/)?.*\.py[co]$ which is the one responsible for the .pyc files.
I am starting to use modules in my python Google Appengine app.
I managed to test my configuration locally (on the dev server) and everything is working fine.
I want to test my changes on a side version online and didn't find a place that states whether the dispatch configuration will affect only my side version or my main serving version also (that's dangerous).
I know that cron.yaml is not a version specific file, how about dispatch.yaml?
Is it safe to deploy a side version with a dispatch file?
Thanks
From Configuration Files:
Optional application-level configuration files (dispatch.yaml,
cron.yaml, index.yaml, and queue.yaml) are included in the top
level app directory.
So no, you can't test a dispatch.yaml file change without affecting all versions of all your app's services/modules since it's an app-level configuration.
To be able to test app-level config file changes I'm using an entirely separate application as a staging environment.
I have an application whose entry point is defined in Emily.py.
Emily.py imports EmilyBlogModel.py and EmilyBlogModel.py imports EmilyTreeNode.py.
When using appcfg.py to upload this app to Google App Engine, how do I make sure that the all my files are uploaded?
The update process doesn't really do any analysis of the code, it just uploads everything it finds under the root folder (where your app.yaml lives), as noted in the docs:
The update action creates or updates the app version named in the app.yaml file at the top level of the directory. It follows symlinks and recursively uploads all files to the server.
Only exception are skipped files, which you can personalize to achieve the opposite.
Say I've got a django project in a git repo with static files hosted on Amazon S3. Pretty common setup. Now there's me and another person working on this project; one is working on the javascript part and I'm working on the python part. Is there a sensible way to take changes the other person makes to the static files hosted on Amazon S3 and bring them back into the django git project?
Django has a sensible one-way flow for static files: collectstatic grabs all of them and then if you use django-storages it pushes them out to Amazon S3 for you. But what about the reverse? How to pull the changed versions of the static files back into the /app/static/ folders in django to keep everything synchronized. I'm not aware of any django functionality. Multiple git repos? Something else?
BTW I am using Heroku, if that helps any.
Any suggestions for how to handle this nicely, or am I doing it all wrong?
I am thinking to design my own web app to serve static files. I just don't want to use Amazon services..
So, can anyone tell me how to start the project? I am thinking to develop in Python - Django on Openshift (Redhat's).
This is how ideas are going through in my mind:
A dashboard helps me to add/ delete/ manage static files
To setup API kind of thing (end point: JSON objects) so that I can use this project to serve my web apps!
As openshift uses Apache!, I am thinking to dynamically edit htaccess and serve the files.. but not sure whether it would be possible or not
Or, I can use django's urls.py to serve the files but I don't think djano is actually made for.
Any ideas and suggestion?
How about this:
Use nginx to serve static files
Keep the files in some kind of predefined directory structure, build a django app as the dashbord with the filesystem as the backend. That is, moving, adding or deleting files from the dashboard changed them the filesystem and nginx doesn't have to be aware of this dashboard.
Do not use dynamic routing. Just layout and maintain the proper directory structure using the databoard.
Optionally, keep the directory structure and file metadata in some database server for faster searches and manipulation.
This should result in a very low overhead static file server.