pyroot folder of a python module - python

I am following a step-by-step tutorial blog on the flask micro framework for python.
I bumped into an issue, when they require me to 'setup' a configuration file in the root of the application folder, so it can easily be accessible if needed.
They called it config.py.
My question is, if the local path to my application is /home/test/application1/, should I create the file inside the ./application1/ directory? What gets me confused in this somewhat obvious question is that I did a quick search for other config.py files in the local directory inside /home/test/application1/, where I found 4 other files. There were in the following directories:
/home/test/application1/flask/lib/python2.7/site-packages/flask/testsuite/config.py
/home/test/application1/flask/lib/python2.7/site-packages/flask/config.py
/home/test/application1/flask/local/lib/python2.7/site-packages/flask/testsuite/config.py
/home/test/application1/flask/local/lib/python2.7/site-packages/flask/config.py
So should I create a new config.py file in the directory that I first mentioned or should I add some lines in one of the previously created config.py files.
Here is the source of the step-by-step tutorial:
It is at the beginning, right after Configuration.

Unlike other frameworks, Flask does not have a lot of rules, in general you can implement things in the way they make sense to you.
But note that the other config.py files that you found are all in the virtual environment, and are all scripts that come with Flask. They have nothing to do with the application configuration.
I wrote the tutorial you are following. In the tutorial I'm putting config.py outside of the application package. This is what I like, I consider the configuration separate from the application. My thinking is that you should be able to run the same application with different configuration files, so that for example, you can have a production configuration, a testing configuration and a development configuration, all different.
I hope this helps.

Related

Right way to organize python project structure and distributing

I have a Python client-service project. Logical dependencies look like this:
Say in words, I have some services (srv1, srv2, ..., srvX) with common code in service.py, some clients for these services (srv1_cl.py, srv2_cl.py, ..., srvX_cl.py) with common code in client.py and common code for both service and clients in common.py.
Now I have problem with structuring my git repository for development and deploying my project to users.
Firstly I tried to store my clients and service in separate folders. Moreover, for each service I created a folder, because each service can contain a lot of files.
/myrepo.git
common.py
/clients
client.py
srv1_cl.py
srv2_cl.py
/services
service.py
/srv1
conf.py
srv1.py
start.bat
/srv2
conf.py
srv2.py
When I store my files like that I ran into a problem - how can a Python script export files from upper directories? Because I need test my client scripts, make some changes and execute.
I found some solutions based on adding PATH with upper folders (sys.path.insert(0, parentdir), but I'm sure it's not a good way) and relative imports. But for relative imports I must wrap my scripts into one package.
OK, let's say I will do this, but here's another problem. I want to deploy "clients" separately from "services" because I don't want share the services' code with my users. How can I handle this problem?
What is the best structure for my project? How can I develop and debug my project when I wrap my code to a package? Because for testing I want to start clients from my subfolders.
Good question. I ran into a similar problem with importing modules at different levels in the directory tree, and deployment. I ended up up reorganizing the project tree to be very flat(all files in one directory) and just naming the files informativeley, and refractoring code a lot to get the desired functionality. And the for deployment, I copied the main pyc file into the subproject folder.
I would love to know a better way though.

Python app configuration best practices

I know this issue has been discussed before, but I am struggling to find a starightforward explanation of how to approach configuration between local development and production server.
What I have done so far: I had one my_app_config.py file that had a section with machine / scenario (test vs production) sections I could just comment out. I would develop with my local machine path hardcoded, test database connection string, my test spreadsheet location, etc. When it comes time to deploy the code to the server, I comment out the "test" section and uncomment the "production section". As you may guess, this is wrought with errors.
I recently adopted the Python ConfigParser library to use .ini files. Now, I have the following lines in my code
import ConfigParser
config = ConfigParser.RawConfigParser()
config.read(os.path.abspath(os.path.join(os.path.dirname( __file__ ), '..', 'settings',
'my_app_config.ini')))
database_connect_string_admin = config.get('Database', 'admin_str')
The problems with this are many...
I need to have the import at the top of every file
The filename my_app_config.ini can't change. So, I rely on comments within the content of the .ini file to know which one I'm dealing with. They are stored in a folder tree so I know which is which.
notice the path to the config file is defined here. So, depending where the python file lives in the tree structure dictates if I get a copy / paste error.
I tried to set environment variables at the beginning of the program, but all the imports for all modules are performed right away at code launch. I was getting "not found" errors left and right.
What I want: To understand how to keep all the configurations stored in one place that is not easy to lose track of what I am doing. I want an easy way to keep these configuration files (ideally one file or script) under version control (security is a whole other issue, I digress). I want to be able to seamlessly switch contexts (local-test, local-production, serverA-test, serverA-production, serverB-test, serverB-production) My app uses
my_app_config.ini read by my parser
uwsgi.ini read by the uwsgi application server emperor
web_config.py used by the flask application
nginx.conf symlinked to the web server's configuration
celery configuration
not to mention different paths for everything (ideally handled within the magic config handling genie). I imagine once I figure this out I will be embarrassed it took so long to grasp.
Are Environment variables what I am trying to do here?
You have to try `simple-settings. It will resolve all you issues. One way set environment variable
in development
$ export SIMPLE_SETTINGS=settings.general,settings.development
$ python app.py
in production
$ export SIMPLE_SETTINGS=settings.general,settings.production
$ python app.py
You can keep `` development.pyandproduction.py` not in a repository for security reasons.
Example
settings/general.py
SIMPLE_CONF = 'simple'
app.py
from simple_settings import settings
print(settings.SIMPLE_CONF)
The documentation indicated many more features and benefits.

Recommended place for a Django project to live on Linux

I'm uploading my first Django project to a Linux server, where I should put my project in the filesystem?
With a PHP, or ASP project, everything goes into /var/www, would it be ok to do the same and add my Django project to the /var/www folder?
In the Django tutorial it states:
Where should this code live?
If your background is in PHP, you're probably used to putting code under the Web server's document root (in a place such as /var/www). With Django, you don't do that. It's not a good idea to put any of this Python code within your Web server's document root, because it risks the possibility that people may be able to view your code over the Web. That's not good for security.
Put your code in some directory outside of the document root, such as /home/mycode.
File Hierarchy System
#Andy Hayden really states where not to place ones code. The File Hierarchy System (FHS) implicates the following structure; PATH maps to PACKAGE or PROVIDER (It is recommended that parties providing multiple packages should use PROVIDER/PACKAGE) :
/etc/opt/PATH # FHS location for /opt configuration files
/opt/PATH # FHS location for PROVIDER or PACKAGE name
/var/opt/PATH # FHS location for /opt variable storage
The FHS expects /opt/PATH to contain all the material necessary for the successful execution of ones package so it seems prudent to setup the following symbolic links
/etc/opt/PATH to /opt/PATH/etc
/var/opt/PATH to /opt/PATH/var
This provides a good basis but Django projects have extraneous requirements that the above structure does not fully meet.
Static Files
Static files are deployed when one runs python manage.py collectstatic to the STATIC_ROOT which should point to the web server root for static delivery, usually /var/www/PATH.
One could link /var/www/PATH symbolically to /opt/PATH/static
but this is typically a bad idea; Consider the case that you have a misconfigured server and a user goes to www.domain.tld/../ and copies your work.
Settings
If you created your project with django-admin create-project WEBSITE the you will typically have a setup.py file under the WEBSITE folder.
PROJECT/
WEBSITE/
setup.py
...
If you converted this settings module into a package, or you used some wrapper around django-admin e.g. django-cms-create etc.
PROJECT/
WEBSITE/
settings/
__init__.py # from .settings import *
settings.py
...
You might symlink /etc/opt/PATH to /opt/PATH/WEBSITE/settings instead of /opt/PATH/etc as described above. I can't think of a practical reason for doing so though... YMMV.
Media
Media, typically provided by ones websites users, are placed into MEDIA_ROOT. It seems prudent to map /var/opt/PATH to /opt/PATH/media in this case.
Virtual Environments
/opt/PATH/env seems the most logical location. /var/env/PATH also seems sensible but is probably better suited as a symbolic link to /opt/PATH/env.
Since a virtual environment is neither an application nor a library the locations /opt/bin and /opt/libs would not do for this. /env/ or /pyvenv/ does not conform to the FHS.
Whiskey
If you're using mod_wsgi with Apache the an invocation similar to python manage.py runmodwsgi --server-root /etc/opt/PATH --setup-only is probably preferable since it places the Apache control commands into the FHS compliant locations, granted they are more cumbersome to invoke in this case.
Home
To my understanding /home was traditionally used by PHP developers when they were hosting multiple sites upon the same server. If you're using Django you're probably serving your site from a dedicated machine and this structure looses a bit of favour in this case... YMMV.

Deploying Django Project

I have a django project which I want to deploy on apache2 Http Server.
However, I want to automatically copy all the python files from some directory to apache srv directory which is /srv/www/myproject. Is there any automatic python tool which could solve the purpose.
I have looked into DistUtils and setup.py but I am unsure about how I would copy all .py files (along with directory structure) to the apache directory.
Any help will be appreciated!!!
Take a look to fabric, it is a great tool to do automatic deployments
From the django tutorial:
If your background is in PHP, you're probably used to putting code
under the Web server's document root (in a place such as /var/www).
With Django, you don't do that. It's not a good idea to put any of
this Python code within your Web server's document root, because it
risks the possibility that people may be able to view your code over
the Web. That's not good for security.
Put your code in some directory outside of the document root, such as
/home/mycode.
For copying files, you can use something as simple as FTP/SCP which you can automate; or you can use more full blown deployment options like fabric (see this blog entry for a step-by-step guide on fabric + virtualenv + apache mod_wsgi).
You can use any tool to automate the task; but please put the code in the appropriate non-web browsable directory.

How to make a module accessible for all applications including GAE deployment process

I have a Python module containing some utils that all my GAE applications may use. I created it myself. It is n a separate folder and I sometimes want to update its code, make refactorings etc. Every application I create, can take use functions from this module. Now I need to copy the module folder somewhere inside an application and import its functions. It's an ordinary procedure, nothing fancy. When I make some updates to the code of the module, I then need to overwrite this module if it is already imported in the application. Then I just deploy the application with GAE utility and all works fine. The question is - Is that possible to not have many copies of the module in every application having to overwrite them all every time I update some code inside that, but have one copy in one place and automatically import it from there? I know I can copy the module code somewhere Python searches for modules. Though, I still need to copy this module folder into the application when I deploy it into GAE environment. So, I need one copy of a module accessible for all my application when they are on my local PC and need to have that folder copied into the app when I deploy it. Is there a good and nice solution? Thanks.
You can store your module in a directory outside all your GAE apps and then create a symbolic link to that directory inside all the GAE apps directories. appcfg.py will follow the symbolic link. Quoting from the Python SDK docs:
If you make a symbolic link to a module's directory in your application directory, appcfg.py will follow the link and include the module in your app.

Categories