django import local settings from the server - python

In my home directory, I have a folder called local. In it, there are files called __init__.py and local_settings.py. My django app is in a completely different directory. When the app is NOT running in DEBUG mode, I want it to load the local_settings.py file. How can this be acheived? I read the below:
Import a module from a relative path
Importing files from different folder in Python
http://docs.python.org/tutorial/modules.html
Basically, those tutorials are allowing to import from another directory, but what about a completely different working tree? I don't want to keep doing .., .., .. etc. Is there a way to goto the home directory?
I tried the following code:
import os, sys
os.chdir(os.path.join(os.getenv("HOME"), 'local'))
from local_settings import *
But i keep seeing errors in my apache error.log for it...

os.chdir just affects the current working directory, which has nothing whatsoever to do with where Python imports modules from.
What you need to do is to add the the local directory to the Pythonpath. You can either do this from the shell by modifying PYTHONPATH, or from inside Python by modifying sys.path:
import sys
import os
sys.path.append(os.path.expanduser("~/local"))
import local_settings

In response to your concerns about source control, you can just set the source control to ignore that file, and/or have a symlink installed as part of your deploy script to link the file on the os into another. I do both , though not in django. but it's a trivial task.
your deploy layout could look like this:
/current ( symlink to /releases/v3 )
/settings/local_settings.py
/releases/v1
/releases/v2
/releases/v3
and a task runs as part of your deploy:
cd /current
ln -s /settings/local_settings.py local_settings.py
if you're deploying with fab or capistrano, it's a few lines of configuration. i'm sure you could do this with puppet/chef simply too.

Related

python import packages that know nothing about my system configuration

I have a project I am working on, let's call it Project, which lives in the directory Project somewhere wholly unknown to me (really it lives both on my local system and on a couple Docker build systems). In that project, I have some source files, source/module1.py and source/module2.py. I also have some example files, some test files, and an init.py So my directory looks something like this:
Project
__init__.py
/source
module1.py
module2.py
/test
testRunner.py
/examples
awesomeExample.py
However, module1 needs some stuff from module2. My naive self thought this could be done by putting an import statement in module1:
import module2
# Do some other interesting stuff
And this works, but only when I am running / importing module 1 from the source directory. If I am, for example, running some unit tests in another directory test/testRunner.py, either from the test directory or in the main Project directory, the import will fail. Same with trying to use it when running an example in the examples directory.
So here is my problem: in general, I don't know where the calling script lives. It might be in the examples directory, it might be in the test directory, or it might be in the main Project directory (for example when trying to import stuff with an init.py). How do I ensure that module1 can always import module2 in each of these scenarios?
I am not looking for a solution like "add all those directories to your python path". Initially I just added Project to my python path on my local machine, and then did all my imports relative to that (import Project.source.module2), but this (predictably) caused my builds to fail on the Docker instances. I don't just want this to work on my local machine, but also on the Docker instances I'm using to build and test this software, and on any user's machine that subsequently installs it (i.e. by doing a pip install Project. What is the most robust way to make sure this dependency is satisfied? How can I make sure module1 can import module2 regardless of where module1 itself is imported from? Any python 3.x.x solution is welcome.
I figured out a way to do it (credit here) - it's a little inelegant, but extremely robust. Works on my local machine independent of whether using an import statement or running a script directly, as well as my build servers which use github actions and Travis CI.
Basically, I added a file in the source directory, called context.py with the following contents:
import os
import sys
fileLocation = os.path.dirname(os.path.abspath(__file__))
sourceLocation = os.path.abspath(os.path.join(fileLocation, '..', 'source/'))
sys.path.insert(0, sourceLocation)
This finds out the current file being executed from python, and then uses that to add to the python path. And then in my module1.py file, at the top I have:
import context
import module2
Now, whenever module1 is imported, it successfully imports module2. More elegant answers or comments on why this works and in which cases it might fail are appreciated.

Python: Import scrypt from subfolder to another subfolder

I am working on some python project (2.7) and I have issue with imports. When I run main.py it start scripts from tests folder etc. and save output to logs and everything works fine.
/root------------
-logs
-staticCfg
-config.py
-tests
-systemTest
-scrypt1.py
-scrypt2.py
-userTest
-uScrypt1.py
main.py
My static variables (email, name etc.) are located in config.py. I need to import config.py in scrypt1.py or scrypt2.py. I tryed adding __init__.py to tests, systemTest and staticCfg folder but I always get an error.
In my scrypt1.py:
import staticCfg as cfg
...
or
from staticCfg import *
...
I get the error:
ImportError: No module named staticCfg
The import mechanism of Python can be a bit tricky.
You can refer to the documentation for more information: Python Import Mechanism
When you use absolute imports (your import does not start with a .) as you do, the import path will start from your main script (the one you launch). In your case, it's scrypt1.py. So starting from this location, python can't find the package staticCfg.
For me, the simplest solution is to create a main script in your root directory and call scrypt1.py from there (imported using from tests.systemTet import scrypt1.py). In this case, the base package will be your root folder and you will have access to the package staticCfg from all your script files as you wanted to do.
you may add root folder to PYTHONPATH.

How can i import function or class from one folder to another folder

I am using python 3.6 . I have following folder structure:
main_directory
-- sub_directory_one
-- config.py
-- sub_directory_two
-- test.py
i want to import function or class from config.py to test.py .
i Have tried
from sub_directory_one.config import class_name
from main_directory.sub_directory_one.config import class_name
But nothing is working.
Few have suggested about adding project to system path.But i am currently working on mac and what will happen if deploy this to ubuntu server.
Thanks
If your sub directories are (supposed to be) Python packages, add an empty __init__.py file in these directories. If you then run your main application from the main directory, you should be able to use:
from sub_directory_one.config import class_name
Alternatively, if config.py and test.py are Python modules that are just separated in different directories for whatever practical reasons, you should add the sub directories to your Python search path. This can be done by setting the environment variable PYTHONPATH before starting your main application, or by extending the Python variable sys.path in your main script before importing these modules. In that case you should use:
from config import class_name
For more information about Python modules and packages, see the official documentation.

Import file from parent directory?

I have the following directory structure:
application
tests
main.py
main.py
application/main.py contains some functions.
tests/main.py will contain my tests for these functions but I can't import the top level main.py. I get the following error:
ImportError: Import by filename is not supported.
I am attempting to import using the following syntax:
import main
What am I doing wrong?
If you'd like your script to be more portable, consider finding the parent directory automatically:
import os, sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
# import ../db.py
import db
You must add the application dir to your path:
import sys
sys.path.append("/path/to/dir")
from app import object
Or from shell:
setenv PATH $PATH:"path/to/dir"
In case you use windows:
Adding variable to path in windows.
Or from the command line:
set PATH=%PATH%;C:\path\to\dir
Please mind the diff between PYTHONPATH, PATH, sys.path.
Late to the party - most other answers here are not correct unfortunately - apart LennartRegebro's (and BrenBarn's) which is incomplete. For the benefit of future readers - the OP should, first of all, add the __init__.py files as in
root
application
__init__.py
main.py
tests
__init__.py
main.py
then:
$ cd root
$ python -m application.tests.main
or
$ cd application
$ python -m tests.main
Running a script directly from inside its package is an antipattern - the correct way is running with the -m switch from the parent directory of the root package - this way all packages are detected and relative/absolute imports work as expected.
First of all you need to make your directories into packages, by adding __init__.py files:
application
tests
__init__.py
main.py
__init__.py
main.py
Then you should make sure that the directory above application is on sys.path. There are many ways to do that, like making the application infto a package and installing it, or just executing things in the right folder etc.
Then your imports will work.
You cannot import things from parent/sibling directories as such. You can only import things from directories on the system path, or the current directory, or subdirectories within a package. Since you have no __init__.py files, your files do not form a package, and you can only import them by placing them on the system path.
To import a file in a different subdirectory of the parent directory, try something like this:
sys.path.append(os.path.abspath('../other_sub_dir'))
import filename_without_py_extension
Edit: Missing closing bracket.
in python . exists for same directory, .. for parent directory
to import a file from parent directory you can use ..
from .. import filename (without .py extension)

How to import a Django project settings.py Python file from a sub-directory?

I created a sub-directory of my Django project called bin where I want to put all command-line run Python scripts. Some of these scripts need to import my Django project settings.py file that is in a parent directory of bin.
How can I import the settings.py file from a sub-directory of the project?
The code that I use in my command-line script to set into the "Django context" of the project is:
from django.core.management import setup_environ
import settings
setup_environ(settings)
This works fine if the script is in the root directory of my project.
I tried the following two hacks to import the settings.py file and then setup the project:
import os
os.chdir("..")
import sys
sys.path = [str(sys.path[0]) + "/../"] + sys.path
The cruel hack can import settings.py, but then I get the error:
project_module = __import__(project_name, {}, {}, [''])
ValueError: Empty module name
I think your approach may be over-complicating something that Django 1.x provides for you. As long as your project is in your python path, you can set the environment variable DJANGO_SETTINGS_MODULE at the top of your script like so:
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings'
In your command line script where you need to read your settings, simply import the settings module from 'django.conf' as you would do in your application code:
from django.conf import settings
And presto, you have your settings and a Django-enabled environment for your script.
I personally prefer to set my DJANGO_SETTINGS_MODULE using '/usr/bin/env' in a bash script called 'proj_env' so I don't have to repeat it
#!/bin/bash
proj_env="DJANGO_SETTINGS_MODULE=myproject.settings"
/usr/bin/env $proj_env ${*}
With this, now I can run any python script with my Django application in context:
proj_env python -m 'myproject.bin.myscript'
If you use virtualenv, this also gives you a good place to source the activate script.
etc. etc.
This is going one level up from your question, but probably the best solution here is to implement your scripts as custom manage.py (django-admin.py) commands. This gives you all of Django's functionality (including settings) for free with no ugly path-hacking, as well as command-line niceties like options parsing. I've never seen a good reason to write Django-related command-line scripts any other way.
Add the parent directory to your path:
import sys
sys.path.append('../')
import settings
Update from comments:
Don't forget the __init__.py file in
the directory that has your
settings.py – S.Lott
Let's say your project directory is /opt/someProject/`
This has files like:
manage.py
Now you have you may have your sub directory anywhere, does not matter.
Eg. subdirectory could be like:
/opt/someproject/dir1/dir2
Now for you to import your project settings inside /opt/someProject/dir1/dir2
You need to set your PYTHONPATH variable
export PYTHONPATH=/opt/someproject/
Now to import modules from bin
from someproject import bin

Categories