Running Python from a virtualenv with Apache/mod_wsgi, on Windows - python

I'm trying to set up WAMP server. I've got Apache working correctly, and I've installed mod_wsgi without a hitch.
Problem is, I'm using virtual environments (using virtualenv) for my projects. So obviously, mod_wsgi is having problems locating my installation of Django.
I'm trying to understand how I can get mod_wsgi to work well with the virtualenvs. The documentation seems to think this isn't possible:
Note that the WSGIPythonHome directive can only be used on UNIX systems and is not available on Windows systems. This is because on Windows systems the location of the Python DLL seems to be what dictates where Python will look for the Python library files. It is not known at this point how one could create a distinct baseline environment independent of the main Python installation on Windows.
From here: mod_wsgi + virtualenv docs.
Does anyone have some idea on how to make this work?

You can activate the environment programmatically from Python adding this to your .wsgi file before importing anything else.
From virtualenv's docs:
Sometimes you can't or don't want to
use the Python interpreter created by
the virtualenv. For instance, in a
mod_python or mod_wsgi environment,
there is only one interpreter.
Luckily, it's easy. You must use the
custom Python interpreter to install
libraries. But to use libraries, you
just have to be sure the path is
correct. A script is available to
correct the path. You can setup the
environment like:
activate_this = '/path/to/env/bin/activate_this.py'
execfile(activate_this, dict(__file__=activate_this))
This will change sys.path and even
change sys.prefix, but also allow you
to use an existing interpreter. Items
in your environment will show up first
on sys.path, before global items.
However, this cannot undo the
activation of other environments, or
modules that have been imported. You
shouldn't try to, for instance,
activate an environment before a web
request; you should activate one
environment as early as possible, and
not do it again in that process.

Related

What does Python Virtual Env "activate_this.py" actually do?

I'm working on a python CLI project that, unfortunately, is a little complex on the virtual environments side, as it has to deal with multiple ones.
In my quest to get my tool to work properly and robustly, I came across Virtual Env's "activate_this.py" file, which is generated inside the .venv/bin/ directory.
I thought this could be useful for my needs so I started experimenting with it, but I haven't truly understood what it is actually doing under the hood.
For example, it says that the script should be used when a file must be called from an existing python interpreter (which is my case), but the virtual environment must be activated.
What I'm struggling to understand is: What exactly does activating a virtual environment on a given file mean? (Since we're not changing the interpreter)
Also, on the activate_this.py file there's this bit of code:
# add the virtual environments libraries to the host python import mechanism
prev_length = len(sys.path)
for lib in "../lib/python3.9/site-packages".split(os.pathsep):
path = os.path.realpath(os.path.join(bin_dir, lib))
site.addsitedir(path.decode("utf-8") if "" else path)
sys.path[:] = sys.path[prev_length:] + sys.path[0:prev_length]
Does this mean that you can now import libraries that are installed on the virtual env you've just activated, even if they are not installed in the environment of the base interpreter?
You've got it. It rewrites your sys.path to expose the stuff installed in the virtual environment, and it cuts off access to the globally installed third-party libraries not in the virtual environment.

Python script in systemd: virtual environment or real environment

I have been trying to run a python script on start-up (on a Pi). I initially did this via an .sh script triggered by cron.
Posting with a problem on StackExchange Pi (https://raspberrypi.stackexchange.com/questions/110868/parts-of-code-not-running-when-autostarting-script-in-crontab) the suggestion is to use systemd.
The person helping me there has suggested not using a virtual environment when executing the Python script (they note their limited familiarity with Python), and using the real environment instead. But other resources strongly suggest the use of a virtual environment (e.g. https://docs.python.org/3/tutorial/venv.html).
In the hope of setting this up correctly could anyone weigh in on the correct approach?
Use the virtual environment. I don't see any reason not to. At some point you might want to have multiple Python applications run at the same time on that system, and these applications might require different versions of the same dependency and then you would be back to square one, so... Use the virtual environment.
When configuring systemd, crontab, or whatever, make sure to use the python binary that is placed inside the virtual environment's bin directory, so that there is no need to activate the virtual environment:
/path/to/venv/bin/python -m my_executable_module
/path/to/venv/bin/python /path/to/my_script.py
/path/to/venv/bin/my_executable_script
systemd is going to try to run your script on startup so your virtual environment will not have been activated yet. You can (maybe) avoid that issue by telling systemd to use the python in the virtualenv's bin, with the appropriate environment variables. Or you can activate as a pre-run step for that script's launch in systemd. Maybe.
But on balance I'd make it easy on systemd and your OS and ignore the virtualenv absolutists. Get the script to work on your dev machine using virtualenv all you want, but then prep systemd to use the global python, with suitable packages installed. You can always use virtualenvs on that pi, for scripts that don't have to work with systemd. Systemd doesn't always have the clearest error messages.
(If you need to import custom modules you could inject directories into sys.path in your script. This could even avoid installing packages, for the global Python, entirely.)
This answer is certainly opinion-based.

How to set LD_LIBRARY_PATH for apache+wsgi website

I'm trying to use a python library in my wsgi( apache + flask) based website.
While using the library in a standalone command script, I have to add the library path to LD_LIBRARY_PATH
So this works for standalone script:
# export LD_LIBRARY_PATH=/usr/local/lib:/usr/local/lib64
# python script.py
Now when I try to use this python library through Apache+wsgi, I need to pass the same path to apache workers. How can I accomplish that?
Is the library required by a Python module extension you have compiled and installed? If it was, set LD_RUN_PATH environment variable to the directory the library is in when compiling and installing that Python module. That way the location is embedded in the Python module extension itself and you don't need LD_LIBRARY_PATH at run time.
The only other way to do it using environment variables is to set LD_LIBRARY_PATH in the startup scripts for Apache so that when Apache is started it is set as you require. This means fiddling with system startup scripts so is not ideal or always practical.
One final way which may work but isn't really that great of an idea and may not even work, is to use:
LoadFile "/usr/local/lib/libmylib.so"
in the Apache configuration. This will force linking of the specific library into Apache at startup, but depending on how the library is being used, this may not work.
For me, the simpler solution was to walk through the LD_LIBRARY_PATH and create softlinks to /usr/lib. Very efficient if your LD_LIBRARY_PATH is not too long:
~/opt/lib$ for f in $(ls) ; do sudo ln -s `pwd`/$f /usr/lib/$f ; done

Should I add my Python project to the site-packages directory, or append my project to PYTHONPATH?

I have a Python project which contains three components: main executable scripts, modules which those scripts rely on, and data (sqlite3 databases, flat files, etc.) which those scripts manipulate. The top level has an __init__.py file so that other programs can also borrow from the modules if needed.
The question is, is it more "Pythonic" or "correct" to move my project into the default site-packages directory, or to modify PYTHONPATH to include one directory above my project (so that the project can be imported from)? On the one hand, what I've described is not strictly a "package", but a "project" with data that can be treated as a package. So I'm leaning in the direction of modifying PYTHONPATH (after all, PYTHONPATH must exist for a reason, right?)
Definitely do not add your project to site-packages this is going to spoil your system Python installation and will fire back at the moment some other app would come there or you would need to install something.
There are at last two popular options for installing python apps in isolated manner
Using virtualenv
See virtualenv project. It allows
creation of new isolating python environment - python for this environment is different from system one and has it's own PYTHONPATH setup this allows to keep all installed packages private for it.
activation and deactivation of given virtualenv from command line for command line usage. After activate, you can run pip install etc. and it will affect only given virtualenv install.
calling any Python script by starting by virtualenv Python copy - this will use related virtualenv (note, that there is no need to call any activate)
using zc.buildout
This package provides command buildout. With this, you can use special configuration file and this allows creation of local python environment with all packages and scripts.
Conclusions
virtualenv seems more popular today and I find it much easier to learn and use
zc.buildout might be also working for you, but be prepared for a bit longer learning time
installing into system Python directories shall be always reserved for very special cases (pip, easy_install), better avoid it.
installing into private directories and manipulatig PYTHONPATH is also an option, but you would repeat, what virtualenv already provides

virtualenv on Windows: not over-riding installed package

My current setup is Python 2.5/ Django 1.1.1 on Windows. I want to start using Django 1.2 on some projects, but can't use it for everything. Which is just the sort of thing I've got virtualenv for. However, I'm running into a problem I've never encountered and it's hard to Google for: installing Django 1.2 into a virtualenv has no effect for me. If I then activate the environment and do
python
import django
django.VERSION
I get "1.1.1 Final". Django is installed in the site-packages directory of my environment and the init file in the root shows that it is 1.2. But the environment falls back to 1.1.1, even if I create the environment with the --no-site-packages flag. What am I screwing up?
Based on the bug you filed at bitbucket, it looks like you're using the PYTHONPATH environment variable to point to a directory with some packages, including Django 1.1.1. By design, PYTHONPATH always comes first in your sys.path, even when you have a virtualenv activated (because PYTHONPATH is under your direct and immediate control, and people use it for local overrides).
In this case, if you don't want that PYTHONPATH when this virtualenv is activated, you'll need to take care of that yourself; perhaps by creating a custom batch file that both calls the virtualenv's activate.bat and also modifies PYTHONPATH.
Some tools you can use to diagnose these problems:
To see where django is coming from, print django.__file__. It will show the file indicating django's location on the file system.
To see all the places Python will look for packages, print sys.path. It's a list of directories.
To see imports as they happen, start python as python -v, and you'll see lots of debugging information about where packages are being imported.

Categories