I have a Python application that comes with a command line script. I expose the script via setuptools "entry point" feature. Whenever a user runs the script, I would like the environment to be consistent with the package's requirements.txt. This means that the environment must contain versions of each dependency package that match the version specifiers in requirements.txt.
I know that this can be achieved with venv/virtualenv by making my users create a virtual environment, install requirements.txt in it, and activate that virtual environment whenever they run the script. I do not want to impose this burden of manually invoking virtualenv on users. Ruby's bundler solves this problem by providing bundler/setup-- when loaded, it modifies Ruby's $LOAD_PATH to reflect the contents of the Gemfile (analogue of requirements.txt). Thus it can be placed at the top of a script to transparently control the runtime environment. Does Python have an equivalent? That is, a way to set the environment at runtime according to requirements.txt without imposing additional complexity on the user?
Does Python have an equivalent? That is, a way to set the environment at runtime according to requirements.txt without imposing additional complexity on the user?
Yes, more than one.
One is pex
pex is a library for generating .pex (Python EXecutable) files which
are executable Python environments in the spirit of virtualenvs.
and the other is Platter:
Platter is a tool for Python that simplifies deployments on Unix
servers. It’s a thin wrapper around pip, virtualenv and wheel and aids
in creating packages that can install without compiling or downloading
on servers.
I don't see why it wouldn't be possible for a Python program to install its own dependencies before importing them, but it is unheard of in the Python community.
I'd rather look at options to make your application a standalone executable, as explained here, for instance.
Related
My question is do i have to install django every single time in my virtual environment in order to run my python files? and is this taking up bunch of space on my machine? My project also uses "matplotlib" and every virtual environment i create it also asks me to import the matplotlib module too. its getting annoying. do i have to do this every time?
Im new to Django. I wanted to run some python files in django but they weren't working, so after some research i found out i needed to run my pycharm project in a virtual environment in order to run these python files.
my folders look like this pycharmProjects -> my project
I enter pycharmProjects and I set up virtual environment using "pienv shell". Then i run "python3 manage.py runserver". It turns out i must install django in the virtual environment before the files run.
Short answer is no, you don't have to use a virtual environment at all and can install your dependancies globally instead. However you will soon find that it will cause a lot of issues. The main reason you would create a virtual environment is to give control of your dependancies and prevent bugs that could be caused because of them having their wires crossed between projects.
Short answer yes.
If you create a virualenv you have to install all packages, that your program needs.
Long answer:
You could install django system wide and then create a virtualenv with the option
--system-site-packages then django would be used from your globally installed python.
(Or you install everything just in your global python, put I personally don't think this is good practice)
If you work with many different projects I think you will avoid a lot of trouble if you use one virtualenv per project.
Trouble meaning that one project breaks, because one pip install for another project changed the version of one package and one project can't handle the newer version.
I would recommend to create a requirements.txt file for each project, that lists the dependencies then you can create the virtualenv with following command
pip install -r requirements.txt
if you have requirement.txt files, then you can create virtualenvs rather quickly if going back to an old project and you can delete the virtualenvs whenever you run out of disk space. If you want to be an the safe side, type pip freeze > pipfreeze.txt prior to deleting the virtualenv and use pip install -r pipfreeze.txt if you want to create one with the same modules and the same versions.
You also might want to look at direnv or autoenv if working on a linux like system.
This will automatically switch to the required virtualenv when changing to a project's working directory.
I have a python library that I am wanting to help out with and fix some issues. I just don't know how to test my changes given the complexity of how python/pip installs libraries.
I have the library installed with pip and I can run python code connecting to the library by doing an "from import *". But now that I want to make changes to it I pulled the code with git and plan to branch to work on my changes. That's fine. I will then do a pull request to merge any changes given tests pass.
But after I make a change, how do I integrate my changes into python to test out the changes I made with the library? Can pip install my custom/modified version of the library?
I have looked around and haven't successfully found an answer to this but perhaps I'm not looking in the right spot.
Can pip install my custom/modified version of the library?
Yes.
There are various ways of approaching this question. A common solution is the use of Python virtual environments. This allows you to create an isolated Python environment that does not share the same packages as your system Python install. You can then install things into it (such as your modified Python library) to test it out.
To get started, you need the virtualenv tool. This is probably available as a package for your distribution, but you can also install it using pip. Once you have it, you can run in the same directory as your code:
virtualenv .venv
This creates a virtuelenv named .venv. You can call it anything you want, but naming it .venv (or anything starting with a .) means it won't clutter up the output of ls in your workspace.
Next, you need to activate the virtualenv:
. .venv/bin/activate.sh
This modifies your $PATH to place the virtualenv at the front of the list of directories. Now when you type python or pip, you'll be using the virtualenv version.
If your code has a setup.py file, you can install it like this:
pip install -e .
The -e means you want to perform an "editable" install, which means python will use the code "in place", and any changes you make will be immediately visible to the code you use for testing.
When you're done, you can run:
deactivate
This will remove the changes that activate made to your environment.
For more information:
Pipenv & Virtual Environments discusses a higher level tool for managing virtual environments.
Virtualenvwrapper is another take on a higher level management tool.
I am writing a code in python that uses numpy, matplotlib etc.
How to make sure that even a remote web server with python installed but no extra modules, can run the code without errors?
I usually work on linux environment. Hence from source code, I can install the libraries in a prefix directory and can keep that along with my code. Then add pythonpath locally in my python code to use the directory.
But, I started to realize it's not correct way as first thing, it can't work on cross platform as the libraries are different, and my code inside the script to extend the pythonpath may not work due to the use of "/" in path. Also, I am not sure if the compiled code can work in different environments of the same Linux Platform.
So I think I need to create a directory like unix,windows,osx etc. and put my code there? I believe this is what I find when I download any code online. Is that what developers generally do to avoid these issues?
A popular convention is to list requirements in a text file (requirements.txt) and install them when deploying the project. Depending on your deployment configuration, libraries can be installed in a virtual environment (google keyword: virtualenv), or in a local user folder (pip install --user -r requirements.txt, if this is the only project under this account) or globally (pip install -r requirements.txt, e.g. in a docker container)
Let me first outline my desired solution and then elaborate on a specific question how to achieve this state.
I'm soon starting two coding projects in python. I've used python before but never on such big projects. My ideal scenario would be to have a setup where I can run virtual environments and different python version for various project. Some research pointed me to virtualenv / virtualenvwrapper and pyenv. It seems using pyenv-virtualenv or pyenv-virtualenvwrapper there is a nice way to specify the virtualenvironment and python version for a specific project.
Question: Once I've setup a virtualenvironment and python version for a specific project, how easily could I switch in a later stage to a newer python version? Let's say I've started project A with python 3.4 and in one year in the future I would like to move everything to python 3.6. Is this possible in a neat way?
Sure:
$ rm -r my-python-3.4-env
$ virtualenv -p python3.6 my-python-3.6-env
$ source my-python-3.6-env/bin/activate
In other words, each virtual environment is just a folder with the necessary files in it. You "activate" an environment with the source .../activate command (in case of virtualenv) and you leave it just as easily. To switch to a different environment you simply create a new one with a specific Python executable and activate it.
What you want to be careful about is to keep your installation repeatable, meaning if you depend on external modules (which modern projects typically do), you don't want to install each dependency by hand and instead automate that. For instance, you create a setuptools setup.py file which lists your dependencies, and then have it install them into your new environment automatically:
$ source my-python-3.6-env/bin/activate
(my-python-3.6-env) $ python setup.py develop
I have a Python project which contains three components: main executable scripts, modules which those scripts rely on, and data (sqlite3 databases, flat files, etc.) which those scripts manipulate. The top level has an __init__.py file so that other programs can also borrow from the modules if needed.
The question is, is it more "Pythonic" or "correct" to move my project into the default site-packages directory, or to modify PYTHONPATH to include one directory above my project (so that the project can be imported from)? On the one hand, what I've described is not strictly a "package", but a "project" with data that can be treated as a package. So I'm leaning in the direction of modifying PYTHONPATH (after all, PYTHONPATH must exist for a reason, right?)
Definitely do not add your project to site-packages this is going to spoil your system Python installation and will fire back at the moment some other app would come there or you would need to install something.
There are at last two popular options for installing python apps in isolated manner
Using virtualenv
See virtualenv project. It allows
creation of new isolating python environment - python for this environment is different from system one and has it's own PYTHONPATH setup this allows to keep all installed packages private for it.
activation and deactivation of given virtualenv from command line for command line usage. After activate, you can run pip install etc. and it will affect only given virtualenv install.
calling any Python script by starting by virtualenv Python copy - this will use related virtualenv (note, that there is no need to call any activate)
using zc.buildout
This package provides command buildout. With this, you can use special configuration file and this allows creation of local python environment with all packages and scripts.
Conclusions
virtualenv seems more popular today and I find it much easier to learn and use
zc.buildout might be also working for you, but be prepared for a bit longer learning time
installing into system Python directories shall be always reserved for very special cases (pip, easy_install), better avoid it.
installing into private directories and manipulatig PYTHONPATH is also an option, but you would repeat, what virtualenv already provides