I'm building a rest-api using the Django Python framework. I'm using many external python packages. I have created a python virtual environment (python -m venv venv) and after activating the venv environment (venv\Scripts\activate), I installed the requests package (python -m pip install requests). Then I pushed my project to my git repo and cloned it onto another machine. When I tried to run my Django project, it asked me to install the requests package again. Why or how can I permanently install packages into my python virtual environment or someplace else where I wouldn't have to install packages again on different machines? I'm looking for a solution similar to NodeJS - npm of installing packages as all the packages are locally installed into the node_modules folder of the project and you don't have to reinstall them on different machines. Thanks
The environment itself is not shareable in the way you specify. I'd recommend to use Docker for this use-case. If you create a docker image which has the correct dependencies, then you can easily operate in the same environment on different computers. The python venv cannot be used this way.
Nevertheless, if your requirements.txt files specify package versions, then the venv you create on the two machines should be relatively similar (depending of course on other parameters like the OS, python version, etc.).
Related
Error Message
I am struggling to install Django to my new project using pipenv, it keeps telling me to run another virtual environment rather than actually installing Django.
I tried to do what it says to using the other virtual env but it still won't work.
You get that all wrong.
"The venv module supports creating lightweight “virtual environments”, each with their own independent set of Python packages installed in their site directories. A virtual environment is created on top of an existing Python installation, known as the virtual environment’s “base” Python, and may optionally be isolated from the packages in the base environment, so only those explicitly installed in the virtual environment are available.
When used from within a virtual environment, common installation tools such as pip will install Python packages into a virtual environment without needing to be told to do so explicitly."
pip is python package manager and there for tool for installing modules such as Django. If you are running linux you can use following commands
-> cd storefront
-> python -m venv venv (create new virtual environment name: "venv")
-> . venv/bin/activate (to activate virtual environment)
-> pip install django (to install Django modules)
I am using Ansible from a pipeline agent, to configure Ubuntu VMs and I would like to use the azure_rm_storageblob and mssql_script module directly on the Ubuntu VM that I am configuring. I had some issues running this, because the packaging module was not installed on the hosts.
Is there a way to install the pip modules only for use when I run Ansible (maybe a virtual environment), as I don't want to mess with the pip modules that are installed on the server for other purposes.
If this is done using something like a python virtual env, how do I make sure that this is used when I connect to the VM using Ansible?
You have to use virtual environments to solve that problem (link to the python doc)
First you create the environment with :
python -m venv venv
Then depending on your OS, you have to activate you virtual environment, on Unix you can do :
source venv/bin/activate
The name of you virtual env should be written in your terminal, at that point you know you are using it and not the default python env.
If multiple developers were to develop a python package, how should we deal with these problems:
(1) What is the best practice to make all developers use the same dev environment?
(2) On dev end, do all of us need to use .../site-package/mypackage as our dev path (map code from version control to there and develop code there) as if things were installed to that path using pip?
Suggest use virtual environment (venv) to control python version and package version. Under venv, pip freeze > requirements.txt can only output what your project needs. Other developers just need to create a new one in their project folder and run pip install -r requirements.txt. It can avoid version conflicts of different projects.
https://docs.python.org/3/library/venv.html
when I pip install a package it gets insalled on my macs library. I am using pycharm whih allows me to click on a package like a hyperlink. And instead of going to my site-packages in my virtualenv it's going to my macs library which is
/Library/Frameworks/Python.Framework/Versions/3.5/lib/python3.5/site-packages/gdata/youtube/
when it should be
myproject/lib/python3.5/site-packages/gdata/youtube/
why is that.
You should activate your virtual environment to install packages on that. In Pycharm you can do it like this:
Go to File > Settings > Project > Project Interpreter
Now you have to select the interpreter for this project. Browse or select the interpreter from drop-down if available. In your case this should be:
myproject/lib/python3.5
I am using Pycharm community edition on Ubuntu. But the
process should be similar in Mac.
I think you want to create a virtual environment for your project.
Install this tool virtualenv.
$ pip install virtualenv
Then create your project's folder
$ cd my_project_folder
$ virtualenv venv
This creates a copy of Python in whichever directory you ran the command in, placing it in a folder named venv.
Source
https://github.com/pypa/virtualenv
For further knowledge read
https://realpython.com/blog/python/python-virtual-environments-a-primer/
You should install your virtual environment and then run pip within that environment. So, for example, I use Anaconda (which I thoroughly recommend if you are installing alot of scientific libraries).
To activate the environment "hle" I type in:
source /Users/admin/.conda/envs/hle/bin/activate hle
Once I've done this the pip command will reference the virtual environment location and not the standard mac location. So, when I install "mypackage" as follows:
pip install mypackage
It subsequently installs files in the virtual folder and not in the usual mac system folders.
You can find out about the Anaconda virtual environment (and download it) over here: http://conda.pydata.org/docs/install/quick.html but other virtual environments (like Virtualenv) work in the same way.
I am using numpy / scipy / pynest to do some research computing on Mac OS X. For performance, we rent a 400-node cluster (with Linux) from our university so that the tasks could be done parallel. The problem is that we are NOT allowed to install any extra packages on the cluster (no sudo or any installation tool), they only provide the raw python itself.
How can I run my scripts on the cluster then? Is there any way to integrate the modules (numpy and scipy also have some compiled binaries I think) so that it could be interpreted and executed without installing packages?
You don't need root privileges to install packages in your home directory. You can do that with a command such as
pip install --user numpy
or from source
python setup.py install --user
See https://stackoverflow.com/a/7143496/284795
The first alternative is much more convenient, so if the server doesn't have pip or easy_install, you should politely ask the admins to add it, explaining the benefit to them (they won't be bothered anymore by requests for individual packages).
You could create a virtual environment through the virtualenv package.
This creates a folder (say venv) with a new copy of the Python executable and a new site-packages directory, into which you can "install" any number of packages without needing any kind of administrative access at all. Thus, activating the environment through source venv/bin/activate will give Python an environment that's equivalent to having those packages installed.
I know this works for SGE clusters, although how the virtual environment is activated might depend on your cluster's configuration.
You can try installing virtualenv on your cluster within your own site-packages directory using the following steps:
Download virtualenv from here, put it on your cluster
Install it using setup.py to a specific, local directory to serve as your own site-packages:
python setup.py build
python setup.py install --install-base /path/to/local-site-packages
Add that directory to your PYTHONPATH:
export PYTHONPATH="/path/to/local-site-packages:${PYTHONPATH}"
Create a virtualenv:
virtualenv venv
You can import a module from an arbitrary path by calling:
sys.path.append()
The Python Distribution Anaconda solves many of the issues discussed in this questions. Anaconda does not require Admin or root access and is able to install to your home directory. Anaconda comes with many of the packages in question (scipy, numpy, sklearn, etc...) as well as the conda installer to install additional packages should additional ones be necessary.
It can be downloaded from https://www.continuum.io/downloads