Install python pip modules only for use by Ansible - python

I am using Ansible from a pipeline agent, to configure Ubuntu VMs and I would like to use the azure_rm_storageblob and mssql_script module directly on the Ubuntu VM that I am configuring. I had some issues running this, because the packaging module was not installed on the hosts.
Is there a way to install the pip modules only for use when I run Ansible (maybe a virtual environment), as I don't want to mess with the pip modules that are installed on the server for other purposes.
If this is done using something like a python virtual env, how do I make sure that this is used when I connect to the VM using Ansible?

You have to use virtual environments to solve that problem (link to the python doc)
First you create the environment with :
python -m venv venv
Then depending on your OS, you have to activate you virtual environment, on Unix you can do :
source venv/bin/activate
The name of you virtual env should be written in your terminal, at that point you know you are using it and not the default python env.

Related

Python Virtual Environment Help - Django

Error Message
I am struggling to install Django to my new project using pipenv, it keeps telling me to run another virtual environment rather than actually installing Django.
I tried to do what it says to using the other virtual env but it still won't work.
You get that all wrong.
"The venv module supports creating lightweight “virtual environments”, each with their own independent set of Python packages installed in their site directories. A virtual environment is created on top of an existing Python installation, known as the virtual environment’s “base” Python, and may optionally be isolated from the packages in the base environment, so only those explicitly installed in the virtual environment are available.
When used from within a virtual environment, common installation tools such as pip will install Python packages into a virtual environment without needing to be told to do so explicitly."
pip is python package manager and there for tool for installing modules such as Django. If you are running linux you can use following commands
-> cd storefront
-> python -m venv venv (create new virtual environment name: "venv")
-> . venv/bin/activate (to activate virtual environment)
-> pip install django (to install Django modules)

package not being permanently installed inside the python virtual environment

I'm building a rest-api using the Django Python framework. I'm using many external python packages. I have created a python virtual environment (python -m venv venv) and after activating the venv environment (venv\Scripts\activate), I installed the requests package (python -m pip install requests). Then I pushed my project to my git repo and cloned it onto another machine. When I tried to run my Django project, it asked me to install the requests package again. Why or how can I permanently install packages into my python virtual environment or someplace else where I wouldn't have to install packages again on different machines? I'm looking for a solution similar to NodeJS - npm of installing packages as all the packages are locally installed into the node_modules folder of the project and you don't have to reinstall them on different machines. Thanks
The environment itself is not shareable in the way you specify. I'd recommend to use Docker for this use-case. If you create a docker image which has the correct dependencies, then you can easily operate in the same environment on different computers. The python venv cannot be used this way.
Nevertheless, if your requirements.txt files specify package versions, then the venv you create on the two machines should be relatively similar (depending of course on other parameters like the OS, python version, etc.).

Can vagrant and ansible playbook be run "inside" of pipenv on macOS

I am using macOS and there I have created pip env. After I activate pip env, I install ansible in created virenv. After I activate this virenv terminal, I run vagrant up in this terminal.
Is this good practice ?
My motivation to do this:
I want to avoid whole mess with pip packages and version on macOS. I use vagrant for testing my ansilbe playbooks.
He wants Ansible to run in the virtual environment created using Pipenv not on one of the target (testing) virtual machines. Thus, he can port his entire virtual environment to any machine and in theory it is clean from the OS Python, etc. packages while being exactly reproducible as code. The Vagrant VMs are only for testing machines to run the playbooks against before going on to another development or production platform.

How to create a Python virtual environment independent of OS and Python version

I am trying to create a virtual environment to run a script which requires Python 3.6. I started off with Pipenv but I am unable to create the same environment on other platforms via the Pipfile.lock or requirements.txt unless the other platform(s) has Python 3.6 installed. I have read this post but I am unsure which direction I should take to create a virtual environment which can be shared and run its own version of Python independent of operating system and version of Python installed on the other platform.
Virtual environments are not portable, they depend on the Python installation you have.
You can't share/distribute virtual environment with others, because you can't control which version of Python others are using.
If you want to distribute your code along with all dependencies including the specific version of Python interpreter, you can use PyInstaller. It is far from perfect and little bit hacky. Also it generates a package which is specific to operating system.
https://pyinstaller.readthedocs.io/en/stable/operating-mode.html
There is also a detailed step-by-step guide on how to use PyInstaller.
https://realpython.com/pyinstaller-python/
This is step-by-step how I use Python virtual environment and share it with co-workers.
To check python and virtualenv presence, run following commands:
which python3
python3 -m pip list | grep env
which virtualenv
Install a python virtual environment builder:
python3 -m pip install virtualenv
Create a virtual environment named venv inside the project's directory: virtualenv venv
To activate this environment use this command inside project's directory: source venv/bin/activate
Install python modules dependencies listed in a requirements.txt:
python3 -m pip install -r requirements.txt
You should activate virtual environment when you working with python in this directory for package installation and for running commands in the project directory. When you need to deactivate the virtual environment do it using deactivate command.
To deactivate environment simply run: deactivate

How to use a virtual environment

Using Python I require both python 2.7 and python 3.5 for different packages. I am trying to install the following package NepidemiX. I get an error when I do this as I have a newer version of python installed.
To combat this I am trying to create a virtual environment. To do this I am using the virtualenv package.
I have created and activated this and am now faced with
(my_project)Your-Computer:your_project UserName$)
In my terminal.
How do I now proceed to install my package from here? Do I need to install python 2.7 in this environment first, or do I simply copy the desired package into the environment ... ?
Please could you instruct me how to correctly set this up?
Many thanks!
Virtual environment is only for libraries. It uses python versions installed on your computer. You can specify the version of python by using the -p attribute while creating the environment, for ex. virtualenv -p python3 env creates a python 3 enviroment (provided you have it installed in your computer and on the PATH). Check this answer.
After you activate the environment (source /env/bin/activate), just pip install libraries, and the environment takes care of installing the correct version.

Categories