Ansible winrm with virtualenv - python

I'm trying to work with ansible, winrm, virtualenv and Jenkins...
Currently, I have installed Ansible with Tom via epel-release.
Jenkins has only basic configuration for now.
I have then created a virtualenv inside Jenkins home named $HOME/ansible-winrm. Then inside it, I have installed winrm via pip.
What I'm trying to do is :
- create a simple job on Jenkins with only a shell script calling ansible-playbook. And it should access to the winrm library installed inside my local virtualenv.
- It should be as transparent as possible.
P.S. It seems that python binary is hard codded inside ansible-playbook script.
What are your best practices to solve this issue ?

Best way to do it is installing winrm with pip in user workspace (option --user)
Ex: pip install --user pywinrm

Related

How do I install python dependency modules through bamboo

I am trying to run a python program through bamboo.
How do I install python dependency modules through bamboo
I need to install some python modules like flask , xldr etc.
You have two options:
Remote or log into the Bamboo agent and manually install the modules. This is a one time install and then they will be there for the task to use in the future.
Run the job using a Docker host instead of the local agent. Then you can specify all the dependencies in the Docker image that is used to build (e.g., Python version, imports).

Deploy Python virtualenv on no internet machine with different system

I need to deploy python application to a no internet server.
I have created a virtual environment on my host machine which uses Ubuntu. This contains python script with a variety of non-standard libraries. I have used option --relocatable to make the links relative.
I have copied over the environment to my client machine which uses RedHat and has no access to the internet.
After activating it using source my_project/bin/activate the environment does not seem to be working - the python used is standard system one and the libraries don't work.
How can the virtual environment be deployed on a different server?
Edit: this is normally done through the creation of requirement.txt file and then using pip to install the libraries on the target machine, however in this case it's not possible as the machine is offline.
For anyone dealing with the same problem:
The quickest way for me was to:
Create a VirtualBox with the target system on the internet machine
Download wheel files using pip download
Migrate to the target machine
Install with pip install --no-index --find-links pip_libs/ requests
What error is shown when u try to activate it, Make sure the python version and environment PATH are consistent with the ones on the previous system.

Run remote python code via ssh with uninstalled modules

I wish to connect to a Linux machine by ssh (from my code) and run some code that is using python libraries that are not installed on the remote machine, what would be the best way to do so?
using a call like this:
cat main.py | ssh user#server python -
will run main.py on the server, but wont help me with the dependencies, is there a way to somehow 'compile' the relevant libraries and have them sent over just for the running my code?
I wish to avoid installing the libraries on the remote machine if possible
Try virtualenv:
pip install virtualenv
then use
virtualenv venv
to create a seperated python environment in current path(in folder venv).
Instead of installing multiple packages in default python path, virtualenv needs only one package installed.

"shade is required for this module" even though shade is installed

Im trying to deploy an ansible playbook to spin up some new openstack instances and keep getting the errorr
"shade is required for this module"
Shade is definitely installed as are all its dependancies.
I've tried adding
localhost ansible_python_interpreter="/usr/bin/env python"
to the ansible hosts file as suggested here, but this did not work.
https://groups.google.com/forum/#!topic/ansible-project/rvqccvDLLcQ
Any advice on solving this would be most appreciated.
On my hosts file I have the following:
[local]
127.0.0.1 ansible_connection=local ansible_python_interpreter="/usr/bin/python"
So far I haven't been using venv and my playbooks work fine.
By adding the ansible_connection= local, it should tell your playbook to be executed on the Ansible machine (I guess that's what you are trying to do).
Then when I launch a playbook, I start with the following:
- hosts: local
connection: local
Not sure if that's the problem. If this does not work, you should give us more information (extract of your playbook at least).
Good luck!
Try installing ansible using pip because I don't know why the python environment of the ansible package provided by my distro isn't the same as the shade module (installed using pip).
On ArchLinux
sudo pacman -R ansible
sudo pip install ansible

How to install Django with pip inside a virtualenv in /var/www/html/project_name folder?

I'm using Ubuntu 15.04 with Python 2.7, pip 1.5.6 and virtualenv 1.11.6.
I will create a Django project inside /var/www/html/project_name (and work in that directory) for use it with Apache2.
I created a virtual environment named venv inside the project_name folder for syncing purposes.
With the virtual environment activated, I just cant't run pip install django because I get 'Permission denied' message. So I try the command sudo pip install django, but that will install Django globally.
So, running which pip, I get /var/www/html/project_name/venv/bin/pip.
But running sudo which pip, I get /usr/bin/pip.
Does someone get any idea about how it is possible to install Django (or any other package) inside the virtual environment?
PS: I know it's possible to run sudo venv/bin/pip install django, but it doesn't seem very useful.
Your trouble is presumably that you don't have write access to this /var/www/html/project_name directory. Your choices are:
Change the permissions, recursively, so that you do have permissions to write to that directory.
Run the following commands:
.
$ sudo su
# . venv/bin/activate
# pip install
Just to add what everyone's been saying and I sort of skimmed over: NEVER EVER put sensitive things (that includes your django project) under the document root. You can store it under something like /var/www/scripts or something, but don't put it under the document root.
Another way of deploying is using something like gunicorn as the "main" webserver and then just having whatever world visible webserver (like apache) reverse proxy to gunicorn. I've done this with nginx and it's fairly easy to setup, the only down side is then you have to setup something extra in your system's init scripts to start up gunicorn.

Categories