Deploy Python virtualenv on no internet machine with different system - python

I need to deploy python application to a no internet server.
I have created a virtual environment on my host machine which uses Ubuntu. This contains python script with a variety of non-standard libraries. I have used option --relocatable to make the links relative.
I have copied over the environment to my client machine which uses RedHat and has no access to the internet.
After activating it using source my_project/bin/activate the environment does not seem to be working - the python used is standard system one and the libraries don't work.
How can the virtual environment be deployed on a different server?
Edit: this is normally done through the creation of requirement.txt file and then using pip to install the libraries on the target machine, however in this case it's not possible as the machine is offline.

For anyone dealing with the same problem:
The quickest way for me was to:
Create a VirtualBox with the target system on the internet machine
Download wheel files using pip download
Migrate to the target machine
Install with pip install --no-index --find-links pip_libs/ requests

What error is shown when u try to activate it, Make sure the python version and environment PATH are consistent with the ones on the previous system.

Related

python: How to install virtualenv without internet connection

I am filling in for someone while they are on vacation and I am new to python. I've been asked to install several packages in a virtual environment. The big catch is the server cannot be on a public net so I will be downloading the software on a different server and copying it to the server where the researcher will be working.
I found another thread "Install Virtualenv without internet connectivity" at Install Virtualenv without internet connectivity, but that doesn't fit the situation I am in - it looks like they can start from a server on the net to complete their installs and want to share that virtualenv to other systems in a lab environment that may not have Internet connectivity.
Another thread "python: How to create virtualenv without internet connection" at python: How to create virtualenv without internet connection, is similar but it looks like they already have virtualenv installed. I don't find virtualenv installed here.
This Windows Server 2016 system is locked down where I cannot copy and paste the commands I ran to provide the information below, so forgive any typos in the hastily written message. I found the python version installed by:
python --version
Python 3.6.2rc1
I have not been able to find an installer to download for virtualenv. Do I need to download the Python installer again, re-run it and select additional options?
Thank you for any help you are able to provide.
edited to add:
Based on feedback, I changed the command (in an administrative command window) to
python -m venv [path] and I have been able to make some progress.
I have the ability to download gz, whl or other files and move them to this server to run them there, but this server cannot be put online to download the installers directly nor can it connect to a repo to download dependencies. I cannot set up the environment on a different machine that has connectivity and share it without violating the security requirements. Thank you for the link to the Python Package Offline Installation thread - I think I was so narrowly focused on the virtual environment that I missed that post.
You're using Python 3.6, which means venv is included and pip can be bootstrapped if necessary. Creating a venv does not need internet access:
python3 -m venv .venv --prompt=myvenv
Installing pip does not require internet access:
python3 -m ensurepip
Installing a package from a local file does not require internet access:
python3 -m pip install --no-index --disable-pip-version-check ./mydist.tar.gz
Should you have more than one package to install (for example, if mydist has dependencies) you may specify a local directory as your --index-url instead of the index defaulting to PyPI.
A solution for people on older versions of Python is covered here.

Run remote python code via ssh with uninstalled modules

I wish to connect to a Linux machine by ssh (from my code) and run some code that is using python libraries that are not installed on the remote machine, what would be the best way to do so?
using a call like this:
cat main.py | ssh user#server python -
will run main.py on the server, but wont help me with the dependencies, is there a way to somehow 'compile' the relevant libraries and have them sent over just for the running my code?
I wish to avoid installing the libraries on the remote machine if possible
Try virtualenv:
pip install virtualenv
then use
virtualenv venv
to create a seperated python environment in current path(in folder venv).
Instead of installing multiple packages in default python path, virtualenv needs only one package installed.

Relocating virtual environment project from local to server (flask project), have to install dependencies again?

I have created a flask application in a virtual environment on my local machine and I could run it locally (at http://localhost:5000).
I then put this project in a repo and I then went to my server and git clone this project.
All files are identical on my local machine and in my server.
I then wanted to test this virtual environment on the server by trying .venv/bin/activate
However I ran into an error. It says I do not have flask!:
Traceback (most recent call last):
File "__init__.py", line 1, in <module>
from flask import Flask
ImportError: No module named flask
I am assuming that I have to initialize something in the virtual environment first, like installing all of the dependencies. Or do I have to pip install flask again? (It would be kind of funny to do that...)
As a general rule python environments are not portable across machines.
This means that you cannot reliably expect to port the virtual environment across machines. This is especially true if you are moving stuff between different operating systems. For example, a virtual environment created in Windows will not work in Linux.
Similarly, a virtual environment created in OSX will not work in Linux. Sometimes, you can get Linux > Linux compatibility, but this is by chance and not to be relied upon.
The reasons are numerous - some libraries need to be built against native extensions, others require compatible system libraries in place to work, etc.
So, the most reliable workflow is the following:
You can (but I would recommend against this) put your virtual environment in the same directory as your project. If you do so, make sure you don't add the virtual environment root directory to your source control system. It is best to separate your virtual environments from your source code (see the virtualenvwrapper project project for a great way to manage your virtual environments separately).
You should create a requirements file, by running pip freeze > requirements.txt. Keep this file updated and add it to your source control system. In your target system, simply create an empty virtual environment and then pip install -r requirements.txt to make sure all requirements are installed correctly. Doing so will make sure that any native extensions are also built and installed.
A few possible issues:
When you created your original virtual environment did you specify --no-site-packages if not your package may be using elements from the system.
Some packages rely on system installed libraries that may be missing on your target system
Is your server running on a similar set of hardware to your development system with the same OS - if not your virtualenv is likely not to work without re-installing packages as any C/C++ extensions will have been built for the wrong hardware/OS and will not work.
The thing is that virtualenv is not a package builder, (look at pyinstaller for that), but rather a development and test environment when you go to distribute your code to a new platform then, provided you started off with --no-site-packages you can easily find out which packages you need to find out what you need to install on the new target.
So basically - Yes you, or more likely the system admin, does need to run pip install flask and probably several other things!

Importing and debugging a Python Django project made in a different environment

I am working on a Django project that was created by another developer on a different machine. I see that in the root of the application, there is a .virtualenv directory. Is it possible to simply setup this project locally on my Windows machine using the project settings and Python version (the app uses 2.7), so that I can run it like a local Django application so debugging is feasible?
I have access to the development web server and have copied the full source of the app down to my Win7 machine but cannot seem to get things setup correctly to run the app locally so I can debug.
I currently have Python 2.7, 2.7.5 and 3.3.2 installed on my local dev machine. I would call myself pretty new to Django and Virtualenv.
If anyone has any guidance on how I can get my environment straitened out so I can run the app with debugging, I would be very thankful.
Thank you in advance.
Using a virtualenv environment created on a different machine is not recommended. There are things hard-wired for the particular system it was created on, and some apps may have components compiled for that particular system.
You should create a new virtualenv environment on your machine, install dependencies and move the Django project there.
Note on installing dependencies - there might be a file named requirements.txt somewhere. If it's there and it's kept up to date you can install all the dependencies by running a single command while in your virtualenv:
pip -r requirements.txt install
If you can't find it ask the other developer to create it. He just need to do this inside his own environment:
pip freeze > requirements.txt
I once faced the same problem and it took me so much time to configure another environment that I eventually had to create a VM with the same version of OS and libraries. I then made a raw copy of the project and it worked fine.

How to relocate python installation installed as non-root user on another server to another server in different directory?

We have two linux servers, one is on private network which does not have internet access. The other is on public network which has internet access. Both the servers run the same RHEL-5 OS.
On as server which has internet access, I have installed python under my home directory as non-root user. Then I used pip to install other packages, pip also resolves dependencies and install the required dependencies.
How I can relocate this python to a server which does not have root access ? Also I want to relocate it as root under different directory ?
Why I want to do this ? Since the private server does not have internet access pip won't work for installing 100s of modules and there dependencies. Since the servers are running the same OS release, is there any easy way to relocate python installed on one server to another server though in a different directory ?
If possible, I'd try to go through the front door and actually install the packages on the other server. Cloning of all packages should in principle be equivalent to the following:
On the first machine (with Internet access and installed packages):
mkdir /tmp/pypackages
pip install -r <(pip freeze) -d /tmp/pypackages
On the second machine:
Copy the packages to /tmp/pypackages
Install them:
cd /tmp/pypackages
pip install *
(either as root or as a regular user).
Note that when I try to run the first set of commands on my machine, I get some errors which I blame on the fact that not all of the packages shown by pip freeze were actually installed with pip. You may need to filter that list as well. It'll probably be easier to save the output of pip freeze to a file and edit it.
P.S. python itself can also be downloaded, transferred, and installed locally by means of the system package manager.

Categories