I have a simple Python GUI application (with a standard structure, similar to https://github.com/kennethreitz/samplemod) it consists of several .py files in a subdirectory. I would like to send the folder with all the source code files to person B so he could easily run it.
I need to zip the folder and transfer it on physical drive, so
hosting on PyPI is not possible.
Person B is likely to have python installed, but providing a simple way of
how he can install dependencies would be nice. Person B might have
Linux, Windows or MacOS.
It is preferable for person B to be able to run the application in
an easy way (preferably by running a script or few commands) that is
considered a good-practice. I have read that for
example altering PYTHONPATH is not a good practice.
Please explain in which format would YOU want to receive such application folder to be able to easily run it (consider that you want to run it once and then delete it and forget about it).
How should I prepare the application? What instructions should I give to person B?
Add a file to your application that contains your dependencies (can be generated with pip freeze for instance). All of your dependencies can then be installed by pip install -r yourDependencyFile.txt
You can also easily make a bash-script or something similar for this that installs the dependencies and then starts the application. This will depend on your setup, but it could be as easy as
pip install -r dependencies.txt && python main.py
Related
Say you have a project, and you want to export it so that you can run it on another machine that:
You don't have root access on
You cannot assume any python packages to be installed other than python itself (not even pip)
Is there a way to export the project, even if it is just a simple script, with everything that it imports, and then everything that the imports need etc.
For example, my project uses a library called python-telegram-bot. It has a list of requirements, and I have tried running pip -r requirements.txt --target myapp to install the requirements into the app's folder, but this is not recursive. For example, requests is not in the library, yet it is needed by the app. And if I manually add requests, there are things that requests needs that aren't part of that.
Is there a way to collect every last bit of requirements into a single folder so that my script functions on an entirely vanilla installation of python?
Pyinstaller creates an executable and you can either roll all dependencies into 1 file (makes it a bit slow to load though) or create a folder with all the packages, modules, imports etc. your script will need to run on any machine. You can Google the docs for pyinstaller, it's all pretty well covered.
Hope that helps, Kuda
I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.
My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?
You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.
Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start
However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt
And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.
Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.
Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.
For deployment/distribution of package, you can choose to
Distribute by source code. User need to run python setup.py --install, or
Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>
However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.
In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.
Note : virtualenv itself is actually a hack to achieve isolation.
I want to modify a couple of functions from sci-kit learn - adding a few lines at most.
Working on my Windows machine at home I can (and did) edit the source code directly (though I realize this is risky in view of future projects...).
Now, however, I'm working on a remote Linux server where I don't have privileges to edit /usr/lib/python2.7/dist-packages/sklearn/.../
What's the best way to go about this?
Depending of your modifications you can:
Override method/function in a new python module
Install library in a different folder for your project
pip install --install-option="--prefix=$PREFIX_PATH" package_name
Copy the library in your project (pip or git or cp)
I have a simple python shell script (no gui) who uses a couple of dependencies (requests and BeautifulfSoup4).
I would like to share this simple script over multiple computers. Each computer has already python installed and they are all Linux powered.
At this moment, on my development environments, the application runs inside a virtualenv with all its dependencies.
Is there any way to share this application with all the dependencies without the needing of installing them with pip?
I would like to just run python myapp.py to run it.
You will need to either create a single-file executable, using something like bbfreeze or pyinstaller or bundle your dependencies (assuming they're pure-python) into a .zip file and then source it as your PYTHONPATH (ex: PYTHONPATH=deps.zip python myapp.py).
The much better solution would be to create a setup.py file and use pip. Your setup.py file can create dependency links to files or repos if you don't want those machines to have access to the outside world. See this related issue.
As long as you make the virtualenv relocatable (use the --relocatable option on it in its original place), you can literally just copy the whole virtualenv over. If you create it with --copy-only (you'll need to patch the bug in virtualenv), then you shouldn't even need to have python installed elsewhere on the target machines.
Alternatively, look at http://guide.python-distribute.org/ and learn how to create an egg or wheel. An egg can then be run directly by python.
I haven't tested your particular case, but you can find source code (either mirrored or original) on a site like github.
For example, for BeautifulSoup, you can find the code here.
You can put the code into the same folder (probably a rename is a good idea, so as to not call an existing package). Just note that you won't get any updates.
First let me explain the current situation:
We do have several python applications which depend on custom (not public released ones) as well as general known packages. These depedencies are all installed on the system python installation. Distribution of the application is done via git by source. All these computers are hidden inside a corporate network and don't have internet access.
This approach is bit pain in the ass since it has the following downsides:
Libs have to be installed manually on each computer :(
How to better deploy an application? I recently saw virtualenv which seems to be the solution but I don't see it yet.
virtualenv creates a clean python instance for my application. How exactly should I deploy this so that usesrs of the software can easily start it?
Should there be a startup script inside the application which creates the virtualenv during start?
The next problem is that the computers don't have internet access. I know that I can specify a custom location for packages (network share?) but is that the right approach? Or should I deploy the zipped packages too?
Would another approach would be to ship the whole python instance? So the user doesn't have to startup the virutalenv? In this python instance all necessary packages would be pre-installed.
Since our apps are fast growing we have a fast release cycle (2 weeks). Deploying via git was very easy. Users could pull from a stable branch via an update script to get the last release - would that be still possible or are there better approaches?
I know that there are a lot questions. Hopefully someone can answer me r give me some advice.
You can use pip to install directly from git:
pip install -e git+http://192.168.1.1/git/packagename#egg=packagename
This applies whether you use virtualenv (which you should) or not.
You can also create a requirements.txt file containing all the stuff you want installed:
-e git+http://192.168.1.1/git/packagename#egg=packagename
-e git+http://192.168.1.1/git/packagename2#egg=packagename2
And then you just do this:
pip install -r requirements.txt
So the deployment procedure would consist in getting the requirements.txt file and then executing the above command. Adding virtualenv would make it cleaner, not easier; without virtualenv you would pollute the systemwide Python installation. virtualenv is meant to provide a solution for running many apps each in its own distinct virtual Python environment; it doesn't have much to do with how to actually install stuff in that environment.