I need to redistribute Python 2.6 for my users.
Currently, I execute the silent instalation for the msi installer from http://www.python.org/download/, but I have some problems, like if other version of python is installed this not get the default.
In the other hand, despite this could be rare, if a user have already python and I install my own, then could cause a conflict.
So, which could the best curse of action? How automate this install?. I already have http://installbuilder.bitrock.com but the available django stack is too large for my app, and I need to pre-install several python modules.
I wonder if in this case virtualenv will be the best option, but still have the issue of properly install it...
I haven't used it, beyond some initial research, but ActiveState's ActivePython generally does what you want, if you're looking for a commercial pre-built solution.
I have worked on software that included its own bundled Python, and there were never any problems with conflicts. In our case, it was installed alongside the rest of the software, rather than C:\Python26, and it was intended for the software's use only, not as a replacement for any existing Python on the system. A full Python with the stdlib pre-compiled to .pyo takes only ~10-20MB of space, depending on which modules you include. So I wouldn't worry about the possible redundancy.
It also makes your life easier to have your own copy, since you know exactly which versions of all the third-party libraries you have; you don't have to worry about the user upgrading something that breaks your app. The counter-argument is that some users may want the latest versions of libraries, for security reasons. But in my experience, those users are in a very slim minority, and for them it just boils down to a little more work.
Finally, I presume you have reasons for wanting a full Python installation. Just in case you're only interested in a way to distribute a single app, take a look at py2exe.
Related
I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.
I have read a bunch of posts here and on Google, but my question is far more basic than the answers: If Python(2.7) came pre-installed on my MacBook Pro (High Sierra), can I just do sudo easy_install pip (as suggested) from the command line--withOUT causing issues? I have a vague understanding of global/local installations, and my understanding is certain Python installations aren't compatible with local/global kernel installations. I hope I am getting the terminology right, but I saw several warnings about installing pip for "a homebrew based python installation", but I am not sure whether Python on my laptop is installed via homebrew (nor how to find out).
My question came about because I wanted to install the Hydrogen package to use in Atom, the text editor (to help me learn Python). I finally succeeded in installing Hydrogen, but got stumped by the missing kernels (not sure which ones I need, so I am willing to install them all). But I can't seem to install the kernels without pip. So here I am.
My apologies for asking such a basic question--and thanks!
The rule of thumb is: If your operating system has a package manager, use it.
Unfortunately, MacOS is the only UNIX-like operating system that does not come with a decent package managment system.
(There is the app store, but that is useless for a lot open source software for several different reasons. It's also a walled garden.)
You have several choices (in descreasing suitability):
Use one of the package managers available for MacOS. Which one is the best choice for you depends on all the packages you need being available.
Use a Python distribution. I've used Anaconda on ms-windows, and that has saved me a lot of hassle. A good choice if you are only looking for Python and related libraries.
Build everything yourself. This can be very time-consuming and is a duplication of effort. You will learn a lot though.
I would second Piinthesky's comment that you install Python 3.6. Python 2.7 is now a legacy version.
Well although I am no Mac expert I've given it a shot anyway:
Yes you could but do you really want to risk it (or even do it)?
Mac-OS must rely on Python to fulfill something in the OS otherwise it would not come inbuilt. This means two things:
The Python installation will be minimal. By that I mean it will have things missing (any large library for a start). They will do this mainly to cut down on the OS size. Therefore you will not have the full Python library and in the long term you may end up missing out.
Second if anything went wrong (IE you broke your installation or even modified it -yep I've done this in Linux and have ended up factory resetting) then you may cause something to stop working and may need to factory reset or perform some other drastic action on your OS. A separate installation would prevent your from risking this. This is very useful because there comes a time when you may decide to update certain modules with pip and find it can't or it updates something that you shouldn't be messing with.
Yes it's possible you may run into compatibility problems but I think it's most widely accepted that you do not use the inbuilt one as it needs to remain unchanged if the OS is to use it correctly. Messing with it increases the chances of it breaking.
Conclusion: So even though installing modules with pip (and getting pip) can be done with the inbuilt Python it comes down whether you want to risk harming your OS. I strongly suggest you get a separate installation and leave the inbuilt one as it is. Second as you mentioned you will find that the inbuilt versions are never up-to-date or are built were they are not really compatible with standard libraries (expect things like the missing runtime libraries all the time) , just another reason to stay clear of them.
This is how I solved this problem-for those newbies who just want Hydrogen to work:
Installed Python 3 (instead of messing around with Python 2.7 and pip).
Followed instructions here (https://packaging.python.org/tutorials/installing-packages/#ensure-you-can-run-pip-from-the-command-line) for 'get-pip.py'.
In Atom, cmd+shift+p to bring up the packages menu, clicked on 'Hydrogen Run', which gave me the errors again.
Copied the code from the warnings and installed the kernels needed (via the command line).
Hydrogen is now working.
Thanks for all the tips!
I've built a few pieces of C++ software I want to release for Ubuntu. What ways are there and what can you recommend? Is building .deb files and setting up an apt repo for them the best way? What about make install, is it considered an acceptable way to install software?
By far simplest for me, and perhaps most transparent for the user, would be to just have a github repository in which one could run make install to get all programs installed at one go.
Do I always install the binaries into /usr/bin?
One of the programs contains Python 3 library code, should that be installed in /usr/lib/python3/dist-packages? (I don't want to create a pip package, that would make the installation harder -- and waste more of my time.) The program also contains Python 3 examples/tutorials intended for the user to tweak and learn from, where do I install those? Do I create a ~/my-prog-tutorial-dir/ to put them in? If so: how should I name that directory?
Edit: if I simply release the statically linked binaries in a tarball, what will break eventually? Libc? Are there any major application APIs that usually change between Ubuntu LTSs? I only use pthreads, X11 and OpenGL so I suspect statically linked binaries could be a fairly stable option?
In general, building a binary package will make your software much easier for your users to install and keep up to date. The same goes for python packages. There are generally tools to generate apt packages from pip packages, so you can just list your python code as a dependency of your binary package(s).
You may see packaging and installers as a waste of your time, but only providing a source distribution wastes your users' time. Users don't want to constantly have to check github for new versions, and they often don't want to have to install all of your build dependencies if they just want to use your software. If your software is targeted towards developers this may be less of an issue, but it's still extra work that your users have to go through.
As for examples, the general convention is to put those in /usr/share/doc/myprogram/samples or a samples directory in your python package.
I was asked to expand my comment in an answer, and so I do.
The project I was talking about is called Woodpecker hash Bruteforce, and I distribute it as plain archived executables for Mac OS, Windows and Linux.
Woodpecker hash Bruteforce has only two dependencies I have to care about (the users don't need to install anything): OpenSSL and Botan - libraries to do hashing. I've got two virtual machines on my Mac where I build the project and several scripts to automate the process. I'm using Docker (in collaboration with VirtualBox) and VMware Fusion.
Above I said the users don't need to worry about any third-party libraries because everything's linked statically with the executable: you just download the appropriate file from the official website, unarchive it (if needed), sudo chmod +x the executable and that's it!
This works on any version of Linux, including Ubuntu (this is where I perform the build) and Kali Linux.
The best way to release software for Ubuntu depends on the software itself and its target audience, as Miles Budnek already pointed out.
Your goal is to lower the barriers to software usage. If you are targeting developers of your software (i.e., you develop source files that are supposed to be edited by others) or you are developing piece of code supposed to be included in other projects (e.g., gnulib), probably it is best to just provide sources and documentation.
In any other case that I currently imagine (including when you are targeting developers), providing precompiled binaries is a better option. In this case the optimal solution would be to have the software in Ubuntu. In this case How to get my software into Ubuntu? provides a lot of useful information, as suggested by Mark K.
Getting software into Debian or Ubuntu can be difficult and may require a large amount of time (you have to respect a lot of policies that you may not be aware of and you have to find a sponsor) and you will soon learn that a key point is to use a decent and popular build system for your software (e.g., autotools, cmake, distutils, ...) as mentioned in the Debian Upstream Guide. Being compliant with the guide will also be beneficial for users of other distributions.
In general I suggest to proceed in this order:
provide sources;
use a common build system (from the point of view of system administrators, i.e., people installing the software, autotools works best in my experience for Posix systems);
create a binary package (please keep in mind that you have to maintain it or your users will likely encounter binary incompatibilities);
add the package in private repository (I suggest aptly for this task);
try to get the package in the distribution of choice (please keep in mind maintainance costs).
Another option that I do not suggest, is to provide statically linked builds. This reduces the possibility of binary incompatibilities, but augment the costs of bug fixing (e.g., if the bug is in a dependency) and security, as explained in this and following comments. Another reason to avoid static linking is if several implementations of the same ABI exists, in order to exploit hardware acceleration (e.g., OpenGL), but you can also mix static and dynamic linking.
Finally you may also provide a container, like docker, to ship your software and all its dependencies: your users will just need docker to run your application very portably. However it is probably overkill in most situations and if it is a practical solution or not depends on your application and target audience.
The development environment, we use, is FreeBSD. We are evaluating Python for developing some tools/utilities. I am trying to figure out if all/most python packages are available for FreeBSD.
I tried using a CentOS/Ubuntu and it was fairly easy to install python as well as packages (using pip). On FreeBSD, it was not as easy but may be I'm not using the correct steps or am missing something.
We've some tools/utilities on FreeBSD that run locally and I want Python to interact with them - hence, FreeBSD.
Any inputs/pointers would be really appreciated.
Regards
Sharad
The assumption that powerful and high-profile existing python tools use a lot of different python packages almost always holds true. We use FreeBSD in our company for quite some time together with a lot of python based tools (web frameworks, py-supervisor, etc.) and we never ran into the issue that a certain tool would not run on freeBSD or not be available for freeBSD.
So to answer your question:
Yes, all/most python packages are available on FreeBSD
One caveat:
The freeBSD ports system is really great and will manage all compatibility and dependency issues for you. If you are using it (you probably should), then you might want to avoid pip. We had a problem in the past where the package manager for ruby did not really play well with the ports database and installed a lot of incompatible gems. This was a temporary issue with rubygems but gave us a real headache. We tend to install everything from ports since then and try to avoid 3rd party package managers like composer, pip, gems, etc. Often the ports invoke the package managers but with some additional arguments so they ensure not to break dependencies.
Is Python support for FreeBSD as good as for say CentOS/Ubuntu/other linux flavors?
It is probably better than on other OSes, but I'm a FreeBSD-bigot.
However! As Freitags says, you do not want to use pip (nor gem, I might add). All of these language-specific packaging systems were born out of developers' frustration with the various inadequacies of OS-specific packagers.
Had the world been using BSD, pip (nor gem) would've been unnecessary.
Why am singing this pean here? To warn you, that you might not find some obscure Python package already ported -- despite being available via pip. Packages of any prominence are ported (here is the current list), but something less known might not be.
Do not despair -- create a port yourself using any of the existing examples and FreeBSD Handbook. It is very easy to do and, if you submit it to FreeBSD, it will already be there the next time you need it.
Best of luck.
Many posts on different aspects of this question but I haven't seen a post that brings it all together.
First a subjective statement: it seems like the simplicity we experience when working with the Python language is shot to pieces when we move outside the interpreter and start grappling with deployment issues. How best to have multiple versions of Python on the same machine? Where should packages be installed? Disutils vs. setuptools vs. pip etc. It seems like the Zen of Python is being abused pretty badly when it comes to deployment. I'm feeling eerie echoes of the "DLL hell" experience on Windows.
Do the experts agree on some degree of best practice on these questions?
Do you run multiple versions of Python on the same machine? How do you remain confident that they can co-exist -- and the newer version doesn't break assumptions of other processes that rely on the earlier version (scripts provided by OS vendor, for example)? Is this safe? Does virtualenv suffice?
What are the best choices for locations for different components of the Python environment (including 3rd party packages) on the local file system? Is there a strict or rough correspondence between locations for many different versions of Unixy and Windows OS's that can be relied upon?
And the murkiest corner of the swamp -- what install tools do you use (setuptools, distutils, pip etc.) and do they play well with your choices re: file locations, Python virtual environments, Python path etc.
These sound like hard questions. I'm hopeful the experienced Pythonistas may have defined a canonical approach (or two) to these challenges. Any approach that "hangs together" as a system that can be used with confidence (feeling less like separate, unrelated tools) would be very helpful.
I've found that virtualenv is the only reliable way to configure and maintain multiple environments on the same machine. It even has as a way of packaging up environment and installing it on another machine.
For package management I always use pip since it works so nicely with virtualenv. It also makes it easy to install and upgrade packages from a variety of sources such a git repositories.
I agree this is quite a broad question, but I’ll try to address its many parts anyway.
About your subjective statement: I don’t see why the simplicity and elegance of Python would imply that packaging and deployment matters suddenly should become simple things. Some things related to packaging are simple, other are not, other could be. It would be best for users if we had one complete, robust and easy packaging system, but it hasn’t turned that way. distutils was created and then its development paused, setuptools was created and added new solutions and new problems, distribute was forked from setuptools because of social problems, and finally distutils2 was created to make one official complete library. (More on Differences between distribute, distutils, setuptools and distutils2?) The situation is far from ideal for developers and users, but we are working on making it better.
How best to have multiple versions of Python on the same machine? Use your package manager if you’re on a modern OS, or use “make altinstall” if you compile from source on UNIX, or use the similar non-conflicting installation scheme if you compile from source on Windows. As a Debian user, I know that I can call the individual versions by using “pythonX.Y”, and that what the default versions (“python” and “python3”) are is decided by the Debian developers. A few OSes have started to break the assumption that python == python2, so there is a PEP in progress to bless or condemn that: http://www.python.org/dev/peps/pep-0394/ Windows seems to lack a way to use one Python version as default, so there’s another PEP: http://www.python.org/dev/peps/pep-0397/
Where should packages be installed? Using distutils, I can install projects into my user site-packages directory (see PEP 370 or docs.python.org). What exactly is the question?
Parallel installation of different versions of the same project is not supported. It would need a PEP to discuss the changes to the import system and the packaging tools. Before someone starts that discussion, using virtualenv or buildout works well enough.
I don’t understand the question about the location of “components of the Python environment”.
I mostly use system packages (i.e. using the Aptitude package manager on Debian). To try out projects, I clone their repository. If I need something that’s not available with Aptitude, I install (or put a .pth file to the repo) in my user site-packages directory. I don’t need a custom PYTHONPATH, but I have changed the location of my user site-packages with PYTHONUSERBASE. I don’t like the magic and the eggs concept in setuptools/distribute, so I don’t use them. I’ve started to use virtualenv and pip for one project though (they use setuptools under the cover, but I made a private install so my global Python does not have setuptools).
One resource for this area is the book Expert Python Programming by Tarek Ziade. I'm ambivalent about the quality of the book, but the topics covered are just what you're focusing on.