How to install modules from a package? - python

For installing python packages, is there a way to only install the specific modules I need (plus dependencies) into the virtual environment?
For example, I import the following in my code
from statsmodels.tsa.arima_model import ARIMA
I only need ARIMA (and dependencies), and won't need the complete statsmodels library.
How do I accomplish that?
Rationale:
I am trying to zip up the packages and deploy to AWS lambda. However, I am exceeding the 50MB file size limit.
If there is a way to install only the modules I need from the packages, I can reduce the file size.

Here's a horrible brute-force approach. I hope someone gives you a better answer!
Assuming the source you want to install is public, its licensing allows this kind of use, and you are willing to deal with the hassle, I imagine you could:
copy the source
remove anything unused by your required module
place that module's source inside your package, and wire it up from there. (This answer proposes a reasonable file structure for vendored software)
Alternately, if the assumptions above hold but you prefer not to vendor directly within your software, you could probably:
copy the source (e.g. fork the repo)
remove anything not used by your required module, making sure to leave a working python package
use pip or your environment's native dependency management tools to download and install your package in the virtual env.
Please note that both of these approaches will expose you to a number of risks, including but not limited to:
breaking the module: what happens to your project if you accidentally created a logical or security flaw?
loss of currency: how do you plan to keep your new mini-package up to date?
liability: Does the vendored software's license allow this kind of use? Are you adequately protected if there is something wrong with the software you have modified?

Related

Is there an easy solution to "No module named [...]" in Python 3? [duplicate]

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Include a PIP module in my project so that user doesn't have to download it

Edit: I am self taught, I don't know the right terms.
This question is probably a duplicate but I am not able to find it. I need to include a python pip package in my application say numpy. I don't want the user to pip install -r requirements.txt I want to include the module when the user downloads the application.
you don't have to include pip when you build the app, because when you build the app, all depencies will be included in the app resources, i am not sure which app you are building, but in general, all depencies will be converted to pyc packages when building the app and compiled
It's a whole world to explore, honestly ) you need to explore the subjects of deployment and distribution; those will depend on target operating systems... I would suggest investigating Docker, which allows you to package the OS (some Unix), runtime and dependencies into one "thing".
Vendoring
The only way I can think of besides pip to do this with pure source code would be to vendor the code you need. This means downloading the source code of the package (like numpy) and including it in your project.
This comes with many potential issues including:
licensing issues
issues with installation if there are cpython files that need to be compiled
Having to manually update the files when new releases come out
increased download size for your source code
etc.
I would not recommend doing this unless there is a hard technical requirement for it, because it's a real hassle to deal with. If there is something in particular besides having to run the extra command as to a reason why you want to avoid pip it might help to better address this question.
Binary distribution
Also depending on the app you could look into something like pyinstaller to make a single .exe or binary file out of your app so they don't even need python or your dependencies, but be warned this has it's own set of complexities to look out for like having to build for every target platform (Windows, Mac and Linux).

When should you package python projects? Should you package personal projects?

The language around packaging python projects in the documentations suggests that it is exclusively for distributing or exporting the project. So what should I do with projects I intend to use personally?
Should I package them anyway? If not, what steps should I take so that I can run my code on any machine with the right version of python? Would packaging a project even accomplish that? Is there even any way to "package" all of a project's files and relevant libraries together, with the end product being a folder/file rather than an install-able package?
I'm sorry if this is basic, I'm very confused, thank you for your time.
For personal use I do not package, except I am very sure there is no need to modify it anymore. And that is very rarely my case. Once it is packaged and published, public or private package repository you get the package and to make changes is more complicated but possible. I prefer to have the project repository and be able to edit and push the changes to remote locations.
Many packaging tools like poetry make it easy to build but also to install requirements and keep track of them. So, there is no hustle with managing requirements.

Robust way to ensure other people can run my python program

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Is there a way to bundle multiple packages together with Python setuptools?

I have packages A and B, both have their own git repository, PyPI page, etc... Package A depends on package B, and by using the install_requires keyword I can get A to automatically download and install B.
But suppose I want to go a step further for my especially non-savvy users; I want to actually include package B within the tar/zip for package A, so no download is necessary (this also lets them potentially make any by-hand edits to package B's setup.cfg)
Is there a suggested (ideally automated) way to,
Include B in A when I call sdist for A
Tell setuptools that B is bundled with A for resolving the dependency (something like a local dependency_links)
Thanks!
This is called 'vendorizing' and no, there is no support for such an approach.
It is also a bad idea; you want to leave installation to the specialized tools, which not only manage dependencies but also what versions are installed. With tools like buildout or pip's requirements.txt format you can control very precisely what versions are being used.
By bundling a version of a dependency inside, you either force upon your users what version they get to use, or make it harder for such tools to ensure the used versions for a given installation are consistent. In addition, you are potentially wasting bandwidth and space; if other packages have also included the same requirements, you now have multiple copies. And if your dependency is updated to fix a critical security issue, you have to re-release any package that bundles it.
In the past, some packages did use vendorize packaging to include a dependency into their distribution. requests was a prime example; they stepped away from this approach because it complicated their release process. Every time there was a bug in one of the vendored packages they had to produce a new release to include the fix, for example.
If you do want to persist in including packages, you'll have to write your own support. I believe requests manually just added the vendorized packages to their repository; so they kept a static copy they updated from time to time. Alternatively, you could extend setup.py to download code when you are creating a distribution.

Categories