Should I install my Python application with setuptools? - python

I am currently about to deploy my Python application. I used to think that the only good way to install the pure Python app is to just copy the source code files along with the requirements file, and install packages listed in the requirements file (also, Python onbuild Docker image does suppose this way).
But I can see that folks often install their apps using setuptools like ./setup.py install(it seems that Warehouse project does this, for example).
Which of the two is considered a better practice?
What are the benefits from installing your app as a package?

Related

Python dependency management best practices

I have a little Python side project which is experiencing some growing pains, wondering how people on larger Python projects manage this issue.
The project is Python/Flask/Docker deployed to AWS. Listed dependencies (that we import directly in the project) are installed from a requirements.txt file with explicit version numbers. We added the version numbers after noticing our new deployments (which rebuild Docker/dependencies etc) would sometimes install newer versions of the packages, causing the project to break.
The issue we're facing now is that an onboarding developer is setting up her environment and facing the same issue - this time with sub-dependencies of the original dependencies. (For example, Flask might install Werkskreug, Jinja2, etc and if some of these are the wrong version, the app breaks.) The obvious solution is to go through each sub-dependency and list out every package, with explicit versions, in requirements.txt. But this is a bit of a pain so I'm asking around to see what people do on Real Projects.
You guys can't be doing this all manually, right? In JS we have NPM and package.lock files and so on - they're automatically built. Is there some equivalent in Python? Have I missed something basic that we should be using here?
Thanks in advance
I wrote a tool that might be helpful for this called realreq.. You can install it from pip pip install realreq. It will generate the requirements you have by reading through your source files and recursively specifying their requirements.
realreq --deep -s /path/to/source will fully specify your dependencies and their sub-dependencies. Note that if you are using a virtual environment you need to have it activated for realreq to be able to find the dependencies, and they must be installed. (i.e realreq needs to be ran in an environment where the dependencies are installed). One of your engineers who has a setup env can run it and then pass the output as a requirements.txt file to your new engineers.

Are there naming conventions for pip requirements files?

Are there conventions for storing multiple requirements.txt files in a Python code repository. For example, one file for simply running the program, another for day-to-day development, another for making a Windows build.
Some repositories contain two files, requirements.txt and requirements_dev.txt, or requirements.txt and requirements_win.txt - this seems pretty ad-hoc.
I have others with a requires subfolder. But I'm not sure what the meaning of requires/requirements.txt is in this context -- running the application, or for development?
There is no mention of storing multiple requirements files in Structuring your project (Hitchhiker's guide to Python) or pip install (pip documentation).
As far as I am aware, there are no hard-and-fast rules here. At least not via PEP (but someone feel free to correct me).
The Hitchhiker's Guide to Python recommends putting the pip requirements file at the root of your project.
There does not appear to be any requirement that a pip requirements file has to be called requirements.txt. The pip install documentation even uses the example of pip install -r example-requirements.txt.
I would think that the conventions vary from project-to-project and largely depend on your deployment process and project-specific documentation.

Python: run tests from wheel or sdist

For a package I've authored, I've done python setup.py sdist bdist_wheel, which generates some package artifacts in the dist/ directory. Now I'd like to run the package's unit tests in those artifacts. What's a good way to do it?
To be clear: an alternative would be to run the tests directly from the local source files, but I want to avoid that to make sure I'm testing the exact pre-built artifact users would be installing (as suggested here).
I'm using Python 3, and I'm on a Linux or Mac OS environment. My context is a build server that builds, tests, and then publishes artifacts (to a private PyPI-like repo) as commits are made to a Git repository.
If there's some other approach I should be using instead, I'm all ears.
What you can do is:
Create a virtual environment
Install your package
Run the tests against your installed library using tools like pytest, you can read more about pytest good practices here: http://pytest.org/dev/goodpractises.html
As pointed in the pytest docs take a look to tox as well for your CI server: http://pytest.org/dev/goodpractises.html#use-tox-and-continuous-integration-servers
This is a related question regarding how to test using the installed package: Force py.test to use installed version of module

setup.py + virtualenv = chicken and egg issue?

I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.
My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?
You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.
Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start
However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt
And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.
Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.
Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.
For deployment/distribution of package, you can choose to
Distribute by source code. User need to run python setup.py --install, or
Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>
However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.
In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.
Note : virtualenv itself is actually a hack to achieve isolation.

packaging scientific project in python

I am trying to build a package for an apps in python. It uses sklearn, pandas, numpy, boto and some other scientific module from anaconda. Being very unexperienced with python packaging, I have various questions:
1- I have some confidential files .py in my project which I don't want anyone to be able to see. In java I would have defined private files and classes but I am completely lost in python. What is the "good practice" to deal with these private modules? Can anyone link me some tutorial?
2- What is the best way to package my apps? I don't want to publish anything on Pypi, I only need it to execute on Google App engine for instance. I tried a standalone package with PyInstaller but I could not finish it because of numpy and other scipy packages which makes it hard. Is there a simple way to package in a private way python projects made with anaconda?
3- Since I want to build more apps in a close future, shall I try to make sub-packages in order to use them for other apps?
The convention is to lead with a single underscore _ if something is internal. Note that this is a convention. If someone really wants to use it, they still can. Your code is not strictly confidential.
Take a look at http://python-packaging-user-guide.readthedocs.org/en/latest/. You don't need to publish to pypi to create a Python package that uses tools such as pip. You can create a project with a setup.py file and a requirements.txt file and then use pip to install your package from wherever you have it (e.g., a local directory or a repository on github). If you take this approach then pip will install all the dependencies you list.
If you want to reuse your package, just include it in requirements.txt and the install_requires parameter in setup.py (see http://python-packaging-user-guide.readthedocs.org/en/latest/requirements/). For example, if you install your package with pip install https://github/myname/mypackage.git then you could include https://github/myname/mypackage.git in your requirements.txt file in future projects.

Categories