I would like to create a very modular package, where users can install pieces if need be. I was wondering if there is a way to create multiple small packages and a master package. A user would first install the master package then install the components as needed.
I've tried researching this way, but I think I have the terminology wrong, I am hoping that someone can point me to some documentation or example project doing this.
Related
The language around packaging python projects in the documentations suggests that it is exclusively for distributing or exporting the project. So what should I do with projects I intend to use personally?
Should I package them anyway? If not, what steps should I take so that I can run my code on any machine with the right version of python? Would packaging a project even accomplish that? Is there even any way to "package" all of a project's files and relevant libraries together, with the end product being a folder/file rather than an install-able package?
I'm sorry if this is basic, I'm very confused, thank you for your time.
For personal use I do not package, except I am very sure there is no need to modify it anymore. And that is very rarely my case. Once it is packaged and published, public or private package repository you get the package and to make changes is more complicated but possible. I prefer to have the project repository and be able to edit and push the changes to remote locations.
Many packaging tools like poetry make it easy to build but also to install requirements and keep track of them. So, there is no hustle with managing requirements.
I want people that know of the old name to be directed to the new name.
For the pypi website, it's easy to upload a package with a README linking to the new package.
I'm not sure what's the best way to handle people using pip to install it. I assume it might be possible to show an error on pip install old_name, looking around it seems to be possible using cmdclass in setup.py and maybe throwing an exception in the right place but the documentation around it is scarce to put it mildly.
So I was wondering if anyone is aware of proper built-in systems for this, or common practices to handle this sort of thing.
Declare the new package a dependency of the old. See for example how scikit-learn does it: the old package sklearn declares in its setup.py:
install_requires=['scikit-learn'],
Thus everyone who does pip install sklearn automatically gets scikit-learn.
pypi-rename worked perfectly for me. It automates the process of making a README and redirecting users to the new package.
https://github.com/simonw/pypi-rename
I'm interested in contributing to a GitHub Python module repo, but I'm not entirely sure where to clone it. This is a simple module, just an __init__.py and some .py files. No other files need to be installed or changed outside of the module's folder.
I would like to be able to clone the repository directly in my site-packages folder. When I want to use the library as is, I would switch to the master branch. If I want to develop a new feature, can branch off of devel. If I want to try out a new feature someone else implemented, I can switch to that particular branch. I can even keep it in the development branch, to get the latest, albeit possibly unstable, features. All this without having to change the import statement to point to a different location in any of my scripts. This option, even though is seems to do all the things I want it to do, seems a bit wrong for some reason. Also, I'm not sure what this would do to pip when calling python -m pip list --outdated. I have a feeling it won't know what the current version is.
Another option would be to clone it to some other folder and keep only the pip-installed variant in the site-packages folder. That way I would have a properly installed library in site-packages and I could try out new features by creating a script inside the repo folder. This doesn't seem nearly as flexible as the option above, but it doesn't mess with the site-packages folder.
Which is the best way to go about this? How do you clone repositories when you both want to work on them and use them with the latest features?
I think this is more a question about packaging and open source than Python itself, but I'll try to help you out.
If you want to host your package on Pip, you should go here, and there you'll see how to upload and tag appropriately your package for usage.
If you want to add some functionality to some open source library, what you could do is to try to submit a Pull Request to that library, so everybody can use it. Rules for PR are specific for each project, you you should ask the maintainer.
If your modification doesn't get merged into master, but you still want to use it without changing import statements, you could fork that repo, and publish your own modifications on, for instance, Github.
In that case, you could install you modifications like this:
pip install git+https://github.com/username/amazing-project.git
So in that way, your library will come from your own repo.
If you're going for the third option, I strongly recommend you using virtualenv, where you can create different virtual environments with different packages, dependencies and so on, without messing up with your Python installation. A nice guide is available here.
If one creates a useful Python package, how/where does one publish/advertise it for other people to use?
I've put it on hithub, but even Google does not find it after a few weeks.
The package is neat & complete, I made it for my personal use and would be a shame not to share it with others :)
Here is the PyPI guide. https://python-packaging-user-guide.readthedocs.org/en/latest/distributing.html
PyPI is the place for putting your Python packages up for others to find. The built-in tool pip references it to install packages for you, and at least one IDE uses pip in the background to give you a GUI for doing this. (PyCharm)
So, to make the package available to a pip install, you have to register it in the Python Package Index (PyPI): https://pypi.python.org/pypi
There's also the test environment, where you can upload your packages to test if your setup is ok before going to the real deal: https://testpypi.python.org/pypi
You create an account in one of the servers and will be able to upload your package. But, before that, you will have to build your package using setuptools. Here's the documentation for packaging and distributing: https://packaging.python.org/distributing/
The proccess can be little boring, so I wrote a little tool to make it simpler. Maybe it's of some use to you: https://github.com/hugollm/foster
I have a virtualenv that serves a few separate projects. I'd like to write a utility library that each of these projects can use. It seems to make sense to stick that in the virtualenv. By all means shoot me down now but please give an alternative if you do.
Assuming I'm not completely crazy though, Where's the best place to stick my library?
My virtualenv sticks everything I install with pip in lib/pyton2.7/site-packages. I wonder if it would make more sense to follow suit or to hack in a separate home (further up the directory tree) so if things do ever clash, my work isn't overwritten by pip (et al).
If your project follows the standard packaging practices with setuptools, then all you have to do is run python setup.py develop inside the virtualenvs that you want the library to be used for. A .egg-link file will be created pointing to your package from which your other libraries will use much like any other packages, with the added benefit that your latest changes will be available to all packages at the same time (if that's your intention). If not, then either you could call python setup.py install or use multiple versions at different locations on the file system.
To get yourself started, take a look at Getting Started With setuptools and setup.py (you can skip the part on registering your package on pypi if this is your private work).
Another relevant stackoverflow thread: setup.py examples? The Hitchhiker's Guide to Packaging can also be quite useful.