I recently came upon a new project with Python as the code-base. I have never done Python before - coming from the C/C++ world of compiled code. I am running into some issues understanding my current codebase.
When we write code, we have our libraries (components that are more general than other code), and our application code (code that applies the library), right? In the projects that I've worked on before, I would keep both my library code and application in contained folders in one project folder. In C/C++ land, there would be a makefile (or some make system) that hooks everything together so that the includes all work appropriately.
Project/
Library/
Utilities.cpp
Application/
Main.cpp
makefile
The project that I am coming onto right now has its own library in the site-packages folder, which itself is located in the IronPython/Python system folder. That library code is ours, and is still "hot" and being worked on. The application code is elsewhere on the system.
This seems like it's bad design, but my peers insist that this is "just how Python works". Python supports including/importing. Shouldn't everything just be self-containedr? It seems odd to scatter code like that.
Thanks!
Python libraries are usually installed via distutils or setuptools. Those utilties install the libraries in python's site-packages folder, which is where python knows to look for libs when an import x statement is encountered.
Developing code directly in the site-packages folder seems a little odd, although there's technically nothing wrong with it. Normally, you'd have something like this:
./app1.py
./lib1/__init__.py
./lib1/lib1.py
./lib2/__init__.py
./lib2/lib2.py
...etc
And then when you were ready to package the libraries up, you can use one of the above mentioned utils to do so (which would then install the libs into site-packages).
So, to answer your question: there's no hard and fast rule. I think most python developers would frown on developing directly in site-packages.
*Setuptools also has a command called develop that installs a link to your development library in site-packages. I've used that a few times with good results.
Usually, third-party libraries get added to site-packages by an installer such as pip, *easy_install*, or distutils using setup.py. For locally developed, "hot libraries" under development, those usually have their own directory tree and python.exe finds them using the PYTHONPATH environment variable (used for setting sys.path).
Its easy importing those libraries using Eclipse.
Install Easy_install and import all those libraries from your command prompt.
Now in Eclipse, just link those libraries, in your preferences and wola! you are ready to GO!
Related
We develop a system that includes a lot of different Python scripts and modules. Some are deployed to a self-hosted PyPI and some are simple packaged into a zip file (stored in a self-hosted Artifactory). Finally, there is another application (not developed by us), that uses some of our Python scripts as plugins. So, the dependency graph is rather complex for a python environment, I guess. The following snippet should explain the graph:
Script (own, zip package)
Module (own, pypi)
Module (external, pypi)
Module (own, pypi)
Module (external, pypi)
This is just an example, in reality, there are much more dependencies. But in the end, it is a mix of zip packaged and pypi packaged Pyhton scripts and modules. The dependencies of the pypi modules are managed via setuptools install_requires parameter in the setup.py. But the dependencies of the zip packaged scripts are managed via a self-implemented configuration and install script.
At the end, we have our install script that creates one virtual environment and installs all dependencies in it. Either via pip or simple download the zip files and put it in the right directory. But honestly, that feels a little bit weird and we are not sure if this is the right (pythonic) way to go.
We already searched couple of days through the internet but found no answer. It also seems, that it is very uncommon to have such a complex system implemented in Python? So, the final question is: is our approach the right one or is there not really "the right way"? Or is our approach completely wrong?
I am new to Python having come from a proprietary compiled language (Xojo) that produces self-contained executables.
I understand that Python is an interpreted language. I understand that it requires an interpreter (let’s stick with CPython) and presumably it requires a number of accessory frameworks/C libraries in order to run. What I don’t understand is why is it so hard to create a folder containing the interpreter and all required files and libraries and simply bundle these up with my script to distribute.
I have discovered that there are a bunch of tools that attempt to do this (py2app, cx_freeze, etc) but many of them seem either broken, not maintained or really buggy.
I guess my question is: is there any documentation that describes the exact things I need to bundle with a “Hello World” script to get it running? This seems to be a really straightforward problem to solve but it hasn’t been (which suggests that it is far more complex than I appreciate).
My understanding is that PyInstaller works fine for making a single exe for distribution. But barring packaging tools like that, in general, there isn't an obvious "bare minimum"; the modules don't have documented dependencies, so it's usually best to ship the whole standard library.
Typically, if you need a redistributable version, you use the embedded Python zip redistributable, shipping Python alongside your main application.
The exact list of files/libraries depends on how the python interpreter is built. In windows for example, you can obtain CPython binaries built from Visual Studio, Cygwin and Mingw-w64. They have different dependency of cause. In Linux distributions, python is normally installed by default.
Below is the list of .dll and .exe files that you can find in the official CPython release for windows.
libcrypto-1_1-x64.dll python.exe python37.dll sqlite3.dll
libssl-1_1-x64.dll pythonw.exe python3.dll vcruntime140.dll
The total size of this ZIP file release is only 6.7 MB. So it would be easy to bundle it in your main executable. You can use whatever bundler at hand, not necessary those designed for python. Quoting from the documentation here:
extracting the embedded distribution to a subdirectory of the application installation is sufficient to provide a loadable Python interpreter.
I feel the absolute best way to experience Python for beginners in thonny and an esp32.
A very good way to get started with python is to use Anaconda https://www.anaconda.com/distribution/#download-section - this distribution contains the CPython interpreter and the most commonly used packages. For quite a while you will get along without installing more packages.
To be able to make a simple distributable piece of code just include a requirements.txt along with your code which should list down the packages (and versions) you are using in your code.
More on that here : https://www.idiotinside.com/2015/05/10/python-auto-generate-requirements-txt/
pip freeze generates a superset of all packages in your running environment so you would ideally go with the second smarter option in the link : pipreqs
So, in short along with your code just an additional requirements.txt should be fine using which people can install all required packages as
pip install -r requirements.txt
and they are good to go to run your code.
For advanced scenarios you might want to look up creating virtual environments using conda.
What is a conda environment?
https://docs.conda.io/projects/conda/en/latest/user-guide/concepts.html#conda-environments
How to create/manage a conda environment
https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
All the best in your Python journey!
I have a Python app that looks for plugins via pkg_resources.iter_entry_points.
When run directly from source checkout, this will find anything in sys.path that fits the bill, including source checkouts that happen to have an applicable .egg-info for setuptools to find.
Yet when I install the package anywhere via python setup.py install, it suddenly ceases to detect everything enumerated in sys.path, instead only finding things that are installed alongside it in site-packages.
Why is pkg_resources.iter_entry_points behaving differently for the vanilla source checkout v. the installed application?
How can I make it traverse everything in sys.path, as it does in development?
How to get it to iterate over sys.path?
pkg_resources.WorkingSet(None).iter_entry_points
Why does it behave differently? Probably because the installed package forces at least the meta data about itself into memory. Looking at the code, my guess would be that your main module has a requires attribute, but that's only an educated guess. Anyway, to force the "installed" behaviour while developing, it should be enough to run python setup.py develop
I'm new to python and I'm writing my first program. I would like after I finish to be able to run the program from the source code on a windows or mac machine. My program has dependencies on 3rd party modules.
I read about virtualenv but I don't think it helps me because it says it's not relocatable and it's not cross-platform (see Making Environments Relocatable http://pypi.python.org/pypi/virtualenv).
The best scenario is to install the 3rd party modules locally in my project, aka xcopy installation.
I will be really surprised if python doesn't support this easily especially since it promotes simplicity and frictionless programming.
You can do what you want, you just have to make sure that the directory containing your third-party modules is on the python path.
There's no requirement to install modules system-wide.
Note, while packaging your whole app with py2exe may not be an option, you can use it to make a simple launcher environment. You make a script with imports your module/package/whatever and launches the main() entry-point. Package this with py2exe but keep your application code outside this, as python code or an egg. I do something similar where I read a .pth text file to learn what paths to add to the sys.path in order to import my application code.
Simply, that's generally not how python works. Modules are installed site-wide and used that way. Are you familiar with pip and/or easy_install? Those + pypi let you automatically install dependencies no matter what you need.
If you want to create a standalone executable typically you'd use py2exe, py2app or something like that. Then you would have no dependencies on python at all.
I also found about zc.buildout that can be used to include dependencies in an automatic way.
I'm in a bit of a discussion with some other developers on an open source project. I'm new to python but it seems to me that site-packages is meant for libraries and not end user applications. Is that true or is site-packages an appropriate place to install an application meant to be run by an end user?
Once you get to the point where your application is ready for distribution, package it up for your favorite distributions/OSes in a way that puts your library code in site-packages and executable scripts on the system path.
Until then (i.e. for all development work), don't do any of the above: save yourself major headaches and use zc.buildout or virtualenv to keep your development code (and, if you like, its dependencies as well) isolated from the rest of the system.
We do it like this.
Most stuff we download is in site-packages. They come from pypi or Source Forge or some other external source; they are easy to rebuild; they're highly reused; they don't change much.
Must stuff we write is in other locations (usually under /opt, or c:\opt) AND is included in the PYTHONPATH.
There's no great reason for keeping our stuff out of site-packages. However, our feeble excuse is that our stuff changes a lot. Pretty much constantly. To reinstall in site-packages every time we think we have something better is a bit of a pain.
Since we're testing out of our working directories or SVN checkout directories, our test environments make heavy use of PYTHONPATH.
The development use of PYTHONPATH bled over into production. We use a setup.py for production installs, but install to an alternate home under /opt and set the PYTHONPATH to include /opt/ourapp-1.1.
The program run by the end user is usually somewhere in their path, with most of the code in the module directory, which is often in site-packages.
Many python programs will have a small script located in the path, which imports the module, and calls a "main" method to run the program. This allows the programmer to do some upfront checks, and possibly modify sys.path if needed to find the needed module. This can also speed up load time on larger programs, because only files that are imported will be run from bytecode.
Site-packages is for libraries, definitely.
A hybrid approach might work: you can install the libraries required by your application in site-packages and then install the main module elsewhere.
If you can turn part of the application to a library and provide an API, then site-packages is a good place for it. This is actually how many python applications do it.
But from user or administrator point of view that isn't actually the problem. The problem is how we can manage the installed stuff. After I have installed it, how can I upgrade and uninstall it?
I use Fedora. If I use the python that came with it, I don't like installing things to site-packages outside the RPM system. In some cases I have built rpm myself to install it.
If I build my own python outside RPM, then I naturally want to use python's mechanisms to manage it.
Third way is to use something like easy_install to install such thing for example as a user to home directory.
So
Allow packaging to distributions.
Allow selecting the python to use.
Allow using python installed by distribution where you don't have permissions to site-packages.
Allow using python installed outside distribution where you can use site-packages.