I have a Python package that I would like to distribute. As part of that package I would like to provide a script written in C. Is there a way to include that as a binary executable along with the Python package?
I tried following the example here: Python setup.py call makefile don't include binaries
But when I do:
python setup.py develop
binary_file
binary_file doesn't execute. I can confirm that it's being placed in bin/binary_file within the site-packages package directory.
Is there a way to get setuptools to add that directory to the current PATH or to link the binary to some folder that is already in PATH so that binary_file can be called as a script (similar to how scripts are specified using scripts: ... in setup.py)?
Related
I have a python package that comes with a variety of scripts in other languages. Setuptools already supports copying these scripts into the scripts directory of the environment to make them accessible from the command line. For this purpose I can simply use the following keyword in the setup commmand:
setup(
...
scripts=["bin/script1.bat", "bin/script2.bat"],
...
)
After installing the package, the script files will end up correctly in the scripts folder of the environment.
My question:
Is there a way to have these files end up in a subfolder of the scripts directory? Something like scripts/odd_scripts/script1.bat and scripts/even_scripts/script2.bat.
Since they are not Python scripts, you don't need the main feature of scripts (which is: rewriting the shebang to point to the same executable as the Python runtime which was used to install the package).
In this case, you can just package them as data_files, the original executable bits and shebangs will be preserved:
from setuptools import setup
setup(
...
data_files=[('bin/odd_scripts', ['bin/script1.bat']),
('bin/even_scripts', ['bin/script2.bat'])],
...
)
I need create a requierment.txt for all scripts in specific folder(s). For your informations, scripts are used by an other program and run inside it... I can't use venv because program use it own install python. I run pip in *.bat to specify python path in PATH.
Do you know if a package python exist and can output a list of packages use in scripts grouped in specific folder it contains python (an other but ignore it) file with retroingenieering source file?
for example:
pip_local freeze -py-env="c:\program\bin\py27\python.exe" -script-path="c:\program\scripts"
"scripts" contains module script and "local lib class package"
The requirements.txt convention isn't used for bare modules in a directory. There is no way to reference a module by path name in a requirements file. Requirements files are meant to state a module's globally unique name and version number, where "globally unique" is relative to all the repositories you've enabled for pip -- PyPi by default, and others via for example --find-links.
You could create your modules as installable packages (via setup.py or some other mechanism) and making them available to install from PyPi, a private PyPi-ish server (like Gemfury), or directly from GitHub, Bitbucket or other SCC. Those all work with the normal requirements file mechanism.
I have several packages in folder Top. The path is set at the command prompt such that each package contains some python files that use other packages' modules.
I have the following files in Top directory: setup.py, MANIFEST, MANIFEST.in, README. I wish to modify the setup files such that the path is set during installation. Does PYTHONPATH set it, and does it need to go into a new file?
The appropriate actions here are
Packages do not mess with PYTHONPATH. Never.
Instead, you write an setup.py entry point to your command line scripts
When the user installs the package using pip install the command line script is automatically added to user's PATH, which is hopefully inside virtualenv
This command line script is generated during the install, so that it points to the PYTHONPATH of virtualenv or system-wide Python installation. The path is hardcoded at the script head, pointing to the current Python interpreter.
More info
https://packaging.python.org/en/latest/distributing/
Python has ability to "pseudoinstall" a package by running it's setup.py script with develop instead of install. This modifies python environment so package can be imported from it's current location (it's not copied into site-package directory). This allows to develop packages that are used by other packages: source code is modified in place and changes are available to rest of python code via simple import.
All works fine except that setup.py develop command creates an .egg-info folder with metadata at same level as setup.py. Mixing source code and temporary files is not very good idea - this folder need to be added into "ignore" lists of multiple tools starting from vcs and ending backup systems.
Is it possible to use setup.py develop but create .egg-info directory in some other place, so original source code is not polluted by temporary directory and files?
setup.py develop creates a python egg, in-place; it does not [modify the] python environment so package can be imported from it's current location. You still have to either add it's location to the python search path or use the directory it is placed in as the current directory.
It is the job of the develop command to create an in-place egg, which may include compiling C extensions, running the 2to3 python conversion process to create Python3 compatible code, and to provide metadata other python code may be relying on. When you install the package as an egg in your site-packages directory, the same metadata is included there as well. The data is certainly not temporary (it is extracted from your setup.py file for easy parsing by other tools).
The intent is that you can then rely on that metadata when using your package in a wider system that relies on the metadata being present, while still developing the package. For example, in a buildout development deployment, we often use mr.developer to automate the process of fetching the source code for a given package when we need to work on it, which builds it as a develop egg and ties it into the deployment while we work on the code.
Note that the .egg-info directory serves a specific purpose: to signal to other tools in the setuptools eco-system that your package is installed and available. If your package is a dependency of another egg in your setup, then that dependency is satisfied. pip and easy_install and buildout will not try and fetch the egg from somewhere else instead.
Apart from creating the .egg-info directory, the only other thing the command does, is to build extensions, in-place. So the command you are looking for instead is:
setup.py build_ext --inplace
This will do the exact same thing as setup.py develop but leave out the .egg-info directory. It also won't generate the .pth file.
There is no way of generating only the .pth file and leave out the .egg-info directory generation.
Technically speaking, setup.py develop will also check if you have the setuptools site.py file installed to support namespaced packages, but that's not relevant here.
The good manner is to keep all source files inside special directory which name is your project name (programmers using other languages keep their code inside src directory). So if your setup.py file is inside myproject directory then you should keep the files at myproject/myproject. This method keeps your sources separated from other files regardless what happen in main directory.
My suggestion would be to use whitelist instead of blacklist -- tell the tools to ignore all files excluding these which are inside myproject directory. I think that this is the simplest way not to touch your ignore lists too often.
Try the --install-dir option. You may also want to use --build-dir to change building dir.
After so much of hassle i build the libxml from source. I performed following steps
Downloaded the lxml.tar.gz and extracted its contents
Build it using
python2.7 setup.py build_ext -i -I /usr/include/libxml2 --with-xslt-config=/opt/xslt/bin/xslt-config
I tried going in python shell and tried import lxml . it didn't worked
Then i went into directory
/home/user/tmp/(extracted lxml directory/
and on linux command prompt i typed
PYTHONPATH=src python27
then i tried import lxml and then it worked.
src folder conatains folder name lxml
So i want to know that when i build the lxml does it mean that i always need that directory to use it or i can delete that. If not then in which location do i need to put that folder so that if i run python normal way then i can access that
Does the modules which we build ourselves are not installed in python folder??
Can i make python egg from it
You told it to build_ext, so it just compiled it and didn't install. If you told it to install, it would install it in system-wide directory (but you need write permissions for that) or whatever directory you specify (with --home (for installing as user) or --prefix (for installing as root to non-standard directory like under /opt) option).
When you set PYTHONPATH, you gave it a relative path, so it will only work from that folder. If you specify an absolute path, like:
export PYTHONPATH=/home/user/tmp/extracted_whatever
It will work regardless of the folder you're in now.