I do this
C:\WINDOWS\system32>python c:\\python34\kissdownloader\setup.py bdist_wheel
and I get this after some other stuff
creating build\bdist.win-amd64\wheel\kissdownloader-1.dist-info\WHEEL
I goto my python root and there's nothing there , no build folder or anything . Is the thing somewhere else ? Oh and I don't have a manifest.in file if that makes any difference .
My setup.py file looks like this :
from setuptools import setup
setup (
name='kissdownloader',
version='1',
description = 'A kisscartoon/kissanime downloader' ,
package_dir={'kissdownloader':'C:\\Python34'},
author='Vriska',
author_email='xyz#gmail.com',
install_requires = ['bs4','cfscrape','requests'],
package_data={'data' : ['C:\Program Files\PhantomJS\phantomjs.exe']},
)
The output is generated in your current working directory. In your case, that's C:\WINDOWS\system32, because that's where you started Python. You'll find a build and dist directory; the latter contains the completed wheel.
If you want the directories to be created next to setup.py, change your directory to c:\python34\kissdownloader\ first. Many a Python project expects you to run setup.py there anyway.
As a side note: I wouldn't bundle the PhantomJS binary in with your project. Instead, require users to install it separately; they may already have it installed anyway, and you may run into legal issues by re-distributing it in your own project without at least a compatible license.
Related
I am trying to create a python package (deb & rpm) from cmake, ideally using cpack. I did read
https://cmake.org/cmake/help/latest/cpack_gen/rpm.html and,
https://cmake.org/cmake/help/latest/cpack_gen/deb.html
The installation works just fine (using component install) for my shared library. However I cannot make sense of the documentation to install the python binding (glue) code. Using the standard cmake install mechanism, I tried:
install(
FILES __init__.py library.py
DESTINATION ${ACME_PYTHON_PACKAGE_DIR}/project_name
COMPONENT python)
And then using brute-force approach ended-up with:
# debian based package (relative path)
set(ACME_PYTHON_PACKAGE_DIR lib/python3/dist-packages)
and
# rpm based package (full path required)
set(ACME_PYTHON_PACKAGE_DIR /var/lang/lib/python3.8/site-packages)
The above is derived from:
debian % python -c 'import site; print(site.getsitepackages())'
['/usr/local/lib/python3.9/dist-packages', '/usr/lib/python3/dist-packages', '/usr/lib/python3.9/dist-packages']
while:
rpm % python -c 'import site; print(site.getsitepackages())'
['/var/lang/lib/python3.8/site-packages']
It is pretty clear that the brute-force approach will not be portable, and is doomed to fail on the next release of python. The only possible solution that I can think of is generating a temporary setup.py python script (using setuptools), that will do the install. Typically cmake would call the following process:
% python setup.py install --root ${ACME_PYTHON_INSTALL_ROOT}
My questions are:
Did I understand the cmake/cpack documentation correctly for python package ? If so this means I need to generate an intermediate setup.py script.
I have been searching through the cmake/cpack codebase (git grep setuptools) but did not find helper functions to handle generation of setup.py and passing the result files back to cpack. Is there an existing cmake module which I could re-use ?
I did read, some alternative solution, such as:
How to build debian package with CPack to execute setup.py?
Which seems overly complex, and geared toward Debian-only based system. I need to handle RPM in my case.
As mentionned in my other solution, the ugly part is dealing with absolute path in cmake install() commands. I was able to refactor the code to avoid usage of absolute path in install(). I simply changed the installation into:
install(
# trailing slash is important:
DIRECTORY ${SETUP_OUTPUT}/
# "." syntax is a reliable mechanism, see:
# https://gitlab.kitware.com/cmake/cmake/-/issues/22616
DESTINATION "."
COMPONENT python)
And then one simply needs to:
set(CMAKE_INSTALL_PREFIX "/")
set(CPACK_PACKAGING_INSTALL_PREFIX "/")
include(CPack)
At this point all install path now need to include explicitely /usr since we've cleared the value for CMAKE_INSTALL_PREFIX.
The above has been tested for deb and rpm packages. CPACK_BINARY_TGZ does properly run with the above solution:
https://gitlab.kitware.com/cmake/cmake/-/issues/22925
I am going to post the temporary solution I am using at the moment, until someone provide something more robust.
So I eventually manage to stumble upon:
https://alioth-lists.debian.net/pipermail/libkdtree-devel/2012-October/000366.html and,
Using CMake with setup.py
Re-using the above to do an install step instead of a build step can be done as follow:
find_package(Python COMPONENTS Interpreter)
set(SETUP_PY_IN "${CMAKE_CURRENT_SOURCE_DIR}/setup.py.in")
set(SETUP_PY "${CMAKE_CURRENT_BINARY_DIR}/setup.py")
set(SETUP_DEPS "${CMAKE_CURRENT_SOURCE_DIR}/project_name/__init__.py")
set(SETUP_OUTPUT "${CMAKE_CURRENT_BINARY_DIR}/build-python")
configure_file(${SETUP_PY_IN} ${SETUP_PY})
add_custom_command(
OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/setup_timestamp
COMMAND ${Python_EXECUTABLE} ARGS ${SETUP_PY} install --root ${SETUP_OUTPUT}
COMMAND ${CMAKE_COMMAND} -E touch ${CMAKE_CURRENT_BINARY_DIR}/setup_timestamp
DEPENDS ${SETUP_DEPS})
add_custom_target(target ALL DEPENDS ${CMAKE_CURRENT_BINARY_DIR}/setup_timestamp)
And then the ugly part is:
install(
# trailing slash is important:
DIRECTORY ${SETUP_OUTPUT}/
DESTINATION "/" # FIXME may cause issues with other cpack generators
COMPONENT python)
Turns out that the documentation for install() is pretty clear about absolute paths:
https://cmake.org/cmake/help/latest/command/install.html#introduction
DESTINATION
[...]
As absolute paths are not supported by cpack installer generators,
it is preferable to use relative paths throughout.
For reference, here is my setup.py.in:
from setuptools import setup
if __name__ == '__main__':
setup(name='project_name_python',
version='${PROJECT_VERSION}',
package_dir={'': '${CMAKE_CURRENT_SOURCE_DIR}'},
packages=['project_name'])
You can be fancy and remove the __pycache__ folder using the -B flag:
COMMAND ${Python_EXECUTABLE} ARGS -B ${SETUP_PY} install --root ${SETUP_OUTPUT}
You can be extra fancy and add debian option such as:
if(CPACK_BINARY_DEB)
set(EXTRA_ARG "--install-layout" "deb")
endif()
use as:
COMMAND ${Python_EXECUTABLE} ARGS -B ${SETUP_PY} install --root ${SETUP_OUTPUT} ${EXTRA_ARG}
I'm trying to wrap my head around f2py because my organization has a lot of legacy fortran code that I would like to incorporate into some newer python-based tools I'm writing. Ideally, I would package these tools either in source packages or wheels to make it easier to distribute to the rest of the organization.
I've written a small test package based on some other examples I've seen that just sums an array of floats. The package contents are included below. If I build a source distribution tarball using py setup.py sdist, everything looks like it works. It even looks like pip successfully installs it. However, if I open a python shell and try to import the newly installed module, I get an error on the from fastadd import fadd line in the initialization script saying
AttributeError: module 'fastadd' has no attribute 'fastadd'
So it seems like it didn't actually successfully build the f2py module. Doing some troubleshooting, if I open a powershell window in the package folder and just run
py -m numpy.f2py -c fadd.pyf fadd.f90
and then open a python shell in the same folder and try to import fastadd, I get an error, ImportError: DLL load failed: The specified module could not be found. (This is after I installed the Visual Studio build tools, a fix suggested on several threads). Following the advice on this thread, changing the command to
py -m numpy.f2py -c --fcompiler=gnu95 --compiler=mingw32 fadd.pyf fadd.f90
will build a module file that I can successfully import and use. Okay, great.
However, when I change config.add_extension in the setup file to include the keyword argument f2py_options=["--fcompiler=gnu95","--compiler=mingw32"] and try to build a package distribution file with setup.py sdist command and then install using py -m pip install fastadd-1.0a1.tar.gz, I get yet a different error that says
ERROR: No .egg-info directory found in C:\Users\username\AppData\Local\Temp\pip-pip-egg-info-c7406k03
And now I'm completely flummoxed. Other configurations of the f2py_options either result in setup.py throwing an error or fail to create the extension altogether, similar to above. Using a simple string for the options gives an error, so apparently f2py_options does in fact expect a list input. I can't seem to find any good documentation on whether I'm using f2py_options correctly, and I have no idea why just adding that option would cause pip to not know where its info directory is. That makes no sense to me. I'd really appreciate some help on this one.
I'm running Python 3.7.0 32-bit, numpy 1.20.1, and pip 21.0.1 on a Windows 10 machine.
--EDIT--
Looking in the installation directory of the test module, I found a new wrinkle to this problem: the installation directory does not actually include any files listed in MANIFEST, not even the __init__.py file. If I copy __init__.py into the directory, trying to import the module gives the same ImportError: DLL load failed error I've been getting.
Also, inspecting the output of py -m pip install, it looks like numpy.distutils doesn't recognize --fcompiler or --compiler as valid options and just ignores them, even though numpy.f2py does recognize them.
--END EDIT--
PACKAGE CONTENTS:
+-fastadd
---__init__.py
---fadd.f90
---fadd.pyf
-MANIFEST.in
-README
-setup.py
fadd.f90 has the following contents:
subroutine fadd(vals,n,mysum)
integer, intent(in) :: n
real*8, intent(out):: mysum
real*8, dimension(n), intent(in) :: vals
mysum = sum(vals)
end subroutine fadd
fadd.pyf has the following contents:
python module fastadd ! in
interface ! in :fastadd
subroutine fadd(vals,n,mysum) ! in :fastadd:fadd.f90
real*8 dimension(n),intent(in) :: vals
integer, optional,intent(in),check(len(vals)>=n),depend(vals) :: n=len(vals)
real*8 intent(out) :: mysum
end subroutine fadd
end interface
end python module fastadd
__init__.py:
"""This is the documentation!"""
from .fastadd import fadd
MANIFEST.in:
include README
recursive-include fastadd *.f90
recursive-include fastadd *.pyf
recursive-include fastadd *.py
and, finally, setup.py:
def configuration(pth=None):
from numpy.distutils.misc_util import Configuration
config = Configuration(
'fastadd',
top_path=pth,
version='1.0a1',
author='John Doe',
author_email='john.doe#fake-org.biz',
url='fake-org.biz/fastadd',
description="Testing f2py build process. Sums an arbitrary-length list of numbers.")
config.add_extension(
'fastadd',
sources=['fastadd\\fadd.pyf','fastadd\\fadd.f90']
)
return config
if __name__ == '__main__':
from numpy.distutils.core import setup
setup(**configuration('fastadd').todict())
If it helps at all, the final MANIFEST file looks like this after the setup script is run:
# file GENERATED by distutils, do NOT edit
README
setup.py
C:\Users\username\Documents\Development\python_modules\fastadd\fastadd\fadd.f90
C:\Users\username\Documents\Development\python_modules\fastadd\fastadd\fadd.pyf
fastadd\__init__.py
fastadd\fadd.f90
fastadd\fadd.pyf
I have a build process that creates a Python wheel using the following command:
python setup.py bdist_wheel
The build process can be run on many platforms (Windows, Linux, py2, py3 etc.) and I'd like to keep the default output names (e.g. mapscript-7.2-cp27-cp27m-win_amd64.whl) to upload to PyPI.
Is there anyway to get the generated wheel's filename (e.g. mapscript-7.2-cp27-cp27m-win_amd64.whl) and save to a variable so I can then install the wheel later on in the script for testing?
Ideally the solution would be cross platform. My current approach is to try and clear the folder, list all files and select the first (and only) file in the list, however this seems a very hacky solution.
setuptools
If you are using a setup.py script to build the wheel distribution, you can use the bdist_wheel command to query the wheel file name. The drawback of this method is that it uses bdist_wheel's private API, so the code may break on wheel package update if the authors decide to change it.
from setuptools.dist import Distribution
def wheel_name(**kwargs):
# create a fake distribution from arguments
dist = Distribution(attrs=kwargs)
# finalize bdist_wheel command
bdist_wheel_cmd = dist.get_command_obj('bdist_wheel')
bdist_wheel_cmd.ensure_finalized()
# assemble wheel file name
distname = bdist_wheel_cmd.wheel_dist_name
tag = '-'.join(bdist_wheel_cmd.get_tag())
return f'{distname}-{tag}.whl'
The wheel_name function accepts the same arguments you pass to the setup() function. Example usage:
>>> wheel_name(name="mydist", version="1.2.3")
mydist-1.2.3-py3-none-any.whl
>>> wheel_name(name="mydist", version="1.2.3", ext_modules=[Extension("mylib", ["mysrc.pyx", "native.c"])])
mydist-1.2.3-cp36-cp36m-linux_x86_64.whl
Notice that the source files for native libs (mysrc.pyx or native.c in the above example) don't have to exist to assemble the wheel name. This is helpful in case the sources for the native lib don't exist yet (e.g. you are generating them later via SWIG, Cython or whatever).
This makes the wheel_name easily reusable in the setup.py script where you define the distribution metadata:
# setup.py
from setuptools import setup, find_packages, Extension
from setup_helpers import wheel_name
setup_kwargs = dict(
name='mydist',
version='1.2.3',
packages=find_packages(),
ext_modules=[Extension(...), ...],
...
)
file = wheel_name(**setup_kwargs)
...
setup(**setup_kwargs)
If you want to use it outside of the setup script, you have to organize the access to setup() args yourself (e.g. reading them from a setup.cfg script or whatever).
This part is loosely based on my other answer to setuptools, know in advance the wheel filename of a native library
poetry
Things can be simplified a lot (it's practically a one-liner) if you use poetry because all the relevant metadata is stored in the pyproject.toml. Again, this uses an undocumented API:
from clikit.io import NullIO
from poetry.factory import Factory
from poetry.masonry.builders.wheel import WheelBuilder
from poetry.utils.env import NullEnv
def wheel_name(rootdir='.'):
builder = WheelBuilder(Factory().create_poetry(rootdir), NullEnv(), NullIO())
return builder.wheel_filename
The rootdir argument is the directory containing your pyproject.toml script.
flit
AFAIK flit can't build wheels with native extensions, so it can give you only the purelib name. Nevertheless, it may be useful if your project uses flit for distribution building. Notice this also uses an undocumented API:
from flit_core.wheel import WheelBuilder
from io import BytesIO
from pathlib import Path
def wheel_name(rootdir='.'):
config = str(Path(rootdir, 'pyproject.toml'))
builder = WheelBuilder.from_ini_path(config, BytesIO())
return builder.wheel_filename
Implementing your own solution
I'm not sure whether it's worth it. Still, if you want to choose this path, consider using packaging.tags before you find some old deprecated stuff or even decide to query the platform yourself. You will still have to fall back to private stuff to assemble the correct wheel name, though.
My current approach to install the wheel is to point pip to the folder containing the wheel and let it search itself:
python -m pip install --no-index --find-links=build/dist mapscript
twine also can be pointed directly at a folder without needing to know the exact wheel name.
I used a modified version of hoefling's solution. My goal was to copy the build to a "latest" wheel file. The setup() function will return an object with all the info you need, so you can find out what it actually built, which seems simpler than the solution above. Assuming you have a variable version in use, the following will get the file name I just built and then copies it.
setup = setuptools.setup(
# whatever options you currently have
)
wheel_built = 'dist/{}-{}.whl'.format(
setup.command_obj['bdist_wheel'].wheel_dist_name,
'-'.join(setup.command_obj['bdist_wheel'].get_tag()))
wheel_latest = wheel_built.replace(version, 'latest')
shutil.copy(wheel_built, wheel_latest)
print('Copied {} >> {}'.format(wheel_built, wheel_latest))
I guess one possible drawback is you have to actually do the build to get the name, but since that was part of my workflow, I was ok with that. hoefling's solution has the benefit of letting you plan the name without doing the build, but it seems more complex.
For various not-very-good-but-unfortunately-necessary reasons I'm using a setup.py file to manage some binary assets.
During py setup.py build or install I would like to create a .py file in the "normal" Python package being installed by setup.py which contains some details about these binary assets (their absolute path, version information, etc).
What's the best way to create that file?
For example, I'd like it to work something like this:
$ cd my-python-package
$ py setup.py install
...
Installing version 1.23 of my_binary_assets to /some/path...
...
$ python -c "from my_python_package import binary_asset_version_info as info; print info"
{"path": "/some/path", "version": "1.23"}
(note: I'm using the cmdclass argument to setup(…) to manage the building + installation of the binary assets… I'd just like to know how to create the binary_asset_version_info.py file used in the example)
At first sight, there is a catch-22 in your requirements: The most obvious place to create this .py file would be in the build or build_py command (to get usual distutils operations like byte-compilation), but you want that file to contain the paths to the installed assets, so you’d have to create it during the install step. I see two ways to solve that:
a) Create your info.py file during build_py, and use distutils machinery to get the installation paths of the assets files
b) Create info.py during install and call distutils.util.byte_compile to byte-compile it
I find both ideas distasteful, but well :) Now, do you know how to fill in the file (i.e. get the install paths from distutils)?
I have a very simple setup:
from distutils.core import setup
setup(name='myscripts',
description='my scripts',
author='Ago',
author_email='blah',
version='0.1',
packages=['myscripts']
)
myscripts folder consists of about 10 python files. Everthing works fine if I just execute my main.py file (executable, which uses those myscripts files). Now I try to do:
python setup.py sdist
But I get:
running sdist
warning: sdist: missing required meta-data: url
reading manifest file 'MANIFEST'
creating myscripts-0.1
making hard links in myscripts-0.1...
'file1.py' not a regular file -- skipping
hard linking setup.py -> myscripts-0.1
'file2.py' not a regular file -- skipping
tar -cf dist/myscripts-0.1.tar myscripts-0.1
gzip -f9 dist/myscripts-0.1.tar
removing 'myscripts-0.1' (and everything under it)
Files file1.py and file2.py are as regular as other files. Any suggestions?
(Already worked, reposting as a proper answer):
Try removing the "MANIFEST" file and re-running it. If you've moved files around, MANIFEST can be wrong (it gets regenerated automatically if it's not there).
NOTE: I am new to setup.py, sdist, etc. and am working through exercise 46 in "learn python the hard way"-> So I don't yet know what I'm doing :) http://learnpythonthehardway.org/
I found this question because I was receiving the same error when trying to include a script. For whatever reason I don't have a "manifest" file (that I can find)--perhaps I'm using a different distutils version? I used pip to install "distribute".
The solution for me was to include the extension "*.py" with the script name. As:
...
'scripts': ['bin/testscript.py'],
...
While following http://docs.python.org/distutils/setupscript.html#installing-scripts it seemed like I shouldn't include the extension. So, I'm not sure what's up here, but it works for me as of now and the "not a regular file -- skipped" error went away.
This solved my problem. You can find my newbie code at: https://github.com/stevekochscience/Test-python-package-with-script-LPTHW-EX46
The README file explains what I did to test the package along with test script. Hope this helps other newbies who stumble across this question!
In my case this error was caused by inadvertly running distutils with Python 2.7 instead of Python 3. The quick fix:
python3 setup.py register sdist upload
Better still, mark the script correctly:
sed -i '1i #!/usr/bin/python3' setup.py