I'm uploadig my package to PyPi with this command:
python setup.py sdist upload
This command generate some files and folders, is there any option to delete this files after upload?
The sdist command calls the build command which by default puts files in a build subdirectory. You probably want to keep that around (i.e. not care about it) to speed up future builds.
sdist then puts the distribution files in a dist subdirectory by default. python setup.py sdist -d $TMP (or the %something% equivalent environment variable for Windows) can be used to put the file in a temporary directory instead, so that they’re wiped out at the next boot.
If you really care about the build dir, try this: python setup.py build -b $TMP sdist -d $TMP. sdist should be clever enough to find the files created by build.
distutils docs: http://docs.python.org/distutils
command help: python setup.py build --help
Log into PyPI, and click on your package in the gray box in the upper right corner of the screen. Click "files" in the list to the right of the appropriate version. That will load a new page. Check off the files you want to delete and click the gray "Remove" button.
Related
I am attempting to reupload everything in my local site-packages folder into a docker container running pypi-server. This is ultimately to download them again into some container using pip instead of some internal tooling that is not idea for python.
The content of my site-packages is mostly items like:
package_a
package_a.egg-info
package_b
package_b.egg-info
Is there a way to convert or simply copy these into the pypi-server? I think I'm missing setup.py and setup.cfg and was curious if there's a way to recover those in site-packages?
Docker image: https://hub.docker.com/r/pypiserver/pypiserver
I grabbed my project (which contains a setup.py) and used python setup.py sdist and twine upload --repoisotory-url <PATH> dist/* per the docker instructions. That puts a tar.gz into the pypi server. Great first step that shows it is working, but not what I need to pull things out of my site-packages.
I tried just copying site-packages content over. This fails because it's not a tar.gz.
I've tried to make a tar.gz by putting pacakge_a under package_a.egg-info/src and making that a tar.gz archive I cp over to the volume I've mounted for pypi to check for packages. This approximates the structure of the tar.gz I made with sdist. However I get an error about the package missing setup.py or pyproject.toml.
I can confirm that setup.py nor pyproject.toml are present in the site-packages.
I'm new to packaging in Python. I've tried to specify my non-python files within setup.py's 'scripts' argument, and also specifying the file within MANIFEST.in, however after I package the file using python setup.py build sdist and install using pip, only the files with the .py extension make it to the site-packages/my_package directory.
Am I missing something?
15 minutes later I find the answer. sdist only includes the *.py files. I just changed the command to use bdist_wheel and all the files I needed were included.
I use CMake to create build scripts (Makefiles + VS solutions) for my projects. As best practice I create the build scripts in a separate folder (out of source). I build the projects in the same folder.
This works fine for compiled programs but I can't find an adequate solution for my Python scripts as these have no build step that would copy (build) them to the build folder.
Looking for creative solutions....
Requirements:
All executables should be available in the build folder post build (I consider *.py files to be executable
Python scripts should be easily managed using an IDE (spyder, eclipse, etc)
Source folder with python scripts is in Git repository. Build folder is not.
C++ compiled python modules should reside next to relevant python scripts
So far I considered two options:
Copy scripts to build folder when running CMake - Need to run CMake for every change in python files (IDE unfriendly). Can cause confusion: which copy of the sources to edit?
Create links to source folder in build folder - Multi platform mess. Problem deploying compiled c++ python modules next to the scripts without polluting source folder.
I hope this is clear enough.
Eventually I found a solution which involves creating symbolic links to the python sources and other related files that are not compiled but are necessary in the build environment. To allow mixing built modules with the symbolic links I used real folders instead of symbolic links.
This way:
There is one copy of the python scripts
It can be run/edited seamlessly from the binary folder
Utility function:
function (create_symlinks)
# Do nothing if building in-source
if (${CMAKE_CURRENT_BINARY_DIR} STREQUAL ${CMAKE_CURRENT_SOURCE_DIR})
return()
endif()
foreach (path_file ${ARGN})
get_filename_component(folder ${path_file} PATH)
# Create REAL folder
file(MAKE_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/${folder}")
# Delete symlink if it exists
file(REMOVE "${CMAKE_CURRENT_BINARY_DIR}/${path_file}")
# Get OS dependent path to use in `execute_process`
file(TO_NATIVE_PATH "${CMAKE_CURRENT_BINARY_DIR}/${path_file}" link)
file(TO_NATIVE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/${path_file}" target)
if (UNIX)
set(command ln -s ${target} ${link})
else()
set(command cmd.exe /c mklink ${link} ${target})
endif()
execute_process(COMMAND ${command}
RESULT_VARIABLE result
ERROR_VARIABLE output)
if (NOT ${result} EQUAL 0)
message(FATAL_ERROR "Could not create symbolic link for: ${target} --> ${output}")
endif()
endforeach(path_file)
endfunction(create_symlinks)
Usage for a python module (inside CMakeLists.txt):
# Do not omit !!!RELATIVE!!!
file(GLOB_RECURSE files RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} *.py *.dat *.xml)
create_symlinks(${files})
Usage:
cd src_dir
mkdir build_dir
cd build_dir
cmake ..
IMPORTANT:
When adding new files don't forget to run cmake
On Windows mklink support only some Windows versions
On Windows mklink can be run as Administrator only. A workaround can be found here.
Use relative paths when calling create_symlinks as this is how the directory structure is reconstructed in the binary folder.
I would compile the python script into the build folder by creating a custom command as is shown in this post.
Python has ability to "pseudoinstall" a package by running it's setup.py script with develop instead of install. This modifies python environment so package can be imported from it's current location (it's not copied into site-package directory). This allows to develop packages that are used by other packages: source code is modified in place and changes are available to rest of python code via simple import.
All works fine except that setup.py develop command creates an .egg-info folder with metadata at same level as setup.py. Mixing source code and temporary files is not very good idea - this folder need to be added into "ignore" lists of multiple tools starting from vcs and ending backup systems.
Is it possible to use setup.py develop but create .egg-info directory in some other place, so original source code is not polluted by temporary directory and files?
setup.py develop creates a python egg, in-place; it does not [modify the] python environment so package can be imported from it's current location. You still have to either add it's location to the python search path or use the directory it is placed in as the current directory.
It is the job of the develop command to create an in-place egg, which may include compiling C extensions, running the 2to3 python conversion process to create Python3 compatible code, and to provide metadata other python code may be relying on. When you install the package as an egg in your site-packages directory, the same metadata is included there as well. The data is certainly not temporary (it is extracted from your setup.py file for easy parsing by other tools).
The intent is that you can then rely on that metadata when using your package in a wider system that relies on the metadata being present, while still developing the package. For example, in a buildout development deployment, we often use mr.developer to automate the process of fetching the source code for a given package when we need to work on it, which builds it as a develop egg and ties it into the deployment while we work on the code.
Note that the .egg-info directory serves a specific purpose: to signal to other tools in the setuptools eco-system that your package is installed and available. If your package is a dependency of another egg in your setup, then that dependency is satisfied. pip and easy_install and buildout will not try and fetch the egg from somewhere else instead.
Apart from creating the .egg-info directory, the only other thing the command does, is to build extensions, in-place. So the command you are looking for instead is:
setup.py build_ext --inplace
This will do the exact same thing as setup.py develop but leave out the .egg-info directory. It also won't generate the .pth file.
There is no way of generating only the .pth file and leave out the .egg-info directory generation.
Technically speaking, setup.py develop will also check if you have the setuptools site.py file installed to support namespaced packages, but that's not relevant here.
The good manner is to keep all source files inside special directory which name is your project name (programmers using other languages keep their code inside src directory). So if your setup.py file is inside myproject directory then you should keep the files at myproject/myproject. This method keeps your sources separated from other files regardless what happen in main directory.
My suggestion would be to use whitelist instead of blacklist -- tell the tools to ignore all files excluding these which are inside myproject directory. I think that this is the simplest way not to touch your ignore lists too often.
Try the --install-dir option. You may also want to use --build-dir to change building dir.
I'm trying to create a deb-package from distribution in a tarball. It has setup.py file.
My actions are:
python setup.py --command-packages=stdeb.command sdist_dsc
cd deb_dist/<pkgname>
debuild -uc -us -i -b
Everything works fine. But when i do
dpkg -i <pkgname>.deb
all package module's files are installs into /usr/share/pyshared/<pkgname> directory and i want to change it.
Is it possible? How?
Thanks.
That's the right directory for installation of Python system libraries, according to Debian Python Policy. The generated deb source ought to be arranging for those files to be symlinked into the appropriate /usr/lib/python2.*/dist-packages directories, based on what Python versions are installed. That would be normally be taken care of by the dh_python2 tool during package build; it should put calls to update-python-modules in the generated postinst.
That behavior can be changed, but the right way to change it depends on the reason you want to change it. What part of this process isn't working for you?