I am trying to create a Repl.it on my Python project, and when I run, it fails at not finding [tool.poetry] section. And yes my project has a pyproject.toml file.
Repl.it: Updating package configuration
--> /usr/local/bin/python3 -m poetry add halo vistir distlib click packaging tomlkit pip-shims pythonfinder python-cfonts appdirs
[RuntimeError]
[tool.poetry] section not found in pyproject.toml
add [-D|--dev] [--git GIT] [--path PATH] [-E|--extras EXTRAS] [--optional] [--python PYTHON] [--platform PLATFORM] [--allow-prereleases] [--dry-run] [--] <name> (<name>)...
exit status 1
Repl.it: Package operation failed.
The question is, how can I know what is happening in the initializing stage, how does it know what dependencies to install and how can I change the behavior? You can try this repo: github/frostming/pdm for reproduction.
After importing project you could specify run button behaviour with bash command:
This will be saved to .replit. You could write tings like pip3 install -r requirements.txt && python3 main.py. Read more about available settings in .replit docs
Also there is another doc about dependencies with following quote:
In a pyproject.toml file, you list your packages along with other
details about your project. For example, consider the following
snippet from pyproject.toml:
...
[tool.poetry.dependencies]
python = "^3.8"
flask = "^1.1"
...
Related
I'm trying to add a pyproject.toml to a project that's been using setup.py in order to enable support by pipx. I'd like to specify the command line scripts the project includes in pyproject.toml, but all the guides I can find give instructions for use with poetry, which I am not using.
I also don't want to specify entry points to modules - I already have working command line scripts and just want to specify those.
Is there a proper place in pyproject.toml to specify command line scripts?
Not sure it matters, but the package in question is cutlet.
Update July 2022: if your TOML file uses setuptools as its build system, setuptools will happily create and install a console script. For example, my pyproject.toml file starts like this:
[build-system]
requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"
Extend your pyproject.toml file with an entry like this, naming the package, module and entry-point function names:
[project.scripts]
my-client = "my_package.my_module:main_cli"
Then run the install sequence:
pip3 install .
And setuptools will install a shell script named my-client somewhere appropriate, for me in my Py3.9 virtual environment 's bin directory (~/.virtualenvs/py39/bin).
I was doing a similar thing, upgrading a package that had a setup.py, altho I had no existing scripts. With the rewrite to using pyproject.toml I dropped the old setup.py file entirely.
FWIW I realize the OP asked for a way to install existing scripts, which I didn't provide. This answer tells setuptools to create and install new scripts.
Update Feb 2023: thanks for all the votes. If you're cutting corners to meet arbitrary management deadlines, just copy-paste this short pyproject.toml file and adjust:
[build-system]
requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"
[project]
name = "my_client"
version = "1.2.3"
authors = [{name="Ubr Programmer", email="ubr#gmailington.com" }]
description = "Client for my awesome system"
readme = "README.md"
dependencies = ["cachetools","requests"]
requires-python = ">=3.9"
[project.scripts]
my-client = "my_package.my_module:main_cli"
[project.urls]
"Homepage" = "https://github.com/your_name_here/something"
"Bug Tracker" = "https://github.com/your_name_here/something/issues"
Is there a proper place in pyproject.toml to specify command line scripts?
PEP566 (Metadata 2.1) only defines Core metadata specifications. Thus, the answer depends on your build system (Note: PEP518 defines build system concept).
If you use the existing build tools such as setuptools, poetry, and flit, you only can consider adding such options in pyproject.toml if that tool supports command line scripts (console_scripts) in pyproject.toml. Surely, if you have your own build tool, you need to implement a parser to parse the command line scripts in pyproject.toml.
Lastly, you can check below list to know which major build system supports command line scripts (console_scripts) in pyproject.toml (2020/Oct):
setuptools: not implemented yet according to PEP621
poetry: yes, here's part of the implementation.
flit: yes.
I've got a repo on GitHub, for which I wanted to implement Travis CI with pytest for basic testing. Currently the Travis CI build fails when loading data tables within a module, raising a FileNotFoundError.
To make it short, here is the imho most important information on the build:
directory of the data tables is included in MANIFEST.in with include mypkg/data_tables/* (see below for a detailed structure)
setuptools.setup method has the include_package_data=True parameter
additionally packages=setuptools.find_packages() is provided
Travis CI installs the package with install: pip install -e .
Travis CI pytest is invoked with script: pytest --import-mode=importlib
during testing the first tests succeed. But when it comes to loading the data tables, pytest raises the error FileNotFoundError: [Errno 2] No such file or directory: '/home/travis/build/myname/mypkg/mypkg/data_tables\\my_data.csv'
Interestingly the slashes before the file name are back-slashes, while the other are not, even though the final path is constructed with os.path.abspath().
Detailed description
Unluckily the repo is private and I'm not allowed to share it. Thus I'll try to describe the GitHub package layout as detailed as possible. So let's say my repo is built with a structure like this (general layout taken from this example):
setup.py
MANIFEST.in
mypkg/
some_data_tables/
my_data.csv
my_other_data.pkl
__init__.py
view.py
tests/
test_view.py
My minimum MANIFEST.in looks like this:
include mypkg/data_tables/*
With the setup.py fully reduced to a minimum working example like this:
from setuptools import find_packages, setup
setup(
name='Mypkg',
version='123.456',
description='some_text',
python_requires='>=3.7.7',
packages=find_packages( # <---- this should be sufficient, right?
exclude=["tests", "*.tests", "*.tests.*", "tests.*"]),
include_package_data=True, # <---- also this should work
)
And the .travis.yml file (omitting - pip install -r requirements.txt etc.):
language: python
python:
- "3.7.7"
dist: xenial
install:
- pip install -e .
script:
- pytest --import-mode=importlib
Checking the content of the .egg or tar.gz files, the data tables are included. So I have no idea, where the files are "getting lost".
Any idea how to solve this error?
If providing more information could help, f.i. on the class initialized in test_view, please tell me.
I wrote a command-line app using python.
the problem is I want the to user can use the command globally after installed the command-line .
I wrote the command-line, I published the package but I don't know how to make this package globally available for users as system commands.
Example :
pip install forosi
and after that user can globally run this command from everywhere they want . like :
forosi help
I'm going to assume you have the main file you are supposed to run in src/forosi.py in your package directory, but you should be able to adapt this if it's different.
First, you want to rename the script to forosi, without the .py extension.
Second, at the top of the file (now called forosi) add the following:
#!/usr/bin/env python3
... rest of file...
In your setup.py for the package, you need to use the scripts option.
setuptools.setup(
...
scripts=['src/forosi'],
...
)
This is the method that required minimal refactoring of your code. If you happen to have a main() function in one of your python files which is the entrypoint of the script, you can just add the following into your setup.py instead of the above:
setup(
...
entry_points = {
'console_scripts': ['src.forosi:main'],
}
...
)
In either case, to build the package locally, run
python3 setup.py bdist_wheel
This will create a wheel file in the dist/ directory called package_name-version-<info>-.whl. This is the official distribution for pypi packages.
To install this package, run:
pip3 install dist/package_name-version-<info>-.whl
or if you only have one version in the dist folder, just
pip3 install dist/*
I created a python package looking like the following. The package is primarily used to run stages in a jenkins pipeline inside a docker container. So I created a repository in github and created a dockerfile with a step where the repository is cloned and performed pip install on that package. Then I built the docker image.
jenkins_pipeline_pkg/
| - jenkins_pipeline_pkg/
| - __init__.py
| - config/
| - config.yaml
| - scripts/
| - pre_build.py
| - build.py
| - setup.py
I performed pip install on the package inside the docker container I created using the dockerfile. The setup.py looks like the following.
#!/usr/bin/env python
from setuptools import setup
setup(name='jenkins_pipeline_pkg',
version='0.1',
description='Scripts for jenkins pipeline',
url='<private repo url>',
author='<name>',
author_email='<email>',
packages=['jenkins_pipeline_pkg'],
zip_safe=False,
entry_points={
'console_scripts': [
'pre-build = jenkins_pipeline_pkg.pre_build:main',
'build = jenkins_pipeline_pkg.build:main',],
}
)
I ran pip install on the package. It installed the executable mentioned in the entry_points in ~/.local/bin. Then I tried to execute the executable from anywhere else by not changing into the directory ~/.local/bin (just say I executed from /home/user). And also bash auto complete doesnt show the pre-build command. I dont know what I'm missing here.
Try either creating link for executable in /use/bin or include ~/.local/bin in $PATH.
Edit:
export PATH=~/.local/bin:$PATH
Travis CI showed a weird behaviour after I tried to integrate with coverage. Before trying to use coverage the build was okay with all the tests. Now suddenly it doesn't locate the file
here is the .travis.yml file
#language to use for app
language: python
-- "3.6"
script:
- virtualEnv/run_travis.sh
# whitelist
branches:
only:
- master
- flask_dev_branch
#dependacies and libraries to install
install: pip install -r virtualEnv/requirements.txt
after_success:
- coveralls
And the file run_travis.sh
#!/usr/bin/env bash
python tests/test_shopping_cart.py > /dev/null &
nosetests --with-coverage
Also an image of the directory with the files included
All this started happening after trying to configure coverage.
Your tests subdirectory is inside virtualEnv and the current directory is the parent of virtualEnv. Run
python virtualEnv/tests/test_shopping_cart.py
PS. And please don't show screenshots — copy text.
I guess the requirements.txt is not in the virtualEnv directory.