Poetry No file/foler for package - python

I have a simple project layout
myproject on  main [$!?] is 📦 v1.0.0 via  v18.14.0 via 🐍 v3.10.9
❯ tree -L 1
.
├── build
├── deploy
├── Dockerfile
├── poetry.lock
├── pyproject.toml
├── README.md
├── scripts.py
└── src
The pyproject.toml is:
[tool.poetry]
name = "myproject"
version = "0.1.0"
description = ""
authors = [""]
[tool.poetry.scripts]
test = "scripts:test"
The scripts.py is:
import subprocess
def test():
"""
Run all unittests.
"""
subprocess.run(
['python', '-u', '-m', 'pytest']
)
if __name__ == '__main__':
test()
When I run poetry run test:
myproject on main [$!?] is 📦 v1.0.0 via  v18.14.0 via 🐍 v3.10.9
No file/folder found for package myproject

Related

Python TestPypi no module named XXX when importing functions from other folders in main.py

I am trying to use TestPypi to package my project. In the main scripts main.py, I use some functions in other folders like common. The project folder is like following:
proj_name/
├── LICENSE
├── pyproject.toml
├── README.md
├── proj_name/
│ └── main.py
├── commons/
│ └── common.py
└── tests/
I tried from commons import common as common and import commons.common as common in main.py but I cannot succeed because after I package my project and pip install it, I will get the result ModuleNotFoundError: No module named 'commons'.
My pyproject.toml is like this:
[tool.poetry]
name = "XXXX"
version = "XXX"
description = ""
authors = ["XXXX"]
readme = "README.md"
[tool.poetry.scripts]
proj_name = "proj_name.main:app"
[tool.poetry.dependencies]
python = "^3.9"
typer = {extras = ["all"], version = "^0.6.1"}
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
May I ask how I can organise the code so that the common module can be imported?
Thanks.

Import module installed in venv

I am trying to call a custom module installed in my virtualenv. The module built by myself has the following structure:
test
├── README.md
├── setup.py
├── src
│   ├── __init__.py
│   └── module_x
│   └── abc.py
└── tests
setup.py looks like following:
from setuptools import find_packages, setup
with open('README.md', 'r') as f:
long_description = f.read()
setup(
name='test',
version='0.1.0',
author='me',
description='description',
long_description=long_description,
long_description_content_type='text/markdown',
packages=find_packages('src'),
package_dir={'': 'src'},
install_requires=[''],
entry_points={
'console_scripts': [
'test=module_x.abc:main'
],
}
)
Inside of the abc.py, I am using the following code:
class RandomClass:
def __init__(self, msg):
self.msg = msg
info()
def info():
print(self.msg)
def main(msg):
RandomClass(msg)
init.py looks like
from .module_x.abc import main
The egg was installed using pip install -e ..
UPDATED
How can I call the module in a new Python script like from test import main? Currently it works just like from test.src import main and I don't want to know the whole structure of the package. In the future will exist also module_y, module_z etc. My assumption is that I would need to modify something in the setup.py.

Building CMake extension with pip

I'm working on a C++/Python project with the following structure:
foo
├── CMakeLists.txt
├── include
├── source
└── python
   ├── foo
   │   ├── _foo_py.py
   │   └── __init__.py
   ├── setup.py
   └── source
   ├── CMakeLists.txt
   └── _foo_cpp.cpp
foo/source and foo/include contain C++ source files and foo/python/source/_foo_cpp.cpp contains pybind11 wrapper code for this C++ code. Running setup.py is supposed to build the C++ code (by running CMake), create a _foo_cpp Python module in the form of a shared object and integrate it with the Python code in _foo_py.py. I.e. I want to be able to simply call python setup.py install from foo/python to install the foo module to my system. I'm currently using a CMake extension class in setup.py to make this work:
class CMakeExtension(Extension):
def __init__(self, name, sourcedir):
Extension.__init__(self, name, sources=[])
self.sourcedir = os.path.abspath(sourcedir)
class CMakeBuild(build_ext):
def run(self):
try:
subprocess.check_output(['cmake', '--version'])
except OSError:
raise RuntimeError("cmake command must be available")
for ext in self.extensions:
self.build_extension(ext)
def build_extension(self, ext):
if not os.path.exists(self.build_temp):
os.makedirs(self.build_temp)
self._setup(ext)
self._build(ext)
def _setup(self, ext):
cmake_cmd = [
'cmake',
ext.sourcedir,
]
subprocess.check_call(cmake_cmd, cwd=self.build_temp)
def _build(self, ext):
cmake_build_cmd = [
'cmake',
'--build', '.',
]
subprocess.check_call(cmake_build_cmd, cwd=self.build_temp)
The problem arises when I try to directly call pip in foo/python, e.g. like this:
pip wheel -w wheelhouse --no-deps .
It seems that before running the code in setup.py, pip copies the content of the working directory into a temporary directory. This obviously doesn't include the C++ code and the top-level CMakeLists.txt. That in turn causes CMakeBuild._setup to fail because there is seemingly no way to obtain a path to the foo root directory from inside setup.py after it has been copied to another location by pip.
Is there anything I can do to make this setup work with both python and pip? I have seen some approaches that first run cmake to generate a setup.py from a setup.py.in to inject package version, root directory path etc. but I would like to avoid this and have setup.py call cmake instead of the other way around.

Error: the command ( (from my_script) could not be found within PATH

While working on a Python FastAPI project using Pipenv and Pytest, I was asked to write a Pipenv script to run the tests.
Project has the following structure:
.
├── app
| ├── main.py
│   ├── my_package
│   │   ├── __init__.py
│   │   ├── (project folders)
│   │   └── tests
| | └── tests.py
│   └── __pycache__
└──(deployment scripts and folders, Pipfile, Dokerfile, etc)
I'd like to run Pytest on tests.py from the top structure (./app/.../tests.py). At least for now.
As recommended here, I currently run them from the top with:
( cd app ; python -m pytest my_package/tests/tests.py )
... which works as expected.
However, when I add that my Pipfile-scripts-section:
[scripts]
my_script = "( cd app ; python -m pytest my_package/tests/tests.py )"
... run it with:
pipenv run my_script
I get the error:
Error: the command ( (from my_script) could not be found within PATH.
I've also tried:
[scripts]
my_script = "cd app && python -m pytest my_package/tests/tests.py"
... which returns another similar error:
Error: the command cd (from my_script) could not be found within PATH.
So it's clear I'm wrong to use it as bash aliases.
I've tried searching for more documentation on how the [scripts] section works as I, but I've had no luck (yet).
I'm not familiar with the tools you're using, but the error message suggests that it's looking for an executable. ( is part of shell syntax, and cd is a shell builtin, not an executable. Try this:
my_script = "bash -c 'cd app && python -m pytest my_package/tests/tests.py'"
Here bash is the executable, and -c makes it run your snippet.
BTW, keep in mind that cd can fail, so the script should bail out if it does.

how to run a script using pyproject.toml settings and poetry?

I am using poetry to create .whl files.
I have an ftp sever runing on a remote host.
I wrote a python script (log_revision.py) which save in a database the git commit, few more parameters and in the end send the the .whl(that poetry created) to the remote server ( each .whl in a different path in the server, the path is save in the db) .
At the moment I run the script manually after each time I run the poetry build commend.
I know the pyproject.toml has the [tool.poetry.scripts] but i dont get how can i use it to run a python script.
I tried
[tool.poetry.scripts]
my-script = "my_package_name:log_revision.py
and then poetry run my-script but I allways get an error
AttributeError: module 'my_package_namen' has no attribute 'log_revision'
1. can some one please help me understand how to run to wish commend?
as a short term option(with out git and params) i tried to use the poetry publish -r http://192.168.1.xxx/home/whl -u hello -p world but i get the following error
[RuntimeError]
Repository http://192.168.1.xxx/home/whl is not defined
2. what am i doing wring and how can i fix it?
would appricate any help, thx!
At the moment the [tool.poetry.scripts] sections is equivalent to setuptools console_scripts.
So the argument must be a valid module and method name. Let's imagine within your package my_package, you have log_revision.py, which has a method start(). Then you have to write:
[tool.poetry.scripts]
my-script = "my_package.log_revision:start"
Here's a complete example:
You should have this folder structure:
my_package
├── my_package
│   ├── __init__.py
│   └── log_revision.py
└── pyproject.toml
The content of pyproject.toml is:
[tool.poetry]
name = "my_package"
version = "0.1.0"
description = ""
authors = ["Your Name <you#example.com>"]
[tool.poetry.dependencies]
python = "^3.8"
[tool.poetry.scripts]
my-script = "my_package.log_revision:start"
[build-system]
requires = ["poetry_core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
and of log_revision.py:
def start():
print("Hello")
After you have run poetry install once you should be able to do this:
$ poetry run my-script
Hello
You cannot pass something to the start() method directly. Instead you can use command line arguments and parse them, e.g. with pythons argparse.
Although the previous answers are correct, they are a bit complicated. The simplest way to run a python script with poetry is as follows:
poetry run python myscript.py
If you are using a dev framework like streamlit you can use
poetry run streamlit run myapp.py
Basically anything you put after poetry run will execute from the poetry virtual environment.
For future visitors, I think what OP is asking for (a post build hook?) isn't directly supported. But you might find satisfaction from using a tool I wrote called poethepoet which integrates with poetry to run arbitrary tasks defined in the pyproject.toml in terms of shell commands or by referencing python functions.
For example you could define something like the following in your pyproject.toml
[tool.poe.tasks.log_revision]
script = "my_package.log_revision:main" # where main is the name of the python function in the log_revision module
help = "Register this revision in the catalog db"
[tool.poe.tasks.build]
cmd = "poetry build"
help = "Build the project"
[tool.poe.tasks.shipit]
sequence = ["build", "log_revision"]
help = "Build and deploy"
And then execute and of the tasks with the poe CLI like the following which will run the other two tasks in sequence, thus building your project and running the deployment script in one go!
poe shipit
By default tasks are executed inside the poetry managed virtualenv (like using poetry run) so the python script can use dev dependencies.
If you need to define CLI arguments or load values into a task from a dotenv file (such as credentials) then this is also supported.
Update: poetry plugin support
poethepoet can now support post build hooks when used as a poetry plugin. For example when using poetry >=1.2.0b1 you can configure the following to run your log_revision task automatically after poetry build is run:
[tool.poe.poetry_hooks]
post_build = "log-revision"
[tool.poe.tasks.log-revision]
script = "scripts:log_revision"
Tinkering with such a problem for a couple of hours and found a solution
I had a task to start the django server via poetry script.
Here are the directories. manage.py is in test folder:
├── pyproject.toml
├── README.rst
├── runserver.py
├── test
│   ├── db.sqlite3
│   ├── manage.py
│   └── test
│   ├── asgi.py
│   ├── __init__.py
│   ├── __pycache__
│   │   ├── __init__.cpython-39.pyc
│   │   ├── settings.cpython-39.pyc
│   │   ├── urls.cpython-39.pyc
│   │   └── wsgi.cpython-39.pyc
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
├── tests
│   ├── __init__.py
│   └── test_tmp.py
└── tmp
└── __init__.py
I had to create a file runserver.py:
import subprocess
def djtest():
cmd =['python', 'test/manage.py', 'runserver']
subprocess.run(cmd)
then write the script itself pyproject.toml:
[tool.poetry.scripts]
dj = "runserver:djtest"
and still make changes to pyproject.toml:
[tool.poetry.scripts]
dj = "runserver:djtest"
only then command poetry run dj started working

Categories