Import only works when module installed using --editable pip flag - python

I have this project on github that allows me to do some jekyll actions more easily.
When I clone that and install it into my virtualenv ( pip install virtualenv .) it works fine, but if I just install without the --editable flag I get these errors when I try to use the commands exposed by click:
$ jk-config-set-editor Traceback (most recent call last):
File "/home/felipe/jekyll-utils/jekyll-venv/bin/jk-config-set-editor", line 7, in <module>
from jekyllutils.configs import set_editor
ImportError: No module named 'jekyllutils'
This is my setup.py file:
from setuptools import setup
setup(
name="jekyllutils",
version='0.1',
py_modules=['generators'],
install_requires=[
'click',
'python-slugify',
'appdirs',
'toml'
],
entry_points='''
[console_scripts]
jk-new = jekyllutils.generators:new_post
jk-edit = jekyllutils.managers:edit_post
jk-config-set-editor = jekyllutils.configs:set_editor
jk-config-set-posts-path = jekyllutils.configs:set_path_to_posts_dir
jk-config-dump-configs = jekyllutils.configs:dump_configs
jk-config-clear-configs = jekyllutils.configs:clear_configs
'''
)
Anybody has any idea as to why this works when --editable is on but not otherwise?

In case anyway runs into this same issue, what worked for me was to use the find_packages function to define my packages in setup.py
I also had to define static data files using the package_data field.
from setuptools import setup, find_packages
setup(
name="jekyllutils",
version='0.1',
py_modules=['generators'],
install_requires=[
'click',
'python-slugify',
'appdirs',
'toml'
],
entry_points='''
[console_scripts]
jk-new = jekyllutils.generators:new_post
jk-edit = jekyllutils.managers:edit_post
jk-config-set-editor = jekyllutils.configs:set_editor
jk-config-set-posts-path = jekyllutils.configs:set_path_to_posts_dir
jk-config-dump-configs = jekyllutils.configs:dump_configs
jk-config-clear-configs = jekyllutils.configs:clear_configs
''',
packages=find_packages(),
package_data={
"": ["*.txt", "*.json", "*.csv", "*.html"],
},
)

Related

Install .desktop file with setuptools and pyproject.toml

I have a GUI Python app that I'm trying to distribute a desktop entry with. Normally, one would write a setup.py with setuptools that has this in it:
from setuptools import setup
setup(
name = 'myapp',
version = '0.0.1',
packages = ['myapp'],
data_files = [
('share/applications', ['myapp.desktop']),
],
)
This is deprecated, however, and my goal is to use only pyproject.toml in my repo with no setup.py or setup.cfg needed. I have been unable to find any information on how I would go about doing this.

python setuptools installs package in different directories

from distutils.core import setup
import setuptools
setup(name = 'my_project_name',
version = '0.0.1',
description = 'My project',
py_modules = ['main'],
packages = ['generated'],
python_requires = '>=3.5',
install_requires = [
'requests>=2.20.0',
'grpcio>=1.48.2',
'grpcio-tools>=1.48.2' ],
)
Now I do python setup.py sdist and it successfully builds package and places in dist/my_project_name-0.0.1.tar.gz in the current directory.
However when I install it with pip install dist/my_project_name-0.0.1.tar.gz, it does install the package in $HOME/.local/lib/python3.6/site-packages (which is fine, I don't run it as root), but in two different pieces:
$HOME/.local/lib/python3.6/site-packages/main.py
$HOME/.local/lib/python3.6/site-packages/generated/*
I was expecting that both main.py and generated/ will go under lib/python3.6/site-packages/my_project_name/. Is there a way to do what I want, or this is a python way?
Will appreciate helpful advices!

Packaging a Python Project to an Executable

I have a project with the below structure:
projectname/projectname/__main__.py
I execute the program using python -m projectname.
If I want to install it locally in my system so that I can just call projectname, How can I achieve this?
You'll want to make a file setup.py in the top-level projectname that installs the package and adds some command (e.g. yeet) to your path. That command will call some function inside projectname/__main__.py:
from setuptools import setup
setup(
name='mypackagename',
version='0.0.1',
packages=['mypackagename'],
install_requires=[
'tensorflow>=2.0.0', # put your modules from requirements.txt here
],
entry_points={
'console_scripts': [
'yeet=projectname:function_to_run',
],
},
)

python installing package with submodules

I have a custom project package with structure like:
package-dir/
mypackage/
__init__.py
submodule1/
__init__.py
testmodule.py
main.py
requirements.txt
setup.py
using cd package-dir followed by $pip install -e . or pip install . as suggested by python-packaging as long as I access the package from package-dir
For example :
$cd project-dir
$pip install .
at this point this works:
$python -c 'import mypackage; import submodule1'
but This does not work
$ cd some-other-dir
$ python -c 'import mypackage; import submodule1'
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named submodule1
How to install all the submodules?
also, if i check the package-dir/build/lib.linux-x86_64-2.7/mypackage dir, I only see the immediate files in mypackage/*.py and NO mypackage/submodule1
setup.py looks like:
from setuptools import setup
from pip.req import parse_requirements
reqs = parse_requirements('./requirements.txt', session=False)
install_requires = [str(ir.req) for ir in reqs]
def readme():
with open('README.rst') as f:
return f.read()
setup(name='mypackage',
version='1.6.1',
description='mypackage',
long_description=readme(),
classifiers=[
],
keywords='',
url='',
author='',
author_email='',
license='Proprietary',
packages=['mypackage'],
package_dir={'mypackage': 'mypackage'},
install_requires=install_requires,
include_package_data=True,
zip_safe=False,
test_suite='nose.collector',
tests_require=['nose'],
entry_points={
'console_scripts': ['mypackage=mypackage.run:run'],
}
)
setup.py is missing information about your package structure. You can enable auto-discovery by adding a line
setup(
# ...
packages=setuptools.find_packages(),
)
to it.

How should I do to install all dependencies pypi?

I wrote a package available in pypi, python repository and it depends of other packages as I show with the following code of setup.py file.
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
setup(
name='aTXT',
packages=['aTXT'],
# package_data={ '':['*.py'],
# 'bin': ['bin/*'], 'docx': ['docx/*'], 'pdfminer': ['pdfminer']},
version=VERSION,
include_package_data=True,
# arbitrary keywords
install_requires=[
'lxml>=3.2.3',
'docx>=0.2.0',
'pdfminer',
'docopt>=0.6.2',
'PySide',
'kitchen>=1.1.1',
'scandir>=0.8'
],
requires=['docopt', 'scandir', 'lxml', 'PySide', 'kitchen'],
)
When I'd tried to install from pip with:
pip install aTXT
If some of the requirements package are not installed, it raise a Import Error.
But, why not pip try to install all dependencies?
The following is an example if I don't have lxml package installed.
ImportError: No module named lxml
Complete output from command python setup.py egg_info:
Traceback (most recent call last):

Categories