I would like to install a config file in a users .config directory. I've tried
# setup.py
from setuptools import setup
import os
setup(
# ...
data_files=[(
'{}/.config/foobar/'.format(os.environ['HOME']),
['config/foobar.config']
)]
)
but this strips the leading / and installs the file at
$HOME/.local/lib/python3.8/site-packages/home/johndoe/.config/foobar/foobar.config
How to install into the ~/.config dir? Bonus points if it works with a setup.cfg file.
Related
I have a pretty weird case - even don't know where to look.
I have a Python package e.g. my_utils which is uploaded to artifactory (same as PyPi).
Project structure:
my_utils
__init__.py
first_package
__init__.py
some_file.py
first_package/some_file.py
def do_job():
print("job_done")
my_utils/init.py
from first_package.some_file import do_job
Package deployed using pretty standard way:
setup.py
from os import path
import setuptools
from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
with open(path.join(here, 'README.md'), encoding='utf-8') as f:
long_description = f.read()
setuptools.setup(
name='my_utils',
version='1.0.0',
description='my_utils',
setup_requires=['wheel'],
long_description=long_description,
include_package_data=True,
url='',
author='',
author_email='',
license='MIT',
packages=setuptools.find_packages(),
package_data={
'': ['*.*']
},
install_requires=[],
zip_safe=False
)
To deploy - I use command:
python setup.py bdist_wheel upload -r local
So when I do pip install my_utils - I can do the following:
# env with my_utils installed
from my_utils import do_job
do_job() # job done
Now when I use this package in the application that is deployed with Docker and contains it in requirements:
requirements.txt
my_utils==1.0.0
Dockerfile
FROM ...
COPY requirements.txt .
RUN PIP install --no-cache-dir -r requirements.txt
When I enter the container i have file my_utils/__init__.py empty. And the import failed:
File "/app/app.py", line 13, in <module>
from my_utils import do_job
ImportError: cannot import name do_job
And the direct import works fine:
from my_utils.first_package.some_file import get_job
For now, I switched to "direct" import without a shortcut but really interesting why that can happen
When I go inside the container - I see that file __init__.py is empty. But when re-installed it gets content
$: docker exec -it my_app bash
# du /usr/lib/python2.7/site-packages/my_utils/__init__.py
0 /usr/lib/python2.7/site-packages/my_utils/__init__.py
# pip uninstall my_utils
# pip install my_utils
# du /usr/lib/python2.7/site-packages/my_utils/__init__.py
4 /usr/lib/python2.7/site-packages/my_utils/__init__.py
# cat /usr/lib/python2.7/site-packages/my_utils/__init__.py
from first_package.some_file import do_job
I have the following directory structure:
/pythonlibraries
/libraryA
setup.py
libraryA/
__init__.py
alib.py
/libraryB
setup.py
libraryB/
__init__.py
blib.py
blib.py:
import libraryA
setup.py for libraryB:
from setuptools import setup
setup(name='libraryB',
version='0.0',
description='',
packages=['libraryB'],
install_requires=["ujson", "/pythonlibraries/libraryA"])
This doesn't work :/
How can I install local dependencies with pip?
Ideally I'd like to do pip install -e /pythonlibraries/libraryB and have it automatically install libraryA from my local disk.
Right now I have to install each local library individually manually...
Did you try to write full path like this
install_requires=["ujson", "/home/user/pythonlibraries/libraryA"])
Because "/" --> this is absolute directory
Following is the folder structure:
Utility/utils/__init__.py, wrapper, auditory.py
/setup.py
I have been trying to install utils as site-package in python by running "python setup.py install"
When I go and check the site-package, there is a egg file which has my utils folder and egg-info.
But it should create my utils folder inside site-packages right?
Am I missing something here?
from setuptools import setup
setup(
name='utils',
version='0.1',
packages=['utils'],
license='Internal use only',
zip_safe = False
)
Ideally it should place the utils folder inside site-packages and egg-info inside egg file.
so that utils package would be available similar to and pandas
I use include_package_data=True with setuptools.
Despite I have include_package_data=True when I run python setup.py install my *.xml and *.ttl (and other) files are not installed.
What is my error? Or is it a bug of setuptools? What to do?
From https://github.com/vporton/xml-boiler setup.py:
from coverage.annotate import os
from setuptools import setup, find_packages
from setuptools.command.build_py import build_py as DistutilsBuild
class MyBuild(DistutilsBuild):
def run(self):
DistutilsBuild.run(self)
os.system('make')
setup(
name='xml-boiler',
version='0.0.2',
url='https://github.com/vporton/xml-boiler',
license='AGPLv3',
author='Victor Porton',
author_email='porton#narod.ru',
description='Automatically transform between XML namespaces',
use_scm_version=True,
setup_requires=['setuptools_scm'],
packages=find_packages(),
# package_data={'': ['**/*.xml', '**/*.ttl', '**/*.net', 'data/assets/*', 'data/scripts/*.xslt',
# 'xmlboiler/doc/*.html', 'xmlboiler/doc/*.css']},
include_package_data=True,
scripts=['bin/boiler'],
# Does not work for non-root install:
# data_files = [
# ('/etc/xmlboiler', ['etc/config-cli.ttl'])
# ],
test_suite="xmlboiler.tests",
cmdclass={'build_py': MyBuild},
)
Here is my MANIFEST.in:
recursive-include xmlboiler *.xml *.ttl *.xslt
recursive-include xmlboiler/core/data/assets *
I encountered the same issue using this MANIFEST.in:
include setup.json
recursive-include . *.coffee
the .coffee files were present in the .tar.gz file but not installed
the problem was not resolved by adding zip_safe=False
it was resolved by switching from recursive-include to individual includes
This is using
wheel 0.32.3
twine 1.12.1
setuptools 39.2.0
Needs zip_safe=False flag to prevent installation inside a ZIP file.
I'm trying get the file VERSION in the root of my python package installed into so that I can read from this file once it's installed, however using MANIFEST.in, the file is placed in the top-level /usr/lib/python3.6/site-packages rather than inside of /usr/lib/python3.6/site-packages/mypackage.
I'm trying to do this so that I can display the packages version at runtime and also easily make use of it inside of the repo.
directory structure:
setup.py
MANIFEST.in
VERSION
mypackage/
- __init__.py
- __main__.py
- foo.py
MANIFEST.in:
include VERSION
setup.py:
#!/usr/bin/env python3
from setuptools import setup, find_packages
with open("VERSION", "r") as versionFile:
version = versionFile.read().strip()
setup(
name="mypackage",
version=version,
packages=find_packages(),
include_package_data=True)
mypackage/__main__.py:
...
verFile = pkg_resources.resource_string(__name__, "VERSION")
with open(verFile, "r") as fin:
version = str(fin.read().strip())
print(version)
...
How can I get the VERSION file to install inside of the package directory?