Empty __init__.py when deployed with Dockerfile - python

I have a pretty weird case - even don't know where to look.
I have a Python package e.g. my_utils which is uploaded to artifactory (same as PyPi).
Project structure:
my_utils
__init__.py
first_package
__init__.py
some_file.py
first_package/some_file.py
def do_job():
print("job_done")
my_utils/init.py
from first_package.some_file import do_job
Package deployed using pretty standard way:
setup.py
from os import path
import setuptools
from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
with open(path.join(here, 'README.md'), encoding='utf-8') as f:
long_description = f.read()
setuptools.setup(
name='my_utils',
version='1.0.0',
description='my_utils',
setup_requires=['wheel'],
long_description=long_description,
include_package_data=True,
url='',
author='',
author_email='',
license='MIT',
packages=setuptools.find_packages(),
package_data={
'': ['*.*']
},
install_requires=[],
zip_safe=False
)
To deploy - I use command:
python setup.py bdist_wheel upload -r local
So when I do pip install my_utils - I can do the following:
# env with my_utils installed
from my_utils import do_job
do_job() # job done
Now when I use this package in the application that is deployed with Docker and contains it in requirements:
requirements.txt
my_utils==1.0.0
Dockerfile
FROM ...
COPY requirements.txt .
RUN PIP install --no-cache-dir -r requirements.txt
When I enter the container i have file my_utils/__init__.py empty. And the import failed:
File "/app/app.py", line 13, in <module>
from my_utils import do_job
ImportError: cannot import name do_job
And the direct import works fine:
from my_utils.first_package.some_file import get_job
For now, I switched to "direct" import without a shortcut but really interesting why that can happen
When I go inside the container - I see that file __init__.py is empty. But when re-installed it gets content
$: docker exec -it my_app bash
# du /usr/lib/python2.7/site-packages/my_utils/__init__.py
0 /usr/lib/python2.7/site-packages/my_utils/__init__.py
# pip uninstall my_utils
# pip install my_utils
# du /usr/lib/python2.7/site-packages/my_utils/__init__.py
4 /usr/lib/python2.7/site-packages/my_utils/__init__.py
# cat /usr/lib/python2.7/site-packages/my_utils/__init__.py
from first_package.some_file import do_job

Related

How to build python package for other PC?

Environment
PC1: Dev-machine, online, Python install dir: C:\Python310
PC2: Target machine, offline, Python install dir: C:\Program Files\Python310
Doing
Write source and run command in workdir pip install -t ./out ./ on PC1.
Copy files under out dir from PC1 to PC2.
Open term and invoke exe file on PC2.
Then I got message Fatal error in launcher: Unable to create process using '"C:\Python310\python.exe" "C:\Program Files\Python310\Scripts\my_app.exe" ': ??????????????????.
How can I build for PC2?
Folder structure
┗━ my_app
┣━ setup.py
┗━ my_app
┣━ __init__.py
┣━ __main__.py
┗━ main.py
File contents:
setup.py
from setuptools import setup, find_packages
setup(
name = 'my_app',
packages = find_packages(),
entry_points = {
'console_scripts': [
'my_app = my_app.main:main',
],
},
)
my_app/__main__.py
from .main import main
main()
my_app/main.py
def main():
print('hello world')
Constraints
without cx_freeze, pyinstaller, py2exe or similer third party packages
actual my_app requires external packages(ex: tqdm)
Run command python setup.py bdist and copy dist/my_app...zip content to target machine, its resolved my question.
https://docs.python.org/3/distutils/builtdist.html

Copy a non-Python file to specific directory during Pip Install

Problem Statement: when I install my pip package, a specific file inside the package get coped to Temp directory
Approach:
My Package directory Sturcture is following:
my-app/
├─ app/
│ ├─ __init__.py
│ ├─ __main__.py
├─ folder-with-extra-stuff/
│ ├─ __init__.py
│ ├─ file_I_want_to_cppy.tar.gz
├─ setup.py
├─ MANIFEST.in
I'm tweaking my setup.py file to do the job. Following is my setup.py
#!/usr/bin/env python
from setuptools import setup, find_packages
from setuptools.command.install import install
import os
import sys
import shutil
rootDir = os.path.abspath(os.path.dirname(__file__))
def run_custom_install():
print("--------Start running custom command -------")
temp_dir = r'c:\temp' if sys.platform == "win32" else r'/tmp'
temp_col_dir = temp_dir + os.sep + 'dump'
os.makedirs(temp_dir, exist_ok=True)
os.makedirs(temp_col_dir, exist_ok=True)
print("----------locate the zip file ---------------")
ColDirTests = os.path.abspath(os.path.join(rootDir, 'my-app','folder-with-extra-stuff'))
_src_file = os.path.join(ColDirTests , 'file_I_want_to_cppy.tar.gz ')
print(f"******{_src_file}**********")
if os.path.exists(_src_file):
print(f"-----zip file has been located at {_src_file}")
shutil.copy(_src_file, temp_col_dir)
else:
print("!!!!Couldn't locate the zip file for transfer!!!!")
class CustomInstall(install):
def run(self):
print("***********Custom run from install********")
install.run(self)
run_custom_install()
ver = "0.0.0"
setup(
name='my_pkg',
version=ver,
packages=find_packages(),
python_requires='>=3.6.0',
install_requires = getRequirements(),
include_package_data= True,
cmdclass={
'install' : CustomInstall,
}
)
MANIFEST.in
include README.md
include file_I_want_to_cppy.tar.gz
recursive-include my-app *
global-exclude *.pyc
include requirements.txt
prune test
Testing build:
> python setup.py bdist_wheel
It is working during build. I can see there is a directory formed C:\temp\dump and file_I_want_to_cppy.tar.gz inside it. But when I release the package in pip and try to install it from pip, the folder remains Empty!
Any idea what I might be doing wrong here?
After a lot of research I have figure out how to resolve this issue. Let me summarize my findings, it might be helpful for other who wants to do post_pip_install processing.
setup.py
Different options to install package: 1) pip install pkg_name, 2) python -m setup.py sdist
If you want to make them work in either ways, need to have install, egg_info and develop all 3 options repeated as shown in setup.py
If you create *.whl file by python -m setup.py bdist_wheel , post pip install processing won't be executed! Please upload .tar.gz format generated usingsdist to PyPi/Artifacts to make post pip install processing work. Again, Please note: It will not work when installing from a binary wheel
upload the pip package: twine upload dist/*.tar.gz
from setuptools import setup, find_packages
from setuptools.command.install import install
from setuptools.command.egg_info import egg_info
from setuptools.command.develop import develop
rootDir = os.path.abspath(os.path.dirname(__file__))
def run_post_processing():
print("--------Start running custom command -------")
# One can Run any Post Processing here that will be executed post pip install
class PostInstallCommand(install):
def run(self):
print("***********Custom run from install********")
install.run(self)
run_post_processing()
class PostEggCommand(egg_info):
def run(self):
print("***********Custom run from Egg********")
egg_info.run(self)
run_post_processing()
class PostDevelopCommand(develop):
def run(self):
print("***********Custom run from Develop********")
develop.run(self)
run_post_processing()
ver = "0.0.0"
setup(
name='my_pkg',
version=ver,
packages=find_packages(),
python_requires='>=3.6.0',
install_requires = getRequirements(),
include_package_data= True,
cmdclass={
'install' : PostInstallCommand,
'egg_info': PostEggCommand,
'develop': PostDevelopCommand
}
)
Few More Things from my research:
If you want to do pre-processing instead of post-processing, need to move install.run(self) at the end
while pip installing, if you want to see custom messages of pre/post instllation, use -vvv. Example: pip install -vvv my_pkg

check if current python code is running from package

is it possible to detect if current python code is running from package?
if yes - is it possible to get package metadata (name, version, description)?
package is created with this kind of setup.py
import os
from setuptools import setup, find_packages
setup(
name='my-pack-name',
description='my description ' + os.getenv('GIT_COMMIT', '*')[:7],
version=os.getenv('BUILD_VERSION', '0.0.0dev'),
packages=find_packages(),
)
build: python3 setup.py bdist_wheel -d ./artifact
install on target: pip3 install "my-pack-name-x.x.x.whl" --upgrade
now from my_pack_name/app.py that was inside my-pack-name-x.x.x.whl i want to detect that i'm running from installed package
and if so then get package metadata defined during setup.py execution
For Python >=3.8
https://docs.python.org/es/3.10/library/importlib.metadata.html
You can get the metadata for a package by:
from importlib.metadata import metadata
md = metadata("your package name")
author = md["Author"]
# etc ...
For Python <3.8
This is just an idea (not tested).
What about having the package metadata in a different module, and try relative import it in the app.py module?
# metadata.py
name='my-pack-name',
description='my description ' + os.getenv('GIT_COMMIT', '*')[:7],
version=os.getenv('BUILD_VERSION', '0.0.0dev')
In setup.py you could reuse that:
# setup.py
import os
from setuptools import setup, find_packages
import .metadata as md
setup(
name=md.name,
description=md.description + os.getenv('GIT_COMMIT', '*')[:7],
version=md.version,
packages=find_packages(),
)
And the in app.py
def get_metadata():
try:
import .metadata as md
except ImportError:
return None
else:
# return metadata here
That way if get_metadata returns None, you were not able to import the module, so you're not executing the app.py in your package, otherwise, you are in your package and as a bonus you got your metadata.
For the test if python code is running from an installed package (vs. from a development location), I use
if 'site-packages' in __file__:
...
I don't know if that's a good approach; seems to work thus far.

Install into ~/.config dir

I would like to install a config file in a users .config directory. I've tried
# setup.py
from setuptools import setup
import os
setup(
# ...
data_files=[(
'{}/.config/foobar/'.format(os.environ['HOME']),
['config/foobar.config']
)]
)
but this strips the leading / and installs the file at
$HOME/.local/lib/python3.8/site-packages/home/johndoe/.config/foobar/foobar.config
How to install into the ~/.config dir? Bonus points if it works with a setup.cfg file.

python installing package with submodules

I have a custom project package with structure like:
package-dir/
mypackage/
__init__.py
submodule1/
__init__.py
testmodule.py
main.py
requirements.txt
setup.py
using cd package-dir followed by $pip install -e . or pip install . as suggested by python-packaging as long as I access the package from package-dir
For example :
$cd project-dir
$pip install .
at this point this works:
$python -c 'import mypackage; import submodule1'
but This does not work
$ cd some-other-dir
$ python -c 'import mypackage; import submodule1'
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named submodule1
How to install all the submodules?
also, if i check the package-dir/build/lib.linux-x86_64-2.7/mypackage dir, I only see the immediate files in mypackage/*.py and NO mypackage/submodule1
setup.py looks like:
from setuptools import setup
from pip.req import parse_requirements
reqs = parse_requirements('./requirements.txt', session=False)
install_requires = [str(ir.req) for ir in reqs]
def readme():
with open('README.rst') as f:
return f.read()
setup(name='mypackage',
version='1.6.1',
description='mypackage',
long_description=readme(),
classifiers=[
],
keywords='',
url='',
author='',
author_email='',
license='Proprietary',
packages=['mypackage'],
package_dir={'mypackage': 'mypackage'},
install_requires=install_requires,
include_package_data=True,
zip_safe=False,
test_suite='nose.collector',
tests_require=['nose'],
entry_points={
'console_scripts': ['mypackage=mypackage.run:run'],
}
)
setup.py is missing information about your package structure. You can enable auto-discovery by adding a line
setup(
# ...
packages=setuptools.find_packages(),
)
to it.

Categories