I am trying to get an entry point to run my flask application.
I think its due to the directory structure:
my_app
- __init__.py
- app.py
- setup.py
- etc..
My setup.py file:
from setuptools import setup, find_packages
import os.path
def read_requirements(pathname):
with open(pathname) as f:
return [line for line in (x.strip() for x in f) if not line.startswith('#')]
def project_path(*names):
return os.path.join(os.path.dirname(__file__), *names)
setup(
name='my_app',
version='0.1.0',
install_requires=read_requirements(os.path.join(os.path.dirname(__file__), 'requirements.txt')),
test_suite='nose.collector',
entry_points={
'console_scripts': [
'START_ME=app:run',
],
},
classifiers=["Programming Language :: Python :: 2.7"],
description=__doc__,
long_description='\n\n'.join(open(project_path(name)).read() for name in (
'README.md',
)),
zip_safe=False,
include_package_data=True,
packages=find_packages(),
)
I think the find_packages() method is not picking up the fact that its in the package, maybe its looking in lower level directories for packages? I've tried find_packages('.') to try get it to search in the project root directory but this did not work.
Can I get this to work without changing my directory structure?
Here is the actual project.
Also, I noticed that when I run setup.py install I get a top_level.txt file in my egg.info folder, it says that the top level is actually a package that exists inside of the root/main package, like:
/ main_package
- __init__.py
- app.py
/ sub_package
- __init__.py
- sub.py
in the top_level.txt file, sub_package is written.
I just ended up putting all the flask app files into a sub directory inside the project root directory. fixed it nicely.
/project
- setup.py
/flask_app
- __init__.py
- app.py
Related
Using python 3.8 I have the following structure in a test library:
testrepo
setup.py
Manifest.in
util/
mycode.py
data/
mydata.txt
The file setup.py looks like
setup(
name='testrepo',
version="0.1.0",
packages=['util'],
author='Tester',
description='Testing repo',
include_package_data=True,
package_data={
"util.data": ["*"]
},
)
and using the following Manifest.in:
include util/data/mycode.txt
and when I install this package I do not see any hint of the data folder in venv/lib/python3.8/site-packages/util (when installing the repo into a python virtual environment).
How to do it correctly, so I can read the content from the file util/data/mydata.txt using
from util import data
import importlib.resources as import_resources
text = import_resources.read_text(data, "mydata.txt")
or whatever...
Where can I find this completely documented, with examples etc.?
I guess what you have to do is to create the following basic structure of the repo:
myrepo
setup.py
Manifest.in
mypackage/
__init__.py
mycode.py
data/
__init__.py
mydata.txt
Just make sure to keep in mind 6 additional steps:
You need to put the data folder inside your package folder
You need to add __init__.py inside your data folder.
In setup.py you have to use packages=find_packages(), to find your packages.
In setup.py, you have to set include_package_data=True,
In setup.py, you have to specify the path to your data files:
package_data={
"mypackage.data": ["*"]
},
You also have to define a second file names Manifest.in containing again your data files as follows (using a placeholder here - you can also add each file in a separate line):
include util/data/*
It you are lucky, then you can include/use your data file like
from mypackage import data
import importlib.resources as import_resources
text = import_resources.read_text(data, "mydata.txt")
or
with import_resources.path(data, "mydata.txt") as filename:
myfilename = filename
to get the path to the data file.
Not sure this is documented anywhere.
I have created 2 packages, named A and B. Both packages have the same structure to them, as they do very similar things, e.g. their structure looks like this:
A/
__init__.py
subpackage1/
__init__.py
submodule1.py
subpackage2/
__init__.py
submodule2.py
setup.py
README.md
requirements.txt
They share the same subpackage, submodule and function names. Each module has a main function, which does the argparsing for me and calls a function with those params. In my setup.py, I specified additional entry points, so that I can call the modules from the command line:
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
with open('requirements.txt') as f:
requirements = f.readlines()
setuptools.setup(
name="A",
version="0.0.1",
author="Me",
author_email="me#myself.com",
description="Test package",
long_description=long_description,
long_description_content_type="text/markdown",
packages=setuptools.find_packages(),
entry_points ={
'console_scripts': [
'command1 = subpackage1.submodule1:main',
'command2 = subpackage2.submodule2:main'
]
},
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
python_requires='>=3.6',
install_requires = requirements
)
When I install the package in a blank docker container, it works fine and I can call my functions with 'command1' and 'command2' from the command line.
As stated before, package B has exactly the same setup.py file, except for the name. If I install that as well, the package A now uses the entry points of package B instead of its own. That means, I call the function with the right name, but from the wrong package.
I want to have them side-by-side in my docker container. How do I have to adjust my packages, so that the system can differentiate between them?
I installed the packages via pip from wheels that I generated.
First impression, the directory structure seems wrong: the first red flag is that there shouldn't be a __init__.py in the same directory as setup.py, the second red flag is that the directories next to setup.py can not be sub-packages, they are top-level packages.
In your example the top-level packages are subpackage1 and subpackage2 in the project A, and in the project B as well. So in both cases, after installation the importable items are import subpackage1 and import subpackage2. It also means that when you install A, then B, the top-level packages of B overwrite those that were previously installed as part of A (since they have the exact same name).
What you probably want to do is to add a directory a in the project A right next to its setup.py, and move both subpackageN as well as the __init__.py into that a directory (same in the project B). So that directory structure looks like:
A/
a/
__init__.py
subpackage1/
__init__.py
submodule1.py
subpackage2/
__init__.py
submodule2.py
setup.py
README.md
requirements.txt
Then the imports will look like:
import a.subpackage1
import b.subpackage1
from a import subpackage2 as subpackagea2
from b import subpackage2 as subpackageb2
Following that the setup.py files should be adjusted accordingly:
# ...
setuptools.setup(
# ...
entry_points ={
'console_scripts': [
'commanda1 = a.subpackage1.submodule1:main',
'commanda2 = a.subpackage2.submodule2:main'
]
},
)
I have a problem when trying to install a Python package that I have created.
The package includes a bitmap images which is used within the package (for OCR).
My folder structure is the following:
mypackage
- mypackage
- media
- template.bmp
- module1.py
- module2.py
- etc...
- tests
- MANIFEST.in
- setup.py
template.bmp is used by the module1.py.
The MANIFEST.in file:
include mypackage/media/template.bmp
The setup.py:
setup(
....
packages = find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests"]),
include_package_data=True,
package_data={'mypackage': ['media/template.bmp']},
...
)
When I run
python setup.py sdist
I can verify that the media folder is included along with template.bmp in the .egg file. However, when referencing the bitmap in a module using
directory = os.path.dirname(os.path.abspath(__file__))
template_path = directory + '/media/template.bmp'
cv2.imread(template_path, 0)
I get a file not found error. The directory variable is the following:
'C:\\anaconda3\\lib\\site-packages\\mypackage-0.0.1-py3.6.egg\\mypackage'
Am I missing something?
Using pkg_resources solved my problem.
template_path = pkg_resources.resource_filename(__name__, '/media/template.bmp')
I am working on a Python project and started using a setup.py file to install it. I followed some guides on how to structure the project and everything is working great after installing it.
My problem is that I "lost" the ability to run my application from inside the project's directory, locally so to speak without installing it, for testing purposes. Here's how the directory layout looks like:
project_dir/
├── bin
│ └── app.py
├── data
│ └── data.conf
├── app_module
│ └── __init__.py
├── LICENSE
├── README.md
└── setup.py
The problem has to do with the imports. In app.py I have import app_module and it works as intended after it gets installed to the proper directory by running python setup.py install. If I want to run it by doing python bin/app.py it obviously doesn't work.
What am I doing wrong and how could I get this to work? Ideally I want to be able to run it from inside the directory and it should still work when installed with the setup.py script.
if your bin/app.py is only doing:
import app_module
app_module.run() # or whatever you called the main() function
you could (should!) use an entry_points (search for console_scripts).
And while running your uninstalled code, you would do either:
python -m app_module or python app_module/__init__.py (assuming you have a __main__ section inside this file, otherwise look at how json.tool is doing.
Edit
Here is a basic example, assuming your library is named "app_module".
setup.py
import imp
import os
import setuptools
module_name = 'app_module'
module = imp.load_module(
module_name,
*imp.find_module(module_name, [os.path.dirname(__file__)])
)
base = 'data'
data_files = [
(
'/usr/local/share/appdata/' + module_name + root[len(base):],
[os.path.join(root, f) for f in files]
) for root, dirs, files in os.walk(base)
]
setuptools.setup(
name=module_name,
version=module.__version__,
classifiers=(
'Environment :: Console',
'Operating System :: POSIX :: Linux',
),
packages=(
module_name,
),
entry_points={
# the executable will be called 'app' (not 'app.py')
'console_scripts': ['app = ' + module_name + ':main'],
# it does not seems unnecessarily complicated to me, it's just three lines
},
data_files=data_files,
)
app_module/__init__.py
import optparse
import sys
__version__ = '1.2.3'
def main():
parser = optparse.OptionParser(version=__version__)
parser.add_option('-m', '--man', action='store_true')
opt, args = parser.parse_args()
if opt.man:
help(__name__)
sys.exit()
print opt, args
if __name__ == '__main__':
main()
And now...
To run it:
python /full-or-replative/path/to/app_module/__init__.py
# or
export PYTHONPATH=/full-or-replative/path/to/app_module/
python -m app_module
To install and run it:
# test in a virtualenv
workon test
python setup.py install
app -h
What you need to do is either set the python path to include your module (ie run it as PYTHONPATH=project_dir python app.py, or make the app_module be placed in project_dir/bin in order to be included in the builtin path.
Note that not doing this is dangerous as after installed your app.py would otherwise pick up the installed package (app_module) instead of that in your project dir (which can get confusing).
I have a project with this structure:
SomeProject/
bin/
CHANGES.txt
docs/
LICENSE.txt
MANIFEST.in
README.txt
setup.py
someproject/
__init__.py
location.py
utils.py
static/
javascript/
somescript.js
And a "setup.py" as follows:
#!/usr/bin/env python
import someproject
from os.path import exists
try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
setup(
name='django-some-project',
version=someproject.__version__,
maintainer='Some maintainer',
maintainer_email='some#manteiner.com',
packages=find_packages(),
include_package_data=True,
scripts=[],
url='https://github.com/xxx/some-project',
license='LICENSE',
description='Some project description.',
long_description=open('README.markdown').read() if exists("README.markdown") else "",
install_requires=[
"Django >= 1.4.0"
],
)
Then, when I upload it using the command:
python setup.py sdist upload
It seems ok, but there is no "static" folder with this "javascript" subfolder in the package. My "setup.py" was inspired on github.com/maraujop/django-crispy-forms that has a similar structure. Any hint on what is wrong on uploading this subfolders?
You should be able to add those files to source distributions by editing the MANIFEST.in file to add a line like:
recursive-include someproject/static *.js
or just:
include someproject/static/javascript/*.js
This will be enough to get the files included in source distributions. If the setuptools include_package_data option you're using isn't enough to get the files installed, you can ask for them to be installed explicitly with something like this in your setup.py:
package_data={'someproject': ['static/javascript/*.js']},
Use following
packages = ['.','templates','static','docs'],
package_data={'templates':['*'],'static':['*'],'docs':['*'],},