I have a below structure
in migrations/env.py file I am trying to import from database import *
but it shows no module name database
I tried from ..database imprt * and adding file in pythonpath also but no luck :(
Your directory structure looks a bit suspicious to me. The alembic.ini shouldn't normally be part of the package (and setuptools won't by default pick it up when packaging). I think this would better be placed into the project-root.
Something like this would be more standard:
├── alembic.ini
├── migrations
│ ├── env.py
│ ├── script.py.mako
│ └── versions
│ └── ...
├── package_name
│ └── database
│ ├── __init__.py
│ └── ...
│ └── models
│ └── __init__.py
│ └── ...
├── README.md
└── setup.py
└── ...
Now, this alone would not make database available from env.py. For this to work you have to somehow make your package discoverable. Usually this would be done by installing package_name into some virtualenv. In that environment you could then use from package_name.database import * in your env.py.
Migrations needs to know where to import from, they either belong to the same package:
A:
migrations
database
init.py
And then in migrations:
from A.database.whatever import whatever else
Or you install them as packages separatedly inside your virtualenv:
And then each of them is dependent on the other, but because they are installed they can be invoked:
database/setup.py
migrations/setup.py
Then both are installed and migrations/env.py can call the installed package database
Related
I have a project
testci/
├── __init__.py
├── README.md
├── requirements.txt
├── src
│ ├── __init__.py
│ └── mylib.py
└── test
├── __init__.py
└── pow_test.py
When I run python3.6 test/pow_test.py I see an error:
File "test/pow_test.py", line 3, in
import testci.src.mylib as mylib
ModuleNotFoundError: No module named 'testci'
pow_test.py
from testci.src.mylib import get_abs
def test_abs():
assert get_abs(-10) == 10
How can I fix this error?
System details: Ububntu 16.04 LTS, Python 3.6.10
try this
from .src import mylib
from mylib import get_abs
if it won't work then import one by one. But don't import the root folder since the file you are importing to is on the same folder you are trying to import then it will always raise an error
Run Python with the -m argument within the base testsci package to execute as a submodule.
I made a similar mock folder structure:
├───abc_blah
│ │ abc_blah.py
│ │ __init__.py
│
└───def
│ def.py
│ __init__.py
abc_blah.py
print('abc')
def.py
import abc_blah.abc_blah
Execute like such:
python -m def.def
Correctly prints out 'abc' as expected here.
simply add __package__ = "testci" and also it is a good practice to add a try and except block
Your final code should look something like
try:
from testci.src.mylib import get_abs
except ModuleNotFoundError:
from ..testci.src.mylib import get_abs
for running it, type python -m test.pow_test
I think your issue is how the package is installed. The import looks fine to me. As it says CI I'm guessing you're having the package installed remotely with only the test folder somehow.
Try adding a setup.py file where you define that both the test as well as the src packages are part of your testci package.
there are many ways to organize a project, keep things consider in mind, structure should be simple and more scaleable, can differentiate the things in codebase easily.
one of the few good possible ways to structure a project is below
project/
├── app.py
├── dockerfile
├── pipfile
├── Readme.md
├── requiements.txt
├── src_code
│ ├── code
│ │ ├── __init__.py
│ │ └── mylib.py
│ └── test
│ ├── __init__.py
│ └── test_func.py
└── travisfile
here app.py is main file which is responsible to run your entire project
I'd like to figure out the cleanest and preferably self contained way to use my packages in scripts that are in a different directory to the package itself.
The example problem is as follows:
The modules in lib need to both be imported, and run as a script.
My project directory is as below and I'm having two issues:
in lib/api.py, I want to read in data_files/key.txt correctly when api.py is called or imported
in testing_script.py I want to import and use lib/get_data.py
I can't seem to find a clean way to do this, does this mean my project is structured in a non-pythonic way?
Thanks for the help.
my-project-git
├── LICENSE
├── README.md
├─── my_project
│ ├── data_files
│ │ ├── key.txt
│ │ ├── mappings.csv
│ ├── lib
│ │ ├── __init__.py
│ │ ├── api.py
│ │ └── get_data.py
│ └── test
│ ├── __init__.py
│ └── testing_script.py
├── requirements.txt
└── setup.py
As far as I know, there's isn't a pythonic way to structure your project.
This is what Kenneth Reitz recommended in 2013 and it's how I use it: https://www.kennethreitz.org/essays/repository-structure-and-python.
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
Inside sample (my_project in your case) you can separate into categories as you like. E.g. Utils (common functions), Database (read, write), View (user commands), etc. It depends on your project.
As for calling the modules at the same level, you should define them in the __init__ file of the top hierarchy module which is sample in this case.
For example:
__init__ in _my_project
from sample.core import a_Class
from sample.core import a_function
from sample.core import anything
then from /test/test_basic.py you do:
from sample import a_Class
# or import sample
a = a_Class() # use the class from core.py
# or a = sample.a_Class()
Take a look at the sample module repository: https://github.com/navdeep-G/samplemod
I'm trying to build my first python package public available but I'm having some trouble with installing it on another machine, not sure what is wrong. My project is here.
After all the CI steps on the master branch, Travis publishes the latest version to the pypi. After that, we can install the package in any place:
pip install spin-clustering
But when I try to import it on my regular python it says that the module does not exist.
$ python -c "import spin"
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'spin'
My package was originally called "spin" but the name was already taken on pypi, I changed it to "spin-clustering", but as scikit-learn is imported with "sklearn" I thought that would be possible to import my package as "spin". Not sure what I'm missing here.
This is my package structure:
├── LICENSE
├── Makefile
├── Pipfile
├── README.md
├── examples
│ ├── circle-example.ipynb
│ └── random-cluster-example.ipynb
├── setup.cfg
├── setup.py
└── spin
├── __init__.py
├── distances
│ ├── __init__.py
│ ├── distances.py
│ └── tests
│ └── __init__.py
├── neighborhood_spin.py
├── side_to_side_spin.py
├── tests
│ ├── __init__.py
│ ├── test_spin.py
│ └── test_utils.py
└── utils.py
And my setup.py
from setuptools import setup, find_packages
setup(name="spin-clustering",
maintainer="otaviocv",
maintainer_email="otaviocv.deluqui#gmail.com",
description="SPIN clustering method package.",
license="MIT",
version="0.0.3",
python_requires=">=3.6",
install_requires=[
'numpy>=1.16.4',
'matplotlib>=3.1.0'
]
)
In your setup.py, you also need to specify what packages will be installed. The simplest way is using the provided find_packages function, which will scan your folders and try to figure out what the packages are (in some slightly unusual cases, your project organization will make this not work right). Your code imports find_packages, but is not using it.
Since you have none listed, nothing is actually installed (except the requirements, if missing).
I'm trying to build a package that uses both python and cython modules. The problem I'm having deals with imports after building and installing where I'm not sure how to make files import from the .so file generated by the build process.
Before building my folder structure looks like this
root/
├── c_integrate.c
├── c_integrate.pyx
├── cython_builder.py
├── __init__.py
├── integrator_class.py
├── integrator_modules
│ ├── cython_integrator.py
│ ├── __init__.py
│ ├── integrator.py
│ ├── numba_integrator.py
│ ├── numpy_integrator.py
│ ├── quadratic_error.png
│ ├── report3.txt
│ ├── report4.txt
│ └── report5.txt
├── report6.txt
├── setup.py
└── test
├── __init__.py
└── test_integrator.py
Building with python3.5 setup.py build gives this new folder in root
root/build/
├── lib.linux-x86_64-3.5
│ ├── c_integrate.cpython-35m-x86_64-linux-gnu.so
│ ├── integrator_modules
│ │ ├── cython_integrator.py
│ │ ├── __init__.py
│ │ ├── integrator.py
│ │ ├── numba_integrator.py
│ │ └── numpy_integrator.py
│ └── test
│ ├── __init__.py
│ └── test_integrator.py
The setup.py file looks like this
from setuptools import setup, Extension, find_packages
import numpy
setup(
name = "integrator_package",
author = "foo",
packages = find_packages(),
ext_modules = [Extension("c_integrate", ["c_integrate.c"])],
include_dirs=[numpy.get_include()],
)
My question is then: how do I write import statements of the functions from the .so file into ìntegrator_class.py in root and cython_integrator and test_integrator located in the build directory. Appending to sys.path seems like a quick and dirty solution that I don't much like.
EDIT:
As pointed out in the comments I haven't installed the package. This is because I don't know what to write to import from the .so file
In no specific order:
The file setup.py is typically located below the root of a project. Example:
library_name/
__init__.py
file1.py
setup.py
README
Then, the build directory appears alongside the project's source and not in the project source.
To import the file c_integrate.cpython-35m-x86_64-linux-gnu.so in Python, just import "c_integrate". The rest of the naming is taken care of automatically as it is just the platform information. See PEP 3149
A valid module is one of
a directory with a modulename/__init__.py file
a file named modulename.py
a file named modulename.PLATFORMINFO.so
of course located in the Python path. So there is no need for a __init__.py file for a compiled Cython module.
For your situation, move the Cython code in the project directory and either do a relative import import .c_integrate or a full from integrator_modules import c_integrate where the latter only works when your package is installed.
A few of this information can be found in my blog post on Cython modules http://pdebuyl.be/blog/2017/cython-module.html
I believe that this should let you build a proper package, comment below if not.
EDIT: to complete the configuration (see comments below), the poster also
Fixed the module path in the setup.py file so that it is the full module name starting from the PYTHONPATH: Extension("integrator_package.integrator_modules.c_integrator", ["integrator_package/integrator_modules/c_integrator.c"] instead of Extension("c_integrate", ["c_integrate.c"])]
Cythonize the module, build it and use with a same Python interpreter.
Further comment: the setup.py file can cythonize the file as well. Include the .pyx file instead of the .c file as the source.
cythonize(Extension('integrator_package.integrator_modules.c_integrator',
["integrator_package/integrator_modules/c_integrator.pyx"],
include_dirs=[numpy.get_include()]))
This is my first time trying to set up a vagrant environment or a python virtuelenv, so forgive me if I am missing something basic.
Right now, I ssh into my vagrant box and in the home directory I have placed my venv folder. I have run
source venv/bin/activate
From my home directory I move to /vagrant, and within here I have my project files laid out something like this:
├──project
├── LICENSE
│
├── project
│ │ ├── exceptions.py
│ │ ├── __init__.py
│ │ ├── resources
│ │ │ ├── base.py
│ │ │ ├── __init__.py
│ │ └── target
│ │ └── __init__.py
│ │ └── test.py
│ ├── README.md
My problem is I am unable to import my modules in different directories. For example, if I am in /vagrant/project/project/target/test.py and I attempt:
import project.exceptions
I will get the error
ImportError: No module named project.exceptions
If I am in the /vagrant/project/project directory and I run
import exceptions
that works fine.
I have read up on similar problems people have experienced on StackOverflow.
Based on this question: Can't import package from virtualenv I have checked that my sys.executable path is the same in both my python interpreter as well as when I run a script (home/vagrant/venv/bin/python)
Based on this question: Import error with virtualenv. I have run ~/venv/bin/python directly and attempted to import, but the import still fails.
Let me know if there is more information I can provide. Thank you.
You have two options:
You can install your project into the virtual environment, by writing a setup.py file and by calling python setup.py install. See the Python Packaging User Guide.
You can set the PYTHONPATH environment variable to point to your project, like this:
$ export PYTHONPATH=$PYTHONPATH:/vagrant/project