Mypy, python paths, "cannot find implementation or library stub" - python

I'm trying to use mypy with a package that I've written, but it can't find my stub file.
I have a workspace which looks like this:
/common
/other_dir
/another_dir
I have used a script to add all of these directories to my sys.path.
Inside each directory is a src/ directory, which contains python packages, and is itself a top-level package (has an init.py).
in /common/src/test1 I have a module called components.py, and I've written another file next to it, components.pyi.
This should work as the stub file for components.py.
In /another/src/example.py, I import like this:
from common.src.test1.components.py import x
x is detected and I can use it, but when I run mypy ./another/src/example.py, it says 'Cannot find implementation or library stub for module named 'common.src.test1.components'.
It would be great if anyone who has experience with mypy could help with this.
Many thanks.

try
from common.src.test1.components import x # type: ignore

Related

How to import modules from a script that could be called from anywhere?

I have a repo with Python code whose structure could be boiled down to this:
repo_root\
tool1\
tool1.py
tool1_aux_stuff.py
tool2\
tool2.py
tool2_aux_stuff.py
lib\
lib1\
lib1.py
lib1_aux_stuff.py
lib2\
lib2.py
lib2_aux_stuff.py
The following rules apply to the module usage:
Any tool could use the modules from any library and from its own package, but not from a different tool's one.
Any library could use the modules from any other library, and from its own package. Libraries never access the tool modules.
There must be a way to invoke any tool from any working directory, including those outside repo_root.
The question is: how do I import the lib modules from the tool ones?
I know that if I add __init__.py to each tool and lib directory and to the repo root, then I would be able to use absolute paths from the root, i.e. in tool1.py I could write
import lib.lib1, lib.lib2.lib2_aux_stuff
However, if I execute tool1.py from a random place, e.g.
machine_name: ~/random/place$ python /path/to/repo/tool1/tool1.py
I get the ModuleNotFoundError: No module named 'lib' found error.
I am aware of a workaround which could be implemented using the PYTHONPATH env variable by augmenting it with an absolute path to repo_root and supplying it to the invocation of the tool script, i.e.:
machine_name: ~/random/place$ PYTHONPATH=$PYTHONPATH:/path/to/repo python /path/to/repo/tool1/tool1.py
but I would really prefer something less clunky.
Any ideas how I could do it in a more straightforward way?
Add the path to lib to the scope using sys.path.append('/custom/path/to/modules'). It should then be callable as a module.
You do need to add __init__.py files in any directory that you want to import as if it were a module, otherwise Python doesn't treat them as modules and you'll get another ImportError

Sphinx autodoc fails to import module

I'm trying to document a project using Sphinx, and am running into an issue where only some modules are being imported from a folder. My project structure looks like this:
Project
|
|--Main
| |--Scripts
| __init__.py
| libsmop.py
| conv_table.py
| f_discrim.py
| recipes.py
| ...
When I try to run make html, libsmop and recipes are imported without any issue, however conv_table and f_discrim get the following error:
WARNING: autodoc: failed to import module u'conv_table' from module u'Scripts'; the following exception was raised:No module named conv_table
I don't think it's my config file because it's finding all of the files when I run sphinx-apidoc -o _rst Main/Scripts and I've confirmed that they appear in the resulting Scripts.rst file.
Why is autodoc finding some modules but not others?
Edit:
conv_table.py is of this form:
import re
import numpy as np
"""
conv_table dictionary at the bottom of this file maps from matlab functions
to their python equivalents.
"""
def get_args(line,separator=",", open_char='(', close_char=')'):
"""Returns the arguments of line
>>> get_args('ones(3,1,length(arr))')
...
< a bunch of function definitions>
...
conv_table = {... < a very big dictionary > ...}
Since your autodoc is picking up some of the modules, it may be because the dependencies of the failed modules are either 1) not imported correctly or 2) not installed under your python environment. You will want to check if all the import statements work within your failed modules.
You will want to check the module loading path, according to the Sphinx docs:
For Sphinx (actually, the Python interpreter that executes Sphinx) to find your module, it must be importable. That means that the module or the package must be in one of the directories on sys.path – adapt your sys.path in the configuration file accordingly.
Also it would be useful to know how your __init__.py in Scripts directory looks like and how the conv_table module looks like as well.
I had a similar issue like yours, the fix was to append the path that holds that module inside the ../source/conf.py file using
sys.path.insert(0, os.path.abspath('whatever relative path works for your folder structure'))
sys.path.append('/path/to/the/conv_table/')
installing this library in your environment should resolve the problem as of now:
pip install sphinxcontrib-bibtex
after running the make html command it may warn you about the configuration problems.

Import Error: No Module named common

My folder structure in pycharm is as follows.
--python
--concepts
--common
--myds.py
--__init__.py
--data_structures
--test_ds.py
I have the following line in test_ds.py
from common import my_ds
I get the following error.
ImportError: No module named 'common'
I have added common to Settings --> Project Interpreter -> Interpreter Paths
and the folder shows up as library root.
Still why am I getting this error.
Try from ..common import my_ds. Also make sure that it has an __init__.py file in that directory (not required but it's good practice).
As for the .. they indicate that you're importing from the parent package to the one you're currently on.
You need to make your common folder into a python package in order to import it in python. I think you've tried to do it and created init file in your common folder but actually it must be __init__.py. Rename it like this and then your package will be visible to python.
Hope it helps!

__init__ file doesn't work as expected in python

I have some folders and .py files in the following structure:
parent/
__init__.py
test.ipynb
code/
__init__.py
common.py
subcode/
__init__.py
get_data.py
In the __init__ file under the parent folder, I have import code and in the one of code, I have import subcode. But when I tried import code.subcode, I got such an error:
ImportError: No module named 'code.subcode'; 'code' is not a package
But when I just import code, no error is thrown. However, when I call code.subcode, this error happens:
AttributeError: module 'code' has no attribute 'subcode'
I try all of those mentioned above in the test.ipynb, which is at the root of the directory.
Do you know what is the reason and how can I fix it? Thanks!
The problem is that you are importing another module named code that is installed on your system rather than your own module. You can verify this by checking the module file path in code.__file__ after you import code.
The first thing to do is change the name of your module to avoid namespace collisions with the other code package on your system. If your new package name doesn't collide with something else, you should now either successfully be importing it and have it behave as expected, or it fails to import entirely.
If it fails to import, it is most likely because your parent directory is not in your PYTHONPATH environment variable.
There can potentially also be other more technical reasons that a module is not recognized by the interpreter such as old definitions being cached (in which case restarting the interpreter is often enough. Possibly after deleting any precompiled versions of the module). Another problem I have seen ended up being that a module contained a bug that made the interpreter unable to parse it. I am sure there are other odd possibilities out there.
You're on Python 3. You need to perform relative imports explicitly:
from . import code
The code module you're currently getting is the standard library code module.

Python No module named

I have a custom module that I am trying to read from a folder under a hierarchy:
> project-source
/tests
/provider
my_provider.py
settings_mock.py
__init__.py
I am trying to call, from my_provider.py
import tests.settings_mock as settings
Example from command line:
project-source> python tests/provider/my_provider.py
Error:
... ImportError: No module named settings_mock
I keep getting No module named settings_mock as error. I have already exported project_source path to PYTHONPATH. I have made tests into a package by creating a __init__.py file in its root, but no change in the error then.
I can print the settings_mock.py attributes when cd'ing project source
>>> import tests.settings_mock as settings
>>> print settings.storage_provider
correct storage provider value
Is anyone able to point out my mistake here? Thanks!
You only have one small mistake. To use subfolders, you need __init__.py, not init.py as you stated in the question. The difference is that __init__ is a builtin function of python, whereas init is not. Having this file in each subfolder tells the pyhon interpreter that the folder is a "package" that needs to be initialized.
UPDATED: It should be noted that python usually runs from the current directory that the script is located. If your executable main script is my_provider.py, then it's not going to know what to import, since the main script is located in a lower directory than the object it is trying to import. Think of it as a hierarchy. Scripts can only import things that are beneath them. Try separating out the executable from everything else in that file, if there are things that settings_mock needs to import.

Categories