Unable to load modules with my current python project structure - python

I have created a Flask application in portal/webapp.py. When I try to start the application using python portal/webapp.py I get the following error:
Traceback (most recent call last):
File "portal/webapp.py", line 3, in <module>
from db import TenantManager, QueryHandler
File "****/Project/portal/db.py", line 4, in <module>
from sql_parser.SQLParserTools import Parser, Builder
ImportError: No module named sql_parser.SQLParserTools
Here is my project structure
Project/
portal/
__init__.py
db.py
manage.py
test/
__init__.py
test_db.py
webapp.py
sql_parser/
__init__.py
error.py
SQLParserTools.py
StringParsers.py
test/
__init__.py
test_parser.py

I think you should add ..../Project/ to your PYTHONPATH variable.
Note that since there is no __init__.py in Project. You consider that portal and sql_parser are two different project. One which is using the other.

Related

Unable to import module from Python

I am trying to import
from app.idol.IP import CONTENT_IP_ADDRESS, CONTENT_PORT
I get the following error
Traceback (most recent call last):
File "testcontent.py", line 1, in <module>
from contentactions import get_dbs
File "C:\Code\Python\xxxxx\app\idol\content\contentactions.py", line 1, in <module>
from app.idol.IP import CONTENT_IP_ADDRESS, CONTENT_PORT
ModuleNotFoundError: No module named 'app'
See directory structure.
All imports are relative to the directory where you run Python. From your screenshot it is clear that you are running testcontent.py from app\idol\content rather than main.py at the source root, so it fails to find the subdirectory app within app\idol\content.
You can instead run Python from C:\Code\Python\K2Associates like this:
python app\idol\content\testcontent.py
__init__.py is imported using a directory. if you want to import it as app you should put __init__.py file in directory named app
another option is just to rename __init__.py to app.py

ModuleNotFoundError in Python 3 but not in Python 2

I have a project I want to run on different machines without the need to modify the PYTHONPATH enviroment variable. My projects structure is as follows:
awesome_project/
data/
scripts/
__init__.py
predict/
importer/
__init__.py
__init__.py
predict.py
train/
importer/
__init__.py
__init__.py
train.py
utils/
__init__.py
configuration.py
In my predict and train code I need to import variables defined in the configuration file inside utils. In Python 2 I defined the importer module, in which the __init__.py had the following code:
import sys
from os import getcwd
from os.path import sep
root_path = sep.join(getcwd().split(sep)[:-2])
sys.path.append(root_path)
And it worked as a charm. I imported the variables as: from scripts.utils.configuration import models_path, but now I'm migrating my code to Python 3 and this does not work at all, I get the following error:
Traceback (most recent call last):
File "predict.py", line 11, in <module>
from scripts.utils.configuration import models_path
ModuleNotFoundError: No module named 'scripts.utils'
What am I doing wrong?

ModuleNotFoundError when setting up Google Cloud Function with Cloud Source repository

I am deploying a Google Cloud Function based on code in a BitBucket repository. I have already linked up the BitBucket account to Google Cloud "Source Repositories" and the Google Function can find the repo, etc. The problem is that my main.py function needs to call several functions in other packages/modules within my repository. I have some simple import statements at the top of my main.py file like this:
import base64
import json
from datetime import datetime
from competitor_scrape.headless import headless_browser
...
The first several lines load fine, but the 4th line (the one that call the module/function within my BitBucket repository) causes this error in Google Functions when I try to define my Google Cloud Function with the main.py in my repository:
message: "Function load error: Code in file main.py can't be loaded.
Did you list all required modules in requirements.txt?
Detailed stack trace: Traceback (most recent call last):
File "/env/local/lib/python3.7/site-packages/google/cloud/functions_v1beta2/worker.py", line 211, in check_or_load_user_function
_function_handler.load_user_function()
File "/env/local/lib/python3.7/site-packages/google/cloud/functions_v1beta2/worker.py", line 140, in load_user_function
spec.loader.exec_module(main)
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/user_code/main.py", line 4, in <module>
from competitor_scrape.headless import headless_browser
ModuleNotFoundError: No module named 'competitor_scrape.headless'
It asks if I have "list[ed] all of the required modules in requirements.txt". Am I allowed to list the modules from my own repository? If so, how should I be doing that? I haven't been able to find any documentation on how to do this.
My current requirements.txt file looks like this:
google-cloud-pubsub
numpy==1.14.5
pandas==0.22.0
psycopg2==2.7.4
selenium==3.4.3
geopy==1.11.0
googlemaps==2.5.1
ratelimiter==1.2.0
sqlalchemy==1.2.0
zeep==2.5.0
EDIT/UPDATE
My repository file structure looks something like this right now:
.
├── competitor_scrape
│ ├── __init__.py
│ └── headless.py
├── main.py
└── requirements.txt
headless_browser is a function defined within headless.py. However, at this point my __init__.py inside competitor_scrape is empty (it was auto-generated by PyCharm). It seems that the __init__.py is probably the root of the problem. How should I be populating that script so that scripts/functions within competitor_scrape are available to the Google Cloud Function?
You shouldn't include the modules in requirements.txt -- that's only for dependencies to be installed from PyPI.
An import statement like:
from competitor_scrape.headless import headless_browser
means that you should have a directory structure something like this:
.
├── competitor_scrape
│   ├── __init__.py
│   └── headless
│   └── __init__.py
├── main.py
└── requirements.txt
And in the competitor_scrape/headless/__init__.py file, you should have a variable called headless_browser.
I had almost the same issue with config.py imported in main.py. It looks like if everything in the .gitignore file was also ignored by gcloud functions deploy. I did not see this issue before!
Error (with config.py in .gitignore):
ModuleNotFoundError: No module named 'config'
Error (after deleting the .gitignore file):
ERROR: (gcloud.functions.deploy) Could not read ignore file [./.gitignore]: Unable to read file [./.gitignore]: [Errno 2] No such file or directory: './.gitignore'
Succes (after adding an empty .gitignore file):
Deploying function (may take a while - up to 2 minutes)...done.
availableMemoryMb: 2048
entryPoint: handler
httpsTrigger:
url: https://europe-west1-my-project.cloudfunctions.net/my-function
labels:
deployment-tool: cli-gcloud
maxInstances: 10
...
When I added a .gitignore file without config.py, everything worked as expected and I was able to deploy the cloud function! So it seems like .gitignore entries are excluded from the deployment. With previous deployment, I thought this behaviour was supposed to be the job of .gcloudignore.
Unfortunately the issue was pretty simple. The file headless.py was in the local working directory, but was not added to Git revision control, so it was not getting updated in the cloud, and therefore could not be found by GCP.

Python 3.5 cannot import a module

I have read a ton of stackoverflow answers and a bunch of tutorials. In addition, I tried to read the Python documentation, but I cannot make this import work.
This is how the directory looks like:
myDirectory
├── __init__.py
├── LICENSE
├── project.py
├── README.md
├── stageManager.py
└── tests
├── __init__.py
└── test_project.py
There is a class in project.py called Project, and I want to import it in a file under tests directory. I have tried the following:
Relative import:
from ..project import Project
def print_sth():
print("something")
This gives me the following error: (running from the tests directory as python test_project.py and from myDirectory as python tests/test_project.py)
Traceback (most recent call last):
File "test_project.py", line 1, in <module>
from ..project import Project
SystemError: Parent module '' not loaded, cannot perform relative import
Absolute import with package name:
If I have something like the following, I get ImportError (with the same run command as above).
from project import Project
def print_sth():
print("something")
------------------------------------------------------
Traceback (most recent call last):
File "test_project.py", line 1, in <module>
from project import Project
ImportError: No module named 'project'
and this too:
from myDirectory.project import Project
def print_sth():
print("something")
------------------------------------------------------
Traceback (most recent call last):
File "test_project.py", line 1, in <module>
from myDirectory.project import Project
ImportError: No module named 'myDirectory'
Finally, I tried adding the if __name__ == '__main__' statement within the test_project.py file, but it still failed. I would really appreciate if anyone could help. If there is a solution where I do not have to write a verbose command, I would prefer that.
When you run a Python script by filename, the Python interpreter assumes that it is a top-level module (and it adds the directory the script is in to the module search path). If the script is in a package, that's not correct. Instead, you should run the module using the -m flag, which takes a module name in the same format as an import statement (dotted separators) and puts the current directory in the module search path.
So, you could run the test from myDirectory with: python -m tests.test_project. When you run the script this way, either of the kinds of imports you tried will work.
But if myDirectory is supposed to be a top-level package itself (as the __init__.py file suggests), you should probably go up one level further up, to myDirectory's parent, and run the script with two levels of package names: python -m myDirectory.tests.test_project. If you do this and want the test to use an absolute import you'd need to name the top level package that the project module is in: from myDirectory.project import Project.

'no module' error with __init__.py in directory using Nosetests

I have the following directory structure:
Chippa/
bin/
__init__.py
app.py
tests/
__init__.py
app_tests.py
tools.py
templates/
hello_form.html
index.html
layout.html
docs/
In my app_tests.py file, I have:
from nose.tools import *
from bin.app import app
from tests.tools import assert_response
When I try to run app_tests.py from outside of the tests directory, one level above the test directory inside the Chippa directory, like so:
python tests/app_tests.py
I get the following error:
Traceback (most recent call last):
File "tests/app_tests.py", line 3, in <module>
from bin.app import app
ImportError: No module named bin.app
But I do have an empty __init__.py in the bin directory, which I thought would have prevented this issue. What am I missing here?
For that import to work, you have to actually be running a proper module in the first place. So, rather than
python tests/app_tests.py
, try
python -m tests.app_tests

Categories