Import error in VSCode despite setting the PYTHONPATH - python

I have an airflow environment locally deployed on WSL, and I am using VScode to debug and code.
My app folder structure is as follows:
~/workspaces
|--- .env
|---organization/gcp/datalake
|--- dags
|--- //My dags
|--- plugins
|--- __init.py__
|--- operators
|--- __init.py__
|--- facebook_operators.py
|--- hooks
|--- __init.py__
|--- facebook_hooks.py
I am having trouble understanding the behavior of VSCode regarding the imports.
I added the dags and plugins folders to the PYTHONPATH via .env file. My VSCode is opened directly on the workspaces directory.
The problem :
I get import errors, although I can successfully go to definition of the class I want to import.
Example : In my facebook_operators.py
from hooks.facebook_hooks import FacebookAdsHook
raises the following error :
No name 'facebook_hooks' in module 'hooks'
The contents of my .env file:
PROJECTDIR=~/workspaces/organization/gcp/datalake
PYTHONPATH=${PROJECTDIR}/plugins
PYTHONPATH=${PROJECTDIR}/dags:${PYTHONPATH}
Where did I go wrong? I'd like to understand and solve this error please.

Add the following statement at the beginning of the file "facebook_operators.py" to help VSCode find the file that needs to be imported:
import os,sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
(VSCode defaults to find the file from the parent folder of the currently opened file. We can use the above statement to add the file path that needs to be imported to the system path, and then VSCode can find it.)

Related

Virtual environment for a package

I am working on a program and access a set of custom modules. I organized the modules in a subfolder as own package with an __init__.py. Moreover, on the main level of the directory I have created a virtual environment that holds my dependencies. The folder structure is as follows:
project
+-- main_program.py
+-- venv
| +-- cool_package.py
+---mypackage
| +-- module1.py
| +-- module2.py
| +-- __init__.py
The issue is, that module2.py depends on a package I installed in venv. Running module2.py from main_program.py gives an error "cool_package.py" not found.
How do I organize stuff like that so that I can accesses cool_package.py from main_program.py with all the other needed packaged. And make cool_package.py accessible for the custom package with module2.py as well?
I may have misunderstood what you mean by your virtual env but based on your folder and file layout I think you need to add an __init.py__ file to your venv folder to make it a package and then you should be able to import venv.cool_package.
thanks for all answers - it eventually worked by properly activating the environment before running the script. Something must have gone wrong the first time - all works now and the folder structure is correct.
Best
Moritz

Portable way to import modules from parent directory in Python

I know this is a worn out topic but the import mechanism/s in python is still confusing the masses. What I want is the ability to import a custom module that is in a parent directory in a way that allows me to take a project to another environment and have all of the imports work.
For example a structure like this:
repo
|--- folder1
| |--- script1.py
|--- folder2
| |--- script2.py
|--- utils
|--- some-util.py
How can I import from some-util.py in both script1 and script2? The idea is that I could clone the repo into a remote host and run scripts from folders 1 and 2 that may have the shared dependency of some-util.py only I don't want to have to run anything before hand. I want to be able to:
connect to box
git clone repo
python repo/folder1/script1.py
contents of script1 and script2:
import some-util
<code>
EDIT:
I forgot to mention that occasionally the scripts need to be run from another directory like:
/nas/some_folder/repo/folder1/script1.py args..
Also, the box is limited to python 2.7.5
The trick is to implement as your scripts as modules (read here and here for an overview of what the python -m switch means).
Here is a structure, also notice every directory contains an (empty file) named __init__.py:
repo/
|____utils/
| |____someutil.py
| |___ __init__.py
|___ __init__.py
|____folder1/
|____script1.py
|___ __init__.py
utils.someutil may contain something like this:
def say_hello():
return "Hello World."
And your script1.py may contain something like:
from ..utils.someutil import say_hello
if __name__ == "__main__":
print(say_hello())
Then running the following:
python -m repo.folder1.script1
... produces:
Hello World.

How to resolve VSCode import warnings with python scripts?

VSCode gives a warning whenever I import a python file from the same directory, but in practice everything works fine when the scripts run.
In a sample directory:
root_folder
+-- __init__.py (empty)
+-- __main__.py (empty)
+-- __import.py (contains Parent class)
+-- toImport.py (contains Child(Parent) class)
I try the following in toImport.py:
from __import import Parent
class Child(Parent): ...
Although I keep getting a warning: unresolved import even if it works. How can I resolve this issue or is it a VSCode issue?
Add this to your projects .vscode/settings.json file.
{
"python.autoComplete.extraPaths": ["./root_folder"]
}

Configure pylint for modules within eggs. (VS code)

Project structure
I have the following folder structure
|
|- src
| |- mypackage
| | |- __init__.py
| | |- mymodule.py
| |- utils.egg
|- main.py
in mymodule.py file I can import the egg adding it to the sys.path as
import sys
sys.path.append('src/utils.egg')
import utils
When calling main.py everything works fine (python -m main).
Problem
The problem comes from pylint. First, it shows the following message in mymodule.py file
Unable to import 'utils' pylint(import-error)
if I ask for suggestions (CRTL + Space) when importing I got
utils.build
.dist
.utils
.setup
# |- suggestions
And from utils.utils I can acces the actual classes / functions in utils module. Of course if I import utils.utils, when executing the main script, an importing error pops up.
How can I configure my vscode setting in order fix pylint?
should I install the egg instead of copy it to the working folder?
Is my project's folder-structure ok, or it goes against recommended practices?
Extra info
In case you wonder the EGG-INFO/SOURCE.txt file looks like
setup.py
utils/__init__.py
utils/functions.py
utils.egg-info/PKG-INFO
utils.egg-info/SOURCES.txt
utils.egg-info/dependency_links.txt
utils.egg-info/top_level.txt
utils/internals/__init__.py
utils/internals/somemodule.py
utils/internals/someothermodule.py
Also, there aren't build nor dist folder in the egg.
This is an issue with Pylint itself and not the Python extension, so it will come down to however you need to configure Pylint.
As for whether you should copy an egg around or install it, you should be installing it into your virtual environment, or at least copying over the appropriate .pth file to make the egg directory work appropriately.

Python absolute import not working on Ubuntu server

My project has the following structure:
server/ (root of project)
|
|--- __init__.py
|--- requirements.txt
|--- env/ (virtual environment)
|--- app/ (main app folder)
|--- __init__.py (defines a 'app = Flask(__name__)' object)
|--- app.py (runs app on local server)
|--- models.py
|--- views.py
The way I import different modules in app.py on my local machine is do:
# /server/app/app.py
from server.app import app
from server.app.models import *
from server.app.views import *
It works fine on my local machine (using PyCharm IDE, and the Python binary inside virtual environment folder /server/env/bin/.
However, when I push this to the production server running Ubuntu, where I install all dependencies globally, it keeps throwing the error no module named server.app when I run:
python server/app/app.py
Does anyone know why?
Any IDE environment usually sets the pythonpath. E.g. in eclipse right click your project and see the properties. You will see that your main project is listed in the pythonpath. This path is used to locate modules.
Now in your production code you are not in your IDE. So normal python interpreter cannot find your path. Hence you need to specify this.
One way is to add sys.path.append('/path/to/the/project') before you do your import (This should be done for the first script that got executed, in this case app.py, that way you only need to do this once).
You can also add your path permanently to your production environment. See this post.
As #Rash mentionned, your IDE very probably add the directory containing /server to your python path. You can check this in your app.py by adding
import sys
print "\n".join(sys.path)
before your imports.
When you're manually running your app, ie python server/app/app.py, the parent of your server directory is obviously not in your python path, so you have to add yourself.

Categories