Python with virtual evironment not working with my own package - python

My project looks like this:
src
|--annotation_correction
|--|--suggestion.py
|--|--error.py
|--|--...
venv
|--...
I installed my package, using pip install -e . while in the main folder.
When I type pip freeze, my package "annotation" is in the list and VSCode also seems to recognize it as an installed package.
The problem is that when I run suggestion.py while trying to import e.g. from error.py with from annotation_correction.error import Error, ErrorType, I still get the error:
ModuleNotFoundError: No module named 'annotation_correction.error'; 'annotation_correction' is not a package
All this while using the interpreter that is running in the venv.
My setup.py just calls setup() and my setup.cfg looks like this:
...
packages =
annotation_correction
...
package_dir =
=src

To make my comment an answer:
If you have a module in a package that imports other modules from the same package with from package import module or from . import module or similar, then you will need to use the -m switch:
Search sys.path for the named module and execute its contents as the __main__ module.
Otherwise, Python will have set up sys.path so the directory of the script that is run is in it, which will naturally wreak havoc with imports (since the package is "invisible").
This stands whether or not you've installed a package (and indeed you don't necessarily need to pip install -e the same package you're working on) or not.
The other option is to have a standalone script outside the package that acts as the entry point and does e.g. from annotation_correction.suggestion import main (since from that script's point of view, the package is there and all is fine -- and if the package is installed, -e or no -e, the script doesn't even need to be next to the package directory, since the package has been installed and as such on the search path), but that's unnecessary.
Visual Studio Code apparently supports the same thing with a "module" run configuration.

Related

Can I debug a python package after installing it?

I'm interested in a package and wanted to play around with the code: https://github.com/aiqm/torchani
The package itself is not complex and the key modules are included in the torchani folder. I wanted to use the VSCode debugger to do some experiments with the components and track the code. Do I need to run python setup.py --install, or I should simply go to the folder and run the modules without installing?
The problem is: there will be a lot of relative import issues if I directly run the code in the parent folder. If I install the package, then the code will probably be compiled and my changes will not be executed.
You can install the package with python setup.py install (or pip install [-e] .).
For the debugging part you can use the debugger of VSCode, just set justMyCode: False in the launch.json.

Python Package on GitHub

I made a python (3) package and I have been trying to upload it on Github. I also know how to push and install a git using pip. To test if it works as anticipated, I made a virtual environment on my local computer (linux) and pip installed my already pushed private package in there without a problem.
The issue is that I don't know how to access it!!! (I know how to activate and use virtualenvs; I don't know how to call my package) My package has a main interface that one would need to call it in terminal as follows:
python3 myui.py some_args *.data
and it's supposed to create some files where it's called. In other words, it's not exactly a module like numpy to be imported. I have watched/read many tutorials and documentations on the web and tbh I'm lost here.
You are looking for the -m flag. If you installed everything correctly, then the following command should allow you to run your script (based on your example). Note that you shouldn't add the file extension '.py'.
python3 -m myui some args *.data
If you have an actual package (directory with __init__.py file and more) instead of a module (a single .py file), then you can add a __main__.py file to that package. Python will execute this script when you use the -m flag with the package's name, in the same way as shown above.
python3 -m mypackage some args *.data
If you want to run a different script that is nested somewhere inside of that package, you can still run it by specifying its module name:
python3 -m mypackage.subpackage.myscript some args *.data
Another common way to make your script available uses the setup script (setup.py) or setup configuration file (setup.cfg) that is used to install the module or package. In that case, you can add an entry point to map a command to a specific module/function/etc. (as described in this Python packaging tutorial) so that you can run that command instead of having to use the -m flag with Python.
$ mycommand some args *.data

Failed to call module in python using cloud9 [duplicate]

Using sqlobject. When I import the module I get a unable to load module error when running lambda local or remote. The module is installed and if I get a command line python3 interpreter and import the module it imports just fine.
How do I get 3rd party modules installed so they work with both lambda local and lambda remote?
Code could not be simpler:
import sqlobject
Answering my own question...
These are instructions for Python 3.
First start with an empty environment, mine was called cycles.
Create a new lambda function:
Your folder structure now looks like this:
There will be two folders with the same name (a bit confusing - ask AWS not me).
Right button click on the top most folder with your lambda function name and select "Open terminal here". This gets you command line.
No need to use sudo, just install the packages you need. Install your packages into that folder:
python3 -m pip install --target=./ sqlobject
IMPORTANT
You need to install the packages in that top folder that you opens a terminal from. See the part of the pip install line that says:
--target=./
that makes sure the packages get installed in the right folder that lambda can use. If you use the standard pip install:
python3 -m pip install sqlobject
You packages will be installed in the wrong place.
Your folder structure should look like this with the new added packeges installed:
You can see the code to the right...it ran fine with the sqlobject package installed.

Python: Export all used modules

I'm using a lot of modules installed by Internet.
It's possible to write a script to copy automatically all of these module in a folder?
I don't know where these modules are, I only write:
import module1
import module2
I simply want that module1 and module2 can copied in a folder in order to use my file.py in other pc witouth installing any software except for Python.
pip and virtualenvs. You develop locally on a virtualenv, installing and uninstalling whatever you want. When your code is exportable make a list of requirements with command "pip freeze". Then you carry your code to another computer, without any other code except for the output of "pip freeze" in a file called "requirements.txt". Do a "pip install -r requirements.txt" and... ¡magic! All are installed in their proper path.
If you are interested in where those modules are, find your python path and find a "site-packages" folder (on Windows it usually is "C:\PythonX.X\lib\site-packages" or something like that). But I'm 100% sure you will regret copying modules manually from here to there.

PYTHONPATH vs symbolic link

Yesterday, I edited the bin/activate script of my virtualenv so that it sets the PYTHONPATH environment variable to include a development version of some external package. I had to do this because the setup.py of the package uses distutils and does not support the develop command à la setuptools. Setting PYTHONPATH works fine as far as using the Python interpreter in the terminal is concerned.
However, just now I opened the project settings in PyCharm and discovered that PyCharm is unaware of the external package in question - PyCharm lists neither the external package nor its path. Naturally, that's because PyCharm does not (and cannot reliably) parse or source the bin/activate script. I could manually add the path in the PyCharm project settings, but that means I have to repeat myself (once in bin/activate, and again in the PyCharm project settings). That's not DRY and that's bad.
Creating, in site-packages, a symlink that points to the external package is almost perfect. This way, at least the source editor of PyCharm can find the package and so does the Python interpreter in the terminal. However, somehow PyCharm still does not list the package in the project settings and I'm not sure if it's ok to leave it like that.
So how can I add the external package to my virtualenv/project in such a way that…
I don't have to repeat myself; and…
both the Python interpreter and PyCharm would be aware of it?
Even when a package is not using setuptools pip monkeypatches setup.py to force it to use setuptools.
Maybe you can remove that PYTHONPATH hack and pip install -e /path/to/package.
One option is to add path dynamically:
try:
import foo
except ImportError:
sys.path.insert(0. "/path/to/your/package/directory")
import foo
But it is not the best solution because it is very likely that that code will not get into the final version of the application. One more option (and more appropriate imho) is to make simple setup.py file for package and deploy it in virtualenv with develop parameter or by pip with -e parameter:
python setup.py develop
or:
pip install -e /path/to/your/package/directory
http://packages.python.org/distribute/setuptools.html#development-mode
This is an improvement on ndpu's answer that will work regardless of where the real file is.
You can dereference the symlink and then set sys.path before importing local imports.
import os.path
import sys
# Ensure this file is dereferenced if it is a symlink
if __name__ == '__main__' and os.path.islink(__file__):
try:
sys.path.remove(os.path.dirname(__file__))
except ValueError:
pass
sys.path.insert(0, os.path.dirname(os.path.realpath(__file__)))
# local imports go here

Categories