I'm climbing my learning curve in Python and try to understand where to put everything.
I originally have a python module in a folder and then a sub folder src, in this src folder I will then have my main source files say main.py then I will have models folder storing my models codes.
/myproject/src/main.py
/myproject/src/models/a-model.py
/myproject/src/models/b-model.py
So my main will import the model like this:
from models.a-model import a
Then when I package the zip file I just zip the myproject folder with that folder structure and deploy and everything is fine.
Now I have another new module doing something different but need to use the same models.
I can easily duplicate them all and code separately and deploy. But I would like to share the codes to the models, so that when one model changes, I only need to update once, instead of 2 places.
My new module is like
/mynew/src/main-b.py
/mynew/src/models/a-model.py
/mynew/src/models/b-model.py
What is the best practise to do this?
Do I put like this?
/myproject/src/main.py
/mynew/src/main-b.py
/models/a-model.py
/models/b-model.py
And then update the import?
But I have doubt how do I deploy? Do I also have to setup the same folder structures?
One would be adding /myproject/src/models to the environment variable PYTHONPATH. Python adds the directories listed in PYTHONPATH environment variable to sys.path, the list of directories where Python searches when you try to import something. This is bad, because modifying PYTHONPATH has its own side effects, fortunately, virtual environments provide a way to get around those side effects.
Alternatively and much better you could add your modules to site-packages directory, site-packages is added to sys.pathby default, this obviates the need to modifyPYTHONPATH. To locate thesite-packages` directory, refer to this page from Python Documentation: Installing Python Modules (Legacy version).
You could also use LiClipse IDE which comes with Pydev already installed. Create source a folder from the IDE and link your previous project with your newer project. When you link your projects the IDE adds the source folders of your older project to the PYTHONPATH of your newer project and thus Python will be able to locate your modules.
Related
I'm coding a bot.
In this bot, deep in the program directory structure, I have to make an import that needs the absolute path of a package far away in the directory structure. In a way that I can't make the imports.
I've managed to import it successfully by exporting the PYTHONPATH variable in my local ~/.bashrc file containing the absolute path to my package.
Then I can import things in my program like:
import absolute_path.module
The thing is, when someone else downloads this program files for use, or when I upload it to a server, how is this other party going to manage this absolute importing I made? (Provided the package to be imported is going along with the program files, in the same path where I make the importing).
They didn't set the PYTHONPATH variable, so, are they going to have troubles?
It depends. Is the other module something standard (ie installable via pip etc)? then you just add it to your project's requirements.txt and the users should be able to figure it out from there.
If it's something you've written, then you can use something like PyInstaller to package all the dependencies of your module (including imports and even the python interpreter) so users don't need to download anything extra.
Another option is to put the other module with your bot module and distribute them together, and use relative paths.
Make your bot into an installable package
I'm trying to set up an environment for Python project to help out a friend (developing on different OS).
I want to set environment variables per project in the .env file, but ultimately it fails to recognize Pythonpath.
The goal is to set the /api subfolder of my currently opened folder, but ir doesn't recognize the relative path. Writing there absolute path (as in screenshot below) sort of works - it recognizes the custom library and all others, but on debugging it fails on other relative paths in the code (or so it appears)
Any advice? Thanks.
One approach is to avoid the whole environment setup to begin with and define your dependencies in source. This works when your repo has all you need.
Consider the project structure like this
proj
/api
api_source.py
/utils
utils_source.py
/tests
api_source.py
import os
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '../')))
import utils.utils_source as utils_source
It's a bit messy but I like this approach in that I do not have to setup any environment wherever I clone my repo, at least not for the self contained dependencies within the repo itself.
Usually I'll do this path appending in the Main script and structure all my imports in the dependencies to rely on a fixed path prefix in the path list that way I do this operation once and all the dependent modules run with it.
I have to deploy a python function to GCP. The libraries I want to use a library (USD by Pixar specifically) which needs I need to build myself. To make the library accessible I need to make changes to $PATH an $PYTHONPATH.
So the problem is that I have to include everything in one project to deploy the function and I don't know where to start.
I have tried appending to PYTHONPATH on run-time but it gives no module error. Also I have no idea how can I change PATH variable to be able to use executables inside usd/bin folder
import sys
sys.path.append('lib/python') # relative path.
First off, I'm pretty new so I hope I hadn't missed anything too trivial.
Here's a small preface with lots of info:
I'm using Windows & Python 2.7.
I've been using an open-source module named pybrain, which I need to change pretty much for my own purposes. So far I've been changing it directly from the python site-packages folder, but I guess it's a pretty messy way to work, so I decided to try and re-do thing so as to launch it from a different folder.
I've also decided to start using Aptana (which as far as I can gather is Eclipse-based enough for the same solutions to apply) instead of the messier but simpler "Spyder" I've been using so far.
Pybrain is a pretty layered module with lots of different subfolders, e.g.:
pybrain
--> subfolder1
--> subfolder2
...
So far I've figured these out:
- I've removed the path to the pybrain folder in site-packages from the PYTHONPATH in aptana project.
- I've added the path to the new project folder.
This works for some imports, namely, the ones that only reference relative paths inside the subfolders, e.g. I can import from things in subfolder1 if I write a module in the main folder.
However, whenever I try to import things from the other subfolder - I can't use "pybrain" in the hierarchy:
from pybrain.subfolder2 import *
doesn't work in subfolder1.
And here is my question:
How do I configure "pybrain" to be a usable name in the code, just as it was when I had pybrain in the site-packages folder?
I think you have added the wrong path to your source folder...
I.e.:
if you have a structure
/project
/project/pybrain
/project/pybrain/__init__.py
The source folder set should be '/project' (whereas I think you've set /project/pybrain as the source folder)... if that's not the case, please add more information on your folders and what did you set as a source folder...
Probably Aptana has some way to configure the list of folders which are considered source packages as in pycharm and in eclipse-pydev.
In any case, you could access your module using a .pth file in your site-packages. This file can be named as you want (p.e. pybrain.pth) and should contain only one line with the path to your pybrain folder. See this and this.
What would be the best directory structure strategy to share a utilities module across my python projects? As the common modules would be updated with new functions I would not want to put them in the python install directory.
project1/
project2/
sharedUtils/
From project1 I can not use "import ..\sharedUtils", is there any other way? I would rather not hardcode the "sharedUtils" location
Thanks in advance
Directory structure:
project1/foo.py
sharedUtils/bar.py
With the directories as you've shown them, from foo.py inside the project1 directory you can add the relative path to sharedUtils as follows:
import sys
sys.path.append("../sharedUtils")
import bar
This avoids hardcoding a C:/../sharedUtils path, and will work as long as you don't change the directory structure.
Make a separate standalone package? And put it in the /site-packages of your python install?
There is also my personal favorite when it comes to development mode: use of symlinks and/or *.pth files.
Suppose you have sharedUtils/utils_foo and sharedUtils/utils_bar.
You could edit your PYTHONPATH to include sharedUtils, then import them in project1 and project2 using
import utils_foo
import utils_bar
etc.
In linux you could do that be editing ~/.profile with something like this:
PYTHONPATH=/path/to/sharedUtils:/other/paths
export PYTHONPATH
Using the PYTHONPATH environment variable affects the directories that python searches when looking for modules. Since every user can set his own PYTHONPATH, this solution is good for personal projects.
If you want all users on the machine to be able to import modules in sharedUtils, then
you can achieve this by using a .pth file. Exactly where you put the .pth file may depend on your python distribution. See Using .pth files for Python development.