My team uses git to manage our workflow, however, I clearly don't have it set up right.
I have my repo setup in my user directory: Repos/project/stuff
However, the actual project and it's site packages are in a different location in the root directory: /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/project/stuff
The specific issue I'm having is: I create a local branch from master, and make a change to a sample.py file which contains dictionaries of hard-coded data (a config file). Then I import sample.py into test.py and print its contents. However, the print contents are not reflective of the local branch where I've made changes, nor are they reflective of the master branch. The import is pulling from the site-packages folder I listed above, which I guess is outdated.
How would I get the import to happen from my local branch? Do I need to place my Repos folder in a different directory? Or is this a path issue?
Thanks in advance.
Python is importing from the site-packages folder because it's how Python's import works: site-packages is added to sys.path.
To make Python to import your updated code you have two ways:
Install the updated code after an every update. pip instal . in the git repository or python setup.py install.
Install the code once in "develop mode" using command pip instal -e . or python setup.py develop. After that Python will always import from the git repo. See https://setuptools.readthedocs.io/en/latest/userguide/development_mode.html
Related
I would like to deploy a cloud function that doesn't rely on using a requirements.txt to install packages. I want the packages to be available within storage or zipped up and upload as part of the function. Is this possible?
EDIT 6/14/2019
Basically I would like to send packages like numpy and pandas with my code to deploy a cloud function. I want to do this in the event that pypi.org is not available. I have tried following this piece of documentation. An example of what I am trying to do is below:
Folder Structure:
-> my_folder
-> main.py
-> libs
-> numpy (the entire package)
-> pandas (the entire package)
-> __init__.py
main.py
import libs.numpy as np
import libs.pandas as pd
def function()
do stuff with numpy and pandas
I then tried to deploy the function from gcloud command line and then gcp UI, both failed. If this is possible please help.
At the moment there is only two options:
Using the requirements.txt
Packaging the dependencies along with your function, link here
They cannot be zipped neither on storage, they will be treated as part of the source of the function.
If you choose to go with the second option, the parameter -t libs might help you.
You can use it to install everything on a libs folder and then you can just move the content to the local directory. As a single command it would look like this:
pip install -t libs [your library name(s)] && rm -rf libs/*.dist-info && mv -r libs/* . && rm -rf libs
I added the rm -rf libs/*.dist-info portion in order to not pollute the source folder with tons of library version and distribution information that are useless to the function. Those are used by pip when freezing and planning updates.
EDIT 6/14/2019
You kept the libraries on the libs folder. That is the point before the mv -r libs/* . on the one-liner that I added above.
Using a libs folder keeps everything more organized, so if you want to keep the packages there you need to vendor that folder adding this to the top of your main.py, before all other imports:
# Vendoring packages from libs folder
import sys
import os
sys.path.insert(1, os.path.join(
os.path.dirname(os.path.realpath(__file__)),
"libs"
))
# All other imports go below this line
Explaining:
__file__ is a global variable present in every module that holds the path to the file from which the module was defined, that is the file where it is being used. In our case, the path to main.py.
Since we cannot be certain of the working directory at the moment main.py is imported we pass that to os.path.realpath to be certain of the path structure. Could be os.path.abspath to, I have seen and used both and haven't noticed any difference.
From the path of the file, we get the path of the directory of your source code with os.path.dirname and then to the libs folder inside it with os.path.join.
Now the most important part. When you try to import a package, python looks for them on the system/python path. So we add the libs full path that we built as the first lookup location on the system path after your working directory. New import statements will look on that folder first and the package is not there proceed normally with the rest of the lookup directories.
If you prefer to look for packages on libs only if they are not available in the system and the python environment, append the libs path instead of inserting it at index 1.
After that you don't need to prepend libs. on your imports, just use the normal import numpy.
On fully independent packages this might work, but not on packages with dependencies, since they expect their dependencies to be directly importable (from anywhere on sys.path).
I want to add a specific library path only to python2. After adding export PYTHONPATH="/path/to/lib/" to my .bashrc, however, executing python3 gets the error: Your PYTHONPATH points to a site-packages dir for Python 2.x but you are running Python 3.x!
I think it is due to that python2 and python3 share the common PYTHONPATH variable.
So, can I set different PYTHONPATH variables respectively for python2 and python3. If not, how can I add a library path exclusively to a particular version of python?
PYTHONPATH is somewhat of a hack as far as package management is concerned. A "pretty" solution would be to package your library and install it.
This could sound more tricky than it is, so let me show you how it works.
Let us assume your "package" has a single file named wow.py and you keep it in /home/user/mylib/wow.py.
Create the file /home/user/mylib/setup.py with the following content:
from setuptools import setup
setup(name="WowPackage",
packages=["."],
)
That's it, now you can "properly install" your package into the Python distribution of your choice without the need to bother about PYTHONPATH. As far as "proper installation" is concerned, you have at least three options:
"Really proper". Will copy your code to your python site-packages directory:
$ python setup.py install
"Development". Will only add a link from the python site-packages to /home/user/mylib. This means that changes to code in your directory will have effect.
$ python setup.py develop
"User". If you do not want to write to the system directories, you can install the package (either "properly" or "in development mode") to /home/user/.local directory, where Python will also find them on its own. For that, just add --user to the command.
$ python setup.py install --user
$ python setup.py develop --user
To remove a package installed in development mode, do
$ python setup.py develop -u
or
$ python setup.py develop -u --user
To remove a package installed "properly", do
$ pip uninstall WowPackage
If your package is more interesting than a single file (e.g. you have subdirectories and such), just list those in the packages parameter of the setup function (you will need to list everything recursively, hence you'll use a helper function for larger libraries). Once you get a hang of it, make sure to read a more detailed manual as well.
In the end, go and contribute your package to PyPI -- it is as simple as calling python setup.py sdist register upload (you'll need a PyPI username, though).
You can create a configuration file mymodule.pth under lib/site-packages (on Windows) or lib/pythonX.Y/site-packages (on Unix and Macintosh), then add one line containing the directory to add to python path.
From docs.python2 and docs.python3:
A path configuration file is a file whose name has the form name.pth and exists in one of the four directories mentioned above; its contents are additional items (one per line) to be added to sys.path. Non-existing items are never added to sys.path, and no check is made that the item refers to a directory rather than a file. No item is added to sys.path more than once. Blank lines and lines beginning with # are skipped. Lines starting with import (followed by space or tab) are executed.
I found that there is no way to modify PYTHONPATH that is only for python2 or only for python3. I had to use a .pth file.
What I had to do was:
make sure directory is created in my home: $HOME/.local/lib/python${MAJOR_VERSION}.${MINOR_VERSION}/site-packages
create a .pth file in that directory
test that your .pth file is work
done
For more info on `.pth. file syntax and how they work please see: python2 docs and python3 docs.
(.pth files in a nutshell: when your python interpreter starts it will look in certain directories and see the .pth file, open those files, parse the files, and add those directories to your sys.path (i.e. the same behavior as PYTHONPATH) and make any python modules located on those directories available for normal importing.)
If you don't want to bother with moving/adding documents in lib/site-packages, try adding two lines of code in the python2.7 script you would like to run (below.)
import sys
sys.path = [p for p in sys.path if p.startswith(r'C:\Python27')]
This way, PYTHONPATH will be updated (ignore all python3.x packages) every time you run your code.
I've installed pip and virtualenv (with sudo) and I've created my first python app using cookiecutter. I've also run virtualenv my_app followed by source ~/virt/bin/activate.
But when I cd to my app, cd /vagrant and run python setup.py test I see the eggs and packages I need are all downloaded into my app, i.e. the current directory.
I was sort of expecting them to go into ~/virt/
What am I doing wrong?
The path should be ./my_app/bin/activate. With the above you are looking for a virtualenv named virt in your home directory, which is likely not what you want.
Assuming that you created an application called my_app in your home directory you would need to call
source ~/my_app/bin/activate
in order to create it
You can find more usage information here.
Used grokproject Sample as in the grok homepage tutorial to simultaneously create a new project and install the grok framework.
cd Sample then ran bin/paster serve parts/etc/deploy.ini as in tutorial and came back with a DistributionNotFound: grokcore.startup error
traceback # http://pastebin.com/T01J0ndM
An educated guess tells me the grok package was not installed with grokproject command?
Using Gentoo Linux.
Normally, when you move a project, running
$ python bootstrap.py
$ ./bin/buildout
in the new location should regenerate all local paths in scripts and config files. It will also download and install eggs needed (like grokcore.startup) if they are not in a commonly shared place like the standard pythons site-packages dir or a common eggs-directory (see below).
You can tell buildout to install your eggs at the same location everytime by creating in your home dir a .buildout/ directory and in this directory a file called default.cfg with contents like this:
[buildout]
eggs-directory = /home/myname/.buildout/eggs
which would install all 'local' eggs in the given path.
This error arose because I moved my project from the original install directory. Obviously there must be location dependent config settings that I can't find.
Creating a new project from scratch in the new directory solved the problem.
I'd like to start developing an existing Python module. It has a source folder and the setup.py script to build and install it. The build script just copies the source files since they're all python scripts.
Currently, I have put the source folder under version control and whenever I make a change I re-build and re-install. This seems a little slow, and it doesn't settle well with me to "commit" my changes to my python install each time I make a modification. How can I cause my import statement to redirect to my development directory?
Use a virtualenv and use python setup.py develop to link your module to the virtual Python environment. This will make your project's Python packages/modules show up on the sys.path without having to run install.
Example:
% virtualenv ~/virtenv
% . ~/virtenv/bin/activate
(virtenv)% cd ~/myproject
(virtenv)% python setup.py develop
Virtualenv was already mentioned.
And as your files are already under version control you could go one step further and use Pip to install your repo (or a specific branch or tag) into your working environment.
See the docs for Pip's editable option:
-e VCS+REPOS_URL[#REV]#egg=PACKAGE, --editable=VCS+REPOS_URL[#REV]#egg=PACKAGE
Install a package directly from a checkout. Source
will be checked out into src/PACKAGE (lower-case) and
installed in-place (using setup.py develop).
Now you can work on the files that pip automatically checked out for you and when you feel like it, you commit your stuff and push it back to the originating repository.
To get a good, general overview concerning Pip and Virtualenv see this post: http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django
Install the distrubute package then use the developer mode. Just use python setup.py develop --user and that will place path pointers in your user dir location to your workspace.
Change the PYTHONPATH to your source directory. A good idea is to work with an IDE like ECLIPSE that overrides the default PYTHONPATH.