I would like to add the pygame and PIL modules as files in my project. My goal is to not have to install them by typing pip install in the console prompt. So I'm looking for a way to physically add them in the project directory. Thank you.
I'm quite sure you don't really want to do this, since both the modules you listed have binary components specific to Python and OS version/architecture. You'd have to ship all of them, for all versions you expect to run your project on.
Just add instructions on how to install them, instead.
Related
I'm trying to install OpenCV into my python environment (Windows), and I'm almost all of the way there, but still having some issues with autocomplete and Pycharm itself importing the library. I've been through countless other related threads, but it seems like most of them are either outdated, for prebuilt versions, or unanswered.
I'm using Anaconda and have several environments, and unfortunately installing it through pip install opencv-contrib-python doesn't include everything I need. So, I've built it from source, and the library itself seem to be working fine. The build process installed some things into ./Anaconda3/envs/cv/Lib/site-packages/cv2/: __init__.py, some config py files, and .../cv2/python-3.8/cv2.cp38-win_amd64.pyd. I'm not sure if it did anything else.
But here's where I'm at:
In a separate environment, a pip install opencv-contrib-python both runs and has autocomplete working
In this environment, OpenCV actually runs just fine, but the autocomplete doesn't work and Pycharm complains about everything, eg: Cannot find reference 'imread' in '__init__.py'
Invalidate Caches / Restart doesn't help
Removing and re-adding the environment doesn't help
Deleting the user preferences folder for Pycharm doesn't help
Rebuilding/Installing OpenCV doesn't help
File->Settings->Project->Project Interpreter is set correctly
Run->Edit Configuration->Python Interpreter is set correctly
So my question is: how does Pycharm get or generate that autocomplete information? It looks like the pyd file is just a dll in disguise, and looking through the other environment's site-packages/cv2 folder, I don't see anything interesting. I've read that __init__.py has something to do with it, but again the pip version doesn't contain anything (except there's a from .cv2 import *, but I'm not sure how that factors in). The .whl file you can download is a zip that only contains the same as what 'pip install' gets.
Where does the autocomplete information get stored? Maybe there's some way to copy it from one environment to another? It would get me almost all the way there, which at this point would be good enough I think. Maybe I need to rebuild it with another flag I missed?
Got it finally! Figures that would happen just after posting the question...
Turns out .../envs/cv/site-packages/cv2/python-3.8/cv2.cp38-win_amd64.pyd needed to be copied to .../envs/cv/DLLs/. Then PyCharm did it's magic and is now all good.
Alternatively add the directory containing the .pyd file to the interpreter paths.
I had exactly this problem with OpenCV 4.2.0 compiled from sources, installed in my Conda environment and PyCharm 2020.1.
I solved this way:
Select project interpreter
Click on the settings button next to it and then clicking on the Show paths for selected interpreter
adding the directory containing the cv2 library (in my case in the Conda Python library path - e.g. miniconda3/lib/python3.7/site-packages/cv2/python-3.7). In general check the site-packages/cv2/python-X.X directory)
If I use a module such as tkinter, would somebody need to have that module installed as well in order for my code to run on their machine?
Definitely. You can use virtual environments or containers to deliver required packages or have a requrements.txt or similar to install the dependencies.
python comes with a number of standard modules pre-installed, if the other person is running python (the same version of you) then he/she won't need to install anything, it will just work, that's the case of tkinter. But if you use external packages that you installed to run your code, for example celery, then he/she will need to do the same thing.
If you gave your code to someone to run, they would need to download the same modules, unless you also sent the environment too. The only way I know around this is to freeze your code where you would create an executable. I've used cx_Freeze and pyInstaller and haven't had any issues but it also depends on your needs. You can find some more information through here:
https://docs.python-guide.org/shipping/freezing/
Hope this helps!
In your running environment do a, this file you add to your repo
pip freeze > requirements.txt
When people clone your repo, they only have to do a:
pip install -r requirements.txt
and they will install exactly the same pypi modules you have.
With virtualenv you can isolate a python environment to each project, with pyenv you can use different pythonversions withing the various environment also.
In short, my question is, how do I install the latest version of scikit-image into my usr/lib/python3/dist-packages so I can actually use it? I think there is a problem with my understanding of how third-party modules are installed. As a newb, I don’t know how to rectify that, hence this post.
I need help to understand how to install packages in python3 up until now I have used pip/pip3/apt-get/synaptic etc and it has worked fine for many packages. However, I have hit several barriers (Skimage, opencv, plantcv in python3). I must emphasise, the problem I am having is using these packages in python3, not 2.7.
For example, I want to use the latest version of scikit-image (0.14) with python3. (http://scikit-image.org/) I have tried using the installation instructions and have not yet successfully managed to install it. I have navigated to my usr/lib/python3/dist-packages and copied scikit-image into this directory (I have all the dependencies installed in here already).
Image of my folder for dist-packages as proof
As you can see, the folder containing skimage is in the directory I want to be installed in, how do I actually install it? Do I have to extract skimage out of the folder into the directory and then run the install command? If I navigate to usr/lib/python3/dist-packages/scikit-image and then run pip install -e . I get an error, stating that I need numpy. If I write a python script using python3 I can clearly see I have it installed (and I have been using it for a long time). So, there must be a problem in how I have this package in my file system. I think a janky workaround would be to copy all the modules into my working directory and Import them that way as if they were modules I have made myself, but this obviously negates the whole point of installing packages.
This has also happened with another package called plantcv. Where I went into the directory usr/lib/python3/dist-packages then cloned the source from git hub and installed as per instructions. When I import plantcv in my python3 script. It Imports fine. But, there is nothing in it, as python cannot see the modules which are inside this folder at usr/lib/python3/dist-packages/plantcv/plantcv.
There is clearly some comprehension here that I am missing, as I have a similar problem for two packages now. Please, Internet. Help me understand what I am missing!
You simply need to copy the folder in /usr/lib/python3/dist-packages/package-name
However, there are certain things that are specific to python packages. The folder named package name should be a valid package. A good indicator of that is it will contain a file "__init__.py". It is very likely that every sub-directory inside this package directory will contain a "__init__.py" file. It depends on whether there are modules inside these sub-directories.
In your code simply import the package like the following.
import package-name
where package-name can be skimage
A Python module is just a .py source file. A Python package is simply a collection of modules.
So why do we need programs such as pip to 'install' Python modules? Why not just download the files, put them in our project's folder and import them?
What exactly does it mean to 'install' a module or a package? And what exactly does pip do?
Are things different on Windows and on Linux?
So why do we need programs such as pip to 'install' Python modules? Why not just download the files, put them in our project's folder and import them?
It's just meant to facilitate the installation of softwares without having to bundle all the dependencies nor ask the user to download the files.
You can type pip install mysoftware and that will also install the required dependencies. You can also upgrade a software easily.
What exactly does it mean to 'install' a module or a package? And what exactly does pip do?
It will copy the files in a directory that is in your Python path. This way you will be able to import the package without having to copy the directory in your project.
With your proposal, for each and every project you have to download the required modules as dependencies. You have to download them again and again and add them with your project which is not very suitable though some platform like node.us do it.
What pip do is to keep the modules you installed in /use/lib/python*/site-packages/ so clearly it is included in your Python's path. So, when you try to import a module or package it checks in site-package if it exists. If exists,then this code will be used with your project. If not, you will get an error.
In my application I would like to use:
packageA, which requires packageX==1.3
packageB, which requires packageX==1.4
packageX==1.5
How can I install multiple versions of packageX with pip to handle this situation?
pip won't help you with this.
You can tell it to install a specific version, but it will override the other one. On the other hand, using two virtualenvs will let you install both versions on the same machine, but not use them at the same time.
You best bet is to install both version manually, by putting them in your Python path with a different name.
But if your two libs expect them to have the same name (and they should), you will have to modify them so they pick up the version they need with some import alias such as:
import dependencyname_version as dependencyname
There is currently no clean way to do this. The best you can hope is for this hack to work.
I'd rather ditch one of the two libs and replace it with an equivalent, or patch it to accept the new version of the dependency and give the patch back to the community.
Download the source for ea. package. Install each on its own separate folder. For example. I had version 1.10 package, but wanted to switch to the dev version for some work. I downloaded the source for the dev module:
git clone https://github.com/networkx/networkx.git
cd netwokrx
I created a folder for this version:
mkdir /home/username/opt/python, then I set the PYTHONPATH env var to: export PYTHONPATH=/home/username/opt/python/lib/python2.7/site-packages/. Next, I installed it using: python setup.py install --prefix=/home/username/opt/python
Now, since my PYTHONPATH is now pointing to this other site-packages folder, when I run python on the command line, and import the new module, it works. To switch switch back, remove the new folder from PYTHONPATH.
>>> import networkx as nx
>>> nx.__version__
'2.0.dev_20151209221101'
an ugly workaround I use with python in blender is I'll install (and keep off path) a like version of python and use subprocess to have the other version do the needed work. Blenders python tends to get a little temperamental if you do much more than install pandas and scipy. I've tried this using virtualenvs with blender but that tends to break things.
Also on the off chance you are using blender for data visualization, you are going to want to add a config folder to your version number folder, this will keep all of your addons in that folder and it makes it far more portable and far less likely to mess up other installs of blender. Many people who make addons for blender are not 'programmers', so often those savvy people will do some very hackish things and this has been the best workaround I've been able to use.
Another workaround (and this has so many flags on the play that it should disqualify me from touching a keyboard) is to manually locate the init file and manually add it to globals with importlib ... this comes with risks. Some modules will play alright when you do this, other modules will toss crapfits that can lead to extra special troubleshooting sessions. Keep it to like versions and it does cut down on the issues and I've had 'alright' luck with using this to import modulus from behind virtual envs, but there is a reason why I use subprocess calls when working with blender's python.
def importfromfilelocation(x,y,z):
#"""x = 'tk',y = "tkinter", z =r'C:\pyth"""
mod_alis = x
spec = importlib.util.spec_from_file_location(y, z)
print(spec)
mod_alis = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod_alis)
globals()[str(x)]= mod_alis
Another "workaround" is to use IPC/RPC and run isolated packages in services. If the dependencies are on the different libraries, maybe can separate by the usage of the packages.