Given a .py file in our local system, is there a way to find its dependancies?
By dependancies I mean the import statements that we specify.
tried pip show [package name] but it does not give the dependancies for a .py file in our local system
Use freeze to get all the dependencies installed in your environment. (This will list all the dependencies even though you have not used it in the project but installed.)
pip freeze > requirements.txt
If you want to list only the used ones use pipreqs,
pip install pipreqs
then,
pipreqs path/to/project
Using Windows
Learning about virtualenv. Here is my understanding of it and a few question that I have. Please correct me if my understanding is incorrect.
virtualenv are environments where your pip dependencies and its selected version are stored for a particular project. A folder is made for your project and inside there are the dependencies.
I was told you would not want to save your .py scripts in side of virtual ENV, if that's the case how do I access the virtual env when I want to run that project? Open it up in the command line under source ENV/bin/activate then cd my way to where my script is stored?
By running pip freeze that creates a requirements.txt file in that project folder that is just a txt. copy of the dependencies of that virtual env?
If I'm in a second virutalenv who do I import another virtualenv's requirements? I've been to the documentation but I still don't get it.
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
Guess I'm confused on the "requirements" description. Isn't best practice to always call our requirements, requirements.txt? If that's the case how does env2 know I'm want env1 requirements?
Thank you for any info or suggestions. Really appreciate the assistance.
I created a virtualenv C:\Users\admin\Documents\Enviorments>virtualenv django_1
Using base prefix'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32'
New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
* disclaimer * I mainly use conda environments instead of virtualenv, but I believe that most of this is the same across both of them and is true to your case.
You should be able to access your scripts from any environment you are in. If you have virtenvA and virtenvB then you can access your script from inside either of your environments. All you would do is activate one of them and then run python /path/to/my/script.py, but you need to make sure any dependent libraries are installed.
Correct, but for clarity the requirements file contains a list of the dependencies by name only. It doesn't contain any actual code or packages. You can print out a requirements file but it should just be a list which says package names and their version numbers. Like pandas 1.0.1 numpy 1.0.1 scipy 1.0.1 etc.
In the lines of code you have here you would export the dependencies list of env1 and then you would install these dependencies in env2. If env2 was empty then it will now just be a copy of env1, otherwise it will be the same but with all the packages of env1 added and if it had a different version number of some of the same packages then this would be overwritten
virtualenv simply creates a new Python environment for your project. Think of it as another copy of Python that you have in your system. Virutual environment is helpful for development, especially if you will need different versions of the same libraries.
Answer to your first question is, yes, for each project that you use virtualenv, you need to activate it first. After activating, when you run python script, not just your project's scripts, but any python script, will use dependencies and configuration of the active Python environment.
Answer to the second question, pip freeze > requirements.txt will create requirements file in active folder, not in your project folder. So, let's say in your cmd/terminal you are in C:\Desktop, then the requirements file will be created there. If you're in C\Desktop\myproject folder, the file will be created there. Requirements file will contain the packages installed on active virtualenv.
Answer to 3rd question is related to second. Simply, you need to write full path of the second requirements file. So if you are in first project and want to install packages from second virtualenv, you run it like env2/bin/pip install -r /path/to/my/first/requirements.txt. If in your terminal you are in active folder that does not have requirements.txt file, then running pip install will give you an error. So, running the command does not know which requirements file you want to use, you specify it.
I created a virtualenv
C:\Users\admin\Documents\Enviorments>virtualenv django_1 Using base prefix 'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32' New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
Yes, saving virtualenv separately from your project files is one of concepts. virtualenvwrapper and pipenv works like that. But personally if I use virtualenv in the simplest form, then I just create the directory with the same name inside virtualenv's directory (next to bin/) and I keep project files there.
pip freeze prints to console the packages (and it's versions) you've installed inside your virtualenv using pip. If you want to save those requirements to file you should do something like pip freeze > requirements.txt
There are few posibilites:
you can activate one virtualenv, then go (cd /path/to/venv2) to another virtualenv.
you can copy your requirements.txt file from one virtualenv and install those requirements in your second virtualenv
I have a python script that use additional libraries. I want to write a additional script for downloading these needed packages. That script will run by the user before run the actual script. I am using Linux environment and all packages downloaded with pip or sudo from command line. What is the proper way to do serve that kind of script? Is setup.py created for that purpose?
There is a Pypi way to do that, use requirements.txt file.
# requirements.txt
numpy==1.5.1
scipy==0.9.0
Then run:
pip install -r requirements.txt
You can use pip freeze to get the currently installed packages.
pip freeze | grep numpy
# numpy==1.5.1
I plan on sharing a Python program on GitHub.
However it uses additional libraries like http, Selenium, BeautifulSoup and the Google Calendar API.
How do I include these libraries in the directory I push to GitHub, so that all users have to do is run python script.py, instead of having to install the libraries?
I thought of generating an executable with pyinstaller but that didn't work :/
You use a pip requirements.txt file for this.
If you have done your work inside of a virtual environment then run on your command line/terminal:
pip freeze > requirements.txt
Then commit and push the file to your github repository.
If you have not done your script within a virtual environment then run:
pip freeze > requirements.txt
And edit the file so you only have the modules you require.
I'd recommend always use a virtual environment for this since it makes your application easy to share. In django framework employing virtualenv's is very common.
Your collaborators can install your dependencies using:
pip install -r requirements.txt
After cloning your github repo.
Usually you don't need to embed your dependencies in your project (not practical! specially when they are many). Instead you may include requirements.txt inside your project to list the modules (and version number) that are required by your application. Then when a user need to use your script, they can run something like this:
pip install -r requirements.txt
read more about requirements files here:
https://pip.readthedocs.org/en/1.1/requirements.html#requirements-files
I have a requirements.txt file with several dependencies listed.
Whenever I try pip install -r requirements.txt in a brand new system, this will usually fail when some dependency is not met (see for example: here and here) This happens specially with the matplotlib package.
After downloading the entire package (in the case of matplotlib it's about 50Mb) and failing to install it, I go and fix the issue and then attempt to install the package again.
pip does not seem to be smart enough to realize it just downloaded that package and automatically re-use that same file (perhaps because it keeps no copy by default?) so the package will be downloaded entirely again.
To get around this issue I can follow the instructions given here and use:
pip install --download=/path/to/packages -r requirements.txt
to first download all packages and then:
pip install --no-index --find-links=/path/to/packages -r requirements.txt
to install all the packages using the locally stored files.
My question: is there a smart command that includes both these directives? I'm after a single line I can run repeatedly so that it will use stored copies of the packages if they exist or download them if they don't, and in this last case, store those copies to some location so they can be re-used later on if needed.