Given a .py file in our local system, is there a way to find its dependancies?
By dependancies I mean the import statements that we specify.
tried pip show [package name] but it does not give the dependancies for a .py file in our local system
Use freeze to get all the dependencies installed in your environment. (This will list all the dependencies even though you have not used it in the project but installed.)
pip freeze > requirements.txt
If you want to list only the used ones use pipreqs,
pip install pipreqs
then,
pipreqs path/to/project
Related
I have a requirements.txt file that contains both normal package names (to install from pypi) and paths to local tar.gz packages within the repo, e.g.
flask
pandas
local_dir/local_pkg.tar.gz
The problem is, there are two different deployment pipelines used for this repo, which both need to work.
The first will only run pip install -r requirements.txt (I cannot modify this command or add any additional options), but it always runs it from the base repo path. So currently, this runs successfully without issue.
The second is the problem. It runs from a different location entirely, and installs the packages via pip install -r /path/to/repo/requirements.txt. The trouble is, pip install doesn't automatically look in /path/to/repo/ for the listed local package path (local_dir/local_pkg.tar.gz); it instead looks for that local package path in the location where the command is being run. It obviously can't find the local package there, and so throws an error.
With this second deployment pipeline, I can add additional options to pip install. However, I've tried out some of the listed options and cannot find anything that resolves my issue.
tl;dir:
How can I modify the pip install -r /path/to/repo/requirements.txt command, so that it looks for local packages as if it's running from /path/to/repo/ (regardless of where the command is actually being run from)?
I have created a program that uses some external libraries such as matplotlib, NumPy, pandas... I want to share my program with other users. The problem is that those users need to install those libraries and I want to avoid that. I have read that an executable file can be created, however, I want the other users could also be able to edit the program. How could I do this?
If you share the program as an executable file then it is true that the user won't have to install the libraries. But once the code is converted it can not be edited, each time you edit the code you will need to convert it every time to update the version. If you want to share editable code then the best option is to share a requirements.txt file along with the program.
You can generate requirements.txt which lists all your current dependencies. Use pipreqs module to list only the libraries used in your program.
install pipreqs (without $)
$ pip install pipreqs
Then go to your program directory and in the terminal/PowerShell/command prompt type (without $)
$ pipreq .
it will generate requirements.txt in the program directory zip the entire folder and share it. User need to install from this requirements.txt file, they can install from the file using the following command (without $)
$ pip install -r requirements.txt
What you need is to add a requirements.txt. This is a file where one specifies dependencies for a certain project. For example your program can have dependency on a certain NumPy version and by adding it to the requirements.txt file, when another user needs to install the dependencies for that project, he can easily automate it.
If you are using a virtual environment, you could simply get the environment.txt doing a pip freeze > requirements.txt. Otherwise you might need to add the used libraries to this file by yourself.
To let other people install the dependencies on their environment, they will need to execute the following pip command instead of installing every single module:
$ pip install -r requirements.txt
As mentioned in the comments, using the requirements file is the way to go and a common standard. You can create the requirements file using the following pip command:
$ cd <root directory>
$ pip freeze > requirements.txt
Using Windows
Learning about virtualenv. Here is my understanding of it and a few question that I have. Please correct me if my understanding is incorrect.
virtualenv are environments where your pip dependencies and its selected version are stored for a particular project. A folder is made for your project and inside there are the dependencies.
I was told you would not want to save your .py scripts in side of virtual ENV, if that's the case how do I access the virtual env when I want to run that project? Open it up in the command line under source ENV/bin/activate then cd my way to where my script is stored?
By running pip freeze that creates a requirements.txt file in that project folder that is just a txt. copy of the dependencies of that virtual env?
If I'm in a second virutalenv who do I import another virtualenv's requirements? I've been to the documentation but I still don't get it.
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
Guess I'm confused on the "requirements" description. Isn't best practice to always call our requirements, requirements.txt? If that's the case how does env2 know I'm want env1 requirements?
Thank you for any info or suggestions. Really appreciate the assistance.
I created a virtualenv C:\Users\admin\Documents\Enviorments>virtualenv django_1
Using base prefix'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32'
New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
* disclaimer * I mainly use conda environments instead of virtualenv, but I believe that most of this is the same across both of them and is true to your case.
You should be able to access your scripts from any environment you are in. If you have virtenvA and virtenvB then you can access your script from inside either of your environments. All you would do is activate one of them and then run python /path/to/my/script.py, but you need to make sure any dependent libraries are installed.
Correct, but for clarity the requirements file contains a list of the dependencies by name only. It doesn't contain any actual code or packages. You can print out a requirements file but it should just be a list which says package names and their version numbers. Like pandas 1.0.1 numpy 1.0.1 scipy 1.0.1 etc.
In the lines of code you have here you would export the dependencies list of env1 and then you would install these dependencies in env2. If env2 was empty then it will now just be a copy of env1, otherwise it will be the same but with all the packages of env1 added and if it had a different version number of some of the same packages then this would be overwritten
virtualenv simply creates a new Python environment for your project. Think of it as another copy of Python that you have in your system. Virutual environment is helpful for development, especially if you will need different versions of the same libraries.
Answer to your first question is, yes, for each project that you use virtualenv, you need to activate it first. After activating, when you run python script, not just your project's scripts, but any python script, will use dependencies and configuration of the active Python environment.
Answer to the second question, pip freeze > requirements.txt will create requirements file in active folder, not in your project folder. So, let's say in your cmd/terminal you are in C:\Desktop, then the requirements file will be created there. If you're in C\Desktop\myproject folder, the file will be created there. Requirements file will contain the packages installed on active virtualenv.
Answer to 3rd question is related to second. Simply, you need to write full path of the second requirements file. So if you are in first project and want to install packages from second virtualenv, you run it like env2/bin/pip install -r /path/to/my/first/requirements.txt. If in your terminal you are in active folder that does not have requirements.txt file, then running pip install will give you an error. So, running the command does not know which requirements file you want to use, you specify it.
I created a virtualenv
C:\Users\admin\Documents\Enviorments>virtualenv django_1 Using base prefix 'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32' New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
Yes, saving virtualenv separately from your project files is one of concepts. virtualenvwrapper and pipenv works like that. But personally if I use virtualenv in the simplest form, then I just create the directory with the same name inside virtualenv's directory (next to bin/) and I keep project files there.
pip freeze prints to console the packages (and it's versions) you've installed inside your virtualenv using pip. If you want to save those requirements to file you should do something like pip freeze > requirements.txt
There are few posibilites:
you can activate one virtualenv, then go (cd /path/to/venv2) to another virtualenv.
you can copy your requirements.txt file from one virtualenv and install those requirements in your second virtualenv
I struggle understanding the pipreqs behaviour.
In my project, I use a virtualenv (Pipreqs is installed globally). I have a requirements.txt file that I first wrote manually, then I did a pip freeze and now I have way more packages than I want (because there are dependencies of dependencies...).
So I found pipreqs to get a requirements file with only the packages that matter.
I activate my virtualenv then pipreqs --savepath requirements2.txt --use-local.
I use --use-local cause I want the currently installed versions of my packages.
And I end up with a file with only one package (not even the version used in my project)...
Without --use-local it seems to retrieve the correct number of packages but the versions are the "up-to-date" ones, not the ones in my project.
Am I doing something wrong ?
Thanks for the help
I am using pip freeze > requirements.txt and noticed some unfamiliar libraries that were added to the requirements file. Does pip freeze only capture the libraries and dependencies that are specific to that directory or from the entire system?
As you have noticed, pip freeze doesn't capture the libraries specific to a directory but indeed all the package installed in the current environment (most likely the packages installed on your system or, if you are in a virtual environment without global access, those from that virtual environment).
You can try pip freeze from an other directory an see that you are having the same results.
If you want to obtain the list of dependency packages for a specific project you might be interested in the pipreqs package which precisely do that.