About the Pipfile that pipenv generates: when I run pipenv shell in a specific folder, the virtual environment works just fine, and everything I install in there works fine as well, but the Pipfile doesn't seem to be updating with the packages I'm installing. When I check the dependency tree with pipenv graph, it shows all the dependencies I've been using. Is there something I'm missing with how pipenv works?
Note:
Whenever I want to create a new env i follow these steps:
mkdir app
cd app
pipenv shell
pip install <somepackage>
touch main.py # (add my code)
python main.py
You have to install packages using the command pipenv install [package] in order for pipenv to create/update the Pipfile and Pipfile.lock files.
As you already installed the dependencies with pip, you can run pipenv run pip freeze > requirements.txt && pipenv install -r requirements.txt and it will create or update the aforementioned files for you. It is best, though, that you declare each package you want because this method also writes each package dependencies on these files.
Read more here
Related
I know about requirements.txt, but that only includes a list of dependencies.
But what about the other meta information like the package name, author, main function etc. ?
Also i know about setup.py but since i want to programmatically access values inside, i need a configuaration file standard like yaml/json and not python code.
Did the python community come out with something truly comparable to package.json ?
1. Without 3rd Party Packages
pip freeze > requirements.txt
In the local machine. And in the server,
pip install -r requirements.txt
This installs all the dependencies
2. With a 3rd Party Package
pipenv
I would use pipenv instead of pip. pipenv automatically generate Pipfile and Pipfile.lock that is far superior to requirements.txt
Install pipenv and setting it for your project
pip install --user pipenv
cd yourproject
pipenv install package1 package2 ...
to install packages from Pipfile is as simple as
pipenv install
Read more: https://pipenv.pypa.io/en/latest/
poetry
I have recently moved from pipenv to poetry because poetry has everything pipenv offers and much more. It is end-to-end, as it includes building and publishing of your project to pypi.
installing poetry
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
and set .poetry/bin in your path.
poetry new yourproject
cd yourproject
poetry add packagename
Like pipenv this generate pyproject.toml file that context all your requirements. Like Pipenv, to install your depence
poetry install
See more: https://poetry.eustace.io/docs/
See Python packaging war: Pipenv vs. Poetry for short review of these awesome packages
Like the comments on my question answered, it's pyproject.toml together with the poetry package management tool
I have installed my needed packages (dependencies) in the global system environment instead of my virtual environment (virtualenv) because i have used the command pip install <package-name> outside of the virtual environment.
So i want to know how can i make a list out of them and install them in any of my virtualenvs?
This is useful in a case that anyhow you have installed some packages (dependencies) in your global system environment instead of your virtualenv, by mistake.
For example by using the command "pip install", not "pipenv install" (outside of virtual environment).
So the solution is:
In the global system environment (outside of any virtualenv), create a "requirements.txt" file from all of your installed packages:
$ pip freeze > requirements.txt
Import installed dependencies from the created "requirements.txt" file to the Pipfile by running the below command in the root that the above created file "requirements.txt" exist; but first check:
a) If Pipfile not exist:
$ pipenv install
b) If Pipfile do exist (i.e. already created virtualenv):
$ pipenv install -r requirements.txt
Then your package listing files "Pipfile" & "Pipfile.lock" would be updated and locked.
But I personally recommend that for avoiding this problem to happen, always use the command
$ pipenv install
instead of $ pip install.
You can create file in your global system environment the format of this file is like following:
my_backages.txt
$ pip install -U Flask-SQLAlchemy
$ pip install --upgrade
$ pip install flask
then you can use pipfile as next :
$ pip install -r my_backages.txt
I have Anaconda3 installed on my Windows 10 computer. I want to install the pysystemtrade package from GitHub. This is the instructions from the author
"This package isn't hosted on pip. So to get the code the easiest way is to use git:
git clone https://github.com/robcarver17/pysystemtrade.git
python3 setup.py develop"
The question is, where do I clone the project to and where do I run setup.py to get it installed in the correct place in Anaconda3 so I can include the files in my python project?
Thanks,
Dana
Create a conda environment with the python version, switch to that environment to keep track of dependencies for that environment. Then clone the Git repo. Afterwards, install the requirements.txt for that Github project, and run the command he gave you to add that dependency to your conda environment.
conda create -n trade-test python=3.8
conda activate trade-test
git clone https://github.com/robcarver17/pysystemtrade.git
cd pysystemtrade/
pip3 install -r requirements.txt
python3 setup.py develop
I just created a pipenv environment using pipenv --python 3.9. I then did pipenv shell and started installing packages with pip install. It turns out this doesn't seem to be the usual way of doing things with pipenv. Is there any command I can run to update the Pipfile with all the packages I installed with pip install? I searched but couldn't find anything.
When you have multiple packages you'd like to install, you usually have whats called a requirements.txt file, that contains all the packages you'd like to use for your project.
You can run
$ pipenv run pip freeze > requirements.txt
To generate the requirements file to the current directory you're in, while the virtual environment is active.
Initially you're going to have to install all your packages manually. But after you can run
$ pipenv install -r path/to/requirements.txt
to import all the packages in requirements.txt from the shell/virtual environment.
Instead of running pipenv shell and then pip install <package>, you should simply run pipenv install <package> (outside the virtual environment, from the same location you ran pipenv --python 3.9).
This will install the package in your virtual environment and will automatically update your Pipefile and Pipfile.lock files.
You may skip the Pipfile.lock update by using --skip-lock flag - pipenv install --skip-lock <package>
You can use pipreqs which generates requirements.txt file based on imports.
pip install pipreqs
pipreqs
pipenv install -r requirements.txt
I installed pip by downloading virtualenv, and creating a bootstrap virtualenv, as described in this answer.
Now I want to try out pipenv, so I used my bootstrap virtualenv to create a new virtualenv and then ran pip install pipenv. Now I can use pipenv, but it sees that it's already running in a virtualenv and doesn't create a new one.
How can I get pipenv to create a new virtualenv so I can have separate virtualenvs for each project? I tried pipenv install -h, but none of the options look promising.
The current documentation makes it sound like you can set the environment variable PIPENV_IGNORE_VIRTUALENVS to avoid reusing an already activated virtualenv:
source ~/some/virtualenv/location/bin/activate
PIPENV_IGNORE_VIRTUALENVS=1 pipenv install
I have to admit that I haven't tried this, though.
If you're in a new project directory, these commands create a new virtualenv using pipenv:
Create a new virtualenv with python 2:
pipenv --two
Create a new virtualenv with python 3:
pipenv --three
Create a new virtualenv with an arbitrary python version:
pipenv --python 3.6.4
It looks like pipenv has gotten smarter about this situation. Here's what worked for me. First, I installed a bootstrap environment following virtualenv's installation documentation to use it locally from source. That way, I don't need to touch the system Python, and I can install pipenv in the bootstrap environment:
$ curl --location --output virtualenv.tar.gz https://github.com/pypa/virtualenv/tarball/16.1.0
$ tar -xzf virtualenv.tar.gz
$ python pypa-virtualenv-4ad2742/src/virtualenv.py vbootstrap
$ rm -r virtualenv.tar.gz pypa-virtualenv-4ad2742/
$ vbootstrap/bin/pip install pipenv
Then I created a new project folder, and used pipenv to install numpy:
$ mkdir my_project
$ cd my_project
$ ../vbootstrap/bin/pipenv install numpy
Creating a virtualenv for this project...
Pipfile: /home/vagrant/my_project/Pipfile
Using /home/vagrant/vbootstrap/bin/python (2.7.15rc1) to create virtualenv...
✔ Complete
Already using interpreter /home/vagrant/vbootstrap/bin/python
Using real prefix '/usr'
New python executable in /home/vagrant/.local/share/virtualenvs/my_project-KmT425B_/bin/python
Installing setuptools, pip, wheel...
done.
Virtualenv location: /home/vagrant/.local/share/virtualenvs/my_project-KmT425B_
Creating a Pipfile for this project...
Installing numpy...
Adding numpy to Pipfile's [packages]...
✔ Installation Succeeded
Pipfile.lock not found, creating...
Locking [dev-packages] dependencies...
Locking [packages] dependencies...
✔ Success!
Updated Pipfile.lock (57a39c)!
Installing dependencies from Pipfile.lock (57a39c)...
🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 1/1 — 00:00:00
To activate this project's virtualenv, run pipenv shell.
Alternatively, run a command inside the virtualenv with pipenv run.
To make it easier to use, I created a symbolic link:
$ ln -s ~/vbootstrap/bin/pipenv ~/pipenv
$ ~/pipenv shell
Launching subshell in virtual environment...
vagrant#vagrant:~/my_project$ . /home/vagrant/.local/share/virtualenvs/my_project-KmT425B_/bin/activate
(my_project) $