I have just started learning Django, and I'm facing some problems regarding the 'copy' of the projects.
I have two computers and I would like to use both computers for my development. When I was learning PHP (at that time I didnt even know how to use Github), all I had to do was set up a webserver on both computers, and upload the whole files through Google Drive (from one computer) and then download it from the other computer.
However, it seems to me that Django is somewhat different since it is a framework and has a lot of setting ups before starting a project (including virtual environment; I am following a Youtube tutorial and it says that I would be better off if I used virtualenv). I thought it wouldn't work just by downloading the whole project folder to the other computer.
Currently, I have uploaded the whole virtual environment folder on Github.
So, to list my questions,
When downloading it on the other computer, should I setup the virtual environment on that computer and then download the folder?...
Is there any way that I can only sync or commit the files that has been changed in the project automatically?
(That is, i have to change many files in django projects(views, urls, settings... etc) but it would be hard to remember all the files that i have changed and then seperately commit those ones)
Any help would be appreciated.
Edit: Consider using pipenv
I suggest that you also install virtualenvwrapper (here). virtualenvwrapper keeps all files except your project at another location so your project directory contains only your files and you can safely use git add --all.
After its installed, do:
$ mkdir my-project; cd my-project
$ mkvirtualenv my-env-name
$ pip install django <more-good-stuff>
$ pip freeze > requirements.txt
$ git init; git add --all; git commit -m "Initial Commit"
... push to github ...
Now go to other machine, and install virtualenv and virtualenvwrapper
$ git clone <url> my-project; cd my-project
$ mkvirtualenv my-env-name
$ pip install -r requirements.txt
... continue your work, commit and push push and win at life :D
you usually don't want to commit everything blindly. It's better if you use git status to see all the files that you changed and then git add those that you want to commit. Once you've verified that you really want to commit all files, you can simply git add --all (more here). And then you git commit.
And, yes, virtualenv is the way to go. You want to create a virtualenv for your project and have all your dependencies in a requirements.txt file. This will allow to have only your source code and no libraries in your git repo, making it much cleaner. This also can allow you to have a set of verified libraries in production, and if you want to try out a new library you can just install it in your local virtualenv. Or even have two virtualenvs and switch, or whatever, and it does not mess your code repo or other machines' installations.
Related
I am looking for a way to setup a GitHub repo in a way that allows someone to clone it and then immediately run the code without needing to setup a virtual environment and then pip install all of the needed packages.
Perhaps I am off base and it is the case that they must install these packages and they can somehow be auto imported, but I'm not sure how that is done.
Or perhaps you can add the virtual environment folder as another folder in the repo, but I feel like you would still have to pip install in that case. (I'm not too familiar with virtual environments.)
I don't want to mess up the project, so I have not messed around with this too much.
Any help would be much appreciated!
Actually, you can create a requirements.txt file for the environment, then anyone (or any build server) that receives a copy of the project needs only to run the
pip install -r requirements.txt
command to reinstall the packages on which the app depends within the active environment.
Hope this gives you some inspiration. Detailed steps please view this tutorial: Create a requirements.txt file for the environment
I cloned a repo and I'm trying to run my tests and i'm getting an interpreter error:
Interpreter path does not exist: C:\Users\username\Source\Repos\citcodownloader\env\Scripts\python.exe
The project downloaded a .sln, solutions view that I opened it with and I thought it set up my enviornment but it doesn't seem to be doing so. Not sure what to do from here.
The best thing you can do is create a (or use an existing) Virtual Environment. It looks like your program is looking for one in the folder "env". Try this:
Open a terminal (Windows key + R, then type cmd and press enter)
Navigate to your repo folder using chdir C:\path\to\your\repo
Run the command env\Scripts\activate.bat (if there is no folder called "env" in your repo, use my instructions below)
Try running your program again.
I hope this helps, post a comment if it doesn't and I'll add as much detail or explanation as you need. Good luck!
For Googlers or people who that doesn't help, look for these files in your repo:
requirements.txt (a list of plugins you need to set up a virtual environment)
venv/ (folder containing a virtual environment)
Solution
If a folder named 'venv' or 'virtualenv' does NOT exist,
run this command to create it: python -m venv venv (or for Python 3: python3 -m venv venv). If it does exist, move forward.
You have a virtual environment! Now enter into it using: source venv/bin/active (on Unix or OSX, see the link above for the Windows command).
If requirements.txt is there, run this command next: pip install -r requirements.txt. If not, move forward.
Run the program again (via whatever method the repo says you should use). If you get 'error: module is not installed' use the command pip install moduleNameHere and run the program again.
Keep doing step 4 for each missing module, once the program is working use pip freeze > requirements.txt to create a requirements file and save yourself the headache next time. :)
I have a custom Python package (call it MyProject) on my filesystem with a setup.py and a requirements.txt. This package needs to be used by a Flask server (which will be deployed on AWS/EC2/EB).
In my Flask project directory, I create a virtualenv and run pip install -e ../path/to/myProject.
But for some reason, MyProject's upstream git repo shows up in pip freeze:
...
llvmlite==0.19.0
-e git+https://github.com/USERNAME/MYPROJECT.git#{some-git-hash}
python-socketio==1.8.0
...
The reference to git is a problem, because the repository is private and the deployment server does not (and should not, and will never) have credentials to access it. The deployment server also doesn't even have git installed (and it seems extremely problematic that pip assumes without my permission that it does). There is nothing in MyProject's requirements.txt or setup.py that alludes to git, so I am not sure where the hell this is coming from.
I can dupe the project to a subdirectory of the Flask project, and then put the following in MyFlaskProject's requirements.txt:
...
llvmlite==0.19.0
./MyProject
python-socketio==1.8.0
...
But this doesn't work, because the path is taken as relative to the working directory of the pip process when it is run, not to requirements.txt. Indeed, it seems pip is broken in this respect. In my case, EC2 runs its install scripts from some other directory (with a full path to requirements.txt specified), and as expected, this fails.
What is the proper way to deploy a custom python package as a dependency of another project?
To install your own python package from a git repo you might want to check this post
To sort out the credential issue, why not having git installed on the EC2? You could simply create an ssh key and share it with MyProject repository.
I am using this solution on ECS instances deployed by Jenkins (with Habitus to hide Jenkin's ssh keys while building the image) and it works fine for me!
I primarily work these days with Python 2.7 and Django 1.3.3 (hosted on Heroku) and I have multiple projects that I maintain. I've been working on a Desktop with Ubuntu running inside of a VirtualBox, but recently had to take a trip and wanted to get everything loaded up on my notebook. But, what I quickly discovered was that virtualenv + Github is really easy for creating projects, but I struggled to try and get them moved over to my notebook. The approach that I sort of came up with was to create new virtualenv and then clone the code from github. But, I couldn't do it in the folder that I really wanted because it would say the folder is not empty. So, I would clone it to a tmp folder than them cut/paste the everthing into where I really wanted it. Not TERRIBLE, but I just feel like I'm missing something here and that it should be easier. Maybe clone first, then mkvirtualenv?
It's not a crushing problem, but I'm thinking about making some more changes (like getting ride of the VirtualBox and just going with a Dual boot system) and it would be great if I could make it a bit smoother. :)
Finally, I found and read a few posts about moving git repos between computers, but I didn't see any dealing with Virtualenv (maybe I just missed it).
EDIT: Just to be clear and avoid confusion, I'm not try to "move" the virtualenv. I'm just talking about best way to create a new one. Install the packages, and then clone the repo from github.
The only workflow you should need is:
git clone repo_url somedir
cd somedir
virtualenv <name of environment directory>
source <name of environment directory>/bin/activate
pip install -r requirements.txt
This assumes that you have run pip freeze > requirements.txt (while the venv is activated) to list all the virtualenv-pip-installed libraries and checked it into the repo.
That's because you're not even supposed to move virtualenvs to different locations on one system (there's relocation support, but it's experimental), let alone from one system to another. Create a new virtualenv:
Install virtualenv on the other system
Get a requirements.txt, either by writing one or by storing the output of pip freeze (and editing the output)
Move the requirements.txt to the other system, create a new virtualenv, and install the libraries via pip install -r requirements.txt.
Clone the git repository on the other system
For more advanced needs, you can create a bootstrapping script which includes virtualenv + custom code to set up anything else.
EDIT: Having the root of the virtualenv and the root of your repository in the same directory seems like a pretty bad idea to me. Put the repository in a directory inside the virtualenv root, or put them into completely separate trees. Not only you avoid git (rightfully -- usually, everything not tracked by git is fair game to delete) complaining about existing files, you can also use the virtualenv for multiple repositories and avoid name collisions.
In addition to scripting creating a new virtualenv, you should make a requirements.txt file that has all of your dependencies (e.g Django1.3), you can then run pip install -r requirements.txt and this will install all of your dependencies for you.
You can even have pip create this for you by doing pip freeze > stable-req.txt which will print out you dependencies as there are in your current virtualenv. You can then keep the requirements.txt under version control.
The nice thing about a virtualenv is that you can describe how to make one, and you can make it repeatedly on multiple platforms.
So, instead of cloning the whole thing, clone a method to create the virtualenv consistently, and have that in your git repository. This way you avoid platform-specific nasties.
I'm using Git for version control on a Django project.
As much as possible, all the source-code that is not part of the project per se, but that the project depends on, is brought in as Git submodules. These live on a lib directory and have to be included on python path. The directory/files layout looks like:
.git
docs
lib
my_project
apps
static
templates
__init__.py
urls.py
manage.py
settings.py
.gitmodules
README
What, would you say, is the best practice for including the libs on python path?
I am using virtualenv, so I could easily sym-link the libraries to the virtualenv's site-packages directory. However, this will tie the virtualenv to this specific project. my understanding is that the virtualenv should not depend on my files. instead, my files should depend on the virtualenv.
I was thinking of using the same virtualenv for different local copies of this project, but if I do things this way I will lose that capability. Any better idea how to approach this?
Update:
The best solution turned out being to let pip manage all the dependencies.
However, this means not being able to use git submodules, as pip can't yet handle relative paths properly. So, external dependencies will have to live on the virtualenv (typically: my_env/src/a_python_module).
I'd still prefer to use submodules, to have some of the dependencies living on my project tree. This makes more sense to me as I've already needed to fork those repos to change some bits of them, and will likely have to change them some more in the future.
dump all your installed packages in a requirement file (requirements.txt looks the standard naming) using
pip freeze > requirements.txt
everytime you need a fresh virtualenv you just have to do:
virtualenv <name> --no-site-packages
pip install -r requirements.txt
the install -r requirements.txt works great also if you want to update to newer packages
just keep requirements.txt in sync with your packages (by running pip freeze every time something changes) and you're done, no matter how many virtualenv you have.
NOTE: if you need to do some development on a package you can install that using the -e (editable) param, this way you can edit the package and you don't have to uninstall/install every time you want to test new stuff :)