I'm new to Django and Python in general. However, I have been playing around with setting up a Django projects when I came across this: https://github.com/jart/django-bone
Its instructions for setup confuse me. Question:
How do I setup and install django-bone on a Mac?
I have Python and Django all installed and working already.
I would suggest to start out by trying to get all django-bone stuff running on your own.
It seems to require some packages which is not out of the box, like NPM which is a packet handler for nodejs.
But if you insist. Clone the repo from github. Place it in a suitable directory and execute the bash-script with ./django-bone yourproject
This will create alot of files (which I guess is to ensure depenacies).
Related
I'm looking to set up python on a new machine.
I've found many instructions on this however I'm concerned with keeping the main installation clean so that each future environment can be modified specifically while I become familiar with the ins and outs of the program and packages.
I've installed python and git on my old machine and having not really known anything I did all the installs via the admin account and made all settings global.
Later discovered this was likely not the best way to do it.
I wonder if anyone here might be able to point this crayon eater in the right direction?
Would I be best off to make a user account on the computer specifically for my developing projects and install python, git, etc locally on this profile? Or are there parts of the install which one would want to have installed from the admin account?
It is OK to have git installed globally. Just create a new repository for each project, using git init.
For maintaining python dependencies per project, consider using virtualenv or pyenv. They create virtual environments which can be activated and deactivated and keep you from cluttering up your globally install python packages.
An alternative is to create a Docker image for each project and run your projects inside Docker containers.
If you are a beginner, the latter might be an overkill.
I have 10 django projects that use over 50 django apps. Each app is separated in its own project and added to pypi and is getting use by few project. Every thing if fine except every time i work on a project and i want to change some code that is in one of my modules (that happens a lot) I have to open the module project, make my changes, test and publish to pypi then come back to my project update requirements.txt file and get the updated module from pip.
I'm looking for a way to be able to edit module right away from all of my projects. For example instead of getting it from pypi i want to get it from git and be able to commit to the git repository in my venv folder!
I know it seems a little bit crazy but i could save a lot of time! publisher and user of all of the modules is me so I don't mind the user to be able to change as well.
Any thought or suggestion will be appreciated. Also any none pip solution will be fine as well like writing a custom shell script.
I don't know about editing in your venv folder, which I think is not a good practice, but you can install from github by pip. You can use 'pip install git+https://github.com/urltoproject/repository.git'. Fill in the necessary details yourself of course. This also works with other systems like gitlab. You could have a separate development requirement file and a production requirement file to separate the two environments, or you install on the commandline directly with pip.
In my free time I am developing a small python project based on django. I use several other python packages to improve user and my development experience.
I have understand, that I run pip install pkgname to install a module named pkgname and then I use from pkgname import somewhat and go one and everything is fine and fluffy. I understand why to use virtualenv and already use serious tools like coverage and travis-ci.
Now, I want to the next step. At the moment, my project is not ready for world wide deployment. But I like to prepare for this as soon as possible. It's called Palco and hosted on GitHub.
Usually, I think, pip is the current default to install python modules. But the application I develop is nothing developers will use in their project but users will. So, nobody will do pip install palco and then write from palco import hellyeah. But they will configure their favourite web server to run django's wsgi-handler.
Currently, I would tell people,
please download this archive.zip, extract it and configure your webserver as mentioned
But this is a very php-way of deployment (just throw your files in a directory).
How make I my end-user python project deployable? Do I need a setup.py? Do I miss something?
I know, trac is installed with pip and then I "deploy" my very on trac-instance with trac-admin.py. Should I have a palco-admin.py with the same magic effect as trac-admin?
First let me explain the current situation:
We do have several python applications which depend on custom (not public released ones) as well as general known packages. These depedencies are all installed on the system python installation. Distribution of the application is done via git by source. All these computers are hidden inside a corporate network and don't have internet access.
This approach is bit pain in the ass since it has the following downsides:
Libs have to be installed manually on each computer :(
How to better deploy an application? I recently saw virtualenv which seems to be the solution but I don't see it yet.
virtualenv creates a clean python instance for my application. How exactly should I deploy this so that usesrs of the software can easily start it?
Should there be a startup script inside the application which creates the virtualenv during start?
The next problem is that the computers don't have internet access. I know that I can specify a custom location for packages (network share?) but is that the right approach? Or should I deploy the zipped packages too?
Would another approach would be to ship the whole python instance? So the user doesn't have to startup the virutalenv? In this python instance all necessary packages would be pre-installed.
Since our apps are fast growing we have a fast release cycle (2 weeks). Deploying via git was very easy. Users could pull from a stable branch via an update script to get the last release - would that be still possible or are there better approaches?
I know that there are a lot questions. Hopefully someone can answer me r give me some advice.
You can use pip to install directly from git:
pip install -e git+http://192.168.1.1/git/packagename#egg=packagename
This applies whether you use virtualenv (which you should) or not.
You can also create a requirements.txt file containing all the stuff you want installed:
-e git+http://192.168.1.1/git/packagename#egg=packagename
-e git+http://192.168.1.1/git/packagename2#egg=packagename2
And then you just do this:
pip install -r requirements.txt
So the deployment procedure would consist in getting the requirements.txt file and then executing the above command. Adding virtualenv would make it cleaner, not easier; without virtualenv you would pollute the systemwide Python installation. virtualenv is meant to provide a solution for running many apps each in its own distinct virtual Python environment; it doesn't have much to do with how to actually install stuff in that environment.
I have developed a python CGI application which works just fine on my development box. My hosting provider however gives me little control of its server: I use a lot of custom stuff in my python environment (like sqlalchemy and mako templating) and the servers python version is far too old to be used. My question is: how do I set up a isolated, complete, standalone python environment in my home directory and install my required modules to run my app? ...the easiest way ;)
how do I set up a isolated, complete, standalone python environment in my home directory
mkdir /home/me/.local (if it doesn't already exist. You don't have to use .local but it is becoming the normal place to put this)
mkdir /home/me/.local/src (ditto)
cd /home/me/.local/src
wget http://python.org/ftp/python/2.6.4/Python-2.6.4.tgz
gzip -d Python-2.6.4.tgz
tar xf Python-2.6.4.tar
cd Python-2.6.4
./configure --prefix=/home/me/.local
make
make install
Hopefully you can now run Python:
/home/me/.local/bin/python
Install packages you need using the usual setup.py script, but with your version of Python:
/home/me/.local/bin/python setup.py install
Set hashbang on CGI files to use your version of Python:
#!/home/me/.local/bin/python
Consider migrating your application to WSGI if you can. You can of course still deploy WSGI apps through CGI using a wsgiref.handlers.CGIHandler for now, but in the future when you have a less woeful hosting environment you'll be able to deploy using a much less wasteful server interface such as mod_wsgi.
In your shoes, I'd use pyinstaller to bundle Python, my code, and all my dependencies into one installer executable, upload it, and run it. Just be sure to use the SVN trunk of pyinstaller -- the "released" version is WAY obsolete.
Be aware that with SQLAlchemy and everything else, with CGI you may find out you're really slow, since you're paying the full startup price everytime the page gets visited. But if CGI is all you can afford, I guess that's the way I would try to cope!-)
This looks like a job for virtualenv. From the site:
Also, what if you can't install packages into the global site-packages directory? For instance, on a shared host.
This looks to be right up your alley.
I am on Dreamhost's shared plan. Besides CGI, they also offer FastCGI which makes things much faster than CGI. You should check if your hosting provider offers that. Or maybe they provide Passenger for Ruby that you could piggyback your Python with.
If you compile Python yourself, keep in mind the UCS setting if you try to install precompiled packages and experience failures. See the StackOverflow article. Dreamhost's wiki has some advice on how you could build and deploy Python yourself on their servers; you might want to adapt to your needs.