In my free time I am developing a small python project based on django. I use several other python packages to improve user and my development experience.
I have understand, that I run pip install pkgname to install a module named pkgname and then I use from pkgname import somewhat and go one and everything is fine and fluffy. I understand why to use virtualenv and already use serious tools like coverage and travis-ci.
Now, I want to the next step. At the moment, my project is not ready for world wide deployment. But I like to prepare for this as soon as possible. It's called Palco and hosted on GitHub.
Usually, I think, pip is the current default to install python modules. But the application I develop is nothing developers will use in their project but users will. So, nobody will do pip install palco and then write from palco import hellyeah. But they will configure their favourite web server to run django's wsgi-handler.
Currently, I would tell people,
please download this archive.zip, extract it and configure your webserver as mentioned
But this is a very php-way of deployment (just throw your files in a directory).
How make I my end-user python project deployable? Do I need a setup.py? Do I miss something?
I know, trac is installed with pip and then I "deploy" my very on trac-instance with trac-admin.py. Should I have a palco-admin.py with the same magic effect as trac-admin?
Related
I have 10 django projects that use over 50 django apps. Each app is separated in its own project and added to pypi and is getting use by few project. Every thing if fine except every time i work on a project and i want to change some code that is in one of my modules (that happens a lot) I have to open the module project, make my changes, test and publish to pypi then come back to my project update requirements.txt file and get the updated module from pip.
I'm looking for a way to be able to edit module right away from all of my projects. For example instead of getting it from pypi i want to get it from git and be able to commit to the git repository in my venv folder!
I know it seems a little bit crazy but i could save a lot of time! publisher and user of all of the modules is me so I don't mind the user to be able to change as well.
Any thought or suggestion will be appreciated. Also any none pip solution will be fine as well like writing a custom shell script.
I don't know about editing in your venv folder, which I think is not a good practice, but you can install from github by pip. You can use 'pip install git+https://github.com/urltoproject/repository.git'. Fill in the necessary details yourself of course. This also works with other systems like gitlab. You could have a separate development requirement file and a production requirement file to separate the two environments, or you install on the commandline directly with pip.
I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.
My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?
You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.
Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start
However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt
And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.
Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.
Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.
For deployment/distribution of package, you can choose to
Distribute by source code. User need to run python setup.py --install, or
Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>
However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.
In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.
Note : virtualenv itself is actually a hack to achieve isolation.
First let me explain the current situation:
We do have several python applications which depend on custom (not public released ones) as well as general known packages. These depedencies are all installed on the system python installation. Distribution of the application is done via git by source. All these computers are hidden inside a corporate network and don't have internet access.
This approach is bit pain in the ass since it has the following downsides:
Libs have to be installed manually on each computer :(
How to better deploy an application? I recently saw virtualenv which seems to be the solution but I don't see it yet.
virtualenv creates a clean python instance for my application. How exactly should I deploy this so that usesrs of the software can easily start it?
Should there be a startup script inside the application which creates the virtualenv during start?
The next problem is that the computers don't have internet access. I know that I can specify a custom location for packages (network share?) but is that the right approach? Or should I deploy the zipped packages too?
Would another approach would be to ship the whole python instance? So the user doesn't have to startup the virutalenv? In this python instance all necessary packages would be pre-installed.
Since our apps are fast growing we have a fast release cycle (2 weeks). Deploying via git was very easy. Users could pull from a stable branch via an update script to get the last release - would that be still possible or are there better approaches?
I know that there are a lot questions. Hopefully someone can answer me r give me some advice.
You can use pip to install directly from git:
pip install -e git+http://192.168.1.1/git/packagename#egg=packagename
This applies whether you use virtualenv (which you should) or not.
You can also create a requirements.txt file containing all the stuff you want installed:
-e git+http://192.168.1.1/git/packagename#egg=packagename
-e git+http://192.168.1.1/git/packagename2#egg=packagename2
And then you just do this:
pip install -r requirements.txt
So the deployment procedure would consist in getting the requirements.txt file and then executing the above command. Adding virtualenv would make it cleaner, not easier; without virtualenv you would pollute the systemwide Python installation. virtualenv is meant to provide a solution for running many apps each in its own distinct virtual Python environment; it doesn't have much to do with how to actually install stuff in that environment.
I'm new to Django and Python in general. However, I have been playing around with setting up a Django projects when I came across this: https://github.com/jart/django-bone
Its instructions for setup confuse me. Question:
How do I setup and install django-bone on a Mac?
I have Python and Django all installed and working already.
I would suggest to start out by trying to get all django-bone stuff running on your own.
It seems to require some packages which is not out of the box, like NPM which is a packet handler for nodejs.
But if you insist. Clone the repo from github. Place it in a suitable directory and execute the bash-script with ./django-bone yourproject
This will create alot of files (which I guess is to ensure depenacies).
I have developed a python CGI application which works just fine on my development box. My hosting provider however gives me little control of its server: I use a lot of custom stuff in my python environment (like sqlalchemy and mako templating) and the servers python version is far too old to be used. My question is: how do I set up a isolated, complete, standalone python environment in my home directory and install my required modules to run my app? ...the easiest way ;)
how do I set up a isolated, complete, standalone python environment in my home directory
mkdir /home/me/.local (if it doesn't already exist. You don't have to use .local but it is becoming the normal place to put this)
mkdir /home/me/.local/src (ditto)
cd /home/me/.local/src
wget http://python.org/ftp/python/2.6.4/Python-2.6.4.tgz
gzip -d Python-2.6.4.tgz
tar xf Python-2.6.4.tar
cd Python-2.6.4
./configure --prefix=/home/me/.local
make
make install
Hopefully you can now run Python:
/home/me/.local/bin/python
Install packages you need using the usual setup.py script, but with your version of Python:
/home/me/.local/bin/python setup.py install
Set hashbang on CGI files to use your version of Python:
#!/home/me/.local/bin/python
Consider migrating your application to WSGI if you can. You can of course still deploy WSGI apps through CGI using a wsgiref.handlers.CGIHandler for now, but in the future when you have a less woeful hosting environment you'll be able to deploy using a much less wasteful server interface such as mod_wsgi.
In your shoes, I'd use pyinstaller to bundle Python, my code, and all my dependencies into one installer executable, upload it, and run it. Just be sure to use the SVN trunk of pyinstaller -- the "released" version is WAY obsolete.
Be aware that with SQLAlchemy and everything else, with CGI you may find out you're really slow, since you're paying the full startup price everytime the page gets visited. But if CGI is all you can afford, I guess that's the way I would try to cope!-)
This looks like a job for virtualenv. From the site:
Also, what if you can't install packages into the global site-packages directory? For instance, on a shared host.
This looks to be right up your alley.
I am on Dreamhost's shared plan. Besides CGI, they also offer FastCGI which makes things much faster than CGI. You should check if your hosting provider offers that. Or maybe they provide Passenger for Ruby that you could piggyback your Python with.
If you compile Python yourself, keep in mind the UCS setting if you try to install precompiled packages and experience failures. See the StackOverflow article. Dreamhost's wiki has some advice on how you could build and deploy Python yourself on their servers; you might want to adapt to your needs.