I had developed a classic Python Django web application on my local enviroment.
Then, I used Cython to compile some my app's modules as dynamic library (*.so files) to "protect" the source code.
Now, I need to distribute my Django app to one customer (for testing purpose) and I want to provide you all my Django app (with its files) but without some specific files and folders (ex: .git folder, *.pyc, *.py, javascript unobfuscated sources, etc...).
I would like to have some bash command (ex: python setup.py local_deploy) to automatic copy entire app's folder to another folder (ex: build) and remove a specific list of folders or file patterns.
Do you have any suggestion to do that?
I recommend you package your application for deployment, as you said, by running python setup.py sdist or another setuptools script command.
You can specify which package/modules/data to include and/or use a MANIFEST.in file to include/exclude files/directories matching specified patterns.
NB if you need to perform additional steps (besides selecting the correct files and packaging them), you may also add a custom command to setup.py. Though personally, I'd use a bash script / fabric to perform those, outside of packaging..
Related
We are using perforce for our version control. So when i go to build a package
python setup.py sdist
it creates a versioned folder with everything in it and then tries to delete it and fails because it contains a bunch of read-only files. It also fails when trying to write to setup.cfg when that isn't checked out.
What is the proper way to do this from a perforce repo or any package that may have read-only files? Do i have to check out everything first and revert unchanged?
Thanks
I have a project that is constantly undergoing development. I have installed a release of the project in my python distribution's site-packages directory using the setup.py script for the project.
However, when I make changes to the project I would like my test scripts to find the files that are under the project's directory and not those that it finds in site-packages. What is the proper way to do this? I only know of one approach which is to modify the search path in the test script itself using sys.path, but then it means that I cannot use the same scripts to test the "installed" version of my codes without editing the sys.path again.
I'm not quite sure what you are asking but you could use
python setup.py develop to create a develop version of your project
https://pythonhosted.org/setuptools/setuptools.html#development-mode
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
Use "Editable" package installation like:
pip install -e path/to/SomeProject
Assuming we are in the same directory with setup.py, the command will be:
pip install -e .
I've amassed a small collection of small handy ad hoc scripts that I would like to have available to me in all my python projects and ipython interactive sessions. I would like to add to and clean up this collection without having to worry about making setup.py files and installing them formally. From the list of directories on the sys.path by default, what's the proper home for these scripts?
The user site directory should be the right directory for such things.
python -m site --user-site
shows you the correct path for your platform. Usually its something like $HOME/.local/lib/python<version>/site-packages
You can even put a module sitecustomize.py there which will be imported automatically on each interperter startup.
I'm using Python with a Cygwin environment to develop data processing scripts and Python packages I'd like to actively use the scripts while also updating the packages on which those scripts depend. My question is what is the best practice, recommendation for managing the module loading path to isolate and test my development changes but not affect the working of a production script.
Python imports modules in the following order (see M. Lutz, Learning Python)
Home directory.
PYTHONPATH directories.
Standard library directories.
The contents of any *.pth file.
My current solution is to install my packages in a local (not in /usr/lib/python2.x/ ) site-packages directory and add a *.pth file in the global site-packages directory so these are loaded by default. In the development directory I then simply modify PYTHONPATH to load the packages I'm actively working on with local changes.
Is there a more standard way of handling this situation? Setting up a virtualenv or some other way of manipulating the module load path?
This is just my opinion, but I would probably use a combination of virtualenvs and Makefiles/scripts in this case. I haven't done it for your specific use case, but I often set up multiple virtualenvs for a project, each with a different python version. Then I can use Makefiles to run my code or tests in one or all of my virtualenvs. Seems like it wouldn't be too hard to set up a makefile that would let you type make devel to run in the development envionment, and make production for the production environment.
Alternatively, you could use git branches to do this. Keep your production scripts on master, and use feature branches to isolate and test changes while still having your production scripts just a git checkout master away.
Python has ability to "pseudoinstall" a package by running it's setup.py script with develop instead of install. This modifies python environment so package can be imported from it's current location (it's not copied into site-package directory). This allows to develop packages that are used by other packages: source code is modified in place and changes are available to rest of python code via simple import.
All works fine except that setup.py develop command creates an .egg-info folder with metadata at same level as setup.py. Mixing source code and temporary files is not very good idea - this folder need to be added into "ignore" lists of multiple tools starting from vcs and ending backup systems.
Is it possible to use setup.py develop but create .egg-info directory in some other place, so original source code is not polluted by temporary directory and files?
setup.py develop creates a python egg, in-place; it does not [modify the] python environment so package can be imported from it's current location. You still have to either add it's location to the python search path or use the directory it is placed in as the current directory.
It is the job of the develop command to create an in-place egg, which may include compiling C extensions, running the 2to3 python conversion process to create Python3 compatible code, and to provide metadata other python code may be relying on. When you install the package as an egg in your site-packages directory, the same metadata is included there as well. The data is certainly not temporary (it is extracted from your setup.py file for easy parsing by other tools).
The intent is that you can then rely on that metadata when using your package in a wider system that relies on the metadata being present, while still developing the package. For example, in a buildout development deployment, we often use mr.developer to automate the process of fetching the source code for a given package when we need to work on it, which builds it as a develop egg and ties it into the deployment while we work on the code.
Note that the .egg-info directory serves a specific purpose: to signal to other tools in the setuptools eco-system that your package is installed and available. If your package is a dependency of another egg in your setup, then that dependency is satisfied. pip and easy_install and buildout will not try and fetch the egg from somewhere else instead.
Apart from creating the .egg-info directory, the only other thing the command does, is to build extensions, in-place. So the command you are looking for instead is:
setup.py build_ext --inplace
This will do the exact same thing as setup.py develop but leave out the .egg-info directory. It also won't generate the .pth file.
There is no way of generating only the .pth file and leave out the .egg-info directory generation.
Technically speaking, setup.py develop will also check if you have the setuptools site.py file installed to support namespaced packages, but that's not relevant here.
The good manner is to keep all source files inside special directory which name is your project name (programmers using other languages keep their code inside src directory). So if your setup.py file is inside myproject directory then you should keep the files at myproject/myproject. This method keeps your sources separated from other files regardless what happen in main directory.
My suggestion would be to use whitelist instead of blacklist -- tell the tools to ignore all files excluding these which are inside myproject directory. I think that this is the simplest way not to touch your ignore lists too often.
Try the --install-dir option. You may also want to use --build-dir to change building dir.