The solution I am working on consists of modules that are decoupled across multiple virtual server instances. All of the modules require the exact same DTO (Data transfer object) classes. Currently I am packaging the DTOs into a library and deploying it to all server modules, when a change is made to the DTOs I have to redeploy the library to all the modules to ensure they are using the latest update.
Are there any technologies or concepts available to share class definitions across multiple server instances without having to redeploy the library manually each time a change occurs?
The only way to share modules across servers without redeploying would be some sort of shared filesystem (NFS, Samba/CIFS, etc...). One thing to explore is whether SaltStack might work for you. It would require deploying, but it would make such deployments a snap -- http://www.saltstack.com/
Related
I want to be able to write shared functions that can be accessed in one-off batch scripts and also by the running Django service (to use the ORM)
Currently, I have this in the _init__.py under the my_proj module.
if 'DJANGO_SETTINGS_MODULE' not in os.environ:
os.environ['DJANGO_SETTINGS_MODULE'] = 'my_proj.blah.blah.settings'
import django
django.setup()
This works fine for one django project. However, now I want to do reference the ORM functions from another django project, "other_proj" in the same repo from an independent script that lives outside both django projects.
Is there a way to "django.setup()" multiple projects at once?
Or, at least, a way to easily toggle the setup between the two projects?
Or is there a better way altogether? (I realize I could create a client library to hit the services while they are running, but would prefer to remove that overhead)
If you want a Django project to access functionality that resides in a different Django project, a client library is an option (as you noted). You could also consider packaging those sets of functionality as re-usable Django apps that you import into each project, or you could abstract them further into re-usable Python modules which get imported into each project. If you're hoping to use the Django ORM from one Project to access data from a different project, then you might be looking for this SO question: How to make two django projects share the same database
I think with more specifics in your question (such as, for example, function X in Project A you wish you could call from Project B) we might be able to be more specific with guidance.
I'm not sure I quite understand the case you're trying to implement here; two things that sound maybe-sort-of like what you're asking for are:
1) Running Django projects under uWSGI in Emperor mode allows you to serve multiple Django projects from one server simultaneously.
2) Django can be configured to run the same project under multiple domains simultaneously using the Sites framework.
I agree, though, that more detail about what you have and what you're trying to accomplish with it is probably necessary to give a satisfying answer.
I'm asking a question about my Django application updates.
I'm working on my Django Development server and I would like to know How I could create a patch which will update my different Django Project installed on different servers.
For example :
In my Dev' server, I add a new Query in my script. Then I would like to add this new Query to my all virtual instances by updating my script and not just copy/past modifications. It could work for one query, but if I have 200 rows, copy/paste will be too long.
I launch a patch which will make all updates without manually copy/paste rows.
How it's possible to do that ?
Thank you so much
You should probably consider migrating your project to a version control system.
Then every time you are changing something to your local copy of the code and push the changes on the repository, you can fetch/rebase/pull your changes wherever you want (be it your server or another computer of yours), and your patches will be applied without copy/pasting!
Some version control systems to consider:
Git Lab allowing the creation of free private repositories
Github the "old reliable", but no free private repositories
Bit Bucket (don't use it myself but it is widely used!)
Good luck :)
I'm planning to write some sort of digital management system to help users to deal with files in VCS (SVN, Perforce...) easily. The main premise is, that all files custom metadata and dependencies are stored alongside real files in VCS and not on separate database server.
But when querying the metadata it would be super slow to load everything from VCS on demand, so I would like to cache all metadata and dependencies locally and just update them incrementally when needed.
I need to write the whole system in Python, since it have to run in several environments that are embedding python.
Theoretically my needs will be fulfilled by nosql embedded graph database with multiprocess access, but sadly I can find anything to match this criteria:
every file can have different metadata structure, so I can't use schemas, thus no SQL db
I need to store dependencies
ability to search metadata and dependencies
several processes need to be able to read the database at once
serverless solution (only local machine will use it)
Python support
Optionally a way to inform connected processes about database update
I would really appreciate if someone more experienced could point me to the right direction. I'm not looking exclusively for one silver-bullet software that would fulfill my needs, it can also be an combination of several solutions. I just don't like reinventing the well, so I would like to use 3rd party solution rather than writing something on my own.
Thank you
ZODB satisfies most of your criteria:
support for arbitrary python constructs
usable mostly like having everything in memory as normal objects
if opened in read-only mode multiple processes can read at the same time (i think)
if using with ZEO (server) many clients can access at the same time with automatic notification on changes. See sample application in ZEO guide
You can load the same database with ZEO or ZODB so you can switch.
There is some tutorial stuff and general info at http://www.zodb.org
Docker containers can be linked. Most examples involve linking a Redis container with an SQL container. The beauty of linking containers is that you can keep the SQL environment separate from your Redis environment, and instead of building one monolithic image one can maintain two nicely separate ones.
I can see how this works for server applications (where the communication is transmitted through ports), but I have troubles replicating a similar approach for different libraries. As a concrete example, I'd like to use a container with Ipython Notebook together with the C/C++-library caffe (which exposes a Python interface through a package in one of its subfolders) and an optimisation library such as Ipopt. Containers for Ipython and Caffe readily exist, and I am currently working on a separate image for Ipopt. Yet how do I link the three together without building one giant monolithic Dockerfile? Caffe, Ipython and Ipopt each have a range of dependencies, making a combined maintenance a real nightmare.
My view on docker containers is that each container typically represents one process. E.g. redis or nginx. Containers typically communicates with each other using networking or via shared files in volumes.
Each container runs its own operating system (typically specified in the FROM-section in your Dockerfile). In your case, you are not running any specific processes but instead you simply wish to share libraries. This is not what docker was designed for and I am not even sure that it is doable but it sure seems as if it is a strange way of doing things.
My suggestion is therefore that you create a base image with the least common denominator (some of the shared libraries that are common to all other images) and that your other images use that image as the FROM-image.
Furthermore, If you need more complex setup of your environment with lots of dependencies and heavy provisioning, I suggest that you take a look at other provisioning tools such as Chef or Puppet.
Docker linking is about linking microservices, that is separate processes, and has no relation to your question as far as I can see.
There is no out-of-the-box facility to compose separate docker images into one container, the way you call 'linking' in your question.
If you don't want to have that giant monolithic image, you might consider using provisioning tools a-la puppet, chef or ansible together with docker. One example here. There you might theoretically get use of the existing recipes/playbooks for the libraries you need. I would be surprised though if this approach would be much easier for you than to maintain your "big monolithic" Dockerfile.
It seems that all roads lead to having to use PyISAPIe to get Django running on IIS6. This becomes a problem for us because it appears you need separate application pools per PyISAPIe/Django instance which is something we'd prefer not to do.
Does anyone have any advice/guidance, or can share their experiences (particularly in a shared Windows hosting environment)?
You need separate application pools no matter what extension you use. This is because application pools split the handler DLLs into different w3wp.exe process instances. You might wonder why this is necessary:
Look at Django's module setting: os.environ["DJANGO_SETTINGS_MODULE"]. That's the environment of the process, so if you change it for one ISAPI handler and then later another within the same application pool, they both point to the new DJANGO_SETTINGS_MODULE.
There isn't any meaningful reason for this, so feel free to convince the Django developers they don't need to do it :)
There are a few ways to hack around it but nothing works as cleanly as separate app pools.
Unfortunately, isapi-wsgi won't fix the Django problem, and I'd recommend that you keep using PyISAPIe (disclaimer: I'm the developer! ;)
Django runs well on any WSGI infrastructure (much like any other modern Python web app framework) and there are several ways to run WSGI on IIS, e.g. see http://code.google.com/p/isapi-wsgi/ .