Pygit2 is a set of Python bindings to the libgit2 shared library, which implements the Git core methods. Unfortunately, it only seems to provide an API towards plumbing commands.
Is there any python library built on the top of pygit2 which provides an implementation of the most common git porcelain commands?
The plumbing commands are those which are only expected to interact with a user and as such their output is subject to change at any time and have a granularity which as a rule is not particularly useful for a computer programme.
If you're not interested in the data structures from libgit2, but rather want to run pluming commands for scripting, you might want to take a look at GitPython which wraps git's own commands behind a python API.
Related
I want to write a daemon in python which gets started via systemd.
I wan to use Type=notify, this way I don't have to do the double fork magic.
According to the docs:
The reference implementation for this notification is provided by libsystemd-daemon.so
... how to do this with Python?
Probably, you could use sdnotify python module which is a pure-python implementation of sd_notify protocol. Actually, the protocol is rather simple, so the module implementation is quite short.
To use watchdog machinery you should add WatchdocSec=<smth> to the unit file, and then send WATCHDOG=1 messages on a regular basis from your service. Check Restart= option as well.
use the package
https://pypi.org/project/systemd-python/
it is the offical systemd devs and maintained.
Currently I am developing automated test framework. This test-framework has different packages. These packages will be refer in different project and these may be modified locally by the developer. I want to manage the python package eggs. I am thinking of using Artifactory. I tried to look for Artifactory help for Python,But I couldn't get anything useful.
should I use Artifactory or PIP ?
Edit:
Is there any way or command in python which can help me to put the eggs in artifactory?
There are numerous reasons to prefer a binary repository manager over a simple shared directory/SCM binary storage:
Fine grained security.
Ability to proxy and cache remote repositories.
More efficient handling of binaries (because it's a tool that's tailored to do so).
Sharing the binaries with other teams and the world is a lot safer and easier.
Integration with many tools in the ecosystem.
Search and manipulation facilities.
Administration tools.
Artifactory exposes a very rich REST API and the deployment of any artifact can be achieved by a simple HTTP PUT request.
Take a look at the Defend Against Fruit project. It provides the previously missing glue between Python and Artifactory.
http://teamfruit.github.io/defend_against_fruit/
You can use "in house" PyPi (either with easy_install -f ... or pip -f ...).
For a server you can have just Apache serving a directory with all the eggs or something like http://pypi.python.org/pypi/pypiserver
I was hoping to implement an SVN communicator in my python program so that any file being worked on is automatically stored into the user's SVN account without any user interaction (username and password already provided so Python takes care of storage). Are there any libraries that can handle this kind of communication?
Thanks!
There are Python bindings for SVN. They follow the C API, so present a fairly low-level interface, not very "Pythonic". I'm not sure how easy they are to install these days. I've tried to use it in the past, and found that it requires some digging into the C API documentation to figure out how to make it work.
pysvn provides a more "Pythonic" API. I've used this, and found it very simple in comparison.
There is a pysvn project, which provides python interface to various svn tasks. You could use that invoke the svn commit operation for the user action which you want to act upon.
I've stumbled upon pexpect and my impression is that it looks roughly similar to fabric. I've tried to find some comparison, without success, so I'm asking here--in case someone has experience with both tools.
Is my impression (that they are roughly equivalent) correct, or it's just how it looks on the surface ?
I've used both. Fabric is more high level than pexpect, and IMHO a lot better. It depends what you're using it for, but if your use is deployment and configuration of software then Fabric is the right way to go.
You can also combine them, to have the best of both worlds, fabrics remoting capabilities and pexpects handling of prompts. Have a look at these answers: https://stackoverflow.com/a/10007635/708221 and https://stackoverflow.com/a/9614913/708221
There are different use cases for both. Something that pexpect does that Fabric doesn't is preserving state. Each Fabric api command (eg: run/sudo) is it's own individual command. So if you do:
run("cd project_dir && workon project")
run("make")
This won't be in that directory nor will it be in the virtualenv. While there are context managers for cd() in Fabric now, they're more or less prepending each run with a cd.
In the scheme of things this has little bearing on how the majority of projects work, and is essentially unnoticed. For some needs however you might use pexpect to manage this state, for multiple sudos or some sort of interactive task that can't be automated with flags.
All of this though isn't a demerit for Fabric, as being only python, you're more than able to include pexpect code inside fabric tasks.
Though in all other ways, Fabric essentially manages all the hard work of remote connections and running commands better than you'd get writing code from the ground up with pexpect.
Update I've been informed of a project that works with Fabric and pexepect, you can see more on this question's answer
I am about to begin a project where I will likely use PyQt or Pyside.
I will need to interface with a buggy 3rd party piece of server software that provides C++ and Java APIs. The Java APIs are a lot easier to use because you get Exceptions where with the C++ libraries you get segfaults. Also, the Python bindings to the Java APIs are automatic with Jython whereas the Python bindings for the C++ APIs don't exist.
So, how would a CPython PyQt client application be able to communicate with these Java APIs? How would you go about it?
Would you have another separate Java process on the client that serializes / pickles objects and communicates with the PyQt process over a socket?
I don't want to re-invent the wheel... is there some sort of standard interface for these types of things? Some technology I should look into? RPC, Corba, etc?
Thanks,
~Eric
If you want to maintain complete isolation and increase your robustness (the 3rd party library going down and not taking your client, and if it's buggy I would recommend that) then perhaps something like CORBA is the way forwards. Don't forget that Java comes with a CORBA implementation as standard, so you just need to generate your C proxy from the IDL.
Swig may be of interest if you want to run stuff in-process. It simplifies the binding of components in different languages. Note in particular that it generates bindings for Python and Java.
If the criteria is not reinventing the wheel, there is the SimpleXMLRPCServer and xmlrpclib modules available in the standard library. They should work in Jython too.