I am looking for a flexible solution for uploading generic builds to an artifact repository (in my case it would be Artifactory but I would not mind if it would also support others, like Nexus)
Because I am not building java code adding maven to the process would only add some unneeded complexity to the game.
Still, the entire infrastructure already supports bash and python everywhere (including Windows) which makes me interested on finding something that involves those two.
I do know that I could code it myself, but now I am looking for a way to make it as easy and flexible as possible.
Gathering the metadata seems simple, only publishing it in the format required by the artefact repository seems to be the issue.
After discovering that the two existing Python packages related to Artifactory are kinda useless as both not being actively maintained, one being only usable as a query interface and the other two having serious bugs that prevent it use, I discovered something than seems to close that what I was looking: http://teamfruit.github.io/defend_against_fruit/
Still, it seems that was designed to deal only with python packages, not with generic builds.
Some points to consider:
Tools like Maven and Gradle are capable of building more than Java projects. Artifactory already integrates with them and this includes gathering the metadata and publishing it together with the build artifacts.
The Artifactory Jenkins plugin supports generic (freestyle) builds. You can use this integration to deploy whatever type of files you like.
You can create your own integration based on the Artifactory's open integration layer for CI build servers - build-info. This is an open source project and all the implementations are also open sourced.The relevant Artifactory REST APIs are documented here.
Disclaimer: I'm affiliated with Artifactory
Related
I work on a large python code base with several teammates. We are often installing or updating dependencies on other python packages, and inevitably this causes problems when someone else updates their master branch from git, or we deploy on a new system.
I've seen many tools available for deploying environments on new computers, which are great. The problem is that these tools only work if everyone is consistently updating the relevant files (e.g. requirements.txt, setup.py, tarballs on a PyPI server...) every time they update or add a package.
We use Github's pull request system for code reviews. What would be great would be some means of indicating to the reviewer that the dependency structure has changed, prompting the reviewer to check for the necessary updates (also good would be to build in a checklist that the reviewer has to complete, reminding them to do the check).
How have other folks dealt with this problem?
I would enforce the use of tools with a network proxy or network ACL, to block public sites and stand up internal services like gitlab, bitbucket, GitHub enterprise, or internal pypi server, to force the use of certain standards.
at my work i write numerous small python scripts for DB management, most scripts use one or two common libraries which i sometimes update,
before distribution i freeze the scripts with cxfreeze and copy over some resources and upload to a server.
i would like to set up some up some system which would allow me to automatically rebuild/freeze all of the scripts
, copy over some files, archive and upload to server.
I'm not sure where even to look for such a system, because most of what if found is complicated server based systems like AHP for compiled languages, while i need something small for a single computer
obviously i can write something fast and dirty in python but it seems illogical that there isn't something ready made for these simple requirements.
please forgive my ignorance, I'm still learning.
Have you considered using make? If you're familiar with it from another language, it might be the easiest. There's also a list on the python wiki here.
Have a look at buildout. From their site:
Buildout is a Python-based build system for creating, assembling and deploying applications from multiple parts, some of which may be non-Python-based. It lets you create a buildout configuration and reproduce the same software later.
We are using it to do exactly what you describe. It is flexible and easy to extend through recipes.
Currently I am developing automated test framework. This test-framework has different packages. These packages will be refer in different project and these may be modified locally by the developer. I want to manage the python package eggs. I am thinking of using Artifactory. I tried to look for Artifactory help for Python,But I couldn't get anything useful.
should I use Artifactory or PIP ?
Edit:
Is there any way or command in python which can help me to put the eggs in artifactory?
There are numerous reasons to prefer a binary repository manager over a simple shared directory/SCM binary storage:
Fine grained security.
Ability to proxy and cache remote repositories.
More efficient handling of binaries (because it's a tool that's tailored to do so).
Sharing the binaries with other teams and the world is a lot safer and easier.
Integration with many tools in the ecosystem.
Search and manipulation facilities.
Administration tools.
Artifactory exposes a very rich REST API and the deployment of any artifact can be achieved by a simple HTTP PUT request.
Take a look at the Defend Against Fruit project. It provides the previously missing glue between Python and Artifactory.
http://teamfruit.github.io/defend_against_fruit/
You can use "in house" PyPi (either with easy_install -f ... or pip -f ...).
For a server you can have just Apache serving a directory with all the eggs or something like http://pypi.python.org/pypi/pypiserver
I'm looking for a tool to keep track of "what's running where". We have a bunch of servers, and on each of those a bunch of projects. These projects may be running on a specific version (hg tag/commit nr) and have their requirements at specific versions as well.
Fabric looks like a great start to do the actual deployments by automating the ssh part. However, once a deployment is done there is no overview of what was done.
Before reinventing the wheel I'd like to check here on SO as well (I did my best w/ Google but could be looking for the wrong keywords). Is there any such tool already?
(In practice I'm deploying Django projects, but I'm not sure that's relevant for the question; anything that keeps track of pip/virtualenv installs or server state in general should be fine)
many thanks,
Klaas
==========
EDIT FOR TEMP. SOLUTION
==========
For now, we've chosen to simply store this information in a simple key-value store (in our case: the filesystem) that we take great care to back up (in our case: using a DCVS). We keep track of this store with the same deployment tool that we use to do the actual deploys (in our case: fabric)
Passwords are stored inside a TrueCrypt volume that's stored inside our key-value store.
==========
I will still gladly accept any answer when some kind of Open Source solution to this problem pops up somewhere. I might share (part of) our solution somewhere myself in the near future.
pip freeze gives you a listing of all installed packages. Bonus: if you redirect the output to a file, you can use it as part of your deployment process to install all those packages (pip can programmatically install all packages from the file).
I see you're already using virtualenv. Good. You can run pip freeze -E myvirtualenv > myproject.reqs to generate a dependency file that doubles as a status report of the Python environment.
Perhaps you want something like Opscode Chef.
In their own words:
Chef works by allowing you to write
recipes that describe how you want a
part of your server (such as Apache,
MySQL, or Hadoop) to be configured.
These recipes describe a series of
resources that should be in a
particular state - for example,
packages that should be installed,
services that should be running, or
files that should be written. We then
make sure that each resource is
properly configured, only taking
corrective action when it's
neccessary. The result is a safe,
flexible mechanism for making sure
your servers are always running
exactly how you want them to be.
EDIT: Note Chef is not a Python tool, it is a general purpose tool, written in Ruby (it seems). But it is capable of supporting various "cookbooks", including one for installing/maintaining Python apps.
Which framework has the most mature, flexible, intergrated, centralized and easy-to-use plugins/extension system.
My main requirements are:
a centralized system/repository where i could find a extension i need
no need to make changes in the source code, the plugin should be easily enabled and disabled
large plugin/extension database
something like http://wordpress.org/extend/plugins/
http://www.symfony-project.org/plugins/
I can't speak for Django, but I can tell you about Rails' open source community. GitHub is the central location for all Rails open source code.
Most ruby libraries/plugins these days are packaged as "gems", which are easy to install, update, and remove. RubyGems is the place to go for these pre-packaged gems, when you care less about the code and more about dropping the functionality into your application.
There is now a new tool called RVM that keeps the gems (and even rails version) isolated from one application to the next, on your system. That way if one app uses version 1.0 of a gem, and another uses version 2.0, they don't conflict with each other.
All in all, a pretty sweet setup.
There are lots of reusable django apps around. You can find many on the CheeseShop, but even more on GitHub and BitBucket.
There is also django-packages, which is a bit like the CheeseShop, but just for django packages.
VirtualEnv is like RVM (or rather, RVM is like VirtualEnv), which is a great way to isolate your python packages (I even use it in production). It has been around for ages, and works well with pip (the best python package installer).
Both of them are mature frameworks. I don't use ruby so I don't know about the rails plugin land. Given how popular it is (and my information from my lurking time on local Ruby lists), it's pretty good.
With Django, you have (like Matthew mentioned) django-packages and a few other places. I've been working on a largish Django project and it's pretty easy to just search for something like "django facebook" on google and get what you need. The Pinax project is an integrated collection of Django apps that lets you have most things out of the box. That's another thing you might want to consider. The packaging of the plugins are using the standard Python distutils libraries so installation is a single command (or if you're using pip/virtualenv, directly off the net).
VirtualEnv and related tools are not really Django specific. They're good practice if you're doing any python development though.
You should take a step back and evaluate both languages as well in my opinion. Python and Ruby are quite different in their approach to good code and it's likely that one will fit your brain better than the other.