Fetchez le Python

Technical blog on the Python programming language, in a pure Frenglish style. Main topics are Python and Mozilla – This blog does not engage my employer

Category: zope

Monitoring a Zope 2 application

We have a simple need for a customer project that runs a Zeo server and a few Zeo clients : being able to check the status of every Zeo client, and monitor what they are doing.

DeadlockDebugger almost provides this feature since it is able to produce a dump of the execution stack for every thread a Zope instance is running.

Based on this tool, I have developed ZopeHealthWatcher, that provides a console script to query a Zope instance, and get back a status for every running thread. It tells you if the thread is idling or if it’s running some code. The script also returns an exit code depending on the number of busy threads, so it can be used in tools like Nagios.

When there are 4 or more busy threads, the script will return the execution stacks for every busy thread and some extra info like the system load and memory info. The returned info will be extendable through plug-ins in the next version, but right now the provided info are enough for our needs.

I have also created an HTML version, so when the dump is requested from another tool than the console script (e.g. a browser), it displays a nice human-readable interface (check the PyPI page for more info and a screenshot).

Notice that DeadLockDebugger is hackish since it patches the Zope publisher at startup. But we won’t change this part: we need this tool to run from the oldest to the newest Zope 2 version. And the patch just works fine, so…

The provided version should run out of the box in a buildout-based Plone 3 application, but requires manual installation steps on older Plone or CPS versions.

I didn’t mention these manual steps in the documentation. I think I am the only person in the world interested in running this tool on the dead-but-still-in-production-in-many-places Nuxeo CPS.

By the way: kudos goes to Marc-Aurèle Darche, who is maintaining CPS for years now, making it one of the most bug-free and stable CMS solution out there.  Ok it’s probably easier to reach this level of quality since the platform is very stable and only evolves very slowly thanks to Georges Racinet.

Advertisements

Pycon 2009 talks

I have 2 accepted talks at Pycon, that is great. I would like to say that the Pycon review system is awesome because you can see what the reviewers have said, and understand why your talk was accepted or declined.

I was a bit frustrated that my Atomisator talk was declined, but I think it makes sense : this is a new tool, and beside my user group and a few people, it is not really used yet.

One reviewer said that it had to be picked, and another one answered :

I agree that PyCon should not restrict itself to well-known projects, but it should definitely restrict itself to projects that are (a) in production use, (b) under active development, and (c) likely to still be so in a year. There are so many projects meeting these criteria that for me, the bar is very high indeed to spend a talk slot on one that does not.

Ok, fair enough : I will present this talk at Pycon 2010 and they won’t have any argument to decline it 😉

The talks that made it:

  • How AlterWay releases web applications using zc.buildout
  • On the importance of PyPI in delivering and building Python softwares – mirroring, fail-over and third-party package indexes

I will get into greater details later on.

Python Isolated Environment (PIE)

Here’s a proposal I will send to the python-dev. What do you think ?

(Disclaimer : this proposal is highly inspired from the work done by people in various tools, it does not reinvent anything)

The problem

Python developers distribute and deploy their packages using myriads of dependencies. Some of them are not yet available as official OS python packages.  Even sometimes one package conflicts with the official version of a package installed in a given OS.

In any case, the cycle of development of most Python applications is shorter than the release cycle of Linux distributions, so it is impossible for application Foo to wait that Bar 5.6 is officialy available in Debian 4.x.

Therefore, there’s a need to provide or describe a specific list of dependencies for their application to work.

And this list of dependency might conflict with the existing list of packages installed in Python. In other words, even if this is not a wanted behavior from an os packager point of view, an application might need to provide its own execution context.

Right now, when Python is loaded, it uses the site module to browse the site-packages directory to populate the path with packages it find there.  .pth files are also parsed to provide extra paths.

Python 2.6 has introduced per-user site-packages directory, where you can define an extra directory, which is added in the path like the central one.

But both will append new paths to the environment without any rule of exclusion or version checking.

The workarounds

A few workarounds exist to be able to express what packages (and version) an application needs to run, or to set up an isolated environment for it:

  • setuptools provides the install_requires mechanism where you can define dependencies directly inside the package, as a new metadata. It also provides a way to install two different versions of one package and let you pick by code or when the program starts, which one you want to activate.
  • virtualenv will let you create an isolated Python environment, where you can define your own site-packages. This allows you to make sure you are not conflicting with a incompatible version of a given package.
  • zc.buildout relies on setuptools and provides an isolated environment a bit similar in some aspects to virtualenv.
  • pip provides a way to describe requirements in a file, which can be used to define bundles, which are very similar to what zc.buildout provides.

But they all aim at the same goal : define a specific execution context for a specific application, and declare dependencies with no respect to other applications or to the OS environment.

This proposal describes a solution that can be added to Python to provide that feature.

The solution

A isolated environment file that describes dependencies is added. This file can be tweaked by the application packager, or later by the OS packager if something goes wrong.

The isolated environment file

A new file called a  Python Isolated Environment file (PIE file) can be provided by any  application to define the list of dependencies and their versions.

It is a simple text file with a first line that provides :

  • a list of paths, separated by ‘:’, on line 1
  • then one package per line, starting at line 2. each package can be prefixed by a `!`

For example:

/var/myapp/myenv
lxml
sqlite
sqlalchemy
!sqlobject

This list of packages might or might not be installed in Python.

Versions can be provided as well in this file :

/var/myapp/myenv:/var/myapp/myenv2
lxml >= 0.9
sqlite > 1.8
sqlalchemy == 0.7
!sqlobject == 0.6

The file is saved with the pie extension,

Loading an isolated environment file

A new function called load_isolated_environment is added in site.py, that let you load a PIE file.

Loading a PIE file means:

  • for each package defined, starting at line 2, load_isolated_environment will look into the environment if the package with the particular version exists. The version is given by the package.__version__ value or the PKG-INFO one when available. If the package exists but the version is not available, the version 0.0 is used.
  • for packages without the ! prefix:
    • if the  package is not found, it will scan each path provided on line 1 of the file, using the site-packages method, looking for that package.
    • if the package is found, it is added in the path.
    • if the package is not found, a PackageMissing error is raised.
  • for packages starting with the ! prefix:
    • if the  package is found, it is removed from the path

This function can be called by code like this:

>>> from site import load_isolated_environment
>>> load_isolated_environment('/path/to/context.pie')

From there, sys.path meets the requirements and the code that is executed after this call will benefit from this context.
Another context can be loaded in the same process :

>>> load_isolated_environment('/path/to/another_context.pie')

Limitations:

  • if the new context brakes other programs in the process. It’s up to the application packager to fix the context file.
  • it’s not the job of load_isolated_environment to resolve dependencies issues : if the foo package needs the bar package, it won’t complain.
  • it is not the job of load_isolated_environment to get missing dependencies.

Using an isolated environment file

Typically, an isolated environment file can be used into high-level Python scripts. For example, any script an application provides to be launched :

# this script runs zope
from site import load_isolated_environment
load_isolated_environment('zope-3.4.pie')

import zope

if __name__ == '__main__':
    zope.run()

Pycon 2009 proposals

The proposal acceptance date is in a few days.

Here are the four proposals I have made:

  • The state of packaging in Python. This discussion resumes the current options when it comes to distribute your packages. It also explains the pitfalls and the gap between the Python developers and the OS vendors and packagers. I think this talk will not be picked because the topic is wide and vague. So I proposed to transform it into a panel where lead developers from various framework could explain their usage of distutils and what is missing to make them happy. No feedback yet on this.
  • Atomisator, the agile data processing framework. This tool is starting to be useful, and I think it can be useful to others. Check http://atomisator.ziade.org for a quick overview.
  • How AlterWay releases web applications using zc.buildout. That is the same talk I gave at the Plone conf but I present it in a way people understand zc.buildout is not tied to Zope and Plone and can be used with any other application. As a matter of fact, it has become a standard here, and we use it for Pylons, etc..
  • On the importance of PyPI in delivering and building Python softwares – mirroring, fail-over and third-party package indexes. That’s a long title. It presents my work on PyPI.

Last, I will go to the Python Language Summit the day before Pycon. I volunteered to be a “champion” on distutils matters.