LSB 3.2 Beta

Today, we released the first beta of LSB 3.2. If all goes well, this will hopefully be the only beta.

We’ve been working on 3.2 for a while, and we’re really excited about it. We’ve added quite a few interfaces, based on feedback from application vendors and others. There are whole new sections: printing support, Perl and Python, FreeType, Qt 4, and trial use support (our new name for “optional”) for Xrender, Xft, and the ALSA API.

Betas can only be as good as the people participating; more feedback means a better standard. So please go check out the beta. Look at the whole thing, or just parts you’re interested in. Read the spec, or check out the tests, or try building your favorite open-source app with our SDK.

We’re hoping for a release before Christmas, but that depends on the feedback we get, of course. And we’d rather know about that really big issue we forgot about and delay the beta than find out after the release. So get cracking!

Another Long Hiatus

Wow. Has it really been that long since my last post?

It occurred to me today, as I upgraded to the latest WordPress and watched the ongoing security nightmares, that going through this effort is only useful if I actually use the darned thing.

And I’ve been busy; yes I have. I’m now the webmaster for my son’s Boy Scout troop, using MediaWiki as a CMS with an eye to encouraging more parent and Scout participation in the site. I’ve been to Montreal and Salt Lake City, among other places. And I’m preparing to upload a Debian package for virtualenv, a cool alternative to OS virtualization in the Python space.

More later.

The End

I was a little surprised to see a message of thanks for me and my old Progeny colleagues. Unfortunately, the news at Progeny’s home page was not good:

We are sorry to inform you that Progeny Linux Systems, Inc. ceased operations April 30, 2007.

It’s always a little sad to see a former employer go away, even when you feel the company brought its troubles onto itself. Imagine how much worse it is to see something die that you thought had a lot of potential, with fabulous co-workers, above-average management, and really good ideas. It’s often been said that competence and vision are not sufficient for success; without getting into the details, Progeny is now Exhibit A in making that case for me.

I am grateful for having worked there, and am proud of what we accomplished. It wasn’t easy surviving the dot-com bust and building a new business model for ourselves. And it’s certain that I wouldn’t be where I am today without the opportunities Progeny gave me.

I wish my former colleagues well as they find new jobs. Nearly everyone who passed through Progeny was top-notch, and would make excellent hires.

LSB Distro Testing, Redux

A while back, I posted a set of instructions on how to test a Linux distribution for LSB compliance. With the 3.1 update, testing has gotten a lot easier.

The most notable enhancement in the update is the LSB Distribution Test Kit Manager, or “DTK Manager” for short. This is a framework which controls the execution of the entire test suite and collecting the results.

So, it’s time to update the instructions.

First of all, as before, your distribution must ship a few things. There must be a “lsb” package, which depends on everything required by the LSB; if it’s not installed by default, you will need to install it. Your distribution must have a facility for installing RPM packages; this will usually either be RPM itself, or a converter such as alien. (The alien utility is used mostly by Debian-based systems, such as Debian, Xandros, or Ubuntu.) Once those are in place, you should install the “Xvfb” or “xvfb” package provided by your distribution. Since Xvfb is a part of, it is almost always available.

When your system is ready, download and install the “lsb-dtk-manager” package, found on the Linux Foundation’s download page. Several bundles are available; find the one that matches the architecture of the distribution you are testing. You may use the “lsb-dist-testkit-manager” or “lsb-dist-testkit” tarballs. Once the tarball is downloaded, unpack it, change to the directory it creates, and run the script.

After is done, start DTK Manager. This is done with the following command:

/opt/lsb/test/manager/bin/ [port-number]

This will start the manager back end, and attempt to run a browser to present the user interface. If this doesn’t work, point a browser at the port number you gave the script. The port number defaults to 8888 if you don’t give one.

If this has all worked, you will see a welcome page in your browser. Click “Get Certified”, fill in the information it requests, such as your name and the name of your distribution; this information will be stored in the test results. Then click “Run tests”.

And that’s it! The tests will take quite a while to run; the browser will display a status window. You can close the browser and do other things while the tests runs; point the browser back at the DTK Manager port and click “Execution” to see progress.

When the tests are done running, you will be presented with a results page, which tells you how the tests went. You can fix any problems you find and re-run the tests by going back to the “Get Certified” link. If you pass most tests and fail only a few, you can use “Custom Tests” to run just the test suites with failures.

Of course, you can still run the tests the old way if you prefer; the journals are all that we need for certification. Give the new DTK Manager a try, though, to see if it’s easier.

It’s certainly made my job easier. Besides the ease with which the tests are run (meaning fewer requests for help from testers), it’s possible to completely automate test runs, which will ensure that we can test the next release of the LSB more extensively and learn about problems sooner in the development process.

Like what you see? Thank the Insitute for System Programming at the Russian Academy of Sciences. They’ve done an excellent job.

When Censorship Is Good

The whole Kathy Sierra incident is coming to a close, with an NPR interview and a call for a blogger’s code of conduct. (Details at the links; basically, Kathy wrote an innocuous blog about software development, and was harassed into quitting her blog by a few nasty commenters.)

The latter item has touched off a rant by Teresa Nielsen Hayden, about the necessity of moderation:

Bloggers can ban anonymous comments or not, as they please. The problem isn’t commenter anonymity; it’s abusive behavior by anonymous or semi-anonymous commenters. Furthermore, the kind of jerks who post comments that need to be deleted will infallibly cry “censorship!” when it happens, no matter what O’Reilly and Wales say.

Anyone who’s read ML for more than a couple of months has watched this happen. Commenters who are smacked down for behaving like jerks are incapable of understanding (or refuse to admit) that it happened because they were rude, not because the rest of us can’t cope with their dazzlingly original opinions. It’s a standard piece of online behavior. How can O’Reilly and Wales not know that?

By coincidence, I got mail from Charlene Blake recently. Back when I bought my current van, I explained some of the reasoning behind my choice: poor customer service from Toyota caused me to decide to buy the Honda. In that post, I linked to a petition and some other information Charlene had put out there. Little did I know that Charlene had her own little “fan club” who liked to search for references to her and troll their little hearts out, trying to stifle any criticism of Toyota by lies, intimidation, fraud, and other nasty stuff. At first, I tried to be civil, but the stalkers got so vile that I was forced to do some “censoring” to keep my site from becoming an anti-Charlene haven.

Well, it’s two years later, and they’re still at it. As far as I can tell, she attempted to get some advice on cyberstalking from, and got a lot of abuse instead. Here’s a sample:

If you can dish it out, you have to be able to deal with the
push back. Evidently you can’t. Whining about those who
don’t agree with you won’t get sympathy from myself, and
undoubtedly most other folks who read similar pathetic
moaning from anyone!!
It is clear you are the kind of individual who always blames
others for your problems.
My advice–get a life!

Interesting legal advice, that.

Now, it’s possible that the fine folks at just take a dim view of Charlene. You’d expect, though, that if these people were regulars at, they’d have more posts on the site than just posts attacking Charlene. So let’s take a look at some of the names of the people who replied to her: Dave Nightingale, Garnet Williams, Roger Francis, Cheryl Martell, Marisa Decker, Vincent Gagnier, Bruce Coristine, Walter Matthews, and Rick Fasan. Right now, not a single one of those searches returns a post that isn’t about Charlene Blake. (Just in case they try to obfuscate the point with irrelevant posts elsewhere: try to find a post by any of those people on that was posted before April 24, 2007.)

By contrast, here’s one other poster on that thread: CK in Delaware. The person’s Charlene comment shows up, but so do a number of other posts, some of which predate Charlene’s initial post. That’s what a regular (or something vaguely close to a regular) looks like. If any of the names above really were regulars, they’d have search results looking like CK’s.

(For the sake of completeness, there’s only one poster left besides Charlene: T. Tonary, a defender of Charlene, also appears to be a one-timer. Ironically, “Bruce” above accuses Tonary of being a shill!)

It would be funny if it weren’t so pathetic.

Charlene comes across to me as a tough woman; certainly she has to have backbone to have pursued this for so long and with such opposition. But why do the Charlenes and Kathys of the world have to put up with this stuff? People talk about “censorship” in regard to deleting nasty comments, and I suppose it is. But Kathy is no longer posting, and Charlene can’t seem to post anywhere without vicious stuff following her. It seems to me that Kathy and Charlene are the ones getting censored.

And if we’re going to have censorship, of one stripe or another, better it be the pond scum than Kathy and Charlene.

Sadly, even I have been made to participate in the anti-Charlene campaign, even if by accident. If you search for Charlene Blake on Google, my blog is the second link, and Google’s excerpt from my initial post linking to Charlene’s petition is from one of the troll commenters. If you don’t actually click the link, you get the impression that I’m trashing her in the main article.

Oh, well. Time to make amends.

New Debian Release

The old testing release is now Debian 4.0:

The Debian Project is pleased to announce the official release of Debian GNU/Linux version 4.0, codenamed etch, after 21 months of constant development. Debian GNU/Linux is a free operating system which supports a total of eleven processor architectures and includes the KDE, GNOME and Xfce desktop environments. It also features cryptographic software and compatibility with the FHS v2.3 and software developed for version 3.1 of the LSB.

That last bit needs to be proven, which I’ll be doing this week.

Almost Made The Paper

My chess club, in the news:

Sean Hollick and Jim Klee run two different clubs on the same day and time with contrary ideals. Yet the two agree that whether it is for the competition or simply for fun, the game of chess is a joy in which everybody should partake.

The Noblesville Daily Times doesn’t believe in keeping stories available online, so you can’t read more than the blurb at Susan’s blog. Sean’s club (the Circle City Chess Club) is where I play; it’s more intense, requiring membership in the USCF to play in tournaments, and having ratings, dues, and the like. The other club (the Hamilton County Free Chess Club) is more casual, with no dues or memberships, talking during games encourages, etc.

The online story included a picture of Sean playing his son, Maxx; the paper edition had that picture, plus two from the HCFCC.

I was at the Circle City meeting when the reporter came by to do his research. He talked to Sean and a few others of us, and took pictures of several of us playing. Sadly, they decided not to use any pictures of me.

UPDATE: This link may work.

The Royal Game

In recent months, I’ve started taking up one of my on-again, off-again passions seriously: chess.

I started playing very young; I can’t remember a time when I didn’t know the rules. Growing up, my usual outlet was in reading my dad’s chess books and getting trounced by him in over-the-board play. In my adult life, chess became an occasional diversion. I taught my kids, and played them occasionally, and would sometimes check out new books or play around with portable electronic chess games.

Now I’ve decided to take it a little more seriously. I’ve gotten involved in the local chess club, and joined the US Chess Federation so I can play in tournaments. (You can even see how I’m doing online.) I’ve also joined an online chess site,, which does online correspondence chess. And my wife has been very supportive of all the new equipment purchases and gift requests: a chess table, new set, clock, books…

And, wonder of wonders, the chess world is quite an interesting one, with stuff going on. Unfortunately, a lot of it has been negative of late. The great champion Kasparov has retired from chess (a real loss; his play was some of the most spectacular seen since the Fischer-Spassky championship match in 1972). The world championship has reunited, but under a cloud. Cheating allegations have multiplied since then. Both the USCF and the World Chess Federation (FIDE) have been mired in controversy over leadership issues.

So, it looks like there should be plenty to blog about, both of a personal nature and in general chess news. I have a long road to walk; my current provisional rating after two tournaments is below 800, which means I get beaten easily by talented children. But it should be fun, and should keep my mind sharp, neither of which are bad things.

Getting the Message Out

From a Fluendo employee:

Are we evil that we don’t take more hours out of our day to build on glibc 2.3 ? You bet, we are cold heartless bastards. But in reality 90% of the people on glibc 2.3 are users that have an upgrade path to a more recent version of their distro; the other people are future Debian Etch users. I’m sure the Etch releasers have convinced themselves of the usefulness of not releasing with a glibc 2.4 that is more than 15 months old, and instead opt for an even older series, even before they actually release. But I am starting to wonder more and more who the people are that are waiting for a release like this.
Realistically speaking, it is possible that we may add glibc 2.3 plugins in the future if we see that more than just Debian is affected. We are not against taking your money for giving you a service that works. But the hours in our day are just as scarce as they are in yours. I just wanted to explain this to people that want to know, to take away your incentive to complain about a nameless faceless Company being Evil to you.

Elsewhere, we learn why Debian is so “backward”. In sum: upgrades from the current stable would break with 2.4, and not all Debian architectures are supported well by 2.4.

I suspect the market for Linux multimedia plugins isn’t a huge one, and Debian is still popular both for end users and as the basis for other efforts. Given that, doesn’t it make sense not to artificially exclude a whole chunk of your potential market?

Of course, I think I know of someone who could help here…

More On Copy Protection

AACS (the copy protection system for HD-DVD, Blu-Ray and other high-definition content) continues to crumble. In a nutshell, AACS adds layers to the process of decrypting movies on disc, and the layers are falling one by one. The previous cracks (see my report) opened individual discs and classes of discs; this crack opens all discs playable by a particular software-based player. It’s possible that the studios could revoke that player’s ability to play discs released in the future, but doing so now hurts customers who will have to update their copy of the player.

With all the news about copy protection failure, it’s worth reading some really good articles on why the efforts of multi-million-dollar companies continue to be cracked by smart teenagers. First, Cory Doctorow’s talk at Microsoft Research:

DRM systems are broken in minutes, sometimes days. Rarely, months. It’s not because the people who think them up are stupid. It’s not because the people who break them are smart. It’s not because there’s a flaw in the algorithms. At the end of the day, all DRM systems share a common vulnerability: they provide their attackers with ciphertext, the cipher and the key. At this point, the secret isn’t a secret anymore.

Cory references another paper written by Microsoft employees, now called simply “the darknet paper”. It’s a little more technical, but explains the problem well:

We investigate the darknet – a collection of networks and technologies used to share digital content. The darknet is not a separate physical network but an application and protocol layer riding on existing networks. Examples of darknets are peer-to-peer file sharing, CD and DVD copying, and key or password sharing on email and newsgroups. The last few years have seen vast increases in the darknet’s aggregate bandwidth, reliability, usability, size of shared library, and availability of search engines. In this paper we categorize and analyze existing and future darknets, from both the technical and legal perspectives. We speculate that there will be short-term impediments to the effectiveness of the darknet as a distribution mechanism, but ultimately the darknet-genie will not be put back into the bottle. In view of this hypothesis, we examine the relevance of content protection and content distribution architectures.

Finally, on the business side, science-fiction publisher Baen Books has been leading the charge away from copy protection in the world of electronic books. Editor and author Eric Flint explains why in a series of articles on their web site; here are the first, second, third, fourth, fifth, and sixth articles on that topic. The sixth article is particularly good, as it explains Baen’s (and Flint’s) experiences with publishing online without copy protection:

The titles are not only made available for free, they are completely unencrypted—in fact, we’ll provide you free of charge with whatever software you’d prefer to download the texts. We make them available in five different formats.

And . . .

The sky did not fall. To the contrary, many of those books have remained in print and continued to be profitable for the publishers and paying royalties to the authors. For years, now, in some cases. Included among them is my own most popular title, 1632. I put that novel up in the Baen Library back in 2001—six years ago. At the time, the novel had sold about 30,000 copies in paperback.

Today, six years after I “pirated” myself, the novel has sold over 100,000 copies.

If you’re curious, I encourage you to check out the Baen Free Library for yourself.

Same As The Old Boss

OK, I’ve been busy, which is why I haven’t said much recently. But in case you haven’t noticed, my old employer, the Free Standards Group, has merged with another open-source consortium (Open Source Development Labs, or OSDL) to form the Linux Foundation. I’ve survived the merger, and am still doing most of what I was doing before.

The reaction has been pretty good so far, and even the criticism we are getting has resulted in a good amount of support in our defense.

Suspicion isn’t out of place. The new group will have to earn its reputation, just as the old organizations did. But I think we’re in a good position to show that we’re not about “combin[ing] open source with all the worst aspects of the proprietary commercial software industry” (see the critical link above, last paragraph). Our management is the same as the old groups, and both groups have established records of improving the quality of open source. Is it that hard to believe that we would do more of the same after the merger?

As for me, I’m going to be doing pretty much the same stuff I was doing before, only with a few more helpers. What could be bad about that?

UPDATE (2007-02-16): I had given the impression that OSDL’s management was entirely gone from the new organization, which is not true.  Sorry, guys!

Copy Protection Broken Yet Again

Boing Boing (via Slashdot):

Arnezami, a hacker on the Doom9 forum, has published a crack for extracting the “processing key” from a high-def DVD player. This key can be used to gain access to every single Blu-Ray and HD-DVD disc.

Previously, another Doom9 user called Muslix64 had broken both Blu-Ray and HD-DVD by extracting the “volume keys” for each disc, a cumbersome process. This break builds on Muslix64’s work but extends it — now you can break all AACS-locked discs.

AACS took years to develop, and it has been broken in weeks. The developers spent billions, the hackers spent pennies.

My HDTV threshold has been inching lower and lower over time, as issues get resolved: lower-cost HDTV monitors, useful broadcast TV, the defeat of the broadcast flag, useful Linux support in hardware and software. Still, it’s clear that my standing advice–don’t do HD yet–has been vindicated.

How much longer? Some of the HDTV options for MythTV recording can do both standard-definition and high-def. If we accept that the HD stuff has to be watched on a computer, I might very soon move to HD recording for local TV channels.

But for now, it seems the major hurdle is HD cable, an area where the technology is still in transition. The current standard is largely a bust, the new standard being rolled out still doesn’t allow certain capabilities (menus, picture-in-picture), and the new standard is due to be eclipsed by yet another standard in a year or so. It’s also clear that reality has yet to set in; for all the consumer confusion and hassle, HD content doesn’t seem to be lacking at the BitTorrent sites.

So, continue to be careful. If you want to be able to do something with your new HD equipment, make sure you can before you leave the store. The HD powers-that-be have yet to honor any promise about future capability, and have broken some of those promises. So if it doesn’t work on the day of purchase, be ready to live without it forever.

As for me, current capabilities (and current prices) are almost at the level I’m looking for. But I haven’t bought yet.

UPDATE (2007-02-14): According to Ars Technica, this crack is still not complete; while all Blu-Ray and HD-DVD discs available today are cracked, the studios could protect future discs by revoking the keys of the software player used in the crack.  To translate that into non-technical English, users of that player would be required to update their player, and discs made after a certain date would not be crackable–until a new software player’s device key is extracted using the same method.

The Price of Success

Oh, the pains of being an early adopter: Google to charge businesses for Google Apps

But it’s not just small companies who have been champing at the bit to make use of Google’s services, as organizations such as Disney, Pixar, and the University of Arizona are eager to sign up to have hundreds of thousands of accounts managed online by Google. The service was offered for free to businesses during Google Apps’ beta period, but will apparently be going live with subscriptions “in the coming weeks,” according to BusinessWeek. It’s still murky as to how much Google will charge organizations for the service, but the fee will reportedly amount to “a few dollars per person per month.”

Now, it is true that all references to pricing refer to business use; there’s no word yet on whether they will charge noncommercial users. And even if they do, a few dollars per month per user isn’t bank-breaking.

But I wonder how well Google will handle the transition. Will some GAFYD customers get cut off if they aren’t paying attention? Will traditional domain hosting get a rush of new customers fleeing? Will Google’s competitors?

I’ve been slowly, slowly warming to this idea of hosted apps. Google Reader took over from Liferea for online news and blogs after I got tired of the latter’s bugs, and Google Calendar works a lot better than the various hack-fests I’ve tried to get local shared calendars working. But I think I’ll stick with hosting my own domain for now, at least until I get a better sense that the providers have the costs figured out.

What Do We Want From Microsoft?

Jason Matusow of Microsoft wants to know:

That said, the real voice of the community is…well…from those of you I don’t know. I have to tell you that the issues with getting this covenant right are incredibly complex and there are real concerns on all sides. Our design goal is to get language in place that allows individual developers to keep developing.

(This is in response to the recent patent deal between Microsoft and Novell, and the poor reception it’s getting from the free software community.)

Unfortunately, he got GrokLaw-ed, and his comment system isn’t taking the heat well. So, here’s my feedback; hopefully, he’s paying attention to views outside his comments.

The big problem, if you ask me, is the distinction between “commercial” and “non-commercial” that Matusow (and everyone else I hear from Microsoft) is making.

In our world, that distinction is a lot less important than the distinction between “proprietary” and “open”. For us, “commercial” is just another way software can be used, and restrictions on commercial use are like restrictions on use by women, or by people in Illinois, or by people who have ever picked their nose in public. Why are businessmen any less deserving of our software as a class than housewives, or Haitians, or other free software developers?

Matusow claims not to be interested in any of this:

We are not interested in providing carte blanche clearance on patents to any commercial activity – that is a separate discussion to be had on a per-instance basis. As you comment, please keep in mind that we are talking about individuals, not .orgs, not .com, not non-profits, not…well, not anyone other than individual non-commercial coders.

Dialogue often means meeting the other person where they’re at, not where you want them to be. They would, presumably, not take us seriously if we insisted on a blanket patent license as a condition for any kind of conversation. Fair enough; but then why should we taken them seriously when they insist on us turning our backs on one of our bedrock principles?

But does the conversation have to be either-or? I’m betting that Matusow’s blog post is evidence that it doesn’t. People at his level are not the types to waste time on wild goose chases.

And is it all that strange to think there might be value in the conversation? There’s a mighty thin line between “proprietary” and “commercial”, so thin even we get them confused sometimes. Does Microsoft really care all that much about for-profit use and improvement of free and open tech? If so, they’re prominent members of a small and shrinking club. If not, then it seems to me that we have a lot of common ground for discussion.

My Own Single Point of Failure

I’ve been a bit difficult to reach recently. Part of that has been general busyness, including attending the FSG Printing Summit in Lexington, KY, but that wasn’t helped by my former employer switching offices. They had generously allowed me to continue hosting there after leaving, but I had been lax in searching for alternative arrangements.

This was made worse because I had centralized too much of my online presence there, with no backups, so when they took everything down to move to the new office, I effectively disappeared from the Internet for a time. So if I’ve seemed a bit uncommunicative lately, that’s probably why.

Ian has been filling my head with tantalizing visions of replacing my hosted boxes with online apps. I think I’m going to give some of these a spin, but I’m not convinced yet. It seems to me that the lesson to learn–don’t put all your eggs in one basket–argues equally well either way.

Blog Update

Well, it’s been over a month since the last entry. So much for posting more often!

Today, I’ve updated the blog to WordPress 2.0.3, and installed a new theme. I wasn’t too happy with the old Steam theme, but it was a variable-width theme, and I can’t stand fixed-width themes. (Why buy a better monitor if all the Web pages are forced to 600 pixels?) But with the new and improved theme support in 2.0, there are some nice themes that use your whole browser window.

(Posted to all known aggregators, too; I hope Planet doesn’t decide all my posts are new now.)

UPDATE: Well, that was fun; the nice-looking theme happens to be completely invalid. Expect theme changes over the next short while.

UPDATE: Wow, that’s depressing; the state of valid XHTML in WordPress themes is, uh, underwhelming. So I switched back to the nice theme, and edited it to be valid XHTML 1.0 Transitional and valid CSS. I’ve set up a Bazaar-NG repository for my changes.

Cluelessness in Security

Ladies and gentlemen, I give you: Diebold!

“For there to be a problem here, you’re basically assuming a premise where you have some evil and nefarious election officials who would sneak in and introduce a piece of software,” [David Bear, a spokesman for Diebold Election Systems,] said. “I don’t believe these evil elections people exist.”

(Originally from here, if you can read it.)

Nope. Evil election officials don’t exist, and never have.

Diebold election machines are insecure and poorly designed. Why does anyone tolerate this?

Multimedia on Linux

Another interesting topic for standardization in the LSB involves multimedia. It’s clear that we need to give developers a good story on how to do multimedia on Linux in their applications; what’s less clear is what that story should be.

There’s been an interesting conversation regarding GStreamer, and its status in the KDE desktop. Apparently, burned by their experiences with their previous sound framework, the KDE folks are writing a new system, called Phonon. The idea is that Phonon would provide a clean, stable API layer for KDE apps to use for the vast majority of simple multimedia-ish things, like playing a sound clip.

Christian Schaller, a GStreamer hacker, isn’t too thrilled with this, and posted an unflattering analysis of Phonon to his blog. This prompted the kind of response you’d expect, including criticisms of GStreamer:

All other arguments aside, GStreamer doesn’t offer a stable API. I can understand why that’s the case, but as such, because of the (sane) library policies within the KDE project on binary compatibility we cannot simply use a GStreamer binding as our multimedia solution. Period. I was a little surprised by Christian’s posting because we’ve talked about this multiple times.

This piqued my interest, because there’s been talk within the LSB to add a multimedia framework, and GStreamer is one of the candidates. So Christian’s response is very important:

I consider Scott a friend and I think his entry is well considered. My general response is that the bigger and more complex an API gets the chances of getting it right the first time goes down.

I’m not sure how to react to this. If GStreamer’s ABI is still in flux, it may not be a good candidate for inclusion in the LSB. On the other hand, are there credible alternatives? Phonon’s scope is too limited, and it will likely be tied strongly to KDE, which makes it less desireable. There are other frameworks, but I’m not seeing that any of them have the credibility of GStreamer.

Perhaps the best we can do, at this time, is standardize at a low level (ALSA and Video for Linux) and wait for a clear winner in the multimedia space to emerge.

That sounds like an endorsement of the Phonon approach, and in a sense it is. But we have to be careful that we don’t create another source of complexity for the Linux desktop. If Phonon encourages all of us to play around with two or three separate multimedia frameworks, to the point that we can’t really have multimedia on our desktop without having to mess with more than one, then the Phonon supporters will have done the Linux desktop a disservice.

Future Standardization Directions

Now that LSB 3.1 is out, there’s been some discussion of future directions for the LSB to take. Not surprisingly, desktop componentry beyond the graphical toolkits (GTK+, Qt) has been of interest.

If you’re interested in this, the results of Sun’s evaluation of the GNOME interfaces for inclusion in Solaris provide a lot of good information about what parts of GNOME are stable enough for inclusion.

Some candidates for standardization were left out due to uncertainty over their status as standards:

We would also like to add the icon integration specification as “Stable”, but the fact that the FreeDesktop Standards website makes a weak stability claim by saying, “ is not a standards body” makes us a bit unsure which specifications should be considerd Stable. It would be good, I think, if the FreeDesktop community could make a stronger claim about the specific specifications that are needed for desktop integration, such as those recommended for use in the GNOME System Admin Guide, and they should probably be referenced on the GNOME Developer Standards page as well.

Well, if the freedesktop folks are nervous about being a standards body, perhaps they could work with a standards body to codify the things they consider standards. I know of at least one candidate for that position…