IRC: sabdfl on irc.freenode.net
Backgrounder: I'm 34 and counting, South African, living in London. Most of my time is devoted to the Ubuntu project as cheerleader and chief whip. I've spent the last year or so focused on the Launchpad collaborative infrastructure, writing a fair portion of it and helping the team to define our goals. It's a thrill to see it all coming on stream! More details at markshuttleworth.com.
FAQs: Why and Whither for Ubuntu?
I'm happy for other website editors to modify this page, fixing typos or things that we have discussed between us. It's written from my perspective ("I") since the buck, technically, stops here. So add to it on my behalf with care, please ;-)
Ubuntu is not without its controversies. This is a good thing (at least IMO) as it suggests we are both challenging the status quo, and taking some risks. Speaking for myself, my motivation for funding and participating in Ubuntu so heavily is in large part derived from a desire to do both of those things. I enjoy shaking up established lines of thinking, and I enjoy taking risks. This document exists to give the community some insight into my thinking - and to a certain extent that of the Community Council, Technical Board and other governance structures - on some of the issues and decisions that have been controversial.
I will try to address rumours, frequently asked questions, common allegations and neuroses, and of course controversies both within the project ("our default desktop should be *purple*") and in the open source community at large ("alert! alert! Ubuntu is going to become a commercial project!"). You may have come across variants of both of those in the wild. You can take this document as canonical (;-)) for my perspective on them, and where noted, you'll also get the Ubuntu Community Council or Technical Board position. If you would like to see additional things addressed here, please raise them at a Community Council meeting on IRC, or correspond with me or the CC members, or raise it on ubuntu-devel.
Why do I do Ubuntu?
To fix bug #1 of course. I believe that free software brings us into a new era of technology, and holds the promise of universal access to the tools of the digital era. I drive Ubuntu because I would like to see that promise delivered as reality.
Will Ubuntu ever demand licence fees or royalties?
No. Never. I have no interest in taking Ubuntu to join the proprietary software industry, it's a horrible business that is boring and difficult, and dying out rapidly anyway. My motivation and goal is to find a way to create a global desktop OS that is *free*, in every sense, as well as sustainable and of a quality comparable to anything you could pay for. That's what I'm trying to do, and if we fail, well then I will go and find some other project to pursue rather than get into the proprietary software business. I don't think any of the core Ubuntu developers, or much of the community, would stick around if I went loony and decided to try the latter, anyhow.
If that isn't enough for you, then you will be happy to know that Canonical has signed public undertakings with government offices to the extent that it will never introduce a "commercial" version of Ubuntu. There will never be a difference between the "commercial" product and the "free" product, as there is with Red Hat (RHEL and Fedora). Ubuntu releases will always be free.
That said if you want to pay for Ubuntu, or something that includes Ubuntu code, you probably can. There are proprietary apps that are certified for Ubuntu. Some Ubuntu-derivatives, like Impi (in which I am an investor) are targeted toward vertical markets that demand specific software, currently proprietary, which they bundle. There is already Ubuntu code in Linspire, which you can pay for (w00t!). Though Linspire is not (yet) based directly on Ubuntu, it's not infeasible that the Linspire guys figure out what a good option that would be for them sooner rather than later. There are likely to be many specialised versions of Ubuntu, under other brand names, that have commercial or proprietary features. They might have proprietary fonts or software like Impi, or add-ons or integration with services, etc. There is also likely to be quite a lot of proprietary software available for Ubuntu (there is already a fair bit - Opera for Ubuntu was announced recently, for example). But Canonical, and I myself, and the Ubuntu Community Council and Technical Board, will not produce an "Ubuntu Professional Edition ($XX.00)". There will certainly be no "Ubuntu Vista".
If you don't make a commercial "Ubuntu Professional Edition", how can Ubuntu be sustainable?
We have some initial revenues from services related to Ubuntu. We have been contracted to produce customised distributions, and we participate in large-scale tenders for big Linux deployments, usually in partnership with local companies, where our job is to provide escalation support. In addition to widespread adoption in developing countries, Ubuntu may well be running all over NASA's Moffett Field soon... So we have the foundations of a sustainable project, and I'm confident that we have a reasonable chance of getting Ubuntu to the point where it funds itself for ongoing growth.
Exactly how it will all pan out from a business perspective is difficult to judge. I don't have all of those answers. That's OK, this is a risky (ad)venture, which is still at an early stage, so I don't expect to know. I can personally justify my investment in Ubuntu on philanthropic grounds (at least, the money we spend on open source development and on tools for open source developers, like Launchpad) because most of my good luck and wealth could only have been created using open source tools. I'm happy to give back to the community. In as much as we start to spend money on suits, they need to be sustainable quickly. We currently make some money offering certification related services (certifying developers, administrators, applications, and hardware) as well as customisation services (you want your own distro, based on Ubuntu, let's talk). Demand for those services is growing. I'm pretty confident that I can get Canonical to break even on that basis. And breaking even is fine by me, because it means that Ubuntu will continue to rock even if I decide it's time to go back to space and pick the wrong Soyuz.
It's also important to distinguish between Canonical, which is a for-profit services operation, and the Ubuntu Foundation, which has capital from me, on a non-profit basis, to continue Ubuntu's work. With the announcement of the Ubuntu Foundation, I was basically saying "OK, this project has legs, I'll commit enough capital to keep the core going for a long time no matter what happens to me or Canonical". So we have plenty of time to grow sustainability around the project. If you want to help on that front, send work to Canonical next time you need something done with Ubuntu. We won't let you down.
What about binary compatibility
We work very hard to ensure that Ubuntu is internally consistent across all packages, by focusing on a specific toolchain version and making sure that the distribution is built entirely with that toolchain. Where there are exceptions to this (like the kernel, sometimes, on specific architectures) we isolate those aberrations.
The challenge, of course, is that Ubuntu has a reputation for being up to date on the desktop, which by definition implies quite a high rate of change between releases. This is why we also have "long-term" releases, which are supported for 3 years on the desktop and 5 years on the server. Those create a better platform for ISVs (Independent Software Vendors) who will be shipping the same binaries for a long term period.
The ISV community is already a part of the broader Ubuntu community, there are many ISV's who participate in Ubuntu's development and help to define the basis of each release, specifically to make it easier for them to certify their applications on it. We provide a point of contact, email@example.com for ISV's who want to certify their apps on Ubuntu. We try to make it as easy as possible for ISV's to be able to certify their software as 100% compatible with any given Ubuntu release, and this is especially true of long-term releases like 6.06 (Dapper).
What about binary compatibility between distributions?
A lot has been said about the fact that Debian is not binary-compatible with Ubuntu. Sometimes this manifests itself as "I can't install Ubuntu packages on Debian", sometimes it's more "Why does Ubuntu use GCC 4 when Debian is using GCC 3.3?". Or "Why is the kernel and glibc on Ubuntu 5.04 different from that in Debian Sarge?". I'll try to address all of those here.
I'll start with our general policy and approach, and then examine some of those examples in detail.
First, "binary compatibility" means different things to different people. If you've followed the trials and tribulations of the LSB standards process, you'll understand how difficult it is even to *define* binary compatibility in a meaningful way across multiple distributions. That, in essence, is why we don't set "binary compatibility" as a goal for Ubuntu. Sometimes it happens, but if so, it's because there was an opportunity to make something work nicely - not because it's a hard goal. We take opportunities for binary compatibility across distributions where we can find them, but we don't constrain ourselves by making that an absolute requirement.
Just to be clear, I'll say it again, for the record. We don't aim for "binary compatibility" with any other distribution. Why?
In short, because we believe in Free Software as a collaborative process focused on SOURCE CODE, and consider it superior to the proprietary process which is focused on specific applications and binary bits. We choose to devote the large majority of our energy to the improvement of the source code that is widely and freely available, rather than trying to work on binary bits that cannot be shared as widely. When we spend hours of time on a feature, we want that work to be usable by as many other distributions as possible, so we publish the source code in "real time" as we publish new versions of the packages. We go to great lengths to make those patches widely available, in an easy to find format, so that they will be useful to upstreams, and other distributions. That benefits Debian, but it also benefits Suse and Redhat, if any of them are willing to take the time to study and apply the patches.
We synchronise our development with upstream, and with Debian, and with other distributions such as Suse and Gentoo and Mandrake and Red Hat, on a regular basis. We draw code from the latest upstream projects (which might not even be in Debian, or in Red Hat, or addressed in the LSB). We try to merge with Debian Unstable (a.k.a. Sid) every six months. We have no control over the release processes of other distributions, nor of upstream, so it would be impossible for us to define in advance an API or ABI for each release. We are in the hands of hundreds of other developers every time we freeze Ubuntu in preparation for a new version. Even though the Ubuntu community is substantial and growing rapidly, it is still tiny compared to the total number of developers working on all the free software applications that make up the distribution itself. Our job is to package what is there, efficiently and cohesively, not to try to massage it to some pre-defined state of compatibility. We focus on delivering the newest-but-stabilised-and-polished versions of the best open source applications for your server or desktop. If we were to set binary compatibility (at any level) as a top priority, it would massively diminish our ability to deliver either newer software, or better integration and polish. And we think our users care most about the fact that they are getting the best, and most integrated, apps on the CD.
It is worth noting that the Linux kernel itself takes the same approach, shunning "binary compatibility" in favour of a "custom monolithic kernel". Each release of the kernel requires that it be compiled separately from previous releases. Modules (drivers) need to be recompiled with the new release, they cannot just be used in their binary form. Linus has specifically stated that the monolithic kernel - based on source code, not trying to maintain a binary interface for drivers across releases - is better for the kernel. We believe the same is true for the distribution.
So the imperative to work with very current code overrides the idea of maintaining compatibility with a specific ABI, especially if we have little or no say in the ABI we should be trying to remain compatible with.
But, I heard that Ubuntu is LESS compatible than other similar projects?
If you've heard this specific allegation, it's absolutely not true.
If you touch or change the kernel, or x server or clients, or libc, or compiler, you have effectively made yourself incompatible. And as far as I am aware every significant distribution has, with good reason, invested work in those components to ensure that they meet the needs of their users. In the process, they make themselves "binary incompatible". What makes open source work despite this, of course, is the fact that source code and patches usually travel across distros, which is why we focus our attention there, not on the binary bits.
Some people might say "but I installed a Linspire package on Ubuntu, and it worked, so they must be compatible". And yes, in many cases a binary package from Linspire or Debian will Just Work (TM) on Ubuntu. But this is "accidental compatibility", not "certified binary compatibility". Your Mileage May Vary (YMMV) is not the sort of certainty most people would accept, and can hardly be called "certified compatibility". Many packages have very simple dependencies, and don't really require specific versions of system libraries, and they may well Just Work. But if you look below the hood, at some level or other, you will find binary incompatibility in every significant derivative distribution, from Knoppix through Linspire and the DCC, with Ubuntu being no different.
It is possible to build a new distribution using only package selections from another distribution, and that's useful. It's like the CDD project - and will in future I think be important in the Ubuntu world too. But it's not fundamentally very interesting - it's just package selection, which is useful for a specific set of users but does not advance the state of the open source art.
OK, why do you recompile packages?
We ensure that Ubuntu is entirely buildable using the toolchain that is the default in Ubuntu. We usually have a new version of GCC in Ubuntu, and certainly a newer version than Debian. So we make sure that we build all packages in Ubuntu with that new version.
In theory, using newer versions of GCC should give better binaries (though in the past, some version changes of GCC have included regressions laying the foundation for future progress). Also, it allows us to deal with ABI changes, especially in C++ code, and reduce the number of ABI package versions we have to keep lying around in the archive.
This is equally true of packages in "universe", which is all the thousands of packages in Ubuntu that mostly come from Debian, though there are alternative sources too. The MOTU ("Masters of the Universe ;-)") team in Ubuntu takes care of those packages, ensuring that ABI transitions, and (for example) Python version transitions happen there too. To ensure consistency, all of those packages are rebuilt too.
How about some specific examples?
There are some good examples of other distributions doing the same thing. Since Ian Murdock and Progeny have been very vocal about this, let's start there. Progeny 1.x was not "binary compatible" with the current Debian stable release at the time. Yes, really. The current "DCC Alliance" release uses a different kernel, and different LibC, to Debian Sarge. In both cases, however, source code patches will transport quite happily from those projects to Ubuntu, and to Debian, and we are happy to take them. That's what makes open source development, focused on the SOURCE CODE and collaboration around the code itself, more productive than proprietary development.
I don't in any way mean to disparage those other distributions. It's worth pointing out, however, that often the people who are shouting loudest about "binary compatibility" have happily disregarded it in their own work. Because in reality it is simply not that important in the open source world, and it is also not practical as a high-priority goal.
Why was Ubuntu 5.04 (Hoary Hedgehog) not "binary compatible" with Debian Sarge?
Many people report no problems moving packages between Ubuntu 5.04 and Sarge. But they are not entirely compatible. Ubuntu 5.04 and Debian Sarge have slightly, but significantly, different libc versions. When Ubuntu 5.04 was released, it WAS compatible with the version in Sarge, which was then in deep freeze. After the Hoary release, a change was proposed in Debian. In order to implement it, the Debian team would have to break compatibility with Hoary, which had already been released. This was discussed openly, and the decision was taken to make the change. We (at Ubuntu) absolutely believe this was the right decision by Debian. This is about open source, and we can collaborate effectively if we focus on the source code. Had Debian felt obliged NOT to implement the change, in order to maintain Ubuntu compatibility, then the open source world would in fact have been impoverished.
So inasmuch as there is a binary incompatibility between those two releases, it was not introduced by the Ubuntu team. Furthermore, we actively support the decision process that led to the incompatibility - that's what makes open source strong.
What about the GCC 4.0 transition? Why did you adopt GCC 4.0?
We always try to include the latest stable development tools, libraries, and applications. GCC 4.0 was released early in the Breezy (Ubuntu 5.10) development cycle, so it was the appropriate choice of compiler for that release. Also, GCC 4.0 gives us basic support for some java apps, through GCJ and GIJ. This meant that C++ applications compiled on Breezy would by default have a different Application Binary Interface (ABI) to the same libraries compiled in Sarge, which used GCC 3.
This was discussed with the Debian toolchain maintainers, who were themselves planning to adopt GCC 4 at some time (well, sort of involved the Ubuntu maintainer talking to himself, they are the same guy, but publishing his plans for discussion). A plan was agreed with regard to the specific naming of binary packages compiled with GCC 4, so that there could be a graceful migration and upgrade process for users who were updating from a previous release of Ubuntu (or Debian). The Ubuntu team then went ahead and pioneered the work, providing patches for hundreds of packages to bring them into compliance with the agreed naming for GCC 4. These patches are available to all the Debian maintainers, and make their life a lot easier with the GCC 4.0 migration in Debian.
Why is the default desktop in Ubuntu BROWN?
The overarching theme of the first set of Ubuntu releases is "Humanity". This drives our choice of artwork as much as our selection of packages and decisions around the installer. Our default theme in the first four releases of Ubuntu is called "Human", and it emphasises warm, human colours - brown.
Yes, that's rather unusual in a world where most desktops are blue or green, and the MacOSX has gone kitchenware. Partly, we like the fact that Ubuntu is different, warmer. The computer is not a device any more, it's an extension of your mind, your gateway to other people (by email, voip, irc, and over the web). We wanted a feel that was unique, striking, comforting, and above all, human. We chose brown. That's quite a high risk choice, because to render brown your screen has to render subtle shades of blue, and green, and red. Even slight variations from the norm can shift the "brown" substantially. But monitors and LCD screens these days are increasingly of a standard that we felt the risk was acceptable. In Hoary and Breezy we have gone with a richer, redder brown, based on feedback from lower-end laptop and LCD screen users.
Will brown always be the default desktop colour?
Unlikely that ANYTHING will be static forever, given that we expect Ubuntu to be around a long time
Our current plan is that the Dapper Drake (Ubuntu 6.06 if we hit our June 2006 release date goal) will be the last of this first "set" of releases. So post-Dapper we have the opportunity to define a new "feel" or overarching theme. It would be unlikely to be... blue. But it might be substantially different to the current Human theme. For the moment, let's stay focused on the road to Dapper, polish up the existing Human theme to the max for that, and then break new ground post-Dapper.
Is Ubuntu a Debian fork? Or spoon? What sort of silverware are you, man?
Yes, Ubuntu is a fork. No, it isn't. Yes it is! Oh, whatever.
In short, we are a project that tries hard to collaborate with many other projects - such as upstream X.org, and GNOME, and of course Debian. In many cases, the code we ship is modified or different to the code shipped by those other projects. When that happens, we work hard to ensure that our changes are published as widely as possible, in a format that is easy for other project maintainers to understand and incorporate into their own working tree.
In practice, we have gone to great lengths to develop tools that make it easy to collaborate with Ubuntu, and help us to collaborate with upstreams and other distributions. For example, we have an automatic patch publisher that shows Debian maintainers what patches for their packages are available for Ubuntu. It couldn't be easier for DDs to decide which patches they want, and which they don't. And frankly, it's a lot easier for us if they DO take them, but we can't force that. Many of the patches only make sense in Ubuntu. As a side benefit, these patches are also available for Gentoo, Red Hat, Linspire (yes, really) and Suse. And we know they check 'em out and use some of them, which is cool.
Collaboration goes beyond patches though. We have developed Malone, a bug tracker that explicitly tries to create collaboration between Ubuntu and other distros, and upstreams, on the fixing of bugs. Each bug can be tracked in lots of places, and in a single place you can see the status of the bug in all places. It's pretty cool.
One of the triggers that got me out of the "cosmonaut playboy international love rat of mystery" game and into Ubuntu was the emergence of tools like TLA, which seemed to offer the promise of even better collaboration on source code between distros and upstreams. So we did a lot of work on TLA, to the point where it looked different enough to call it Bazaar. Then we did a ground up rewrite in Python, and the result is Bazaar-NG, or Bzr, which will be Bazaar 2.0 by March 2006. Why is this important? Because passing patches around is not nearly as effective as working in a genuinely distributed revision control system. Many of the Ubuntu guys don't work on the distro, they work on tools like Bazaar, and HCT, which we hope will really accelerate the kind of collaboration that is possible in the open source world. Time will tell.
In summary: binary compatibility between Ubuntu and Debian is not a priority for us. We believe we contribute more to the open source world by providing patches to make Ubuntu (and Debian) packages work better, and providing a cutting edge (or bleeding edge, depending on your perspective) distribution for others to collaborate with. We invest a lot of energy in making sure our patches are widely published and easily available to developers of ALL other distributions as well as upstream, because that way we think our work will have the biggest long term benefit. And we develop tools (see Bazaar and Bazaar-NG and Launchpad and Rosetta and Malone) that we hope will make source code collaboration even more efficient.
What about forking the community? The Ubuntu community has grown very quickly, and that causes some people to worry that this growth might come at some cost to other open source communities, Debian in particular.
Given that patches can flow so easily between Ubuntu and Debian, it seems to me that the bigger we can make our total combined developer community, the better for both projects. Ubuntu benefits from a strong Debian, and Debian benefits from a strong Ubuntu. This is particularly true because the two projects have slightly different goals. Ubuntu gets to break new ground sooner, and Debian benefits hugely from those patches (just scan changelogs in Debian Sid since the Sarge release, and you'll see how many references to "Ubuntu" are in there. And that's only the cases where credit has been given. incentive If the Ubuntu and Debian communities worked in the same way, then I think there would be more truth to this concern, because we would attract the same sorts of people, which would mean that we were competing for talent. But the two communities are quite different. We organise ourselves differently, and we set different priorities. That means that we tend to attract different sorts of developers.
Now, there are certainly Debian developers who have started doing most of their work in Ubuntu now. There are also developers who work equally in Ubuntu and Debian. But the majority of the Ubuntu community is made up of newer developers, who are attracted to the Ubuntu way of doing things. There will always be some churn and movement between communities, and thats healthy because it helps to spread good ideas.
What if Ubuntu's success means Debian dies?
That would be very bad for Ubuntu. Every Debian developer is also an Ubuntu developer, because one way to contribute to Ubuntu is to contribute to Debian. We incorporate Debian changes regularly, because that introduces the latest work, the latest upstream code, and the newest packaging efforts from a huge and competent open source community. Without Debian, Ubuntu would not be possible. And the Debian Way is under no threat, it's getting a lot more exposure in more interesting places now that Ubuntu has shown what amazing things can be done within that community.
Why is Ubuntu not part of the DCC Alliance?
I don't believe the DCC will succeed, though its aims are lofty and laudable. It would be expensive to participate, and it would slow down our ability to add the features, polish and integration that we want in new releases. I'm not prepared to devote scarce resources to an initiative that I believe will ultimately fail. There's no point here in going into the reasons why I think the DCC will fail - time will tell. I would encourage members of the Ubuntu community to participate in the DCC discussions if they have time and are interested. If the DCC produces good code, we should merge that into Ubuntu releases, and it should be easy to do so.
Why are you funding Ubuntu, instead of giving the money to Debian?
I spent a lot of time thinking about how best to make a contribution to the open source world, and how best to explore the ideas I am personally interested in, such as the best ways to deploy open source on the desktop. One option was to stand for the position of DPL (I'm a DD, first maintainer of Apache in 1996 blah blah) and drive those ideas inside Debian. In the end I decided to create a parallel distribution, and invest in the infrastructure to make inter-distro collaboration a lot more efficient.
First, a lot of the things I want to do involve reducing the scope of the distro. That makes it MORE useful for one set of people, but quite clearly LESS useful for others. For example, we currently officially support only 3 architectures in Ubuntu. That's GREAT for people running those architectures, but clearly not so useful for people running on something else.
Similarly, we support about 1,000 core apps in Ubuntu. Those are the core pieces that are in the "main" component for Ubuntu, Kubuntu and Edubuntu. Everything else is accessible, as part of Universe or Multiverse, but not officially supported.
The more I thought about it, the more I realised this was the wrong thing for Debian, which derives much of its strength from its "universality". It makes more sense to take these approaches in a separate project. We get to pioneer and focus on those things, and the patches are instantly accessible to the DDs who feel they are appropriate for Debian too.
Second, the problem of "sharing between distributions" is the really interesting one. At the moment, we tend to see the world, as upstream, a distro, and derivatives. Actually, the world looks more like a bunch of different projects that need to collaborate. We need to collaborate with Debian, but we should also be able to collaborate with upstream, and with Gentoo. And with Red Hat, too. We need to figure out how to collaborate effectively with distributions that use totally different packaging systems to us. Because the reality of the open source world is one in which the number of distributions continues to rise - each one fulfilling the needs of a specific group of people, based on their job, or their cultural identity, or the institution for which they work, or their personal interests.
Solving the "distro collaboration problem" would really advance the state of open source. So that's what we set out to do in Ubuntu. We work on Launchpad, which is a web service for collaboration on bugs, and translations, and technical support. We work on Bazaar, which is a revision control system that understands branching and distributions, and is integrated with Launchpad. And hopefully those tools allow us to make our work available easily to Debian, and to Gentoo, and to upstream. And also, allow us to take good work from other distros (even if they would rather we didn't ;-)).
And finally, it seems to me that the hard part is not making funds available, it's allocating them to people and projects. I could easily write a cheque to SPI, Inc, for the same amount that I've invested in Ubuntu. But who would decide how that money was spent? Have you actually read the financial statements of SPI, Inc, over the past few years? Who would decide who gets hired full time, and who doesn't? Who would decide which projects get funded to be worked on, and which don't? As much as I admire the governance and social structures of Debian, I don't believe that it would be effective to allocate funds to it and expect to see the same level of productivity that we have been able to achieve in the Ubuntu project.
Mixing funding with volunteer work raises all sorts of issues. Ask Mako to tell you about the experiment that showed that this difficulty might be hard-wired into our genes - there are deep social difficulties with projects that blend full time paid work with volunteer efforts. (See this article for details.) I'm not sure Debian needs that kind of challenge. You can very quickly get into deep conflict over who gets to allocate money and hire people, and who decides which ideas get funding and which don't. One of the things that I believe gives Debian its real strength is the sense of "untaintedness". And to a certain extent, the fact that Ubuntu does NOT force changes into Debian has helped to reinforce that healthy reputation for Debian.
OK, but why don't you call it "Debian for Desktops" then?
Because we respect the Debian trademark policy. You may have watched the mindbending contortions around the definition of "DCC Alliance" recently for an example of what happens when people don't. Very simply, the Ubuntu project is not Debian, so it has no right to use the name. And using the name would weaken Debian's own brand name. In addition, we like the "humanity" aspect of the name Ubuntu, so we chose that.
Now we are on the naming thing, what's with the "Funky Fairy" naming system?
The official name of any Ubuntu release is "Ubuntu X.YY" where X represents the year, less 2000, and YY represents the month of the release in that year. So the first release, made in October 2004 was Ubuntu 4.10. The (at the time of writing) next release is due in April 2010, and so it will be Ubuntu 10.04.
The development codename of a release takes the form "Adjective Animal". So for example: Warty Warthog (Ubuntu 4.10), Hoary Hedgehog (Ubuntu 5.04), Breezy Badger (Ubuntu 5.10), are the first three releases of Ubuntu. In general, people refer to the release using the adjective, like "warty" or "breezy".
Many sensible people have wondered why we chose this naming scheme. It came about as a joke on a ferry between Circular Quay and somewhere else, in Sydney:
lifeless: how long before we make a first release? sabdfl: it would need to be punchy. six months max. lifeless: six months! thats not a lot of time for polish. sabdfl: so we'll have to nickname it the warty warthog release.
And voila, the name stuck. The first mailing list for the Ubuntu team was called "warthogs", and we used to hang out on #warthogs on irc.freenode.net. For subsequent releases we wanted to stick with the "hog" names, so we had Hoary Hedgehog, and Grumpy Groundhog. But "Grumpy" didn't sound right, for a release that was looking really good, and had fantastic community participation. So we looked around and came up with "Breezy Badger". We will still use "Grumpy Groundhog", but those plans are still a surprise ToBeAnnounced...
For those of you who think the chosen names could be improved, you might be relieved to know that the "Breezy Badger" was originally going to be the "Bendy Badger" (I still think that rocked). There were others...
For all of our sanity we are going to try to keep these names alphabetical after Breezy. We might skip letters, and we'll have to wrap eventually. But the naming convention is here for a while longer, at least. The possibilities are endless. Gregarious Gnu? Antsy Aardvaark? Phlegmatic Pheasant? You send 'em, we'll consider 'em.