FOSDEM is the biggest free and non-commercial event organized by and for the community. Its goal is to provide Free and Open Source developers a place to meet. No registration necessary.

   

Interview: Jonathan Corbet

Jonathan Corbet will give a talk about "How kernel development goes wrong and why you should be a part of it anyway" at FOSDEM 2011.

Many people know you as the lead editor of LWN.net, but you are also a kernel developer. Could you briefly introduce yourself and the things you do?

I'm Jonathan Corbet, and I'm not really sure what I do anymore. I've been working with computers for rather longer than I care to admit, having gotten my start with CDC mainframes and Cray 1 serial number 3. I've done system programing, system administration, field deployment, and mid-level management. In recent times I'm the lead editor for LWN.net, an occasional book author, and a developer when the time allows. Sometimes it seems like what I mostly do is hang out in airports.

What will your talk be about, exactly?

My talk is about the kernel development process, and, in particular, how things can go wrong. It's a study in failure, and how to avoid it.

What do you hope to accomplish by giving this talk? What do you expect?

Many years ago, in a graduate AI class, I read Herbert Simon's "The Science of the Artificial" - a book I would recommend to anybody. Therein he says:

A bridge, under its usual conditions of service, behaves simply as a relatively smooth level surface on which vehicles can move. Only when it has been overloaded to we learn the physical properties of the materials from which it is built.

He applied this idea to the study of how the brain works; we learn most about how we think by looking at our failures. I wanted to take a similar approach to the study of how we develop kernel code. My ultimate hope, of course, is to help prevent similar failures in the future, while, at the same time, dispelling some of the "the kernel community is too hard to work with" myths.

What do you consider the biggest failures of the Linux kernel developer community in the past and what have they done to prevent happening it again?

I think we have lost a number of good developers as the result of impedance mismatches between their working styles and the community's needs. Anything we can do to close that gap can only be good for the community in the long and short run.

What are the biggest organizational issues that the Linux kernel developer community is facing at the moment and how can they improve the situation?

As a whole, it's really working as well as it ever has. We've managed to scale to a point where almost every 80 days, like clockwork, we put out another kernel which incorporates the work of over 1,000 developers. So, in a sense, it's hard to find much to criticize.

That said, there's always room for improvement. We still could do better at comprehensively reviewing code going into the kernel. If we're going to continue to scale the size of the project (which, incidentally, is not necessarily something that needs to happen), we'll want to have Linus pulling fewer trees and the mid-level maintainers taking on more of that task. Right now, most pull requests go straight to Linus, and, as we've seen before, he only scales so far.

The Linux kernel is already quite mature, but development obviously keeps going on, so in which domains do you still expect big technical improvements in the next years?

Trying to predict where the kernel will be in a few years is a fool's game. I often make the mistake of trying to predict a single year ahead, and I often get it wrong.

Some things are fairly obvious: we'll continue to work to make the kernel function as well as possible on current hardware. The advent of solid-state storage devices is going to force some interesting changes; I don't think anybody really knows how we're going to make the best use of that hardware a few years from now. As the number of cores grows, we'll get even better at parallelism. And, I predict, some amazing things will come out of the blue and surprise everybody.

It seems that in recent years the influence of embedded Linux companies on the Linux kernel has grown: more and more of these companies contribute to the Linux kernel or have become a member of the Linux Foundation. Will this change the focus of Linux kernel development more from desktop and server use to embedded use?

The focus of kernel development has always been to support everything as well as we can. The increase in participation from the embedded community is more than welcome, but, in the end, their needs aren't all that different from those of the other communities. For example, SMP support was seen as a server feature, but, as dual-core smartphones hit the market, the embedded people will be glad to have that support.

Embedded developers may bring a stronger focus on kernel size and on power management, but desktop and server users will benefit from that work too. In the end, I really don't think there is much tension between the different user communities; we all need the same things in the end.

At the moment Linus Torvalds seems like a 'single point of failure' for the Linux kernel development. Is the kernel developer community ready to go on if something happens to him?

If Linus were to find enlightment and sequester himself into a Buddhist monastery tomorrow, there would be a definite hiccup in the development process. It might even delay the next kernel release by a month as various developers argued over who got stuck with his job. Linus's technical skills, his managerial skills, and even his flames would be much missed, but the kernel would go on without him. It's not something that I lose much sleep over.

Creative Commons License
This interview is licensed under a Creative Commons Attribution 2.0 Belgium License.