Into the Future with the Internet Vendor Task Force
A very Curmudgeonly View
or
Testing Spaghetti - a Wall's Point of View

as published in ACM SIGCOMM Computer Communication Review
Volume 35, Number 5, October 2005

It is said that many researchers think the Internet Vendor Task Force (IVTF, nee IETF) has become irrelevant. They are either measuring the real internet as a behavioral phenomenon, which is a bit scary if you think about it, or they are wanting to do research 'beyond' the internet. Neither involves the IETF.

This is partly because of the research community's inability to get deployment traction via the IETF path. The '90s poster children, QOS, DiffServ, IntServ, Self-Serv have a long history of attempts at relevance and deployment via the IETF, none successful in the face of a bandwidth glut and lack of end-to-end signaling in their designs. Whether economics will change sufficiently to give them legs is not clear; I would not bet on it.

But do not think that the direct researcher/operator interface is in the best of shape. For example, as it is widely believed that the majority of congestion is on customer access links, why is WRED not enabled on these links? Why have operators not asked the vendors to make it the default for some types of interfaces?

Simple ideas for which there was clear need, ECN, pushback, ... have been poorly deployed at best, and never finished in the IETF community at worst, vendor reluctance being a major block. This inability to get deployment has been exacerbated by the disconnect between the IETF and the operational community - and researchers want their work to be used. And the IETF community has a real problem thinking it's too smart. Despite all the attention to cryptographic algorithm agility, when it came to hash functions, the IETF got it wrong every time [0]. Only IKEv2 was even close to correct.

Given that the IETF has spent the last decade devaluing researchers almost as much as it has operators, is it that there are few if any really serious thinkers left? Or maybe it has become so difficult to have serious new thoughts that the added bureaucratic path through the IETF makes it all just not worthwhile.

But my perspective of the IETF is poisoned by the devaluation of operational realities.

One has only to look at the long and painful histories of (not) deploying IPv6, dnssec, various QOS schemes, ... to see the results of the disconnects between research, the IETF, process and the realities of operational deployment.

And no one has bad intent. It's the physics of growth, industry maturity, 42 cases of second system syndrome, and the resulting but anti-functional complexity.

It is easy to lose perspective, become preoccupied with the trees, and think one must be making progress through the forest. When there are as many moving parts as the IETF culture of complexity seems to tolerate, worse encourage, the concept of forest and direction are lost in organizational processes, social welfare, and administrative politics.

A famous Mexican painter, David Siqueros, had a wonderful piece on a church ceiling I visited as a child, The Man who was so Open Minded his Brains Fell Out. As a culture, we are becoming so open minded and accepting of complexity, both in our product and in our process, that our real engineering vision has fallen out. So, we can look at other matured scientific/engineering disciplines, e.g., aerospace, telco, ... and get a feel for where we are headed, not fun places. Think ITU, ISO, etc. But today's vision of the IETF will take us there, slowly, openly, and with true team spirit.

[ I wish to thank a few researchers and operators who read and very seriously improved this broadside. It it is probably best if they remain anonymous. ]
---

[0] - Developing a New Hash Algorithm, Bellovin & Rescorla

[1] - MPLS Document Architecture

[2] - Some reflections on Modula-2 standardization