>The IETF is an open standards process that works by consensus.
>the mailing list rather than in a meeting where attendance is rather
>haphazard. Meetings are usefull to help get consensus, they are not
>guarantee of concensus. Therefore I am not (yet) buying Tim's
>the 29:1 vote in Amsterdam.
> First of all voting is not necessarily consensus
>(although with 29:1 it seems to come close :-), second the last few
>over 200 mails have convinced me that there is no concensus.
There are many points under the URI umbrella, ranging
from standardization of existing practice at the URL
end to inventive creation of new techniques at the URN
If there is a lot of discussion, that does not mean that
some issues are not ripe for the guiguillotinelotine.
Even if a topic is discussed, the size of the list is
sufficient that there will always be new people coming in
who ask "I am new to this list, but coming from a background
in porpoise farming, if I am not missing something,
surely..." which will create a lot of traffic.
There is also always the possibility of directly
obstructive behavior delaying consensus for any amount
It is the duty of the chairs to isolate issues where
consensus can be obtained.
In this case, the URL issue is ripe. If consensus cannot
be reached now then it never will be able to be.
>As an Area Director for Applications Area I will thus be very
>accept the latest URL draft as an RFC submission.
What, then, will happen? The essential requirement
for a universal printable representation for
any name spaces on the net was defined in Boston.
Soon afterward, rewriting and much refinement of
the document was performed, down to details of
In Amsterdam, one member suggested that the whole
work should be changed to the definition of a binary
representation, threatening the entire effort at
a point when small changes in allowed characters
were being decided.
>It is important ( in my view even essential to the success of an
>Information Architecture for the Internet) that we get the URL/URN
>right. And we wont get multiple chances.
To think that we will defining *everything* absolutely
*correctly* *now* is foolish, and and ideal way to
extend happy debate indefinitely.
First we define some basic, easily agreeable things,
a mixture of good design, and in some places arbitrary
choice. Like the principle of URI. And its basic
Later we can define URNs, URN resolution protocols.
In time, probably many. But we can represent them
Where would we be if the RFC822 authors had waited
to get MIME right? We still wouldn't have email.
> To get it right we need something
>that everyone can live with. I mean all the implementors, service
>and user support people on the uri-list. I do not necessarily mean
>everyone should agree that the solution is optimal, that will not be
>achieved evidently, however I feel that concensus can be reached on
>something that is acceptable to all of us.
I think that was reached on the net and in the
flesh at Amsterdam. There had been very little net
traffic, and discussion was at a detailed level.
>Threats like : " we have an installed base, so we won't support
>perpendicular to an open standards process and are undermingin for
>concensus. Parties that argue like that have no wish to get an open
>but merely a wish to further their own inventions. This will NOT
lead to an
>integrated information Architecture.
I personally have not argued in this way I hope.
There is however a
rather successful convention that the Internet respects
working code. A typical ISO member's ploy is to
deliberately change a standard to put all developers
back on the same footing. Hence ISO's attempt to put
Europe back into the networking picture with ISO
protocols. The IETF is big enough and commercial
interests are present enough for it to have to guard
against such behavior.
> In an open standards process as the
>IETF, accepting that an NIH idea might have merit is the basis of
>Finally I'd like to remark that on top of all this we have another
>condition for this WG (it is not in the USV area for nothing): The
>One of the UR* will be visible to the user (which one depends on
>position in the discussions :-). The user wants something as
>as possible, easy to remind and easy to pass on to a
Currently users are however passing URLs around and
have been since 1990. But now we are talking
criteria which were addressed agreed and defined in Boston.
I hope we all still agree on that one!
>So to get back to the question:
>The policy is:
>- get concensus in the URI WG (not per se in a physical meeting)
>- submit the paper to the ADs (Joyce, John and me)
>- The ADs will do a review (together with some people outside of
>- We will keep eye on Internet Architecture aspects, overlap with
>protocols, user perspective.
>- After this iteration the paper will be put forward to the IESG.
>- Then a last call is issued, for anyone to comment on the paper and
>process that lead to it's current contents.
>- If no problems then it goes to the RFC-editor.
>If anyone does not agree with any part of the process, the complaint
>1 - WG-chair, if that does not resolve it:
>2- AD if that does not resolve...
>3- IESG if....
>4- IAB if.....
>5- ISOC ombudsman
What actually happens?
Brewster simply disconnected WAIS from the Z39.50 discussions
because they rambled on for too long. He did not follow
a complaint hierarchy. He just left.
The gopher people have never bothered to wait for the IETF
to do anything. They have made fun of WWW's waiting for
RFCs to be produced. They didn't follow the hierarchy
or the IETF. They have the IIIR WG with a mandate
to document what they have done.
I have tried earnestly to separate the WWW specs (URI,
HTTP, HTML) into independent individually useful specs
for the internet community, and had thought that the
IETF would be like-minded enough to join in with what I
see as a very important effort for the future. Maybe it
is simply that the discussion group is above a critical
mass and will never cool.
The WWW specs have all been thrashed out in open
debate amongst people who are actually writing code.
There is some overlap with the IETF.
The WWW specs benefit from the Internet -- do they
have to benefit from the IETF? The
IETF must address the issue of global information and the
underpinning standards, and if it ignores the
working code it end up right up there with ISO.
One of the difficulties is that the Web in its conception
was an Integrated Information Architecture. By defining
URIs (and a few other things) it addressed (and basically
solved) half of the IIIR issues, incorporating already
gopher, FTP, WAIS, and whatever. Those who felt that it
was their mandate to "Integrate WWW and the others"
necessarily were going to have metaproblems
IETF working group chairs have a difficult job. If they
really cannot produce consensus they should time out
the debate and issue (a) informational RFCs on the
proposals as they stand (eg URLs) and (b) pointers to the
IRTF to do some concrete research about the fluffy
problems for which no engineering exists (eg URNs)
Sometimes in practice to produce a consensus requires
doing more than just vote counting and message counting.
If means being able to separate strands of argument,
condemn obfuscation, and declare irrelevance as such
when it appears. It requires recognition of attempts
to make arbitrary changes for NIH reasons.
It requires dedication to a timely concrete result.
Tim Berners-Lee email@example.com
World Wide Web development leader
CERN, CN Division Tel: +41(22)767 3755
1211 Geneva 23, Switzerland Fax: +41(22)767 7155