New gTLD winners could be named June 2013

Kevin Murphy, July 30, 2012, 07:23:56 (UTC), Domain Policy

ICANN has sketched out a tentative timetable for the evaluation of its new generic top-level domain applications that would see the first successful gTLDs appear over a year from now.

But the plan has little meat on its bones, and ICANN has admitted that it still doesn’t know exactly how the evaluation process is going to pan out.

In a new call for comments, ICANN confirmed that all 1,930 applications are going to be evaluated at the same time, and that the evaluators have already started work.

The winners and losers from Initial Evaluation, ICANN said, could be announced June or July 2013.

This would mean that the first new gTLDs would start going live on the internet “in late third quarter of 2013, six months later than originally expected”, ICANN said.

But which successful applications would start hitting the root first is still wide open to debate.

The idea that the applications would be processed in batches of 500 or thereabouts, is now pretty much dead. That’s been obvious since digital archery was killed off, but it’s now confirmed.

ICANN said it has a “tentative project plan” that “foresees the processing of applications in a single batch, and simultaneous release of results” about a year from now.

But with “batching” dead, we now have a “metering” problem.

Hypothetically, as many as 1,409 unique gTLD applications could emerge successfully from evaluation at the same time, in June or July next year.

That’s the theoretical ceiling; in reality the number will be substantially reduced by withdrawals, objections and contention.

But before any of them can go live the applicants need to negotiate and/or sign registry agreements with ICANN and undergo formal pre-delegation technical testing. That creates two bottlenecks at ICANN in its legal and IANA departments.

ICANN now wants to know how to “meter” successfully evaluated applications, to smooth out the roll-out so that no more than 1,000 new gTLDs are delegated in any given year.

An idea that emerged in Prague was to order applications according to how “clean” they were, as measured many clarifying questions the evaluators had to ask the applicants. But that idea has now been dismissed as “unworkable”, ICANN said.

ICANN’s board of directors had promised to make about three weeks after the Prague meeting – a deadline that passed over a week ago – but it’s now turning to the community for ideas.

Before August 19, it wants to know:

1. Should the metering or smoothing consider releasing evaluation results, and transitioning applications into the contract execution and pre-delegation testing phases, at different times?

a. How can applications be allocated to particular release times in a fair and equitable way?

b. Would this approach provide sufficient smoothing of the delegation rate?

c. Provide reasoning for selecting this approach.

2. Should the metering or smoothing be accomplished by downstream metering of application processing (i.e., in the contract execution, pre-delegation testing or delegation phases)?

a. How can applications be allocated to a particular timing in contract execution, pre-delegation testing, or delegation in a fair and equitable way?

b. Provide reasoning for selecting this approach.

3. Include a statement describing the level of importance that the order of evaluation and delegation has for your application.

My hunch based on conversations in Prague is that the majority answer to question 1 will be “No” and that the majority answer to question 2 will be “Yes”, but that’s just a hunch at this point.

Tagged: , , , ,

Comments (1)

  1. Zack says:

    I’m wondering what came first – the chicken or the egg (the 1,000 per year limit or the ICANN paper on delegation rates)? ICANN published a paper in October 2010 titled “Delegation Rate Scenarios for New gTLDs”. On page 6 of that paper ICANN says that it could not delegate more than 965 strings in a year. Is this where the nice and rounded figure of 1,000 comes from? If so then that analysis should be revisited. As you point out, ICANN is saying it can process the 1,930 applications in about 12 months which seems to be higher than the assumptions contained in that paper. Me thinks with real numbers, and far less assumptions, a better discussion can be had with what can be delegated within one year? Or did the 1,000 figure come from somewhere else? If so I’d like to see that analysis!

    I also think that if ICANN is looking for input from the community then they ought to consider a Question & Answer forum. Without more facts about what ICANN can or cannot do then any processing models we create will come with a lot of assumptions. If we can eliminate as many assumptions as possible then maybe ICANN will get better models for their consideration.

Add Your Comment