Could the next new gTLD round last 25 years? Or 70 years?
Will the next new gTLD round see 25,000 applications? If so, how long will it take for them all to go live?
The 25,000 figure is one that I’ve heard touted a few times, most recently during public sessions at ICANN’s meeting in Johannesburg last month.
The problem is that, judging by ICANN’s previous performance, such a huge number of applications would take anywhere from 25 to 70 years to process.
It’s unclear to me where the 25,000 application estimate comes from originally, but it does not strike me as laughably implausible.
There were just shy of 1,930 applications for 1,408 unique strings in the most recent round.
There could have been so many more.
ICANN’s outreach campaign is generally considered to have been a bit lackluster, particularly in developing markets, so many potential applicants were not aware of the opportunity.
In addition, some major portfolio applicants chose to rein in their ambitions.
Larry Page, then-CEO of Google, is known to have wanted to apply for many, many more than the 101 Google wound up applying for, but was talked down by staff.
There’s talk of pent-up demand for dot-brands among those companies that missed the 2012 window, but it’s impossible to know the scale of that demand with any precision.
Despite the fact that a handful of dot-brands with ICANN registry agreements and delegations have since cancelled their contracts, there’s no reason they could not reapply for defensive purposes again in subsequent rounds.
There are also thousands of towns and cities with populations comparable to cities that applied in 2012 that could apply next time around.
And there’s a possibility that the cost of applying — set at $185,000 on a highly redundant “cost recovery” basis — may come down in the next round.
Lots of other factors will play a role in how many applications we see, but in general it doesn’t seem impossible that there could be as many as 25,000.
Assuming for a moment that there are 25,000, how long will that take to process?
In the 2012 round, ICANN said it would delegate TLDs at a rate of no more than 1,000 per year. So that’s at least 25 years for a 25,000-app round.
That rate was set somewhat arbitrarily during discussions about root zone scaling before anyone knew how many gTLDs would be applied for and estimates were around the 500 mark.
Essentially, the 1,000-per-year number was floated as a sort of straw man (or “straw person” as some ICANNers have it nowadays) so the technical folk had a basis to figure out whether the root system could withstand such an influx.
Of course, this limit will have to be revised significantly if ICANN has any hope of processing 25,000 applications in under a generation.
Discussions at the time indicated that the rate of change, not the size of the root zone, was what represented the stability threat.
In reality, the rate of delegation has been significantly slower than 1,000 per year.
It took until May 2016 for the 1,000th new gTLD to go live, 945 days after the first batch were delegated in late October 2013.
That means that during the relative “rush-hour” of new gTLD delegations, there was still only a little over one per day on average.
And that’s counting from the date of the first delegation, which was actually 18 months after the application window was closed.
If that pattern held in subsequent rounds, we would be looking at about 70 years for a batch of 25,000 to make their way through the system.
You could apply for a vanity gTLD matching your family name and leave the delegation as a gift to your great-grandchildren, long after your death.
Clearly, with 25,000 applications some significant process efficiencies — including, I fancy, much more automation — would be in order.
Currently, IANA’s process for making changes to root zone records (including delegations) is somewhat complex and has multiple manual steps. And that’s before Verisign makes the actual change to the master root zone file.
But the act of delegation is only the final stage of processing a gTLD application.
First, applications that typically run into tens of thousands of words have to undergo Initial Evaluation by several teams of knowledgeable consultants.
From Reveal Day in 2012 to the final IE being published in 2014 took a little over two years, or an average of 2.5 applications per day.
Again, we’re looking at over a quarter of a century just to conduct IE on 25,000 applications.
Then there’s contracting — ICANN’s lawyers would have to sign off on about a dozen Registry Agreements per day if it wanted to process 25,000 delegations in just five years.
Not to mention there’s also pre-delegation testing, contention resolution, auctions, change requests, objections…
There’s a limited window to file objections and there were many complaints, largely from governments, that this period was far too short to read through just 1,930 applications.
A 25,000-string round could take forever, and ICANN’s policies and processes would have to be significantly revised to handle them in a reasonable timeframe.
Then again, potential applicants might view the 2012 round as a bust and the next round could be hugely under-subscribed.
There’s no way of knowing for sure, unfortunately.
I heard interest for 200 .BRAND new gTLD applications just for France (which – of course – does not mean that 200 .BRAND new gTLD applicants will apply)…unless I am wrong.
Best not to get caught up in the semantics of what happened in the past. Necessity os the mother of invention.
In WW2 (63 years ago) the combined armies of the world were building 230,000 aircraft a year with 1930’s technology.
Uniregistry stands ready to run 200,000 new gtld’s or more if application fees and demand allow.
How’s that demand working out?
An only only $9999.99/year!
My problem with you Kevin is you’re not blunt enough. Can you possibly be more direct and assertive please?
In the GNSO new gTLD policy work – it was envisaged that we would one day move to a first come first served model – where an applicant can apply at any time. ie same as applying for gTLD registrar accreditation. The reason for processing in rounds was to deal with contention where multiple applicants would want the same string – as we saw with .web in the last round. Maybe it is time to move to first-come first served under the current policy, and update the rules as new policy is developed, taking a more agile approach.