Latest news of the domain name industry

Recent Posts

Name collisions comments call for more gTLD delay

Kevin Murphy, August 29, 2013, Domain Registries

The first tranche of responses to Interisle Consulting’s study into the security risks of new gTLDs, and ICANN’s proposal to delay a few hundred strings pending more study, is in.
Comments filed with ICANN before the public comment deadline yesterday fall basically into two camps:

  • Non-applicants (mostly) urging ICANN to proceed with extreme caution. Many are asking for more time to study their own networks so they can get a better handle on their own risk profiles.
  • Applicants shooting holes in Interisle’s study and ICANN’s remeditation plan. They want ICANN to reclassify everything except .home and .corp as low risk, removing delays to delegation and go-live.

They were responding to ICANN’s decision to delay 521 “uncalculated risk” new gTLD applications by three to six months while further research into the risk of name collisions — where a new gTLD could conflict with a TLD already used by internet users in a non-standard way — is carried out.
Proceed with caution
Many commenters stated that more time is needed to analyse the risks posed by name collisions, noting that Interisle studied primarily the volume of queries for non-existent domains, rather than looking deeply into the consequences of delegating colliding gTLDs.
That was a point raised by applicants too, but while applicants conclude that this lack of data should lead ICANN to lift the current delays, others believe that it means more delays are needed.
Two ICANN constituencies seem to generally agree with the findings of the Interisle report.
The Internet Service Providers and Connectivity Providers constituency asked for the public comment period be put on hold until further research is carried out, or for at least 60 days. It noted:

corporations, ISPs and connectivity providers may bear the brunt of the security and customer-experience issues resulting from adverse (as yet un-analyzed) impacts from name collision

these issues, due to their security and customer-experience aspects, fall outside the remit of people who normally participate in the ICANN process, requiring extensive wide-ranging briefings even in corporations that do participate actively in the ICANN process

The At-Large Advisory Committee concurred that the Interisle study does not currently provide enough information to fully gauge the risk of name collisions causing harm.
ALAC said it was “in general concurrence with the proposed risk mitigation actions for the three defined risk categories” anyway, adding:

ICANN must assure that such residual risk is not transferred to third parties such as current registry operators, new gTLD applicants, registrants, consumers and individual end users. In particular, the direct and indirect costs associated with proposed mitigation actions should not have to be borne by registrants, consumers and individual end users. The Board must err on the side of caution

Several individual stakeholders agreed with the ISPCP that they need more time to look at their own networks. The Association of Nation Advertisers said:

Our member companies are working diligently to determine if DNS Clash issues are present within their respective networks. However the ANA had to communicate these issues to hundreds of companies, after which these companies must generate new data to determine the potential service failures on their respective networks.

The ANA wants the public comment period extended until November 22 to give its members more time to gather data.
While the ANA can always be relied upon to ask for new gTLDs to be delayed, its request was echoed by others.
General Electric called for three types of additional research:

  • Additional studies of traffic beyond the initial DITL sample.
  • Information and analysis of “use cases” — particular types of queries and traffic — and the consequences of the failure of particular use cases to resolve as intended (particular use cases could have severe consequences even if they might occur infrequently — like hurricanes), and
  • Studies of the time and costs of mitigation.

GE said more time is needed for companies such as itself to conduct impact analyses on their own internal networks and asked ICANN to not delegate any gTLD until the risk is “fully understood”.
Verizon, Heinz and the American Insurers Association have asked for comment deadline extensions for the same reasons.
The Association of Competitive Technology (which has Verisign as a member) said:

ICANN should slow or temporarily suspend the process of delegating TLDs at risk of causing problems due to their frequency of appearance in queries to the root. While we appreciate the designation of .home and .corp as high risk, there are many other TLDs which will also have a significant destructive effect.

Numerically, there were far more comments criticizing ICANN’s mitigation proposal. All were filed by new gTLD applicants, whose interests are aligned, however.
Most of these comments, which are far more focused on the details and the data, target perceived deficiencies in Interisle’s report and ICANN’s response to it.
Several very good arguments are made.
The Svalbard problem
First, there is criticism of the cut-off point between “low risk” and “uncalculated risk” strings, which some applicants say is “arbitrary”.
That’s mostly true.
ICANN basically took the list of applied-for strings, ordered by the frequency Interisle found they generate NXDOMAIN responses at the root, and drew a line across it at the 49,842 queries mark.
That’s because 49,842 queries is what .sj, the least-frequently-queried real TLD, received over the same period
If your string, despite not yet existing as a gTLD, already gets more traffic than .sj, it’s classed as “uncalculated risk” and faces more delays, according to ICANN’s plan.
As Directi said in its comments:

The result of this arbitrary selection is that .bio (Rank 281) with 50,000 queries (rounded to the nearest thousand) is part of the “uncategorized risk” list, and is delayed by 3 to 6 months, whereas .engineering (Rank 282) with 49,000 queries (rounded to the nearest thousand) is part of the “low risk” list, and can proceed without any significant delays.

What neither ICANN nor Interisle explained is why this is an appropriate place to draw a line in the sand.
This graphic from DotCLUB Domains illustrates the scale of the problem nicely:
.sj is the ccTLD for Svalbard, a Norwegian territory in the Arctic Circle with fewer than 3,000 inhabitants. The TLD is administered by .no registry Norid, but it’s not possible to register domains there.
Does having more traffic than .sj mean a gTLD is automatically more risky? Does having less mean a gTLD is safe? The ICANN proposal assumes “yes” to both questions, but it doesn’t explain why.
Many applicants say that having more traffic than existing gTLDs does not automatically mean your gTLD poses a risk.
They pointed to Verisign data from 2006, which shows that gTLDs such as .xxx and .asia were already receiving large amounts of traffic prior to their delegation. When they were delegated, the sky did not fall. Indeed, there were no reports of significant security and stability problems.
The New gTLD Applicants Group said:

In fact, the least “dangerous” current gTLD on the chart, .sx, had 331 queries per million in 2006. This is a higher density of NXDOMAIN queries than all but five proposed new TLDs. 4 Again, .sx was launched successfully in 2012 with none of the problems predicted in these reports.
These successful delegations alone demonstrate that there is no need to delay any more than the two most risky strings.

Donuts said:

There is no factual basis in the study recommending halting delegation process of 20% of applied-for strings. As the paper itself says, “The Study did not find enough information to properly classify these strings given the short timeline.” Without evidence of actual harm, the TLDs should proceed to delegation. Such was the case with other TLDs such as .XXX and .ASIA, which were delegated without delay and with no problems post-delegation.

Applicants also believe that the release in June 2012 of the list of all 1,930 applied-for strings may have skewed the data set that Interisle used in its study.
Uniregistry, for example, said:

The sole fact that queries are being received at the root level does not itself present a security risk, especially after the release to the public of the list of applied-for strings.

The argument seems to be that a lot of the NXDOMAIN traffic seen in 2013 is due to people and software querying applied-for TLDs to see if they’re live yet.
It’s quite a speculative argument, but it’s somewhat supported by the fact that many applied-for strings received more queries in 2013 than they did in the equivalent 2012 sampling.
Second-level domains
Some applicants pointed out that there may not be a correlation between the volume of traffic a string receives and the number of second-level domains being queried.
A string might get a bazillion queries for a single second-level domain name. If that domain name is reserved by the registry, the risk of a name collision might be completely eliminated.
The Interisle report did show that number of SLDs and the volume of traffic do not correlate.
For example, .hsbc is ranked 14th in terms of traffic volume but saw requests for just 2,000 domains, whereas .inc, which ranked 15th, saw requests for 73,000 domains.
Unfortunately, the Interisle report only published the SLD numbers for the top 35 strings by query volume, leaving most applicants none the wiser about the possible impact of their own strings.
And ICANN did not factor the number of SLDs into its decision about where to draw the line between “low” and “uncalculated” risk.
Conspiracy theories
Some applicants questioned whether the Interisle data itself was reliable, but I find these arguments poorly supported and largely speculative.
They propose that someone (meaning presumably Verisign, which stands to lose market share when new gTLDs go live, and which kicked off the name collisions debate in the first place) could have gamed the study by generating spurious requests for applied-for gTLDs during the period Interisle’s data was being captured.
Some applicants put forth this view, while other limited their comments to a request that future studies rely only on data collected before now, to avoid tampering at the point of collection in future.
NTAG said:

Query counts are very easily gamed by any Internet connected system, allowing for malicious actors to create the appearance of risk for any string that they may object to in the future. It would be very easy to create the impression of a widespread string collision problem with a home Internet connection and the abuse of the thousands of available open resolvers.

While this kind of mischief is a hypothetical possibility, nobody has supplied any evidence that Interisle’s data was manipulated by anyone.
Some people have privately pointed DI to the fact that Verisign made a substantial donation to the DNS-OARC — the group that collected the data that Interisle used in its study — in July.
The implication is that Verisign was somehow able to manipulate the data after it was captured by DNS-OARC.
I don’t buy this either. We’re talking about a highly complex 8TB data set that took Interisle’s computers a week to process on each pass. The data, under the OARC’s deal with the root server operators, is not allowed to leave its premises. It would not be easily manipulated.
Additionally, DNS-OARC is managed by Internet Systems Consortium — which runs the F-root and is Uniregistry’s back-end registry provider — from its own premises in California.
In short, in the absence of any evidence supporting this conspiracy theory, I find the idea that the Interisle data was hacked after it was collected highly improbable.
What next?
I’ve presented only a summary of some key points here. The full list of comments can be found here. The reply period for comments closes September 17.
Several ICANN constituencies that can usually be relied upon to comment on everything (registrars, intellectual property, business and non-commercial) have not yet commented.
Will ICANN extend the deadline? I suppose it depends on how cautious it wants to be, whether it believes the companies requesting the extension really are conducting their own internal collision studies, and how useful it thinks those studies will be.