Name Collisions: Unanticipated Effects [Guest Post]
I attended the TLD Security Forum sponsored by Artemis in San Francisco five weeks ago. By happenstance, I became involved in a small group formed after the meeting that dedicated themselves to replicating the Interisle study (“Name Collisions in the DNS”) and carrying on with the next step in the analysis.
The work among competitors that occurred over the next four weeks was collaborative, intensive, and competent: an excellent example of how the multi-stakeholder model can accomplish significant work and publish it to the broad internet community in an effort to resolve an issue. It brought the right people together to accomplish more, faster than any other governance model would achieve.
Their work is easily identifiable among the many comments submitted on the name collision issue. Without offering an opinion on conclusions here, I note that the competence of work shines through and should be carefully considered.
The Interisle study sounded an alarm because it reported a potentially high number of domain name “collisions” that might result from the delegation of new gTLDs. The term “collision” is somewhat of a misnomer and the key issue, I think, is the use of search-list processing by companies in configuring their networks.
The Interisle report published the volumes of NX Domain responses by TLD and described possible harms but did not link harms to specific types of queries nor delve into the data in order to draw firm conclusions or propose mitigations.
There is nothing wrong with this –- the report was competently executed given the time available.
This is where several interested parties, mostly applicants, jumped in. In an impromptu meeting after the conference a half-dozen companies coordinated: the purchase of servers to analyze previously collected root-zone data (the “Day In The Life” or DITL data); acquisition of memberships in OARC, to whom the servers were donated; and the analysis of vast amounts of data.
Considerable time was spent redesigning queries in order to replicate the Interisle results from the DITL data so that the next step in the analysis would be seamless as the work transitioned from Interisle to this collaborative group.
Hypotheses were developed, queries written, data summarized and statistically tested. Every difference between the Interisle data and the newly analyzed data was discussed until the team was satisfied it would withstand public scrutiny.
The team met twice weekly in conference calls and traded numerous emails to flesh out technical details. Data scientists learned about the DNS, DNS experts learned about z-tests and the effects of non-standard distributions.
The team agreed to publish the data, which it has, so that anyone could perform analysis similar to that done by this team.
For me, these technical discussions brought to mind the reaffirmation of the effectiveness of the ICANN model that occurred as a result of this issue. Work continues and will be discussed at the next TLD Security Conference on October 1st in Washington, DC.
This is a guest post written by Kurt Pritz, ICANN’s former chief strategy officer. He is currently an independent consultant working with new gTLD applicants and others.
If you find this post or this blog useful or interestjng, please support Domain Incite, the independent source of news, analysis and opinion for the domain name industry and ICANN community.
Recent Comments