Latest news of the domain name industry

Recent Posts

Delays still dog many new gTLD applicants

Kevin Murphy, March 3, 2014, Domain Policy

With dozens of new gTLDs currently live and on sale, it’s easy to forget that many applicants are still in ICANN limbo due to several still-unresolved issues with the evaluation process.

The New gTLD Applicant Group wrote to ICANN on Friday to express many of these concerns.

First, NTAG is upset that resolution of the name collisions issue is not moving as fast as hoped.

JAS Advisors published its report into collisions, which recommends “controlled interruption” as a solution, last Thursday. But it’s currently open for public comment until April 21.

That would push approval of the plan by ICANN’s board beyond the Singapore meeting taking place at the end of March, at least a month later than originally expected.

NTAG secretary Andrew Merriam argues that the 42-day comment period should be reduced to 21 days, with ICANN and JAS conducting webinars this week to discuss the proposal with applicants.

Second, NTAG is upset that ICANN has pushed out the start date for the first contention set auctions from March to June. It’s asking ICANN to promise that there will be no further delays.

Third, NTAG says that many dot-brands are unable to enter into contracting talks with ICANN until Specification 13 of the Registry Agreement, which contains opt-outs for single-registrant zones, is finalized.

That’s not currently expected to happen until Singapore, apparently because there were no scheduled meetings of the ICANN board’s New gTLD Program Committee until then.

NTAG also complains about the length of time it’s taking to decide the first Community Priority Evaluations, which is apparently due to quality assurance measures (very wise given the controversy caused by the lack of oversight on new gTLD objections, if you ask me).

The NGPC has a newly scheduled meeting this Wednesday, with new gTLDs on the agenda, but it’s not yet clear whether any of NTAG’s issues are going to be addressed.

NTAG rubbishes new gTLD collision risk report

Kevin Murphy, August 15, 2013, Domain Policy

The New gTLD Applicants Group has slated Interisle Consulting’s report into the risk of new gTLDs causing security problems on the internet, saying the problem is “overstated”.

The group, which represents applicants for hundreds of gTLDs and has a non-voting role in ICANN’s GNSO, called on ICANN to reclassify hundreds of “Uncalculated” risk strings as “Low” risk, meaning they would not face as substantial a delay before or uncertainty about their eventual delegation.

But NTAG said it “agreed” that the high-risk .corp and .home “should be delayed while further studies are conducted”. The current ICANN proposal is actually to reject both of these strings.

NTAG was responding to ICANN’s proposal earlier this month to delay 523 applications (for 279 strings) by three to six months while further studies are carried out.

The proposal was based on Interisle’s study of DNS root server logs, which showed many millions of daily queries for gTLDs that currently do not exist but have been applied for.

The worry is that delegating those strings would cause problems such as downtime or data leakage, where sensitive information intended for a recipient on the same local network would be sent instead to a new gTLD registry or one of its (possibly malicious) registrants.

NTAG reckons the risk presented by Interisle has been overblown, and it presented a point-by-point analysis of its own. It called for everything except .corp and .home to be categorized “Low” risk, saying:

We recognize that a small number of applied for names may possibly pose a risk to current operations, but we believe very strongly that there is no quantitative basis for holding back strings that pose less measurable threat than almost all existing TLDs today. This is why we urge the board to proceed with the applications classified as “Unknown Risk” using the mitigations recommended by staff for “Low Risk” strings. We believe the 80% of strings classified as “Low Risk” should proceed immediately with no additional mitigations.

The group pointed to a recent analysis by Verisign (which, contrarily, was trying to show that new gTLDs should be delayed) which included data about previous new gTLD delegations.

That report (pdf) said that .xxx was seeing 4,018 look-ups per million queries at the DNS root (PPM) before it was delegated. The number for .asia was 2,708.

If you exclude .corp and .home, both of those PPM numbers are multiples larger than the equivalent measures of query volume for every applied-for gTLD today, also according to Verisign’s data.

NTAG said:

None of these strings pose any more risk than .xxx, .asia and other currently operating TLDs.

the least “dangerous” current gTLD on the chart, .sx, had 331 queries per million in 2006. This is a higher density of NXDOMAIN queries than all but five proposed new TLDs. 4 Again, .sx was launched successfully in 2012 with none of the problems predicted in these reports.

Verisign’s report, which sought to provide a more qualitative risk analysis based on some data-supported guesses about where the error traffic is coming from and why, anticipated this interpretation.

Verisign said:

This could indicate that there is nothing to worry about when adding new TLDs, because there was no global failure of DNS when this was done before. Alternately, one might conclude that traffic volumes are not the only indicator of risk, and the semantic meaning of strings might also play a role. We posit that in some cases, those strings with semantic meanings, and which are in common use (such as in speech, writing, etc.) pose a greater risk for naming collision.

The company spent most of its report making somewhat tenuous correlations between its data (such as a relatively large number of requests for .medical from Japanese IP addresses) and speculative impacts (such as “undiagnosed system failures” at “a healthcare provider in Japan”).

NTAG, by contrast, is playing down the potential for negative outcomes, saying that in many cases the risks introduced by new gTLDs are no different from collision risks at the second level in existing TLDs.

Just as the NTAG would not ask ICANN to halt .com registrations while a twelve month study is performed on these problems, we believe there is no reason to introduce a delay in diversifying the Internet’s namespace due to these concerns.

While it stopped short of alleging shenanigans this time around, NTAG also suggested that future studies of root server error traffic could be gamed if botnets were engaged to crapflood the roots.

Its own mitigation plan, which addresses Interisle’s specific concerns, says that most of the reasons that non-existent TLDs are being looked up are either not a problem or can be easily mitigated.

For example, it says that queries for .youtube that arrived in the form of a request for “www.youtube” are probably browser typos and that there’s no risk for users if they’re taken to the YouTube dot-brand instead of youtube.com.

In another example, it points out that requests for “.cisco” or “.toshiba” without any second-level domains won’t resolve anyway, if dotless domains are banned in those TLDs. (NTAG, which has influential members in favor of dotless domains, stopped short of asking for a blanket ban.)

The Interisle report, and ICANN’s proposal to deal with it, are open for public comment until September 17. NTAG’s response is remarkably quick off the mark, for guessable reasons.

“Risky” gTLDs could be sacrificed to avoid delay

Kevin Murphy, July 20, 2013, Domain Tech

Google and other members of the New gTLD Applicant Group are happy to let ICANN put their applications on hold in response to security concerns raised by Verisign.

During the ICANN 46 Public Forum in Durban on Thursday, NTAG’s Alex Stamos — CTO of .secure applicant Artemis — said that agreement had been reached that about half a dozen applications could be delayed:

NTAG has consensus that we are willing to allow these small numbers of TLDs that have a significant real risk to be delayed until technical implementations can be put in place. There’s going to be no objection from the NTAG on that.

While he didn’t name the strings, he was referring to gTLDs such as .home and .corp, which were highlighted earlier in the week as having large amounts of error traffic at the DNS root.

There’s a worry, originally expressed by Verisign in April and independent consultant Interisle this week, that collisions between new gTLDs and widely-used internal network names will lead to data leakage and other security problems.

Google’s Jordyn Buchanan also took the mic at the Public Forum to say that Google will gladly put its uncontested application for .ads — which Interisle says gets over 5 million root queries a day — on hold until any security problems are mitigated.

Two members of the board described Stamos’ proposal as “reasonable”.

Both Stamos and ICANN CEO Fadi Chehade indirectly criticised Verisign for the PR campaign it has recently built around its new gTLD security concerns, which has led to somewhat one-sided articles in the tech press and mainstream media such as the Washington Post.

Stamos said:

What we do object to is the use of the risk posed by a small, tiny, tiny fraction — my personal guess would be six, seven, eight possible name spaces that have any real impact — to then tar the entire project with a big brush. For contracted parties to go out to the Washington Post and plant stories about the 911 system not working because new TLDs are turned on is completely irresponsible and is clearly not about fixing the internet but is about undermining the internet and undermining new gTLDs.

Later, in response to comments on the same topic from the Association of National Advertisers, which suggested that emergency services could fail if new gTLDs go live, Chehade said:

Creating an unnecessary alarm is equally irresponsible… as publicly responsible members of one community, let’s measure how much alarm we raise. And in the trademark case, with all due respect it ended up, frankly, not looking good for anyone at the end.

That’s a reference to the ANA’s original campaign against new gTLDs, which wound up producing not much more than a lot of column inches about an utterly pointless Congressional hearing in late 2011.

Chehade and the ANA representative this time agreed publicly to work together on better terms.

New gTLD hopefuls set aggressive targets for ICANN

Kevin Murphy, August 22, 2012, Domain Registries

ICANN should start delegating new gTLDs in the first quarter of next year as previously planned and the Governmental Advisory Committee should work faster.

That’s according to many new gTLD applicants dropping their ideas into ICANN’s apparently semi-official comment box on application “metering” over the last week or so.

ICANN wanted to know how it should queue up applications for eventual delegation, in the wake of the death of batching and digital archery.

According to information released over the past couple of weeks, it currently plans to release the results of Initial Evaluation on all 1,924 still-active applications around June or July next year, leading to the first new gTLDs going live in perhaps August.

But that’s not good enough for many applicants. Having successfully killed off batching, their goal now is to compress the single remaining batch into as short a span as possible.

The New TLD Applicant Group, a new observer group recognized by ICANN’s Registry Stakeholder Group, submitted lengthy comments.

NTAG wants Initial Evaluation on all applications done by January 2013, and for ICANN to publish the results as they trickle in rather than in one batch at the end.

The suggested deadline is based on ICANN’s recent statement that its evaluators’ processing powers could eventually ramp up to 300 applications per month. NTAG said in its comments:

Notwithstanding ICANN’s statements to the contrary, there is not a consensus within the group that initial evaluation results should be held back until all evaluations are complete; in fact, many applicants believe that initial evaluation results should be released as they become available.

That view is not universally supported. Brand-centric consultancy Fairwinds and a couple of its clients submitted comments expressing support for the publication of all Initial Evaluation results at once.

January 2013 is an extremely aggressive deadline.

Under the batching-based schedule laid out in the Applicant Guidebook, 1,924 applications would take more like 20 months, not seven, to pass through Initial Evaluation.

NTAG could not find consensus on methods for sequencing applications among its members. Separate submissions from big portfolio applicants including Donuts, Uniregistry, TLDH and Google and smaller, single-bid applicants gave some ideas, however.

Donuts, for example, hasn’t given up on a game-based solution to the sequencing problem – including, really, Rock Paper Scissors – though it seems to favor a system based on timestamping.

The company is among a few to suggest that applications could be prioritized using the least-significant digits of the timestamp they received when they were submitted to ICANN.

An application filed at 15:01:01 would therefore beat an application submitted at 14:02:02, for example.

This idea has been out there for a while, though little discussed. I have to wonder if any applicants timed their submissions accordingly, just in case.

Comments submitted by TLDH, Google and others offer a selection of methods for sequencing bids which includes timestamping as well alphabetical sorting based on the hash value of the applications.

This proposal also supports a “bucketing” approach that would give more or less equal weight to five different types of application – brand, geographic, portfolio, etc.

Uniregistry, uniquely I think, reckons it’s time to get back to random selection, which ICANN abandoned due to California lottery laws. The company said in its comments:

Random selection of applications for review should not present legal issues now, after the application window has closed. While the window was still open, random selection for batches would have given applicants an incentive to file multiple redundant applications, withdrawing all but the application that placed earliest in the random queue and creating a kind of lottery for early slots. Now that no one can file an additional application, that lottery problem is gone.

Given that the comment was drafted by a California lawyer, I can’t help but wonder whether Uniregistry might be onto something.

Many applicants are also asking the GAC to pull its socks up and work on its objections faster.

The GAC currently thinks it can file its official GAC Advice on New gTLDs in about April next year, which doesn’t fit nicely with the January 2013 evaluation deadline some are now demanding.

ICANN should urge the GAC to hold a special inter-sessional meeting to square away its objections some time between Toronto in October and Beijing in April, some commenters say.

ICANN received dozens of responses to its call for comments, and this post only touches on a few themes. A more comprehensive review will be posted on DI PRO tomorrow.