Latest news of the domain name industry

Recent Posts

After .org price outrage, ICANN says it has NOT scrapped public comments

Kevin Murphy, October 11, 2019, Domain Policy

ICANN this evening said that it will continue to open up gTLD registry contract amendments for public comment periods, despite posting information yesterday suggesting that it would stop doing so.

The organization recently formalized what it calls “internal guidelines” on when public comment periods are required, and provided a summary in a blog post yesterday.

It was very easy to infer from the wording of the post that ICANN, in the wake of the controversy over the renegotiation of Public Interest Registry’s .org contract, had decided to no longer ask for public comments on future legacy gTLD contract amendments.

I inferred as much, as did another domain news blogger and a few other interested parties I pinged today.

I asked ICANN if that was a correct inference and Cyrus Namazi, head of ICANN’s Global Domains Division, replied:

No, that is not correct. All Registry contract amendments will continue to be posted for public comment same as before.

He went on to say that contract changes that come about as a result of Registry Service Evaluation Process requests or stuff like change of ownership will continue to not be subject to full public comment periods (though RSEP does have its own, less-publicized comment system).

The ICANN blog post lists several scenarios in which ICANN is required to open a public comment period. On the list is this:

ICANN org base agreements with registry operators and registrars.

The word “base” raised at least eight eyebrows of people who read the post, including my two.

The “base” agreements ICANN has with registries and registrars are the 2013 Registrar Accreditation Agreement and the 2012/2017 Registry Agreement.

The RAA applies to all accredited registrars and the base RA applies to all new gTLD registries that applied in the 2012 round.

Registries that applied for, or were already running, gTLDs prior to 2012 all have bespoke contracts that have been gradually brought more — but not necessarily fully — into line with the 2012/17 RA in renewal renegotiations over the last several years.

In all cases, the renegotiated legacy contracts have been subject to public comment, but in no cases have the comments had any meaningful impact on their ultimate approval by ICANN.

The most recent such renewal was Public Interest Registry’s .org contract.

Among the changes were the introduction of the Uniform Rapid Suspension anti-cybersquatting policy, and the removal of price caps that had limited PIR to a 10% increase per year.

The comment period on this contract attracted over 3,200 comments, almost all of which objected to the price regulation changes or the URS.

But the contract was signed regardless, unaffected by the comments, which caused one registrar, NameCheap, to describe the process as a “sham”.

With this apparently specific reference to “base” agreements coming so soon thereafter, it’s easy to see how we could have assumed ICANN had decided to cut off public comment on these contentious issues altogether, but that appears to not be the case.

What this seems to mean is that when .com next comes up for renewal, it will be open for comment.

These two ccTLDs drove two thirds of all domain growth in Q2

Kevin Murphy, August 30, 2019, Domain Registries

The number of registered domain names in the world increased by 2.9 million in the second quarter, driven by .com and two ccTLDs.

That’s according to the latest Verisign Domain Name Industry Brief, which was published (pdf) overnight, and other data.

The quarter ended with 354.7 million domains. Verisign’s own .com was up 1.5 million over Q1 at 142.5 million names.

ccTLDs across the board grew by 1.9 million names sequentially to 158.7 million. Year-over-year, the increase was 10.5 million domains.

The sequential ccTLD increase can be attributed almost entirely to two TLDs: .tw and .uk. These two ccTLDs appear to account for two thirds of the overall net new domains appearing in Q2.

Taiwan grew by about 600,000 in the quarter, presumably due to an ongoing, unusual pricing-related growth spurt among Chinese domainers that I reported in June.

The UK saw an increase of roughly 1.3 million domains, ending the quarter at 13.3 million.

That’s down to the deadline for registering second-level .uk matches for third-level .co.uk domains, which passed June 25.

Nominet data shows that 2LDs increased by about 1.2 million in the period, even as 3LDs dipped. The difference between this and the Verisign data appears to be rounding.

Factoring out the .uk and .tw anomalies, we have basically flat ccTLD growth, judging by the DNIB data.

Meanwhile, the new gTLD number was 23 million. That’s flat after rounding, but Verisign said that the space was actually up by about 100,000 names.

Growth as a whole was tempered by what I call the “other” category. That comprises the pre-2012 gTLDs such as .net, .org, .info and .biz. That was down by about a half a million names.

.net continued its gradual new gTLD-related decline, down 200,000 names sequentially at 13.6 million, while .org was down by 100,000 names.

The overall growth numbers are subject to the usual DNIB-related disclaimers: Verisign (and most everyone else) doesn’t have good data for some TLDs, including large zones such as .tk and .cn.

This latest Chinese bubble could deflate ccTLD growth

With many ccTLD operators recently reporting stagnant growth or shrinkage, one registry has performed stunningly well over the last year. Sadly, it bears the hallmarks of another speculative bubble originating in China.

Verisign’s latest Domain Name Industry Brief reported that ccTLDs, excluding the never-shrinking anomaly that is .tk, increased by 1.4 million domains in the first quarter of the year.

But it turns out about 1.2 million of those net new domains came from just one TLD: Taiwan’s .tw, operated by TWNIC.

Looking at the annual growth numbers, the DNIB reports that ccTLDs globally grew by 7.8 million names between the ends of March 2018 and March 2019.

But it also turns out that quite a lot of that — over five million names — also came from .tw.

Since August 2018, .tw has netted 5.8 million new registrations, ending May with 6.5 million names.

It’s come from basically nowhere to become the fifth-largest ccTLD by volume, or fourth if you exclude .tk, per the DNIB.

History tells us that when TLDs experience such huge, unprecedented growth spurts, it’s usually due to lowering prices or liberalizing registration policies.

In this case, it’s a bit of both. But mostly pricing.

TWNIC has made it much easier to get approved to sell .tw names if you’re already an ICANN-accredited registrar.

But it’s primarily a steep price cut that TWNIC briefly introduced last August that is behind huge uptick in sales.

Registry CEO Kenny Huang confirmed to DI that the pricing promo is behind the growth.

For about a month, registrants could obtain a one-year Latin or Chinese IDN .tw name for NTD 50 (about $1.50), a whopping 95% discount on its usual annual fee (about $30).

As a result, TWNIC added four million names in August and September, according to registry stats. The vast majority were Latin-script names.

According to China domain market experts Allegravita, and confirmed by Archive.org, one Taiwanese registrar was offering free .tw domains for a day whenever a Chinese Taipei athlete won a gold medal during the Asian Games, which ran over August and September. They wound up winning 17 golds.

Huang said that the majority of the regs came from mainland Chinese registrants.

History shows that big growth spurts like this inevitably lead to big declines a year or two later, in the “junk drop”. It’s not unusual for a registry to lose 90%+ of its free or cheap domains after the promotional first year is over.

Huang confirmed that he’s expecting .tw registrations to drop in the fourth quarter.

It seems likely that later this year we’re very likely going to see the impact of the .tw junk drop on ccTLD volumes overall, which are already perilously close to flat.

Speculative bubbles from China have in recent years contributed to wobbly performance from the new gTLD sector and even to .com itself.

New gTLDs slip again in Q1

The number of domains registered in new gTLDs slipped again in the first quarter, but it was not as bad as it could have been.

Verisign’s latest Domain Name Industry Brief, out today, reports that new gTLD domains dropped by 800,000 sequentially to end March at a round 23.0 million.

It could have been worse.

New gTLD regs in Q1 were actually up compared to the same period last year, by 2.8 million.

That’s despite the fact that GRS Domains, the old Famous Four portfolio, has lost about three million domains since last August.

Verisign’s own .com was up sequentially by two million domains and at 141 million, up by 7.1 million compared to Q1 2018. But .net’s decline continued. It was down from 14 million in December to 13.8 million in March.

Here’s a chart (click to enlarge) that may help visualize the respective growth of new gTLDs and .com over the last three years. The Y axes are in the millions of domains.

.com v new gs

New gTLDs have shrunk sequentially in six of the last 12 quarters, while .com has grown in all but two.

The ccTLD world, despite the woes reported by many European registries, was the strongest growth segment. It was up by 2.5 million sequentially and 10 million compared to a year ago to finish the period with 156.8 million.

But once you factor out .tk, the free TLD that does not delete expired or abusive names, ccTLDs were up by 1.4 million sequentially and 7.8 million on last year.

Major registries posting “fabricated” Whois data

One or more of the major gTLD registries are publishing Whois query data that may be “fabricated”, according to some of ICANN’s top security minds.

The Security and Stability Advisory Committee recently wrote to ICANN’s top brass to complain about inconsistent and possibly outright bogus reporting of Whois port 43 query volumes.

SSAC said (pdf):

it appears that the WHOIS query statistics provided to ICANN by registry operators as part of their monthly reporting obligations are generally not reliable. Some operators are using different methods to count queries, some are interpreting the registry contract differently, and some may be reporting numbers that are fabricated or otherwise not reflective of reality. Reliable reporting is essential to the ICANN community, especially to inform policy-making.

SSAC says that the inconsistency of the data makes it very difficult to make informed decisions about the future of Whois access and to determine the impact of GPDR.

While the letter does not name names, I’ve replicated some of SSAC’s research and I think I’m in a position to point fingers.

In my opinion, Google, Verisign, Afilias and Donuts appear to be the causes of the greatest concern for SSAC, but several others exhibit behavior SSAC is not happy about.

I reached out to these four registries on Wednesday and have published their responses, if I received any, below.

SSAC’s concerns relate to the monthly data dumps that gTLD registries new and old are contractually obliged to provide ICANN, which publishes the data three months later.

Some of these stats concern billable transactions such as registrations and renewals. Others are used to measure uptime obligations. Others are largely of academic interest.

One such stat is “Whois port 43 queries”, defined in gTLD contracts as “number of WHOIS (port-43) queries responded during the reporting period”.

According to SSAC, and confirmed by my look at the data, there appears to be a wide divergence in how registries and back-end registry services providers calculate this number.

The most obvious example of bogosity is that some registries are reporting identical numbers for each of their TLDs. SSAC chair Rod Rasmussen told DI:

The largest issue we saw at various registries was the reporting of the exact or near exact same number of queries for many or all of their supported TLDs, regardless of how many registered domain names are in those zones. That result is a statistical improbability so vanishingly small that it seems clear that they were reporting some sort of aggregate number for all their TLDs, either as a whole or divided amongst them.

While Rasmussen would not name the registries concerned, my research shows that the main culprit here appears to be Google.

In its December data dumps, it reported exactly 68,031,882 port 43 queries for each of its 45 gTLDs.

If these numbers are to be believed, .app with its 385,000 domains received precisely the same amount of port 43 interest as .gbiz, which has no registrations.

As SSAC points out, this is simply not plausible.

A Google spokesperson has not yet responded to DI’s request for comment.

Similarly, Afilias appears to have reported identical data for a subset of its dot-brand clients’ gTLDs, 16 of which purportedly had exactly 1,071,939 port 43 lookups in December.

Afilias has many more TLDs that did not report identical data.

An Afilias spokesperson told DI: “Afilias has submitted data to ICANN that addresses the anomaly and the update should be posted shortly.”

SSAC’s second beef is that one particular operator may have reported numbers that “were altered or synthesized”. SSAC said in its letter:

In a given month, the number of reported WHOIS queries for each of the operator’s TLDs is different. While some of the TLDs are much larger than others, the WHOIS query totals for them are close to each other. Further statistical analysis on the number of WHOIS queries per TLD revealed that an abnormal distribution. For one month of data for one of the registries, the WHOIS query counts per TLD differed from the mean by about +/- 1%, nearly linearly. This appeared to be highly unusual, especially with TLDs that have different usage patterns and domain counts. There is a chance that the numbers were altered or synthesized.

I think SSAC could be either referring here to Donuts or Verisign

Looking again at December’s data, all but one of Donuts’ gTLDs reported port 43 queries between 99.3% and 100.7% of the mean average of 458,658,327 queries.

Is it plausible that .gripe, with 1,200 registrations, is getting almost as much Whois traffic as .live, with 343,000? Seems unlikely.

Donuts has yet to provide DI with its comments on the SSAC letter. I’ll update this post and tweet the link if I receive any new information.

All of the gTLDs Verisign manages on behalf of dot-brand clients, and some of its own non-.com gTLDs, exhibit the same pattern as Donuts in terms of all queries falling within +/- 1% of the mean, which is around 431 million per month.

So, as I put to Verisign, .realtor (~40k regs) purportedly has roughly the same number of port 43 queries as .comsec (which hasn’t launched).

Verisign explained this by saying that almost all of the port 43 queries it reports come from its own systems. A spokesperson told DI:

The .realtor and .comsec query responses are almost all responses to our own monitoring tools. After explaining to SSAC how Verisign continuously monitors its systems and services (which may be active in tens or even hundreds of locations at any given time) we are confident that the accuracy of the data Verisign reports is not in question. The reporting requirement calls for all query responses to be counted and does not draw a distinction between responses to monitoring and non-monitoring queries. If ICANN would prefer that all registries distinguish between the two, then it is up to ICANN to discuss that with registry operators.

It appears from the reported numbers that Verisign polls its own Whois servers more than 160 times per second. Donuts’ numbers are even larger.

I would guess, based on the huge volumes of queries being reported by other registries, that this is common (but not universal) practice.

SSAC said that it approves of the practice of monitoring port 43 responses, but it does not think that registries should aggregate their own internal queries with those that come from real Whois consumers when reporting traffic to ICANN.

Either way, it thinks that all registries should calculate their totals in the same way, to make apples-to-apples comparisons possible.

Afilias’ spokesperson said: “Afilias agrees that everyone should report the data the same way.”

As far as ICANN goes, its standard registry contract is open to interpretation. It doesn’t really say why registries are expected to collect and supply this data, merely that they are obliged to do so.

The contracts do not specify whether registries are supposed to report these numbers to show off the load their servers are bearing, or to quantify demand for Whois services.

SSAC thinks it should be the latter.

You may be thinking that the fact that it’s taken a decade or more for anyone to notice that the data is basically useless means that it’s probably not all that important.

But SSAC thinks the poor data quality interferes with research on important policy and practical issues.

It’s rendered SSAC’s attempt to figure out whether GDPR and ICANN’s Temp Spec have had an effect on Whois queries pretty much futile, for example.

The meaningful research in question also includes work leading to the replacement of Whois with RDAP, the Registration Data Access Protocol.

Finally, there’s the looming possibility that ICANN may before long start acting as a clearinghouse for access to unredacted Whois records. If it has no idea how often Whois is actually used, that’s going to make planning its infrastructure very difficult, which in turn could lead to downtime.

Rasmussen told DI: “Our impression is that all involved want to get the numbers right, but there are inconsistent approaches to reporting between registry operators that lead to data that cannot be utilized for meaningful research.”