Latest news of the domain name industry

Recent Posts

.com zone tops 140 million

The .com zone file passed the 140 million domain milestone for the first time today.
According to Verisign’s own count, today there are 140,016,726 .com names in the file. Yesterday, it had 139,979,307 names.
It’s taken since November 2017 to add the last 10 million names.
Adding registered names not in the zone, what Verisign calls its “Domain Name Base”, .com is currently at 141,857,360 domains.
Meanwhile, .net is continuing to shrink.
It has 13,441,748 names in its zone today, down from an October 2016 peak of over 15.8 million.
The .net domain name base is 13,668,548.
Pretty soon, if the slide continues, Verisign won’t be able to round up to 14 million in its quarterly reports any more.

Court rules domain name list should stay secret

Publishing a list of every domain name in their zone is something that most TLD registries do automatically on a daily basis, but a court in Chile has ruled that doing so is a cybersecurity risk.
NIC Chile, which runs .cl, said last week that it has won an appeal against a Transparency Council ruling that would have forced it to publish a list of the domains it manages.
The Court of Appeals ruled that the registry was within its rights to refuse to hand over an Excel spreadsheet listing the 575,430 domains in .cl to the person who requested it.
The request was just for the list of domains, with none of the other data you’d find in a zone file and no Whois information about the registrants.
Nevertheless, the court unanimously ruled that to hand over the list would present “cybersecurity risks”, according to NIC Chile attorney Margarita Valdés Cortés.
NIC Chile said in a statement:

In this particular case, it was considered that the bulk delivery of domain names to a private individual could generate risks of cybersecurity of various kinds, both in access to information as a result of those domain names as well as the possibility that, by having such a list, attacks on servers, phishing, spam or others could be made easier. Similarly, the ruling of the Court of Appeals understood that the delivery of the data affects commercial and economic rights of the holders of these .CL domains, and considered that there is a legal cause that justifies NIC Chile´s refusal to turn over the list of all registered names.

Cortés said that the case will now go to the nation’s Supreme Court for a final decision, after the Transparency Council appealed.
Access to zone files is considered by many security researchers to be an invaluable tool in the fight against cybercrime.
NIC Chile has published the ruling, in Spanish, here (pdf).

Zone file access is crap, security panel confirms

Kevin Murphy, June 20, 2017, Domain Policy

ICANN’s Centralized Zone Data Service has some serious shortcomings and needs an overhaul, according to the Security and Stability Advisory Committee.
The panel of DNS security experts has confirmed what CZDS subscribers, including your humble correspondent, have known since 2014 — the system had a major design flaw baked in from day one for no readily apparent reason.
CZDS is the centralized repository of gTLD zone files. It’s hosted by ICANN and aggregates zones from all 2012-round, and some older, gTLDs on a daily basis.
Signing up for it is fairly simple. You simply fill out your contact information, agree to the terms of service, select which zones you want and hit “submit”.
The purpose of the service is to allow researchers to receive zone files without having to enter into separate agreements with each of the 1,200+ gTLDs currently online.
The major problem, as subscribers know and SSAC has confirmed, is that the default subscription period is 90 days.
Unless the gTLD registry extends the period at its end and in its own discretion, each subscription ends after three months — cutting off access — and the subscriber must reapply.
Many of the larger registries exercise this option, but many — particularly dot-brands — do not.
The constant need to reapply and re-approve creates a recurring arse-ache for subscribers and, registry staff have told me, the registries themselves.
The approval process itself is highly unpredictable. Some of the major registries process requests within 24 hours — I’ve found Afilias is the fastest — but I’ve been waiting for approval for Valuetainment’s .voting since September 2016.
Some dot-brands even attempt to insert extra terms of service into the deal before approving requests, which defeats the entire purpose of having a centralized service in the first place.
Usually, a polite email to the person handling the requests can produce results. Other times, it’s necessary to report them to ICANN Compliance.
The SSAC has evidently interviewed many people who share my concerns, as well as looking at data from Compliance (where CZDS reliably generates the most complaints, wasting the time of Compliance staff).

This situation makes zone file access unreliable and subject to unnecessary interruptions. The missing data introduces “blind spots” in security coverage and research projects, and the reliability of software – such as security and analytics applications – that relies upon zone files is reduced. Lastly, the introduced inefficiency creates additional work for both registry operators and subscribers.

The SSAC has no idea why the need to reapply every 90 days was introduced, figuring it must have happened during implementation.
But it recommends that access agreements should automatically renew once they expire, eliminating the busywork of reapplying and closing the holes in researchers’ data sets.
As I’m not objective on this issue, I agree with that recommendation wholeheartedly.
I’m less keen on the SSAC’s recommendation that registries should be able to opt out of the auto-renewals on a per-subscriber basis. This will certainly be abused by the precious snowflake dot-brands that have already shown their reluctance to abide by their contractual obligations.
The SSAC report can be read here (pdf).

At least one in 10 new gTLDs are shrinking

While the universe of new gTLDs is growing at a rapid clip, DI research shows that at least one in 10 individual new gTLDs are shrinking.
Using zone file data, I’ve also established that almost a third of new gTLDs were smaller June 1 than they were 90 days earlier, and that more than one in five shrunk over a 12-month period.
There’s been a lot written recently, here and elsewhere, about the volume boom at the top-end of the new gTLD league tables, driven by the inexplicable hunger in China for worthless domain names, so I thought I’d try to balance it out by looking at those not benefiting from the budget land-grab madness.
It’s been about two and a half years since the first new gTLDs of the 2012 round were delegated. A few hundred were in general availability by the end of 2014.
These are the ones I chose to look at for this article.
Taking the full list of delegated 2012-round gTLDs, I first disregarded any dot-brands. For me, that’s any gTLD that has Specifications 9 or 13 in its ICANN Registry Agreement.
Volume is not a measure of success for dot-brands in general, where only the registry can own names, so we’re not interested in their growth rates.
Then I disregarded any gTLD that had a general availability date after March 14, 2015.
That date was selected because it’s 445 days before June 1, 2016 — enough time for a gTLD to go through its first renewal/deletion cycle.
There’s no point looking at TLDs less than a year old as they can only be growing.
This whittling process left me with 334 gTLDs.
Counting the domains in those gTLDs’ zone files, I found that:

  • 96 (28.7%) were smaller June 1 than they were 30 days earlier.
  • 104 (31.1%) were smaller June 1 than they were 90 days earlier.
  • 76 (22.7%) were smaller June 1 than they were 366 days earlier.
  • 35 (10.4%) were smaller on a monthly, quarterly and annual basis.

Zone files don’t include all registered domains, of course, but the proportion of those excluded tends to be broadly similar between gTLDs. Apples-to-apples comparisons are, I believe, fair.
And I think it’s fair to say that if a gTLD has gotten smaller over the previous month, quarter and year, that gTLD is “shrinking”.
There are the TLDs.
[table id=42 /]
Concerning those 35 shrinking gTLDs:

  • The average size of the zones, as of June 1, was 17,299 domains.
  • Combined, they accounted for 605,472 domains, down 34,412 on the year. That’s a small portion of the gTLD universe, which is currently over 20 million.
  • The smallest was .wed, with 144 domains and annual shrinkage of 12. The largest was .网址 (Chinese for “.website”) which had 330,554 domains and annual shrinkage of 7,487.
  • The mean shrinkage over the year was 983 domains per gTLD. Over the quarter it was 1,025. Over the month it was 400.

Sixteen of the 35 domains belong to Donuts, which is perhaps to be expected given that it has the largest stable and was the most aggressive early mover.
Of its first batch of seven domains to go to GA, way back in February 2014, only three — .guru, .singles, and .plumbing — are on our list of shrinkers.
A Donuts spokesperson told DI today that its overall number of registrations is on the increase and that “too much focus on individual TLDs doesn’t accurately indicate the overall health of the TLD program in general and of our portfolio specifically.”
He pointed out that Donuts has not pursued the domainer market with aggressive promotions, targeting instead small and medium businesses that are more likely to actually use their domains.
“As initial domainer investors shake out, you’re likely to see some degradation in the size of the zone,” he said.
He added that Donuts has seen second-year renewal rates of 72%, which were higher than the first year.
“That indicates that there’s more steadiness in the registration base today than there was when first-year renewals were due,” he said.

Verisign adds 750,000 .com names instantly with reporting change

Kevin Murphy, March 23, 2015, Domain Registries

Verisign has boosted its reportable .com domain count by almost 750,000 by starting to count expired and suspended names.
The change in methodology, which is a by-product of ICANN’s much more stringent Whois accuracy regime, happened on Friday afternoon.
Before the change, the company reported on its web site that there were 116,788,107 domains in the .com zone file, with another 167,788 names that were registered but not configured.
That’s a total of 116,955,895 domains.
But just a few hours later, the same web page said .com had a total of 117,704,800 names in its “Domain Name Base”.
That’s a leap of 748,905 pretty much instantly; the number of names in the zone file did not move.
.net jumped 111,110 names to 15,143,356.
The reason for the sudden spikes is that Verisign is now including two types of domain in its count that it did not previously. The web page states:

Beginning with the first quarter, 2015, the domain name base on this website and in subsequent filings found in the Investor Relations site includes domains that are in a client or server hold status.

I suspect that the bulk of the 750,000 newly reported names are on clientHold status, which I believe is used much more often than serverHold.
The clientHold EPP code is often applied by registrars to domains that have expired.
However, registrars signed up to the year-old 2013 Registrar Accreditation Agreement are obliged by ICANN to place domains on clientHold status if registrants fail to respond within 15 days to a Whois verification email.
The 2013 RAA reads (my emphasis):

Upon the occurrence of a Registered Name Holder’s willful provision of inaccurate or unreliable WHOIS information, its willful failure promptly to update information provided to Registrar, or its failure to respond for over fifteen (15) calendar days to inquiries by Registrar concerning the accuracy of contact details associated with the Registered Name Holder’s registration, Registrar shall either terminate or suspend the Registered Name Holder’s Registered Name or place such registration on clientHold and clientTransferProhibited, until such time as Registrar has validated the information provided by the Registered Name Holder.

Last June, registrars claimed that the new policy — which came after pressure from law enforcement — had resulted in over 800,000 domains being suspended.
It’s an ongoing point of contention between ICANN, its registrars, and cops.
Verisign changing its reporting methodology may well be a reaction to this increase in the number of clientHold domains.
While its top-line figure has taken a sharp one-off boost, it will still permit daily apples-to-apples comparisons on an ongoing basis.
UPDATE:
My assumption about the link to the 2013 RAA was correct.
Verisign CFO George Kilguss told analysts on February 5.

Over the last several years, the average amount of names in the on-hold status category has been approximately 400,000 names and the net change year-over-year has been very small.
While still immaterial, during 2014, we saw an increase in the amount of names registrars have placed on hold status, which appears to be a result of these registrars complying with the new mandated compliance mechanisms in ICANN’s 2013 Registrar Accreditation Agreement or RAA.
In 2014, we saw an increase in domain names placed on hold status from roughly 394,000 names at the end of 2013 to about 870,000 at the end of 2014.

Verisign reveals “dark” .com domains

Verisign has started publishing the daily count of .com and .net domain names that are registered but do not work.
On a new page on its site, the company is promising to break out how many domains are registered but do not currently show up in the zone files for its two main gTLDs.
These are sometimes referred to as “dark” domains.
As of yesterday, the number of registered and active .com domains stands at 103,960,994, and there are 145,980 more (about 0.14% of the total) that are registered but do not currently have DNS.
For .net, the numbers stand at 14,750,674 and 32,440 (0.22%).
Verisign CEO Jim Bidzos told analysts last night that the data is being released to “increase transparency” into the company’s performance.
Many tools available for tracking registration numbers in TLDs are skewed slightly by the fact that they rely on publicly available zone file data, which does not count dark domains.
Registry reports containing more accurate data are released monthly by ICANN, but they’re always three months old.