Latest news of the domain name industry

Recent Posts

Donuts slashes prices on a million domains

Kevin Murphy, August 28, 2019, Domain Registries

Donuts is to overhaul the pricing on 1.1 million registry-reserved “premium” domain names, taking hundreds of thousands out of premium status altogether.

The company said today that it has decided to reduce the registration cost of 250,000 domains across its 242 new gTLDs. Discounts as deep as 90% are possible, judging by the company’s pricing page.

A further 850,000 will have their premium tag removed and return to regular pricing.

Part of the overhaul relates to the Rightside acquisition, which closed in 2017. While Rightside’s portfolio of TLDs was substantially smaller than Donuts’, it had been much more aggressive on its premium pricing.

For the domains being moved to standard pricing, Donuts will give it one last shot at squeezing a premium price out of them, however.

The company said that from September 5 to November 1 there will be a “pre-sales” event, during which registrants can pay the current premium fee for the first year on the understanding that they will renew at the standard pricing.

For example, drunk.games currently commands a roughly $130-a-year registration fee at registrars. If you buy it during the pre-sales event you’ll pay $130 for the first year but only about $20 upon renewal.

Donuts says this unusual landrush-style event is designed to make the names more attractive to investors who want to get in before prices fall.

The full effect of the price changes takes effect November 5.

It’s worth noting that standard pricing at Donuts is actually going up across most TLDs, by as much as 9%, on October 1, so you may want to check what your actual renewal fee is before buying.

A searchable database of the newly priced inventory can be found here.

Looks like .fans has a new Chinese owner

It appears that the struggling new gTLD .fans has changed ownership for the second time in a year.

According to ICANN’s web site, the .fans Registry Agreement was assigned to a company called ZDNS International on June 28.

Since August 2018, the contract had been in the hands of a CentralNic subsidiary called Fans TLD, having been originally operated by Asiamix Digital.

ZDNS International appears to be a newish Hong Kong subsidiary of major China-based DNS service provider ZDNS.

ZDNS provides DNS services for more than 20 TLDs, mostly Chinese-language, but as far as I can tell it is not the contracted party for any.

It’s also known for providing registry gateway services for non-Chinese registries that want to set up shop in the country.

CentralNic took over .fans last year after Asiamix failed to get the TLD’s sales to take off.

.fans had about 1,700 domains under management at the time, and it’s been pretty much flat ever since. I don’t think CentralNic has been promoting it.

Over the same period, singular competitor .fan, which Donuts acquired from Asiamix last year, has gone from 0 to almost 3,000 registrations.

If CentralNic, a public company, made a profit on the flip it does not appear to have been material enough to require disclosure to shareholders.

Cloudflare “bug” reveals hundreds of secret domain prices

The secret wholesale prices for hundreds of TLDs have been leaked, due to an alleged “bug” at a registrar.

The registry fees for some 259 TLDs, including those managed by Donuts, Verisign and Afilias, are currently publicly available online, after a programmer used what they called a “bug” in Cloudflare’s API to scrape together price lists without actually buying anything.

Cloudflare famously busted into the domain registrar market last September by announcing that it would sell domains at cost, thumbing its nose at other registrars by suggesting that all they’re doing is “pinging an API”.

But because most TLD registries have confidentiality clauses in their Registry-Registrar Agreements, accredited registrars are not actually allowed to reveal the wholesale prices.

That’s kind of a problem if you’re a registrar that has announced that you will never charge a markup, ever.

Cloudflare has tried to get around this by not listing its prices publicly.

Currently, it does not sell new registrations, instead only accepting inbound transfers from other registrars. Registry transaction reports reveal that it has had tens of thousands of names transferred in, but has not created a significant number of new domains.

(As an aside, it’s difficult to see how it could ever sell a new reg without first revealing its price and therefore breaking its NDAs.).

It appears that the only way to manually ascertain the wholesale prices of all of the TLDs it supports would be to buy one of each at a different registrar, then transfer them to Cloudflare, thereby revealing the “at cost” price.

This would cost over $9,500, at Cloudflare’s prices, and it’s difficult to see what the ROI would be.

However, one enterprising individual discovered via the Cloudflare API that the registrar was not actually checking whether they owned a domain before revealing its price.

They were therefore able to compile a list of Cloudflare’s prices and therefore the wholesale prices registries charge.

The list, and the script used to compile it, are both currently available on code repository Github.

The bulk of the list comprises Donuts’ vast portfolio, but most TLDs belonging to Afilias (including the ccTLD .io), XYZ.com and Radix are also on there.

It’s not possible for me to verify that all of the prices are correct, but the ones that are comparable to already public information (such as .com and .net) match, and the rest are all in the ballpark of what I’ve always assumed or have been privately told they were.

The data was last refreshed in April, so without updates its shelf life is likely limited. Donuts, for example, is introducing price increases across most of its portfolio this year.

After $30 million deal, is a .voice gTLD now inevitable?

Do big second-level domain sales translate into new gTLD success, and does the record-breaking $30 million sale of voice.com this week make a .voice gTLD inevitable?

The answers, I believe, are no and maybe.

Before the 2012 new gTLD application round, one way applicants picked their strings was by combing through the .com zone file to find frequently-occurring words that terminated the second level string.

This is where we get the likes of .site and .online from Radix and much of Donuts’ portfolio.

But applicants also looked at lists of high-priced secondary market sales for inspiration.

This is where we get the likes of .vodka, from MMX.

The latter strategy has seen mixed-to-poor results.

Five of the top domain sales, as compiled by Domain Name Journal, were not eligible for gTLD status are they are too short.

Of the remaining 15 strings, “sex” (which occurs twice), “fund”, “porn”, “toys” and “vodka” were all applied for in 2012 and are currently on sale.

The strings “clothes” and “diamond” do not appear as gTLDs, but Donuts runs both .clothing and .diamonds.

Not delegated in any fashion are “porno” (unless you count it as a derivative of “porn”), “slots”, “tesla”, “whisky” and “california”. A company called IntercontinentalExchange runs .ice as a dot-brand.

As well as .clothing and .diamonds, .fund and .toys are both also Donuts TLDs. None of them are doing spectacularly well.

At the lower end, .diamonds currently has fewer than 3,000 domain under management, but has a relatively high price compared to the the higher-volume TLDs in Donuts’ stable.

At the high-volume end, .fund has just shy of 16,000 names and .clothing has about 12,000.

Judging by their retail prices, and the fact that Donuts benefits from the economies of scale of a 240-strong TLD portfolio, I’m going to guess these domains are profitable, but not hugely so.

If we turn our attention to .vodka, with its roughly 1,500 domains, it seems clear that MMX is barely covering the cost of its annual ICANN fees. Yet vodka.com sold for $3 million.

So will anyone be tempted to apply for .voice in the next gTLD application round? I’d say it’s very possible.

First, “voice” is a nice enough string. It could apply to telephony services, but also to general publishing platforms that give their customers a “voice”. I’d say it could gather up enough registrations to fit profitably into a large portfolio, but would not break any records in terms of volume.

But perhaps the existence of voice.com buyer Block.one as a possible applicant will raise some other applicants out of the woodwork.

Block.one, which uses a new gTLD and an alt-ccTLD (.io) for its primary web sites, is certainly not out-of-touch when it come to alternative domain names.

Could it apply for .voice, and if it does how much would it be willing to spend to pay off rival applicants? It still apparently has billions of dollars from its internet coin offering in the bank.

How much of that would it be prepared to pay for .voice at private auction?

That prospect alone might be enough to stir the interest of some would-be applicants, but it has to be said that it’s by no means certain that the highly gameable application process ICANN deployed in 2012 is going to look the same next time around.

Major registries posting “fabricated” Whois data

One or more of the major gTLD registries are publishing Whois query data that may be “fabricated”, according to some of ICANN’s top security minds.

The Security and Stability Advisory Committee recently wrote to ICANN’s top brass to complain about inconsistent and possibly outright bogus reporting of Whois port 43 query volumes.

SSAC said (pdf):

it appears that the WHOIS query statistics provided to ICANN by registry operators as part of their monthly reporting obligations are generally not reliable. Some operators are using different methods to count queries, some are interpreting the registry contract differently, and some may be reporting numbers that are fabricated or otherwise not reflective of reality. Reliable reporting is essential to the ICANN community, especially to inform policy-making.

SSAC says that the inconsistency of the data makes it very difficult to make informed decisions about the future of Whois access and to determine the impact of GPDR.

While the letter does not name names, I’ve replicated some of SSAC’s research and I think I’m in a position to point fingers.

In my opinion, Google, Verisign, Afilias and Donuts appear to be the causes of the greatest concern for SSAC, but several others exhibit behavior SSAC is not happy about.

I reached out to these four registries on Wednesday and have published their responses, if I received any, below.

SSAC’s concerns relate to the monthly data dumps that gTLD registries new and old are contractually obliged to provide ICANN, which publishes the data three months later.

Some of these stats concern billable transactions such as registrations and renewals. Others are used to measure uptime obligations. Others are largely of academic interest.

One such stat is “Whois port 43 queries”, defined in gTLD contracts as “number of WHOIS (port-43) queries responded during the reporting period”.

According to SSAC, and confirmed by my look at the data, there appears to be a wide divergence in how registries and back-end registry services providers calculate this number.

The most obvious example of bogosity is that some registries are reporting identical numbers for each of their TLDs. SSAC chair Rod Rasmussen told DI:

The largest issue we saw at various registries was the reporting of the exact or near exact same number of queries for many or all of their supported TLDs, regardless of how many registered domain names are in those zones. That result is a statistical improbability so vanishingly small that it seems clear that they were reporting some sort of aggregate number for all their TLDs, either as a whole or divided amongst them.

While Rasmussen would not name the registries concerned, my research shows that the main culprit here appears to be Google.

In its December data dumps, it reported exactly 68,031,882 port 43 queries for each of its 45 gTLDs.

If these numbers are to be believed, .app with its 385,000 domains received precisely the same amount of port 43 interest as .gbiz, which has no registrations.

As SSAC points out, this is simply not plausible.

A Google spokesperson has not yet responded to DI’s request for comment.

Similarly, Afilias appears to have reported identical data for a subset of its dot-brand clients’ gTLDs, 16 of which purportedly had exactly 1,071,939 port 43 lookups in December.

Afilias has many more TLDs that did not report identical data.

An Afilias spokesperson told DI: “Afilias has submitted data to ICANN that addresses the anomaly and the update should be posted shortly.”

SSAC’s second beef is that one particular operator may have reported numbers that “were altered or synthesized”. SSAC said in its letter:

In a given month, the number of reported WHOIS queries for each of the operator’s TLDs is different. While some of the TLDs are much larger than others, the WHOIS query totals for them are close to each other. Further statistical analysis on the number of WHOIS queries per TLD revealed that an abnormal distribution. For one month of data for one of the registries, the WHOIS query counts per TLD differed from the mean by about +/- 1%, nearly linearly. This appeared to be highly unusual, especially with TLDs that have different usage patterns and domain counts. There is a chance that the numbers were altered or synthesized.

I think SSAC could be either referring here to Donuts or Verisign

Looking again at December’s data, all but one of Donuts’ gTLDs reported port 43 queries between 99.3% and 100.7% of the mean average of 458,658,327 queries.

Is it plausible that .gripe, with 1,200 registrations, is getting almost as much Whois traffic as .live, with 343,000? Seems unlikely.

Donuts has yet to provide DI with its comments on the SSAC letter. I’ll update this post and tweet the link if I receive any new information.

All of the gTLDs Verisign manages on behalf of dot-brand clients, and some of its own non-.com gTLDs, exhibit the same pattern as Donuts in terms of all queries falling within +/- 1% of the mean, which is around 431 million per month.

So, as I put to Verisign, .realtor (~40k regs) purportedly has roughly the same number of port 43 queries as .comsec (which hasn’t launched).

Verisign explained this by saying that almost all of the port 43 queries it reports come from its own systems. A spokesperson told DI:

The .realtor and .comsec query responses are almost all responses to our own monitoring tools. After explaining to SSAC how Verisign continuously monitors its systems and services (which may be active in tens or even hundreds of locations at any given time) we are confident that the accuracy of the data Verisign reports is not in question. The reporting requirement calls for all query responses to be counted and does not draw a distinction between responses to monitoring and non-monitoring queries. If ICANN would prefer that all registries distinguish between the two, then it is up to ICANN to discuss that with registry operators.

It appears from the reported numbers that Verisign polls its own Whois servers more than 160 times per second. Donuts’ numbers are even larger.

I would guess, based on the huge volumes of queries being reported by other registries, that this is common (but not universal) practice.

SSAC said that it approves of the practice of monitoring port 43 responses, but it does not think that registries should aggregate their own internal queries with those that come from real Whois consumers when reporting traffic to ICANN.

Either way, it thinks that all registries should calculate their totals in the same way, to make apples-to-apples comparisons possible.

Afilias’ spokesperson said: “Afilias agrees that everyone should report the data the same way.”

As far as ICANN goes, its standard registry contract is open to interpretation. It doesn’t really say why registries are expected to collect and supply this data, merely that they are obliged to do so.

The contracts do not specify whether registries are supposed to report these numbers to show off the load their servers are bearing, or to quantify demand for Whois services.

SSAC thinks it should be the latter.

You may be thinking that the fact that it’s taken a decade or more for anyone to notice that the data is basically useless means that it’s probably not all that important.

But SSAC thinks the poor data quality interferes with research on important policy and practical issues.

It’s rendered SSAC’s attempt to figure out whether GDPR and ICANN’s Temp Spec have had an effect on Whois queries pretty much futile, for example.

The meaningful research in question also includes work leading to the replacement of Whois with RDAP, the Registration Data Access Protocol.

Finally, there’s the looming possibility that ICANN may before long start acting as a clearinghouse for access to unredacted Whois records. If it has no idea how often Whois is actually used, that’s going to make planning its infrastructure very difficult, which in turn could lead to downtime.

Rasmussen told DI: “Our impression is that all involved want to get the numbers right, but there are inconsistent approaches to reporting between registry operators that lead to data that cannot be utilized for meaningful research.”