Latest news of the domain name industry

Recent Posts

Is the .co rebid biased toward Afilias? Yeah, kinda

Kevin Murphy, January 17, 2020, Domain Registries

The Colombian government has come under fire for opening up the .co registry contract for rebid in a way that seems predetermined to pick Afilias as the winner, displacing its fierce rival Neustar.
As I blogged in November, Colombia thinks it might be able to secure a better registry deal, so it plans to shortly open .co up to competitive proposals.
A company called .CO Internet, acquired by Neustar for $109 million in 2014, has been running the ccTLD for the last decade. There are currently around 2.3 million .co domains under management, according to Colombia.
With the renewal deadline looming, the government’s technology ministry, MinTIC, published an eyebrow-raising request for proposals last month.
What’s surprising about the RFP is that some of the four main technical performance criteria listed are so stringent that probably only two companies in the industry qualify — Verisign and Afilias, and so far Verisign has not been involved in the RFP process.
The companies that have been engaging with the government to date are Afilias, Neustar/.CO, Nominet, CentralNic and Donuts.
First, MinTIC wants a registry that’s had at least two million domains under management across its portfolio continuously for two years. All five registries qualify there.
Second, it wants a registry that’s been involved in the migration of a TLD of at least one million names, either as the gaining or losing back-end.
That immediately narrows the pack to just two of the five aforementioned registries — Neustar and Afilias.
Verisign would also qualify, if it’s in the bidding, but I suspect it’s not. Taking over .co would look like a “buy it to kill it” strategy, which would be horrible optics for the Colombian government.
There have only ever been three migrations over one million names, to my knowledge: the Verisign->Afilias .org transition of 2003, the Neustar->Afilias .au move of 2018, and last year’s Afilias->Neustar .in handover.
CentralNic, Nominet and Donuts have all moved numerous TLDs between back-ends, but with much smaller per-TLD domain volumes.
Third — and here’s the kicker — the successful .co bidder will have to show that it processes on average 25 million registry transactions — defined as “billable EPP (write) transactions, as well as all EPP search (read) transactions” — per day. (All of the RFP quotes in this post have been machine-translated from Spanish by Google and run by a few generous Spanish speakers for verification.)
The RFP is not entirely clear on what exact data points it’s looking at here, but my take is that qualifying transactions include, at an absolute minimum, attempts to create a domain, renew a domain, transfer a domain and check whether a domain is registered.
The vast majority of such transactions are in the check and create functions, and I believe a great deal of that activity relates to drop-catching, where registries are flooded with add requests for just-deleted domains.
Whichever way you split it, 25 million a day is a ludicrously high number. Literally only .com, which sees 2.3 billion checks and 1.5 billion adds per month, sees that kind of action.
According to Neustar, which actually runs .co, it only sees 6.4 million transactions per day on average. The requirement to handle 25 million a day is “exaggerated, unjustified and discriminatory” against Neustar, Neustar told MinTIC.
But the RFP allows for the bidding registries to spread their 25-million-a-day quota across all of the TLDs they manage, and this MAY sneak Afilias over the line.
I say MAY in big letters because I don’t believe the numbers that Afilias (and probably other registries too) reports to ICANN every month are reliable.
If you add up the reported, qualifying EPP transactions for September in Afilias’ top four legacy gTLDs — .org, .info, .mobi and .pro — you get to over 25 million per day.
But those same records show that, for example, .mobi, .pro and .info had exactly the same number of EPP availability checks that month — 215,988,497 each.
This is clearly bad data.
I reported on this issue last May, when ICANN’s Security and Stability Advisory Committee informed ICANN that major registries were providing “not reliable” or possibly “fabricated” data about port 43 Whois queries.
Afilias, which was one of the apparent offenders, told me at the time that it was addressing the issue with ICANN, but it does not yet appear to have fully fixed its reporting to enable TLD-by-TLD breakdowns of its registry activity.
It is of course quite possible, even very likely, that Afilias has on average more than 25 million qualifying EPP transactions per day, but how’s it going to prove that to the Colombian government when the numbers it reports under contract to ICANN are clearly unreliable?
It’s a little harder to determine whether Neustar would qualify under the 25-million transaction rule, because some of its largest zones are ccTLDs — .co, .in and .us — that do not publicly report this kind of data. Its comments to the RFP suggest it would not.
Numbers aside, I’ll note that there’s very probably an inherent bias towards legacy gTLD operators like Afilias and against relative newcomers such as CentralNic if you’re counting EPP transactions. As I noted above, a lot of these transactions are coming from drop-catch activity, which is more prevalent on larger, older TLDs where there are more dropping domains that are more likely to have existing backlinks and traffic.
The fourth technical requirement in the Colombian RFP that looks a bit fishy is the requirement that the new registry must have channel relationships with at least 10 of the largest 25 registrars, as listed by a web site called domainstate.com.
I can’t say I’ve looked at domainstate.com very often, if at all, but a quick look at its numbers for September strongly suggests to me that it does not count post-2012 new gTLD registrations in its registrar league table. One registrar with almost four million domains under management doesn’t even show up on the list. This arguably could give an advantage to a registry that plays strongly in legacy gTLDs.
That said, it’s probably an academic point — I don’t think any of the bidders for the .co contract would have difficulty showing that they have 10 of the top 25 registrars on board, whichever way you calculate that league table.
Cumulatively, these four technical hurdles have led some to suggest that Afilias has somehow steered MinTIC towards creating an RFP only it could win.
Apart from what I’ve discussed here, I’ve no evidence that is the case, and Afilias has not yet responded to my request for comment today.
Luckily for the bidding registries, the Columbian RFP has not yet been finalized. Comments submitted by the bidders and others are apparently going to be taken on board, so the barriers to entry for respondents could be lowered before bids are finally accepted.
MinTIC posted an update last night that extends the period that the RFP could run, and the transition period should Neustar lose the contract. A handover, should one happen at all, could now happen as late as February next year.

Major registries posting “fabricated” Whois data

One or more of the major gTLD registries are publishing Whois query data that may be “fabricated”, according to some of ICANN’s top security minds.
The Security and Stability Advisory Committee recently wrote to ICANN’s top brass to complain about inconsistent and possibly outright bogus reporting of Whois port 43 query volumes.
SSAC said (pdf):

it appears that the WHOIS query statistics provided to ICANN by registry operators as part of their monthly reporting obligations are generally not reliable. Some operators are using different methods to count queries, some are interpreting the registry contract differently, and some may be reporting numbers that are fabricated or otherwise not reflective of reality. Reliable reporting is essential to the ICANN community, especially to inform policy-making.

SSAC says that the inconsistency of the data makes it very difficult to make informed decisions about the future of Whois access and to determine the impact of GPDR.
While the letter does not name names, I’ve replicated some of SSAC’s research and I think I’m in a position to point fingers.
In my opinion, Google, Verisign, Afilias and Donuts appear to be the causes of the greatest concern for SSAC, but several others exhibit behavior SSAC is not happy about.
I reached out to these four registries on Wednesday and have published their responses, if I received any, below.
SSAC’s concerns relate to the monthly data dumps that gTLD registries new and old are contractually obliged to provide ICANN, which publishes the data three months later.
Some of these stats concern billable transactions such as registrations and renewals. Others are used to measure uptime obligations. Others are largely of academic interest.
One such stat is “Whois port 43 queries”, defined in gTLD contracts as “number of WHOIS (port-43) queries responded during the reporting period”.
According to SSAC, and confirmed by my look at the data, there appears to be a wide divergence in how registries and back-end registry services providers calculate this number.
The most obvious example of bogosity is that some registries are reporting identical numbers for each of their TLDs. SSAC chair Rod Rasmussen told DI:

The largest issue we saw at various registries was the reporting of the exact or near exact same number of queries for many or all of their supported TLDs, regardless of how many registered domain names are in those zones. That result is a statistical improbability so vanishingly small that it seems clear that they were reporting some sort of aggregate number for all their TLDs, either as a whole or divided amongst them.

While Rasmussen would not name the registries concerned, my research shows that the main culprit here appears to be Google.
In its December data dumps, it reported exactly 68,031,882 port 43 queries for each of its 45 gTLDs.
If these numbers are to be believed, .app with its 385,000 domains received precisely the same amount of port 43 interest as .gbiz, which has no registrations.
As SSAC points out, this is simply not plausible.
A Google spokesperson has not yet responded to DI’s request for comment.
Similarly, Afilias appears to have reported identical data for a subset of its dot-brand clients’ gTLDs, 16 of which purportedly had exactly 1,071,939 port 43 lookups in December.
Afilias has many more TLDs that did not report identical data.
An Afilias spokesperson told DI: “Afilias has submitted data to ICANN that addresses the anomaly and the update should be posted shortly.”
SSAC’s second beef is that one particular operator may have reported numbers that “were altered or synthesized”. SSAC said in its letter:

In a given month, the number of reported WHOIS queries for each of the operator’s TLDs is different. While some of the TLDs are much larger than others, the WHOIS query totals for them are close to each other. Further statistical analysis on the number of WHOIS queries per TLD revealed that an abnormal distribution. For one month of data for one of the registries, the WHOIS query counts per TLD differed from the mean by about +/- 1%, nearly linearly. This appeared to be highly unusual, especially with TLDs that have different usage patterns and domain counts. There is a chance that the numbers were altered or synthesized.

I think SSAC could be either referring here to Donuts or Verisign
Looking again at December’s data, all but one of Donuts’ gTLDs reported port 43 queries between 99.3% and 100.7% of the mean average of 458,658,327 queries.
Is it plausible that .gripe, with 1,200 registrations, is getting almost as much Whois traffic as .live, with 343,000? Seems unlikely.
Donuts has yet to provide DI with its comments on the SSAC letter. I’ll update this post and tweet the link if I receive any new information.
All of the gTLDs Verisign manages on behalf of dot-brand clients, and some of its own non-.com gTLDs, exhibit the same pattern as Donuts in terms of all queries falling within +/- 1% of the mean, which is around 431 million per month.
So, as I put to Verisign, .realtor (~40k regs) purportedly has roughly the same number of port 43 queries as .comsec (which hasn’t launched).
Verisign explained this by saying that almost all of the port 43 queries it reports come from its own systems. A spokesperson told DI:

The .realtor and .comsec query responses are almost all responses to our own monitoring tools. After explaining to SSAC how Verisign continuously monitors its systems and services (which may be active in tens or even hundreds of locations at any given time) we are confident that the accuracy of the data Verisign reports is not in question. The reporting requirement calls for all query responses to be counted and does not draw a distinction between responses to monitoring and non-monitoring queries. If ICANN would prefer that all registries distinguish between the two, then it is up to ICANN to discuss that with registry operators.

It appears from the reported numbers that Verisign polls its own Whois servers more than 160 times per second. Donuts’ numbers are even larger.
I would guess, based on the huge volumes of queries being reported by other registries, that this is common (but not universal) practice.
SSAC said that it approves of the practice of monitoring port 43 responses, but it does not think that registries should aggregate their own internal queries with those that come from real Whois consumers when reporting traffic to ICANN.
Either way, it thinks that all registries should calculate their totals in the same way, to make apples-to-apples comparisons possible.
Afilias’ spokesperson said: “Afilias agrees that everyone should report the data the same way.”
As far as ICANN goes, its standard registry contract is open to interpretation. It doesn’t really say why registries are expected to collect and supply this data, merely that they are obliged to do so.
The contracts do not specify whether registries are supposed to report these numbers to show off the load their servers are bearing, or to quantify demand for Whois services.
SSAC thinks it should be the latter.
You may be thinking that the fact that it’s taken a decade or more for anyone to notice that the data is basically useless means that it’s probably not all that important.
But SSAC thinks the poor data quality interferes with research on important policy and practical issues.
It’s rendered SSAC’s attempt to figure out whether GDPR and ICANN’s Temp Spec have had an effect on Whois queries pretty much futile, for example.
The meaningful research in question also includes work leading to the replacement of Whois with RDAP, the Registration Data Access Protocol.
Finally, there’s the looming possibility that ICANN may before long start acting as a clearinghouse for access to unredacted Whois records. If it has no idea how often Whois is actually used, that’s going to make planning its infrastructure very difficult, which in turn could lead to downtime.
Rasmussen told DI: “Our impression is that all involved want to get the numbers right, but there are inconsistent approaches to reporting between registry operators that lead to data that cannot be utilized for meaningful research.”