Second private auction nets $1.2m per gTLD
Only eight new gTLD contention sets were resolved during Innovative Auctions second round of private auctions this week, and the average winning bid has gone down.
The eight strings sold for a combine $9,651,000, or an average of $1.2 million per string. That’s down from the $1.5 million average reported from the first round of auctions in June.
The overall average winning bid from Innovative’s auctions is now $1.33 million.
Over 100 gTLDs had been committed to the second round by various applicants — which put up 68 strings and wound up winning three — but the auctions can obviously only go ahead if the whole contention set agrees to participate.
According to Innovative, these are the winners this week:
- .guide: Donuts
- .construction: Donuts
- .storage: Extra Space Storage (applying as Self Storage LLC)
- .desi: Desi Networks
- .expert: Donuts
- .fishing: Top Level Domain Holdings
- .casa: Top Level Domain Holdings
- .网址 (.wangzhi): Hu Yi Global
These were all two-applicant contention sets (Go Daddy had originally applied for .casa, but withdrew its application months ago).
Losing applicants — which get to take home the winning’s bidder’s cash, net Innovative’s fees — were Demand Media, Afilias, Dot Construction, and Red Circle.
The DI PRO Application Tracker will be updated daily as and when the losing applications are withdrawn. So far, only Donuts’ bid for .casa has had its withdrawal processed by ICANN.
Innovative seemed to blame the low turnout on the August holiday period, and said it has scheduled its third round of auctions for September 10.
Dotless domains are dead
ICANN has banned dotless gTLDs, putting a halt to Google’s plans to run .search as a dotless search service and confounding the hopes of some portfolio applicants.
ICANN’s New gTLD Program Committee, acting with the powers of its board of directors passed the resolution on Tuesday. It was published this morning. Here’s the important bit (links added):
Resolved (2013.08.13.NG02), in light of the current security and stability risks identified in SAC053, the IAB statement and the Carve Report, and the impracticality of mitigating these risks, the NGPC affirms that the use of dotless domains is prohibited.
The current version of the Applicant Guidebook bans dotless domains (technically, it bans apex A, AAAA and MX records) but leaves the door open for registries to request an exception via Extended Evaluation.
This new decision closes that door.
The decision comes a week after the publication of Carve Systems’ study of the dotless domain issue, which concluded that the idea was potentially “dangerous” and that if ICANN intended to allow them it should do substantial outreach to hardware and software makers, essentially asking them to change their products.
The Internet Architecture Board said earlier that “dotless domains are inherently harmful to Internet security.”
Microsoft, no doubt motivated in part at least by competitive concerns in the search market, had repeatedly implored ICANN to implement a ban on security grounds.
Google had planned to run .search as a browser service that would allow users to specify preferred search engines. I doubt the dotless ban will impact its application’s chances of approval.
Donuts and Uniregistry, which together have applied for almost 400 gTLDs, had also pushed for ICANN to allow dotless domains, although I do not believe their applications explicitly mentioned such services.
dotShabaka Diary — Day 4
Here’s the fourth installment of dotShabaka Registry’s journal, charting its progress towards becoming one of the first new gTLDs to go live, written by general manager Yasmin Omer.
Friday 16 August 2013
The IBM TMDB webinar was disappointing. We had hoped to gain some much needed insight into the TMDB system, but instead we left with more questions and concerns. Let’s hope IBM can lift their game for next week’s webinar and the integration and testing process is clarified.
In other news, it has been a week since the teleconference to discuss the URS Technical Requirements Document and we are still unclear on when the requirements will be finalised, posted and whether they stand on the critical path to our Sunrise. If the discussions during the teleconference are anything to go by, significant work is required by both parties to finalise the document. Implementing the requirements in the URS Technical Requirements Document isn’t as simple as flicking a switch – development efforts will be required. This work needs to start now.
Finally, there are now only a couple of days left in our Pre-Delegation Testing window and so far we have not heard anything; we hope that no news is good news. Following this we expect the PDT service provider will take a couple of weeks to review our results. Fingers crossed!
Still no welcome package.
Read previous and future diary entries here.
NTAG rubbishes new gTLD collision risk report
The New gTLD Applicants Group has slated Interisle Consulting’s report into the risk of new gTLDs causing security problems on the internet, saying the problem is “overstated”.
The group, which represents applicants for hundreds of gTLDs and has a non-voting role in ICANN’s GNSO, called on ICANN to reclassify hundreds of “Uncalculated” risk strings as “Low” risk, meaning they would not face as substantial a delay before or uncertainty about their eventual delegation.
But NTAG said it “agreed” that the high-risk .corp and .home “should be delayed while further studies are conducted”. The current ICANN proposal is actually to reject both of these strings.
NTAG was responding to ICANN’s proposal earlier this month to delay 523 applications (for 279 strings) by three to six months while further studies are carried out.
The proposal was based on Interisle’s study of DNS root server logs, which showed many millions of daily queries for gTLDs that currently do not exist but have been applied for.
The worry is that delegating those strings would cause problems such as downtime or data leakage, where sensitive information intended for a recipient on the same local network would be sent instead to a new gTLD registry or one of its (possibly malicious) registrants.
NTAG reckons the risk presented by Interisle has been overblown, and it presented a point-by-point analysis of its own. It called for everything except .corp and .home to be categorized “Low” risk, saying:
We recognize that a small number of applied for names may possibly pose a risk to current operations, but we believe very strongly that there is no quantitative basis for holding back strings that pose less measurable threat than almost all existing TLDs today. This is why we urge the board to proceed with the applications classified as “Unknown Risk” using the mitigations recommended by staff for “Low Risk” strings. We believe the 80% of strings classified as “Low Risk” should proceed immediately with no additional mitigations.
The group pointed to a recent analysis by Verisign (which, contrarily, was trying to show that new gTLDs should be delayed) which included data about previous new gTLD delegations.
That report (pdf) said that .xxx was seeing 4,018 look-ups per million queries at the DNS root (PPM) before it was delegated. The number for .asia was 2,708.
If you exclude .corp and .home, both of those PPM numbers are multiples larger than the equivalent measures of query volume for every applied-for gTLD today, also according to Verisign’s data.
NTAG said:
None of these strings pose any more risk than .xxx, .asia and other currently operating TLDs.
…
the least “dangerous” current gTLD on the chart, .sx, had 331 queries per million in 2006. This is a higher density of NXDOMAIN queries than all but five proposed new TLDs. 4 Again, .sx was launched successfully in 2012 with none of the problems predicted in these reports.
Verisign’s report, which sought to provide a more qualitative risk analysis based on some data-supported guesses about where the error traffic is coming from and why, anticipated this interpretation.
Verisign said:
This could indicate that there is nothing to worry about when adding new TLDs, because there was no global failure of DNS when this was done before. Alternately, one might conclude that traffic volumes are not the only indicator of risk, and the semantic meaning of strings might also play a role. We posit that in some cases, those strings with semantic meanings, and which are in common use (such as in speech, writing, etc.) pose a greater risk for naming collision.
The company spent most of its report making somewhat tenuous correlations between its data (such as a relatively large number of requests for .medical from Japanese IP addresses) and speculative impacts (such as “undiagnosed system failures” at “a healthcare provider in Japan”).
NTAG, by contrast, is playing down the potential for negative outcomes, saying that in many cases the risks introduced by new gTLDs are no different from collision risks at the second level in existing TLDs.
Just as the NTAG would not ask ICANN to halt .com registrations while a twelve month study is performed on these problems, we believe there is no reason to introduce a delay in diversifying the Internet’s namespace due to these concerns.
While it stopped short of alleging shenanigans this time around, NTAG also suggested that future studies of root server error traffic could be gamed if botnets were engaged to crapflood the roots.
Its own mitigation plan, which addresses Interisle’s specific concerns, says that most of the reasons that non-existent TLDs are being looked up are either not a problem or can be easily mitigated.
For example, it says that queries for .youtube that arrived in the form of a request for “www.youtube” are probably browser typos and that there’s no risk for users if they’re taken to the YouTube dot-brand instead of youtube.com.
In another example, it points out that requests for “.cisco” or “.toshiba” without any second-level domains won’t resolve anyway, if dotless domains are banned in those TLDs. (NTAG, which has influential members in favor of dotless domains, stopped short of asking for a blanket ban.)
The Interisle report, and ICANN’s proposal to deal with it, are open for public comment until September 17. NTAG’s response is remarkably quick off the mark, for guessable reasons.
Verisign confirms .gov downtime, blames algorithm
Verisign this morning confirmed yesterday’s reports that the .gov top-level domain went down for some internet users due to a DNSSEC problem, which it said was related to an algorithm change.
In a posting to various mailing lists, Verisign principal engineer Duane Wessels said:
On the morning of August 14, a relatively small number of networks may have experienced an operational disruption related to the signing of the .gov zone. In preparation for a previously announced algorithm rollover, a software defect resulted in publishing the .gov zone signed only with DNSSEC algorithm 8 keys rather than with both algorithm 7 and 8. As a result .gov name resolution may have failed for validating recursive name servers. Upon discovery of the issue, Verisign took prompt action to restore the valid zone.
Verisign plans to proceed with the previously announced .gov algorithm rollover at the end of the month with the zone being signed with both algorithms for a period of approximately 10 days.
This clarifies that the problem was slightly different to what had been assumed yesterday.
It was related to change of the cryptographic algorithm used to create .gov’s DNSSEC keys, a relatively rare event, rather than a scheduled key rollover, which is a rather more frequent occurrence.
The problem would only have made .gov domains (and consequently web sites, email, etc) inaccessible for users of networks where DNSSEC validation is strictly enforced, which is quite small.
The US ISP with the strongest support for DNSSEC is Comcast. Since turning on its validators it has reported dozens of instances of DNSSEC failing — mostly in second-level .gov domains, where DNSSEC is mandated by US policy.
On two other occasions Comcast has blogged about the whole .gov TLD failing DNSSEC validation due to problems keeping keys up to date.
The general problem is widespread enough, and the impact severe enough, that Comcast has had to create an entirely new technology to prevent borked key rollovers making web sites go dark for its customers.
Called Negative Trust Anchors, it’s basically a Band-Aid that allows the ISP to deliberately ignore DNSSEC on a given domain while it waits for that domain’s owner to sort out its key problem.
The technology was created following the widely reported nasa.gov outage last year.
It’s really little wonder that so few organizations are interested in deploying DNSSEC today.
Yesterday’s .gov problem may have been minor, lasting only an hour or two, but had the affected TLD been .com, and had DNSSEC deployment been more widespread, everyone on the planet would have noticed.
Under ICANN contract, DNSSEC is mandatory for new gTLDs at the top level, but not the second level.
Reports: .gov fails due to DNSSEC error
The .gov top-level domain suffered a DNSSEC problem today and was unavailable to some internet users, according to reports.
According to mailing lists and the SANS Internet Storm Center, it appeared that .gov rolled one of its DNSSEC keys without telling the root zone about the update.
This meant that anyone whose DNS servers do strict DNSSEC validation — a relatively small number of networks — would have been unable to access .gov web sites, email and other resources.
As a matter of policy, all second-level .gov domains have to be DNSSEC-signed.
The problem was corrected quite quickly — looks like within an hour or two — but as SANS noted, caching issues may prolong the impact.
Both .gov and the root zone are managed by Verisign, which isn’t on the best of terms with the US government at the moment.
dotShabaka Diary — Day 3
Here’s the third installment of dotShabaka Registry’s journal, charting its progress towards becoming one of the first new gTLDs to go live, written by general manager Yasmin Omer.
Wednesday 14 August 2013
Our Pre-Delegation Testing (PDT) continues. The latest ICANN published timeframe shows 30 days duration to 30 August. Previous communications indicated it would take 14 days plus rectification (if required) and the PDT ‘clock’ is counting down 21 days. When will it end?
We now have access to the TMDB and have received the initial Registration Token. We have run some internal tests and it all looks OK. So what next? We will attend the TMDB webinar today and hopefully the TMDB integration and testing process will be defined. Stay tuned.
According to ICANN we will receive a ‘new Registry’ Welcome Pack soon. I suspect we are ‘ahead of the curve’ in terms of the timing of this pack and other applicants will receive this information once the Agreement is signed.
In other news, ICANN have published IOC, Red Cross and Red Crescent reserved lists in multiple languages, but the IGO list has not been defined. Is ICANN going to publish a list of countries (in six official United Nations languages) or is every Registry going to generate their own list with their own rules? I guess we’ll have to wait and see.
Read previous and future diary entries here.
First string confusion decisions handed down, Verisign loses against .tvs
The International Centre for Dispute Resolution has started delivering its decisions in new gTLD String Confusion Objections, and we can report that Verisign has lost at least one case.
ICDR expert Stephen Strick delivered a brief, five-page ruling in the case of Verisign vs. T V Sundram Iyengar & Sons yesterday, ruling that .tvs is not confusingly similar to .tv.
TVS is a $6-billion-a-year, 100-year-old Indian conglomerate, while .tv is the ccTLD for Tuvalu, which Verisign manages because of its similarity of meaning to “television”.
It’s impossible to glean from the decision (pdf) what Verisign’s argument comprised. The summary is just two sentences long.
But TVS, in response, appears to have relied to an extent on the “DuPont factors” a 13-point test for trademark confusion that came out of a 1973 case in the US.
That’s the same precedent that has been found relevant in many Legal Rights Objections in cases handled by WIPO.
The “discussion and reasons for determination” section of the .tvs decision, in which Strick found that confusion was possible but not “probable”, amounts to just four sentences.
Here’s almost all of it. Emphasis in original:
in order for the Objector to prevail, Objector must prove that the co-existence of the two TLDs in question would probably result in user confusion. Given the analysis of the thirteen factors cited by Applicant derived from the DuPont case cited above, I find that Objector has failed to meet its burden of proof regarding the probability of such confusion. I note that while the co-existence of the two TLDs that are the subject of this proceeding may result in confusion by users, Objector has failed to meet its burden of proof to establish the likelihood or probability that users will be confused.
In considering parties’ arguments, I was persuaded, in part, by Applicant’s arguments relating to the commercial impression of the TVS TLD, including the proof offered by Applicant as to the longevity of the TVS brand, the limited nature of the gTLD’s intended use, the dissimilarity of the goods or services associated respectively with the two strings, ie TVS’s association with automobile products, the fact that TVS’s brand is associated with capital letters (whereas Objector’s .tv is in lower case), the fact that TVS is well known and associated with its companys’ [sic] brands, the lengthy market interface and the long historical co-existence of TVs and tv without evidence of confusion in the marketplace.
The geeks among you will no doubt be screaming at your screen right now: “WTF? He thought CASE was relevant?”
Yes, apparently the fact that the TVS trademark is in upper case makes a difference, despite the fact that the DNS is completely case-insensitive. Bit of a head-scratcher.
I understand several more decisions have also been sent to applicants and objectors, but they’re not yet pubicly available.
The ICDR’s web site for new gTLD decisions has been down for several days, returning 404 errors.
CentralNic earmarks IPO money for new gTLDs
CentralNic this morning formally confirmed that it plans to float on the Alternative Investment Market in London and said the money raised will help it buy stakes in new gTLDs.
The London-based company plans to hit the market at the beginning of September. CEO Ben Crawford told The Telegraph yesterday that the company hopes to raise £5 million ($7.7 million) with the IPO.
CentralNic said in a press release this morning:
The Directors believe that the funds raised for the Group by the placing of shares will allow the Group to enhance its global distribution network, acquire interests in new gTLDs, expand its own retail business and obtain contracts from governments to operate their country code TLDs (“ccTLDs”), especially in developing markets.
While the company is best-known for running pseudo-gTLDs such as us.com and uk.com, it also provides the back-end for the repurposed ccTLDs .la and .pw and has 60 new gTLD back-end contracts, 25 of which are uncontested.
Crawford said in the press release:
We are profitable, debt free, asset backed and about to capitalise on the major changes being made to the internet with the influx of new TLDs. We already have in place the required IT infrastructure and global retailer network. We have also been awarded a significant number of new TLD contracts so the Company is confident of expanding rapidly.
According to The Telegraph, the IPO could value the company at £30 million ($46.4 million).
The Alternative Investment Market is the low-cap little brother to the London Stock Exchange. CentralNic will be the second registry, after Top Level Domain Holdings, to list there.
New US trademark rules likely to exclude many dot-brand gTLDs
The US Patent and Trademark Office plans to allow domain name registries to get trademarks on their gTLDs.
Changes proposed this week seem to be limited to dot-brand gTLDs and would not appear to allow registries for generic strings — not even “closed” generics — to obtain trademarks.
But the rules are crafted in such a way that single-registrant dot-brands might be excluded.
Under existing USPTO policy, applications for trademarks that consist solely of a gTLD cannot be approved, because they don’t identify the source of goods and services.
If “.com” were a trademark, one might have to assume that the source of Amazon.com’s services was Verisign, which is plainly not the case.
But the new gTLD program has invited in hundreds of gTLDs that exactly match existing trademarks. The USPTO said:
Some of the new gTLDs under consideration may have significance as source identifiers… Accordingly, the USPTO is amending its gTLD policy to allow, in some circumstances, for the registration of a mark consisting of a gTLD for domain-name registration or registry services
In order to have a gTLD trademark approved, the applicant would have to pass several tests, substantially reducing the number of marks that would get the USPTO’s blessing.
First, only companies that have signed a Registry Agreement with ICANN would be able to get a gTLD trademark. That should continue to prohibit “front-running”, in which a gTLD applicant tries to secure an advantage during the application process by getting a trademark first.
Second, the registry would have to own a prior trademark for the gTLD string in question. It would have to exactly match the gTLD, though the dot would not be considered.
It would have to be a word mark, without attached disclaimers, for the same types of goods and services that web sites within the gTLD are supposed to provide.
What this seems to mean is that registries would not be able to get trademarks on closed generics.
You can’t get a US trademark on the word “cheese” if you sell cheese, for example, but you can if you sell a brand of T-shirts called Cheese.
So you could only get a trademark on “.cheese” as a gTLD if the class was something along the lines of “domain name registration services for web sites devoted to selling T-shirts”.
Third, registries would have to present a bunch of other evidence demonstrating that their brand is already so well-known that consumers will automatically assume they also own the gTLD:
Because consumers are so highly conditioned and may be predisposed to view gTLDs as non-source indicating, the applicant must show that consumers already will be so familiar with the wording as a mark, that they will transfer the source recognition even to the domain name registration or registry services.
Fourth, and here’s the kicker, the registry would have to show it provides a “legitimate service for the benefit of others”. The USPTO explained:
To be considered a service within the parameters of the Trademark Act, an activity must, inter alia, be primarily for the benefit of someone other than the applicant.
…
While operating a gTLD registry that is only available for the applicant’s employees or for the applicant’s marketing initiatives alone generally would not qualify as a service, registration for use by the applicant’s affiliated distributors typically would.
In other words, a .ford as a single-registrant gTLD would not qualify for a trademark, but a .ford that allowed its dealerships around the world to register domains would.
That appears to exclude many dot-brand applicants. In the current batch, most dot-brands expect to be the sole registrant as well as the registry, at least at first.
Some applications talk in vague terms about also opening up their namespace to affiliates, but in most applications I’ve read that’s a wait-and-see proposition.
The new USPTO rules, which are open for comment to people who have registered with its web site, would appear to apply to a very small number of applicants at this stage.
Recent Comments