Latest news of the domain name industry

Recent Posts

Uniregistry changes emails after “renewal scam” complaints

Kevin Murphy, February 2, 2018, Domain Registrars

Uniregistry has modified its marketing emails after customers complained they looked like fake renewal “scams”.
One customer contacted DI last week to say they were “horrified” to receive pitches for cheap SSL certificates that “read like some of the worst domain expiration scams of the past”.
The company recently started reselling Comodo’s SSL certs as part of its plan to broaden its customer base beyond its roots in the domain investor community.
But the way these certs were marketed left more than one customer with concerns. One email, which I’ve lightly redacted, read as follows:

Dear [CUSTOMER],
FINAL NOTICE – Your SSL certificate for your domain has expired. Take action and renew your certificate today through Uniregistry.
If your SSL certificate expires your website will display a warning informing customers the site is not secure.
We’ve teamed up with Comodo CA to offer our valued customers discounts up to 78% off when they renew their SSL certificate through us.
Visit https://www.comodo.com/uniregistry/ to take advantage of this offer and renew your certificate before it expires.
Domains at Risk :
[LIST OF DOMAINS]
Average validation time is less than an hour could take longer. Don’t let your certificate expire and put your business at risk. We are here to help, contact one of our SSL Specialist for more information or if you need additional support.
Thank you for choosing Uniregistry and Comodo CA

The reader said that while they have some domains with Uniregistry, their SSL certs had been bought elsewhere.
They added that the certs had not “expired” as the email claimed and said that they were not due to expire for months.
In addition, the email is quite clearly asking the customer to “renew” their cert via Uniregistry and Comodo, which should not be possible if the current cert was bought from a different Certificate Authority. It’s actually a solicitation to buy a new cert.
The scare-tactics wording is reminiscent of the old “slamming” scams carried out by Brandon Gray Internet Services, going under the moniker Domain Registry Of America and similar, until ICANN terminated its contract in 2014.
These “fake renewal” scams were delivered in the form of final-demand invoices, but were in fact solicitations to transfer domains, at a huge premium, from their current registrar to the scammer’s registrar.
A major difference between the DROA scam and Uniregistry’s marketing is that Uniregistry only contacted its existing customers. It was not spamming SSL owners at random.
Uniregistry told DI that the emails in question were part of an “A/B test” — when a company tests two emails to different sets of customers to see which one gets the best response rate — that were sent to “small number” of its customers.
Chief operating officer Kanchan Mhatre said in an email:

The initial content sent came from a previous campaign and it’s fair to say that it needed modifying to more accurately reflect what we were trying to convey. Based on the feedback received from you and other customers, we have modified the messaging and we are currently reviewing cert expiry date validation to ensure that we communicate with our customers in a timely manner.

GMO offers free SSL with .shop domains

Kevin Murphy, November 14, 2016, Domain Registries

GMO Registry is to offer .shop domain registrants a free one-year SSL certificate with every purchase.
The company said yesterday that the deal, made via sister certificate company GMO GlobalSign, should be in place by the end of the month.
The certs on offer appear to be the of low-end “Domain Validation” variety.
Nevertheless, GlobalSign usually sells them for over $150 per year, many times more expensive than .shop domains themselves.
Popular registrars are currently selling .shop names from $10 to $25.
There are about 90,000 domains in .shop’s zone file today.
That’s a goodish volume by new gTLD standards, but probably not good enough to help GMO recoup the $41.5 million it paid for .shop at auction any time soon.
Upsell opportunities such as the SSL offer, assuming they get any uptake, may help accelerate its path to breakeven.

New gTLDs are the new Y2K: .corp and .home are doomed and everything else is delayed

Kevin Murphy, August 6, 2013, Domain Registries

The proposed gTLDs .home and .corp create risks to the internet comparable to the Millennium Bug, which terrorized a burgeoning internet at the turn of the century, and should be rejected.
Meanwhile, every other gTLD that has been applied for in the current round could be delayed by months in order to mitigate the risks they pose to internet users.
These are the conclusions ICANN has drawn from Interisle Consulting’s independent study into the problems that could be caused when new gTLDs clash with widely-used internal naming systems.
The extensive study, which drew on 8TB of traffic data provided by 11 of the 13 DNS root server operators, is 197 pages long and absolutely fascinating. It was published by ICANN today.
As Interisle CEO Lyman Chapin reported at the ICANN meeting in Durban a few weeks ago, the large majority of TLDs that have been applied for in the current round already receive large amounts of error traffic:

Of the 1,409 distinct applied-for TLD strings, 1,367 appeared at least once in the 2013 DITL [Day In the Life of the Internet] data with the string at the TLD position.

We’ve previously reported on the volume of queries new gTLDs get, such as the fact that .home gets half a billion hits a day and that 3% of all requests were for strings that have been applied for in the current round.
The extra value in Interisle’s report comes when it starts to figure out how many end points are making these requests, and how many second-level domains they’re looking for.
These are vitally important factors for assessing the scale of the risk of each TLD.
Again, .home and .corp appear to be the most dangerous.
Interisle capped the number of second-level domains it counted in the 2013 data at 100,000 per TLD per root server — 1,100,000 domains in total — and .home was the only TLD string to hit this cap.
Cisco Systems’ proposed .cisco TLD came close, failing to hit the cap in only one of the 11 root servers providing data, while .box and .iinet (both also used widely on home routers) hit the cap on at least one root server.
The lowest count of second-level domains of the 35 listed in the report came from .hsbc, the bank brand, but even that number was a not-inconsiderable 2,000.
Why are these requests being made?
Surprisingly, interactions between a security feature in Google’s own Chrome browser and common residential routers appear to be the biggest cause of queries for non-existent TLDs.
That issue, which impacts mainly .home, accounts for about 46% of the requests counted, according to the report.
In second place, with 15% of the queries, are requests for real domain names that appear to have had a non-existent TLD — again, usually .home — appended by a residential router or cable modem.
Apparent typos — where a user enters a URL but forgets to type the TLD — were a relatively small percentage of requests, coming in at under 1% of queries.
The study also found that bad requests come from many thousands of sources. This table compares the number of requests to the number of sources.
[table id=14 /]
The “Count” column is the number, in thousands, of requests for each TLD string. The “Prefix Count ” column refers to the number of sources providing this traffic, counted by the /24 IP address block (each of which is up to 256 potential hosts).
As you can see, there’s not necessarily a correlation between the number of requests a TLD gets and the number of people making the requests — .google gets queried by more sources than the others, but it’s only ranked 24 in terms of overall query volume, for example.
Interisle concluded from all this that .corp and .home are simply too dangerous to delegate, comparing the problem to the year 2000 bug, where a global effort was required to make sure software could support the four-digit dating scheme required by the turn of the century.
Here’s what the report says about .corp:

users could be taken to the wrong web site (and possibly be exposed to phishing attacks) or told that web sites do not exist when they do, depending on how the .corp TLD is resolved. A corporate mail system might attempt to deliver email to the wrong server, and this could expose sensitive or confidential information to someone who was not supposed to receive it. In essence, everything deployed in the private network would need to be checked.
There are no easy solutions to these problems. In an ideal world, the operators of these private networks would get a timely notification of the new TLD’s delegation and then take action to address these issues. That seems very improbable. Even if ICANN generated sufficient publicity about the new TLD’s delegation, there is no guarantee that this will come to the attention of the management or operators of the private networks that could be jeopardized by the delegation.

It seems reasonable to estimate that the amount of effort involved might be comparable to a wholesale renumbering of the internal network or the Y2K problem.

It notes that applied-for TLDs such as .site, .office, .group and .inc appear to be used in similar ways to .home and .corp, but do not appear to present as broad a risk.
To be clear, the risk we’re talking about here isn’t just people typing the wrong things into browsers, it’s about the infrastructure on many thousands of private networks starting to make the wrong security assumptions about domain names.
ICANN, in response, has outlined a series of measures sure to infuriate many gTLD applicants, but which are consistent with its goal to protect the security and stability of the internet.
They’re also consistent with some of the recommendations put forward by Verisign over the last few months in its campaign to show that new gTLDs pose huge risks.
First, .corp and .home are dead. These two strings have been categorized “high risk” by ICANN, which said:

Given the risk level presented by these strings, ICANN proposes not to delegate either one until such time that an applicant can demonstrate that its proposed string should be classified as low risk

Given the Y2K-scale effort required to mitigate the risks, and the fact that the eventual pay-off wouldn’t compensate for the work, I feel fairly confident in saying the two strings will never be delegated.
Another 80% of the applied-for strings have been categorized “low risk”. ICANN has published a spreadsheet explaining which string falls into which category. Low risk does not mean they get off scot-free, however.
First, all registries for low-risk strings will not be allowed to activate any domain names in their gTLD for 120 days after contract signing.
Second, for 30 days after a gTLD is delegated the new registries will have to reach out to the owners of each IP address that attempts to query names in that gTLD, to try to mitigate the risk of internal name collisions.
This, as applicants will no doubt quickly argue, is going to place them under a massive cost burden.
But their outlook is considerably brighter than that of the remaining 20% of applications, which are categorized as “uncalculated risk” and face a further three to six months of delay while ICANN conducts further studies into whether they’re each “high” or “low” risk strings.
In other words, the new gTLD program is about to see its biggest shake-up since the GAC delivered its Advice in Beijing, adding potentially millions in costs and delays for applicants.
ICANN’s proposed mitigation efforts are now open for public comment.
One has to wonder why the hell ICANN didn’t do this study two years ago.

Is the .home new gTLD doomed? ICANN poses study of security risks

Kevin Murphy, May 22, 2013, Domain Tech

ICANN has set up a study into whether certain applied-for new gTLD strings pose a security risk to the internet, admitting that some gTLDs may be rejected as a result.
Its board of directors on Saturday approved new research into the risk of new gTLD clashes with “internal name certificates”, saying that the results could kill off some gTLD applications.
In its rationale, the board stated:

it is possible that study might uncover risks that result in the requirement to place special safeguards for gTLDs that have conflicts. It is also possible that some new gTLDs may not be eligible for delegation.

Internal name certificates are the same digital certificates used in secure, web-based SSL transactions, but assigned to domain names in private, non-standard namespaces.
Many companies have long used non-existent TLDs such as .corp, .mail and .home on their private networks and quite often they obtain SSL certs from the usual certificate authorities in order to enable encryption between corporate resources and their internal users.
The problem is that browsers and other applications on laptops and other mobile devices can attempt to access these private namespaces from anywhere, not only from the local network.
If ICANN should set these TLD strings live in the authoritative DNS root, registrants of clashing domain names might be able to hijack traffic intended for secure resources and, for example, steal passwords.
That’s obviously a worry, but it’s one that did not occur to ICANN’s Security and Stability Advisory Committee until late last year, when it immediately sought out the help of the CA/Browser Forum.
It turned out the the CA/Browser forum, an alliance of certificate authorities and browser makers, was already on the case. It has put in new rules that state certificates issued to private TLDs that match new gTLDs will be revoked 120 days after ICANN signs a contract with the new gTLD registry.
But it’s still not entirely clear whether this will sufficiently mitigate risk. Not every CA is a member of the Forum, and some enterprises might find 120 day revocation windows challenging to work with.
Verisign recently highlight the internal certificate problem, along with many other potential risks, in an open letter to ICANN.
But both ICANN CEO Fadi Chehade and the chair of SSAC, Patrick Falstrom, have said that the potential security problems are already being addressed and not a reason to delay new gTLDs.
The latest board resolution appears to modify that position.
The board has now asked CEO Fadi Chehade and SSAC to “consider the potential security impacts of applied-for new-gTLD strings in relation to this usage.”
The Root Server Stability Advisory Committee and the CA/Browser Forum will also be tapped for data.
While the study will, one assumes, not be limited to any specific applied-for gTLD strings, it’s well known that some strings are more risky than others.
The root server operators already receive vast amounts of erroneous DNS traffic looking for .home and .corp, for example. If any gTLD applications are at risk, it’s those.
There are 10 remaining applications for .home and five for .corp.

Browser makers brush me off on DNSSEC support

Kevin Murphy, July 29, 2010, Domain Tech

A couple of weeks back, I emailed PR folk at Microsoft, Mozilla, Google and Opera, asking if they had any plans to provide native support for DNSSEC in their browsers.
As DNS uber-hacker Dan Kaminsky and ICANN president Rod Beckstrom have been proselytizing this week at the Black Hat conference, support at the application layer is the next step if DNSSEC is to quickly gain widespread traction.
The idea is that one day the ability to validate DNSSEC messages will be supported by browsers in much the same way as SSL certificates are today, maybe by showing the user a green address bar.
CZ.NIC has already created a DNSSEC validator plugin for Firefox that does precisely that, but as far as I can tell there’s no native support for the standard in any browser.
These are the responses I received:

Mozilla: “Our team is heads down right now with Firefox 4 beta releases so unfortunately, I am not going to be able to get you an answer.”

Microsoft:
“At this stage, we’re focusing on the Internet Explorer 9 Platform Preview releases. The platform preview is a developer and designer scoped release of Internet Explorer 9, and is not feature complete, we will have more to share about Internet Explorer 9 in the future.”
Google: No reply.
Opera: No reply.

In 11 years of journalism, Apple’s PR team has never replied to any request for information or comment from me, so I didn’t bother even trying this time around.
But the responses from the other four tell us one of two things:

  • Browser makers haven’t started thinking about DNSSEC yet.

Or…

  • Their PR people were just trying to brush me off.

I sincerely hope it’s the former, otherwise this blog post has no value whatsoever.

Will VeriSign change its name?

VeriSign’s $1.3 billion sale of its SSL business to Symantec yesterday means not only that the company will be almost entirely focussed on domain names, but also that it will no longer “sign” anything.
The word “VeriSign” will cease to describe what the company does, so will it change its name?
The idea could make sense, given that the services Symantec bought are all about trusting the VeriSign brand, and Symantec has acquired certain rights to use that brand.
Under the deal, Symantec is allowed to use the VeriSign name in authentication services such as the VeriSign Trust Seal. The company plans to incorporate “VeriSign” into a new Symantec trust logo.
VeriSign boss Mark McLaughlin said on a conference call yesterday that Symantec is buying certain VeriSign trademarks, such as Thawte and GeoTrust, but that VeriSign will stay VeriSign.
Symantec will be able to use the VeriSign brand in its logos for a “transition period of time over a number of years”, McLaughlin said.
On the one hand, there’s a potential for a certain degree of confusion that might persuade VeriSign to brand itself afresh. On the other, corporate rebranding is not cheap.
I suppose, if it does choose to rename itself, it had better hope that its first choice of .com is available.

VeriSign poised to sell SSL business to Symantec

Reliable news sources including the Wall Street Journal and Reuters are reporting that VeriSign is on the verge of offloading its market-leading SSL certificate business to Symantec for over $1 billion.
The sale would be the latest in a series of spin-offs that started in 2007, highlighting the company’s renewed focus on domain names.
VeriSign spent many years acquiring a bunch of companies in tenuously related markets – deals that never really made any sense to me – and the last few years selling them off again.
But SSL is not really in the same category as VeriSign’s bizarre forays into, for example, the Crazy Frog ringtone company. It’s the business the company was founded on when it was spun out of RSA Security 15 years ago.
It’s called VeriSign for a reason.
But offloading the SSL business would make sense. One of the reasons VeriSign bought Network Solutions ten years ago was the obvious retail synergies between domain names and SSL certificates – customers could buy both at the same time.
That synergy was diluted when VeriSign spun the NSI registrar business out as a separate company three years later, creating the vertically separated domain name market we know today.
Symantec, with its fingers in the enterprise and home/small business pies, might be able to make a better crack at the SSL game.
So is this bad news for SSL’s current silver medal holder, Go Daddy?
Possibly. Symantec is a force to be reckoned with – only marketing prowess could explain why so many people use Norton.
Of course, these news stories could be nonsense.
But my guts say they’re probably based on the same kind of leaks that companies often float to the press, to see what the markets do, when they’re in the final stages of negotiations.