Latest news of the domain name industry

Recent Posts

.sexy may be blocked in Iran

Kevin Murphy, September 16, 2015, Domain Tech

Some networks in Iran appear to be systematically blocking Uniregistry’s .sexy gTLD.

That’s one of the conclusions of a slightly odd experiment commissioned by ICANN.

The newly published An Analysis of New gTLD Universal Acceptance was conducted by APNIC Labs. The idea was to figure out whether there are any issues with new gTLDs on the internet’s DNS infrastructure.

It concluded that there is not — new gTLDs work just fine on the internet’s plumbing.

However, the survey — which comprised over 100 million DNS resolution attempts — showed “One country, Iran, shows some evidence of a piecemeal block of Web names within the .sexy gTLD.”

The sample size for Iranian attempts to access .sexy was just 30 attempts. In most cases, users were able to resolve the names with DNS, but HTTP responses appeared to be blocked.

The survey did not test .porn or .adult names, but it might be safe to assume similar behavior in those gTLDs.

APNIC also concluded that Israel’s .il ccTLD, included in the report as a known example of TLD blocking at the national level, is indeed blocked in Iran and Syria.

The study also found that there may be issues with Adobe’s Flash software, when used in Internet Explorer, when it comes to resolving internationalized domain names.

That conclusion seems to have been reached largely because the test’s methodology saw a Flash advertisement discretely fetching URLs in the background of web pages using Google Ads.

When the experimenters used HTML 5 to run their scripts instead, there was no problem resolving the names.

The study did not look at some of the perhaps more pressing UA issues, such as the ability for registrants and others to use new gTLD domain names in web applications.

Blue Coat explains .zip screw-up

Kevin Murphy, September 4, 2015, Domain Tech

Security vendor Blue Coat apparently doesn’t check whether domains are actually domains before it advises customers to block them.

The company yesterday published a blog post that sought to explain why it denounced Google’s unlaunched .zip gTLD as “100% shady” even though the only .zip domain in existence leads to google.com.

Unrepentant, Blue Coat continued to insist that businesses should consider blocking .zip domains, while acknowledging there aren’t any.

It said that its censorware treats anything entered into a browser’s address bar as a URL, so it has been treating file names that end in .zip — the common format for compressed archive files — as if they are .zip domain names. The blog states:

when one of those URLs shows up out on the public Internet, as a real Web request, we in turn treat it as a URL. Funny-looking URLs that don’t resolve tend to get treated as Suspicious — after all, we don’t see any counter-balancing legitimate traffic there.

Further, if a legal domain name gets enough shady-looking traffic — with no counter-evidence of legitimate Web traffic — it’s possible for one of our AI systems to conclude that the behavior isn’t changing, and that it deserves a Suspicious rating in the database. So it gets one.

In other words, Blue Coat has been categorizing Zip file names that somehow find their way into a browser address bar as .zip domain names.

That may sound like a software bug that Blue Coat needs to fix, but it’s still telling people to block Google’s gTLD anyway, writing:

In conclusion, none of the .zip “domains” we see in our traffic logs are requests to registered sites. Nevertheless, we recommend that people block these requests, until valid .zip domains start showing up.

That’s a slight change of position from its original “Businesses should consider blocking traffic that leads to the riskiest TLDs”, but it still strikes me as irresponsible.

The company has still not disclosed the real numbers behind any of the percentages in its report, so we still have no idea whether it was fair to label, for example, Famous Four’s .review as “100% shady”.

Concern over mystery TMCH outage

Kevin Murphy, May 20, 2015, Domain Tech

The Trademark Clearinghouse is investigating the causes and impact of an outage that is believed to have hit its primary database for 10 hours last Friday.

Some in the intellectual property community are concerned that the downtime may have allowed people to register domain names without receiving Trademark Claims notices.

The downtime was confirmed as unscheduled by the TMCH on a mailing list, but requests for more information sent its way today were deflected to ICANN.

An ICANN spokesperson said that the outage is being analyzed right now, which will take a couple of days.

The problem affected the IBM-administered Trademark Database, which registrars query to determine whether they need to serve up a Claims notice when a customer tries to register a domain that matches a trademark.

I gather that registries are supposed to reject registration attempts if they cannot get a definitive answer from the TMDB, but some are concerned that that may not have been the case during the downtime.

Over 145,000 Claims notices have been sent to trademark owners since the TMCH came online over a year ago.

(UPDATE: This story was edited May 21 to clarify that it is the TMCH conducting the investigation, rather than ICANN.)

Site names and shames shoddy TLD support

Kevin Murphy, April 20, 2015, Domain Tech

A self-professed geek from Australia is running a campaign to raise awareness of new gTLDs by naming and shaming big companies that don’t provide comprehensive TLD support on their web sites.

SupportTheNew.domains, run by university coder Stuart Ryan, has been around since last June and currently indexes support problems at dozens of web sites.

The likes of Facebook, Amazon, Adobe and Apple are among those whose sites are said to offer incomplete support for new gTLDs.

It’s the first attempt I’m aware of to list “universal acceptance” failures in any kind of structured way.

Ryan says on the site that he set up the campaign after running into problems signing up for services using his new .email email address.

The site relies on submissions from users and seems to be updated whenever named companies respond to support tickets.

Universal acceptance is a hot topic in the new gTLD space, with ICANN recently creating a steering group to promote blanket TLD support across the internet.

Often, sites rely on outdated lists of TLDs or regular expressions that think TLDs are limited to three characters when they attempt to verify domains in email addresses or URLs.

Google eliminating domains from search results

Kevin Murphy, April 17, 2015, Domain Tech

Google has made another move to make domain names less relevant to internet users.

The company will no longer display URLs in search results pages for any web site that adopts a certain technical standard.

Instead, the name of the web site will be given. So instead of a DI post showing up with “domainincite.com” in results, it would be “Domain Incite”.

Google explained the change in a blog post incorrectly titled “Better presentation of URLs in search results”.

Webmasters wishing to present a company name or brand instead of a domain name need to publish metadata on their home pages. It’s just a few lines of code.

Google will make a determination whether to make the change based on whether the name meets these criteria:

Be reasonbly [sic] similar to your domain name

Be a natural name used to refer to the site, such as “Google,” rather than “Google, Inc.”

Be unique to your site—not used by some other site

Not be a misleading description of your site

Code samples and the rules are published here.

It strikes me that Google, by demanding naming uniqueness, is opening itself up for a world of hurt.

Could there be a landrush among non-unique brands? How will disputes be handled?

Right now the change has been made only to mobile search results and only in the US, but Google hinted that it could roll out elsewhere too.