Latest news of the domain name industry

Recent Posts

Get ready for thousands of new two-letter domains

Kevin Murphy, November 9, 2016, Domain Policy

New gTLD registry operators have been given the right to start selling two-letter domains that match country codes.
Potentially thousands of names could start being released next year, resulting in a windfall for registries and possible opportunities for investors.
Some governments, however, appear to be unhappy with the move and how ICANN’s board of directors reached its decision.
The ICANN board yesterday passed a resolution that will unblock all two-letter domains that match country codes appearing on the ISO 3166 list, most of which are also ccTLDs.
While the resolution gives some protection to governments worried about abuse of “their” strings, it’s been watered down to virtually nothing.
In the first draft of the rules, published in July, ICANN said registries “must” offer an “Exclusive Availability Pre-registration Period” — a kind of mini-sunrise period limited to governments and ccTLD operators.
In the version approved by ICANN yesterday, the word “must” has been replaced by “may” and the word “voluntary” has been added.
In other words, registries won’t have to give any special privileges to governments when they start selling two-character names.
They will, however, have to get registrants to agree that they won’t pass themselves off as having affiliations with the relevant government. It looks like registries probably could get away with simply adding a paragraph to their terms of service to satisfy this requirement.
Registries will also have to “take reasonable steps to investigate and respond to any reports from governmental agencies and ccTLD operators of conduct that causes confusion with the corresponding country code in connection with the use of a letter/letter two-character ACSCII domain.”
This too is worded vaguely enough that it could wind up being worthless to governments, many of which are worried about domains matching their ccTLDs being passed off as government-approved.
The Governmental Advisory Committee is split on how worrisome this kind of thing is.
For examples, governments such as Spain and Italy have fought for the right to get to pre-approve the release of “es” and “it” domains, whereas the governments of the US and UK really could not care less.
The most-recent formal GAC advice on the subject, coming out of the July meeting in Helsinki, merely said ICANN should:

urge the relevant Registry or the Registrar to engage with the relevant GAC members when a risk is identified in order to come to an agreement on how to manage it or to have a third-party assessment of the situation if the name is already registered

“It is our belief that that our resolution is consistent with GAC advice,” outgoing ICANN board member Bruce Tonkin said yesterday, noting that nobody can claim exclusive rights over any string, regardless of length.
Before and after the resolution passed, the GAC expressed “serious concern” that the board had not formally responded to the Helsinki communique.
In its Hyderabad communique, issued after yesterday’s vote, the GAC advised the board to:

  • Clearly indicate whether the actions taken by the Board as referred to in the resolution adopted on 8 November 2016 are fully consistent with the GAC advice given in the Helsinki Communiqué.
  • Always communicate in future the position of the Board regarding GAC advice on any matter in due time before adopting any measure directly related to that advice.

ICANN staff are now tasked with coming up with a way to implement the two-character release.
My sense is that some kind of amendment to Registry Agreements might be required, so we’re probably looking at months before we start seeing two-letter domains being released.

“Shadow content policing” fears at ICANN 57

Kevin Murphy, November 7, 2016, Domain Policy

Fears that the domain name industry is becoming a stooge for “shadow regulation” of web content were raised, and greeted very skeptically, over the weekend at ICANN 57.
Attendees yesterday heard concerns from non-commercial stakeholders, notably the Electronic Frontier Foundation, that deals such as Donuts’ content-policing agreement with the US movie industry amount to regulation “by the back door”.
But the EFF, conspicuously absent from substantial participation in the ICANN community for many years, found itself walking into the lion’s den. Its worries were largely pooh-poohed by most of the rest of the community.
During a couple of sessions yesterday, EFF senior attorney Mitch Stoltz argued that the domain industry is being used by third parties bent on limiting internet freedoms.
He was not alone. The ICANN board and later the community at large heard support for the EFF’s views from other Non-Commercial User Constituency members, one of whom compared what’s going on to aborted US legislation SOPA, the Stop Online Piracy Act.
“Regulation of content through the DNS system, through ICANN institutions and through contracted parties is of great concern and I think should be of great concern to all of us here,” Stoltz said.
He talked about a “bright line” between making policies related to domain names and policies related to content.
“I hope that the bright line between names and content is maintained because I think once we get past it, there may be no other bright line,” he said.
“If we allow in copyright enforcement, if we allow in enforcement of professional or business licensing as a criterion for owning a domain name, it’s going to be very hard to hold that line,” he said.
ICANN has long maintained, though with varying degrees of vigor over the years, that it does not regulate content.
Chair Steve Crocker said yesterday: “It’s always been the case, from the inception. It’s now baked in deeply into the mission statement. We don’t police content. That’s not our job.”
That kind of statement became more fervent last year, as concerns started to be raised about ICANN’s powers over the internet in light of the US government’s decision to give up its unique ICANN oversight powers.
Now, a month after the IANA transition was finalized, ICANN has new bylaws that for the first time state prominently that ICANN is not the content cops.
Page one of the massive new ICANN bylaws says:

ICANN shall not regulate (i.e., impose rules and restrictions on) services that use the Internet’s unique identifiers or the content that such services carry or provide

It’s pretty explicit, but there’s a catch.
A “grandfather” clause immediately follows, which states that registries and registrars are not allowed to start challenging the terms of their existing contracts on the basis that they dabble too much with content regulation.
That’s mainly because new gTLD Registry Agreements all include Public Interest Commitments, which in many cases do actually give ICANN contractual authority over the content of web sites.
Content-related PICs are most prominent in “Community” gTLDs.
In the PICs for Japanese city gTLD .osaka, for example, the registry promises that “pornographic, vulgar and highly objectionable content” will be “adequately monitored and removed from the namespace”.
While ICANN does not actively go out looking for .osaka porn, if porn did start showing up in .osaka and the registry does not suspend the domains, it would be in breach of its RA and could lose its contract.
That PIC was voluntarily adopted by the .osaka registry and does not apply to other gTLDs, but it is binding.
So in a roundabout kind of way, ICANN does regulate content, in certain narrow circumstances.
Some NCUC members think this is a “loophole”.
Another back door they think could be abused are the bilateral “trusted notifier” relationships between registries and third parties such as the movie, music and pharmaceutical industries.
Donuts and Radix this year have announced that the Motion Picture Association of America is allowed to notify it about domains that it believes are being used for large-scale, egregious movie piracy.
Donuts said it has suspended a dozen domains — sites that were TLD-hopping to evade suspension — since the policy came into force.
EFF’s Stoltz calls this kind of thing “shadow regulation”.
“Shadow regulation to us is the regulation of content… through private agreements or through unaccountable means that were not developed through the bottom-up process or through a democratic process,” he told the ICANN board yesterday.
While the EFF and NCUC thinks this is a cause for concern, they picked up little support from elsewhere in the community.
Speakers from registries, registrars, senior ICANN staff, intellectual property and business interests all seemed to think it was no big deal.
In a different session on the same topic later in the day, outgoing ICANN head of compliance Allen Grogan addressed these kinds of deals. He said:

From ICANN’s point of view, if there are agreements that are entered into between two private parties, one of whom happens to be a registry or a registrar, I don’t see that ICANN has any role to play in deciding what kinds of agreements those parties can enter into. That clearly is outside the scope of our mission and remit.
We can’t compel a registrar or a registry to even tell us what those agreements are. They’re free to enter into whatever contracts they want to enter into.
To the extent that they become embodied in the contracts as PICs, that may be a different question, or to the extent that the agreements violate those contracts or violate consensus policies, that may be a different question.
But if a registrar or registry decides to enter into an agreement to trust the MPAA or law enforcement or anyone else in deciding what actions to take, I think they’re free to do that and it would be far beyond the scope of ICANN’s power or authority to do anything about that.

In the same session, Donuts VP Jon Nevett cast doubt on the idea that there is an uncrossable “bright line” between domains and content by pointing out that the MPAA deal is not dissimilar to registries’ relationships with the bodies that monitor online child abuse material.
“We have someone that’s an expert in this industry that we have a relationship with saying there is child imagery abuse going on in a name, we’re not going to make that victim go get a court order,” he said.
Steve DelBianco of the NetChoice Coalition, a member of the Business Constituency, had similar doubts.
“Mitch [Stoltz] cited as an example that UK internet service providers were blocking child porn and since that might be cited as an example for trademark and copyright that we should, therefore, not block child porn at all,” he said. “I can’t conceive that’s really what EFF is thinking.”
Nevett gave a “real-life example” of a rape.[tld] domain that was registered in a Donuts gTLD.
“[The site] was a how-to guide. Talk about horrific,” he said. “We got a complaint. I’m not going to wait till someone goes and gets a court order. We’re a private company and we agreed to suspend that name immediately and that’s fine. There was no due process. And I’m cool with that because that was the right thing to do.”
“Just like a restaurant could determine that they don’t want people with shorts and flip-flops in the restaurant, we don’t want illegal behavior and if they want to move somewhere else, let them move somewhere else,” he said.
In alleged copyright infringement cases, registrants get the chance to respond before their names are suspended, he said.
Stoltz argued that the Donuts-MPAA deal had been immediately held up, when it was announced back in February, as a model that the entire industry should be following, which was dangerous.
“If everyone is subject to the same policies, then they are effectively laws and that’s effectively law-making by other means,” he said.
He and other NCUC members are also worried about the Domain Name Association’s Healthy Domains Initiative, which is working on voluntary best practices governing when registries and registrars should suspend domain names.
Lawyer Kathy Kleiman of the NCUC said the HDI was basically “SOPA behind closed doors”.
SOPA was the hugely controversial proposed US federal legislation that would have expanded law enforcement powers to suspend domains in cases of alleged copyright infringement.
Stoltz and others said that the HDI appeared to be operating under ICANN’s “umbrella”, giving it an air of having multistakeholder legitimacy, pointing out that the DNA has sessions scheduled on the official ICANN 57 agenda and “on ICANN’s dime”.
DNA members disagreed with that characterization.
It seems to me that the EFF’s arguments are very much of the “slippery slope” variety. While that may be considered a logical fallacy, it does not mean that its concerns are not valid.
But if there was a ever a “bright line” between domain policy and content regulation, it was traversed many years ago.
The EFF and supporters perhaps should just acknowledge that what they’re really concerned about is copyright owners abusing their powers, and target that problem instead.
The line has moved.

Governments mull greater geo gTLD powers

Kevin Murphy, November 3, 2016, Domain Policy

Governments are toying with the idea of asking ICANN for greater powers over gTLDs that match their geographic features.
The names of rivers, mountains, forests and towns could be protected under ideas bandied around at the ICANN 57 meeting in India today.
The Governmental Advisory Committee held a session this morning to discuss expanding the list of strings that already enjoy extra ICANN protections on grounds of geography.
In the 2012 application round, gTLDs matching the names or ISO acronyms of countries were banned outright.
For capital city names and non-capital names where the gTLD was meant to represent the city in question, government approval was required.
For regions on the ISO 3166 list, formal government non-objection was required whether or not the gTLD was intended to represent the region.
That led to gTLDs such as .tata, a dot-brand for Tata Group, being held up indefinitely because it matches the name of a small region of Morocco.
One applicant wound up agreeing to fund a school to the tune of $100,000 in order to get Montenegro’s support for .bar.
But other names were not protected.
Notably, the string “Amazon” was not on any of the protected lists, largely because while it’s a river and a forest it doesn’t match the name of a formal administrative region of any country.
While GAC objections ultimately killed off Amazon’s bid for .amazon (at least for now), the GAC wants to close the Amazon loophole in time for the next new gTLD application round.
The GAC basically is thinking about the power to write its own list of protected terms. It would build on the existing list to also encompass names of “geographic significance”.
GAC members would be able to submit names to the list; applicants for those names would then require non-objection letters from the relevant government(s).
Some governments, including the UK and Peru, expressed concern that “geographic significance” is a little vague.
Truly, without a narrow definition of “significance” it could turn out to be a bloody big list. The UK alone has over 48,000 towns, not to mention all the named forests, rivers and such.
Peru, one of the nations that had beef with Amazon, said it intended to send ICANN a list of all the geographic names it wants protecting, regardless of whether the GAC decides to create a new list.
Other GAC members, including Iran and Denmark, pressed how important it was for the GAC to coordinate with other parts of the ICANN community, mainly the GNSO, on geo names, to avoid overlap and conflict further down the line.
The GAC has a working group looking at the issue. It hopes to have something to recommend to the ICANN board by the Copenhagen meeting next March.

Should new gTLDs be first-come, first-served?

Kevin Murphy, November 3, 2016, Domain Policy

Who needs rounds? The idea of allocating new gTLDs on a first-come, first-served basis is getting some consideration at this week’s ICANN 57 meeting.
Such a move could have profound implications on the industry, creating new business opportunities while scuppering others.
Whether to shift to a FCFS model was one of many issues discussed during a session today of the GNSO’s working group tasked with looking at the next new gTLD round.
Since 2000, new gTLDs have been allocated in strict rounds, with limited application windows and often misleading guidance about when the next window would open, but it’s not written in stone that that is the way it has to be.
The idea of switching to FCFS — where any company could apply for any gTLD at any time — is not off the table.
FCSC would not mean applicants would merely have to ask for a string and automatically be granted — there’d still be multiple phases of evaluation and opportunities for others to object, so it wouldn’t be just like registering a second-level domain.
Depending on how the new process was designed, doing away with rounds could well do away with the concept of “contention” — multiple applicants simultaneously vying for the same string.
This would basically eliminated the need for auctions entirely.
No longer would an applicant be able to risk a few hundred thousand bucks in application expenses in the hope of a big private auction pay-day. Similarly, ICANN’s quarter-billion-dollar pool of last-resort auction proceeds would grow no more.
That’s potentially an upside, depending on your point of view.
On the downside, and it’s a pretty big downside, a company could work on a solid, innovative gTLD application for months only to find its chances scuppered because a competitor filed an inferior application a day earlier.
A middle way, suggested during today’s ICANN 57 session, would be a situation in which the filing of an application starts a clock of maybe a few months during which other interested parties would be able to file their own applications.
That would keep the concept of contention whilst doing away with the restrictive round-based structure, but would present plenty of new opportunities for exploitation and skulduggery.
Another consequence of the shift to FCSC could be to eliminate the concept of Community gTLDs altogether, it was suggested during today’s session.
In 2012, applicants were given the opportunity to avoid auction if they could meeting exacting “Community” standards. The trade-off is that Community gTLDs are obliged to be restricted to their designated community.
If FCSC led to contention going away, there’d be no reason for any applicant to apply for a Community gTLD that could unnecessarily burden their business model in future.
For those strongly in favor of community gTLDs, such as governments, this could be an unwelcome outcome.
Instinctively, I think FCSC would be a bad idea, but I think I’d be open to persuasion.
I think the main problem with the round-based structure today is that it’s unpredictable — nobody knows when the next round is likely to be so it’s hard to plan their new gTLD business ideas.
Sure, FCSC would bring flexibility, allowing companies to apply at times that are in tune with their business objectives, but the downsides could outweigh that benefit.
Perhaps the way to reduce unpredictability would be to put application windows on a predictable, reliable schedule — once a year for example — as was suggested by a participant or two during today’s ICANN 57 session.
The discussions in the GNSO are at a fairly early stage right now, but a switch to FCSC would be so fundamental that I think it needs to be adopted or discarded fairly quickly, if there’s ever going to be another application round.

RANT: Governments raise yet another UN threat to ICANN

Kevin Murphy, October 31, 2016, Domain Policy

ICANN’s transition away from US government oversight is not even a month old and the same old bullshit power struggles and existential threats appear to be in play as strongly as ever.
Governments, via the chair of the Governmental Advisory Committee, last week yet again threatened that they could withdraw from ICANN and seek refuge within the UN’s International Telecommunications Union if they don’t get what they want from the rest of the community.
It’s the kind of thing the IANA transition was supposed to minimize, but just weeks later it appears that little has really changed in the rarefied world of ICANN politicking.
Thomas Schneider, GAC chair, said this on a conference call between the ICANN board and the Generic Names Supporting Organization on Thursday:

I’m just urging you about considering what happens if many governments consider that this system does not work. They go to other institutions. If we are not able to defend public interest in this institution we need to go elsewhere, and this is exactly what is happening currently at the ITU Standardization Assembly.

This is a quite explicit threat — if governments don’t like the decisions ICANN makes, they go to the ITU instead.
It’s the same threat that has been made every year or two for pretty much ICANN’s entire history, but it’s also something that the US government removing its formal oversight of ICANN was supposed to help prevent.
So what’s this “public interest” the GAC wants to defend this time around?
It’s protections for the acronyms of intergovernmental organizations (IGOs) in gTLDs, which we blogged about a few weeks ago.
IGOs are bodies ranging from the relatively well-known, such as the World Health Organization or World Intellectual Property Organization, to the obscure, such as the European Conference of Ministers of Transport or the International Tropical Timber Organization.
According to governments, the public interest would be served if the string “itto”, for example, is reserved in every new gTLD, in other words. It’s not known if any government has passed laws protecting this and other IGO strings in their own ccTLDs, but I suspect it’s very unlikely any have.
There are about 230 such IGOs, all of which have acronyms new gTLD registries are currently temporarily banned from selling as domains.
The multi-stakeholder GNSO community is on the verge of coming up with some policy recommendations that would unblock these acronyms from sale and grant the IGOs access to the UDRP and URS mechanisms, allowing them to reclaim or suspend domains maliciously exploiting their “brands”.
The responsible GNSO working group has been coming up with these recommendations for over two years.
While the GAC and IGOs were invited to participate in the WG, and may have even attended a couple of meetings, they decided they’d have a better shot at getting what they wanted by talking directly to the ICANN board outside of the usual workflow.
The WG chair, Phil Corwin of the Internet Commerce Association, recently described IGO/GAC participation as a “near boycott”.
This reluctance to participate in formal ICANN policy-making led to the creation of the so-called “small group”, a secretive ad hoc committee that has come up with an opposing set of recommendations to tackle the same IGO acronym “problem”.
I don’t think it’s too much of a stretch to call the the small group “secretive”. While the GNSO WG’s every member is publicly identified, their every email publicly archived, their every word transcribed and published, ICANN won’t even say who is in the small group.
I asked ICANN for list of its members a couple of weeks ago and this is what I got:

The group is made up of Board representatives from the New gTLD Program Committee (NGPC), primarily, Chris Disspain; the GAC Chair; and representatives from the IGO coalition that first raised the issue with ICANN and some of whom participated in the original PDP on IGO-INGO-Red Cross-IOC protections – these would include the OECD, the UN, UPU, and WIPO.

With the publication two weeks ago of the small group’s recommendations (pdf) — which conflict with the expect GNSO recommendations — the battle lines were drawn for a fight at ICANN 57, which kicks off this week in Hyderabad, India.
Last Thursday, members of the GNSO Council, including WG chair Corwin, met telephonically with GAC chair Schneider, ICANN chair Steve Crocker and board small group lead Disspain to discuss possible ways forward.
What emerged is what Crocker would probably describe as a “knotty” situation. I’d describe it as a “process clusterfuck”, in which almost all the relevant parties appear to believe their hands are tied.
The GNSO Council feels its hands are tied for various reasons.
Council chair James Bladel explained that the GNSO Council doesn’t have the power to even enter substantive talks.
“[The GNSO Council is] not in a position to, or even authorized to, negotiate or compromise PDP recommendations that have been presented to use by a PDP working group and adopted by Council,” he said.
He went on to say that while the GNSO does have the ability to revisit PDPs, to do so would take years and undermine earlier hard-fought consensus and dissuade volunteers from participating in policy making. He said on the call:

By going back and revisiting PDPs we both undermine the work of the community and potentially could create an environment where folks are reluctant to participate in PDPs and just wait until a PDP is concluded and then get engaged at a later stage when they feel that the recommendations are more likely adopted either by the board or reconciled with GAC advice.

He added that contracted parties — registries and registrars — are only obliged to follow consensus policies that have gone through the PDP process.
Crocker and Disspain agreed that the the GAC and the GNSO appear to have their hands tied until the ICANN board makes a decision.
But its hands are also currently tied, because it only has the power to accept or reject GNSO recommendations and/or GAC advice, and it currently has neither before it.
Chair Crocker explained that the board is not able to simply impose any policy it likes — such as the small group recommendations, which have no real legitimacy — it’s limited to either rejecting whatever advice the GAC comes up with, rejecting whatever the GNSO Council approves, or rejecting both.
The GNSO WG hasn’t finished its work, though the GNSO Council is likely to approve it, and the GAC hasn’t considered the small group paper yet, though it is likely to endorse it it.
Crocker suggested that rejecting both might be the best way to get everyone around a table to try to reach consensus.
Indeed, it appears that there is no way, under ICANN’s processes, for these conflicting views to be reconciled formally at this late stage.
WG chair Corwin said that any attempt to start negotiating the issue before the WG has even finished its work should be “rejected out of hand”.
With the GNSO appearing to be putting up complex process barriers to an amicable way forward, GAC chair Schneider repeatedly stated that he was attempting to reach a pragmatic solution to the impasse.
He expressed frustration frequently throughout the call that there does not appear to be a way that the GAC’s wishes can be negotiated into a reality. It’s not even clear who the GAC should be talking to about this, he complained.
He sounds like he’s the sensible one, but remember he’s representing people who stubbornly refused to negotiate in the WG over the last two years.
Finally, he raised the specter of governments running off to the UN/ITU, something that historically has been able to put the willies up those who fully support (and in many cases make their careers out of) the ICANN multistakeholder model.
Here’s a lengthier chunk of what he said, taken from the official meeting transcript:

If it turns out that there’s no way to change something that has come out of the Policy Development Process, because formally this is not possible unless the same people would agree to get together and do the same thing over again, so maybe this is what it takes, that we need to get back or that the same thing needs to be redone with the guidance from the board.
But if then nobody takes responsibility to — in case that everybody agrees that there’s a public interest at stake here that has not been fully, adequately considered, what — so what’s the point of this institution asking governments for advice if there’s no way to actually follow up on that advice in the end?
So I’m asking quite a fundamental question, and I’m just urging you about considering what happens if many governments consider that the system does not work. They go to other institutions. They think we are not able to defend public interest in this institution. We need to go elsewhere. And this is exactly what is happening currently at the ITU Standardization Assembly, where we have discussions about protection of geographic names because — and I’m not saying this is legitimate or not — but because some governments have the feeling that this hasn’t been adequately addressed in the ICANN structure.

I’m really serious about this urge that we all work together to find solutions within ICANN, because the alternative is not necessarily better. And the world is watching what signals we give, and please be aware of that.

The “geographic names” issue that Schneider alludes to here seems to be a proposal (Word .doc) put forward by African countries and under discussion at the ITU’s WTSA 2016 meeting this week.
The proposal calls for governments to get more rights to oppose geographic new gTLD applications more or less arbitrarily.
It emerged not from any failure of ICANN policy — geographic names are already protected at the request of the GAC — but from Africa governments being pissed off that .africa is still on hold because DotConnectAfrica is suing ICANN in a California court and some batty judge granted DCA a restraining order.
It’s not really relevant to the IGO issue, nor especially relevant to the issue of governments failing to influence ICANN policy.
The key takeaway from Schneider’s remarks for me is that, despite assurances that the IANA transition was a way to bring more governments into the ICANN fold rather than seeking solace at the UN, that change of heart is yet to manifest itself.
The “I’m taking my ball and going home” threat seems to be alive and well for now.
If you made it this far but want more, the transcript of the call is here (pdf) and the audio is here (mp3). Good luck.

ICANN to terminate Guardian’s last gTLD

Kevin Murphy, October 27, 2016, Domain Registries

Newspaper publisher Guardian News & Media is out of the gTLD game for good now, with ICANN saying this week that it will terminate its contract for the dot-brand, .theguardian.
It’s the 14th new gTLD registry agreement to be terminated by ICANN. All were dot-brands.
The organization has told Guardian that it started termination proceedings October 21, after the company failed to complete its required pre-delegation testing before already-extended deadlines.
.theguardian was the only possible gTLD remaining of the five that Guardian originally applied for.
It signed its registry agreement with ICANN in April 2015, but failed to go live within a year.
Guardian also applied for .guardian, which it decided not to pursue after facing competition from the insurance company of the same name.
The .observer gTLD, a dot-brand for its Sunday sister paper, was sold off to Top Level Spectrum last month and has since been delegated as a non-brand generic.
Applications for .gdn and .guardianmedia were withdrawn before Initial Evaluation had even finished.

Ship explosion cost ICANN $700k

Kevin Murphy, October 27, 2016, Domain Policy

An explosion on board a cargo ship set ICANN back $700,000, the organization has revealed.
The September 1 blast and subsequent fire, which we blogged about two weeks ago, cause equipment heading to ICANN 57 in Hyderabad to be detained by authorities.
The explosion, at the port in Hamburg, was reportedly caused by a welding accident and nobody was seriously hurt.
Now, in a blog post, ICANN said the cost of replacing the detained gear and shipping it to India was $700,000.
Hyderabad is due to kick off next week.
The ICANN blog post, from CIO Ashwin Rangan, reports that all the equipment required to run the meeting has already arrived safely.
The meeting has also been plagued by widespread reports of difficulties obtaining visas. Many have complained on social media that the process is unnecessarily unpredictable and complicated.
Many of these complaints have come from regular ICANN attendees from North America and Europe, unaccustomed to having to secure visas for international travel.
But the level of complaints has been sufficiently high that ICANN has been talking to Indian government officials about ensuring everyone who wants to attend, can.

ICANN has $400m in the bank

Kevin Murphy, October 27, 2016, Domain Policy

ICANN ended its fiscal 2016 with just shy of $400 million on its balance sheet, according to its just-released financial report.
As of June 30, the organization had assets of $399.6 million, up from $376.5 million a year earlier, the statement (pdf) says.
Its revenue for the year was actually down, at $194.6 million in 2016 compared to $216.8 million in 2015.
That dip was almost entirely due to less money coming in via “last-resort” new gTLD auctions.
The growth of the gTLD business led to $74.5 million coming from registries, up from $59 million in 2015.
Registrar revenue grew from $39.3 million to $48.3 million.
Money from ccTLD registries, whose contributions are entirely voluntary, was down to $1.1 million from $2.1 million.
Expenses were up across the board, from $143 million to $131 million, largely due to $5 million increases in personnel and professional services costs.
The results do not take into account the $135 million Verisign paid for .web, which happened after the end of the fiscal year.
Auction proceeds are earmarked for some yet-unspecified community purpose and sit outside its general working capital pool. Regardless, they’re factored into these audited financial reports.
ICANN has to date taken in almost a quarter of a billion dollars from auctions. Its board recently decided to diversify how the money is invested, so the pot could well grow.

Thick Whois coming to .com next year, price rise to follow?

Kevin Murphy, October 27, 2016, Domain Registries

Verisign could be running a “thick” Whois database for .com, .net and .jobs by mid-2017, under a new ICANN proposal.
A timetable published this week would see the final three hold-out gTLDs fully move over to the standard thick Whois model by February 2019, with the system live by next August.
Some people believe that Verisign might use the move as an excuse to increase .com prices.
Thick Whois is where the registry stores the full Whois record, containing all registrant contact data, for every domain in their TLD.
The three Verisign TLDs currently have “thin” Whois databases, which only store information about domain creation dates, the sponsoring registrar and name servers.
The model dates back to when the registry and registrar businesses of Verisign’s predecessor, Network Solutions, were broken up at the end of the last century.
But it’s been ICANN consensus policy for about three years for Verisign to eventually switch to a thick model.
Finally, ICANN has published for public comment its anticipated schedule (pdf) for this to happen.
Under the proposal, Verisign would have to start offering registrars the ability to put domains in its thick Whois by August 1 2017, both live via EPP and in bulk.
It would not become obligatory for registrars to submit thick Whois for all newly registered domains until May 1, 2018.
They’d have until February 1, 2019 to bulk-migrate all existing Whois records over to the new system.
Thick Whois in .com has been controversial for a number of reasons.
Some registrars have expressed dissatisfaction with the idea of migrating part of their customer relationship to Verisign. Others have had concerns that local data protection laws may prevent them moving data in bulk overseas.
The new proposal includes a carve-out that would let registrars request an exemption from the requirements if they can show it would conflict with local laws, which holds the potential to make a mockery out of the entire endeavor.
Some observers also believe that Verisign may use the expense of building and operating the new Whois system as an excuse to trigger talks with ICANN about increasing the price of .com from its current, frozen level.
Under its .com contract, Verisign can ICANN ask for a fee increase “due to the imposition of any new Consensus Policy”, which is exactly what the move to thick Whois is.
Whether it would choose to exercise this right is another question — .com is a staggeringly profitable cash-printing machine and this Whois is not likely to be that expensive, relatively speaking.
The proposed implementation timetable is open for public comment until December 15.

Radix acquires .fun gTLD from Warren Buffett

Kevin Murphy, October 25, 2016, Domain Registries

New gTLD portfolio player Radix has acquired the pre-launch TLD .fun from its original owner.
The company took over the .fun Registry Agreement from Oriental Trading Company on October 4, according to ICANN records.
Oriental is a party supplies company owned by Warren Buffett’s Berkshire Hathaway.
It won .fun in a private auction in April last year, beating off Google and .buzz operator DotStrategy.
It had planned to run it as a “closed generic” — keeping all the domains in .fun for itself — but those plans appeared to have been shelved by the time it signed its RA in January this year.
Evidently Oriental’s heart was not in it, and Radix made an offer for the string it found more attractive.
Radix business head Sandeep Ramchandani confirmed to DI today that .fun will be operated in a completely unrestricted manner, the same as its other gTLDs.
It will be Radix’s first three-letter gTLD, Ramchandani said. It already runs zones such as .online, .site and .space.
.fun is not yet delegated, but Radix is hoping for a December sunrise period, he said.