Latest news of the domain name industry

Recent Posts

Fight over closed generics ends in stalemate

Kevin Murphy, August 27, 2020, Domain Policy

Closed generic gTLDs could be a thing in the next application round. Or they might not. Even after four years, ICANN’s greatest policy-making minds can’t agree.

The New gTLDs Subsequent Procedures working group (SubPro) delivered its draft final policy recommendations last week, and the most glaring lack of consensus concerned closed generics.

A closed generic is a dictionary-word gTLD, not matching the applicant’s trademark, that is nevertheless treated as if it were a dot-brand, where the registry is the only eligible registrant.

It’s basically a way for companies to vacuum up the strings most relevant to their businesses, keeping them out of the hands of competitors.

There were 186 attempts to apply for closed generics in 2012 — L’Oreal applied for TLDs such as .beauty, .makeup and .hair with the clear expectation of registry-exclusive registrations, for example

But ICANN moved the goalposts in 2013 following advice from its Governmental Advisory Committee, asking these applicants to either withdraw or amend their applications. It finally banned the concept in 2015, and punted the policy question to SubPro.

But SubPro, made up of a diverse spectrum of volunteers, was unable to reach a consensus on whether closed generics should be allowed and under what circumstances. It’s the one glaring hole in its final report.

The working group does appear to have taken on board the same GAC advice as ICANN did seven years ago, however, which presents its own set of problems. Back in 2013, GAC advice was often written in such a way as to be deliberately vague and borderline unimplementable.

In this case, the GAC had told ICANN to ban closed generics unless there was a “a public interest goal”. What to make of this advice appears to have been a stumbling block for SubPro. What the hell is the “public interest” anyway?

Working group members were split into three camps: those that believed closed generics should be banned outright, those who believed that should be permitted without limitation, and those who said they should be allowed but tightly regulated.

Three different groups from SubPro submitted proposals for how closed generics should be handled.

The most straightforward, penned by consultant Kurt Pritz and industry lawyers Marc Trachtenberg and Mike Rodenbaugh, and entitled The Case for Delegating Closed Generics (pdf) basically says that closed generics encourage innovation and should be permitted without limitation.

This trio argues that there are no adequate, workable definitions of either “generic” or “public interest”, and that closed generics are likely to cause more good than harm.

They raise the example of .book, which was applied for by Amazon as a closed generic and eventually contracted as an open gTLD.

Many of us were thrilled when Amazon applied for .book. Participation by Amazon validated the whole program and the world’s largest book seller was well disposed to use the platform for innovation. Yet, we decided to get in the way of that. What harm was avoided by cancelling the incalculable benefit staring us right in the face?

Coincidentally, Amazon signed its .book registry agreement exactly five years ago today and has done precisely nothing with it. There’s not even a launch plan. It looks, to all intents and purposes, warehoused.

It goes without saying that if closed generics are allowed by ICANN, it will substantially increase the number of potential new gTLD applicants in the next round and therefore the amount of work available for consultants and lawyers.

The second proposal (pdf), submitted by recently independent policy consultant Jeff Neuman of JJN Solutions, envisages allowing closed generics, but with only with heavy end-user (not registrant) involvement.

This idea would see a few layers of oversight, including a “governance council” of end users for each closed generic, and seems to be designed to avoid big companies harming competition in their industries.

The third proposal (pdf), written by Alan Greenberg, Kathy Kleiman, George Sadowsky and Greg Shatan, would create a new class of gTLDs called “Public Interest Closed Generic gTLDs” or “PICGS”.

This is basically the non-profit option.

PICGS would be very similar to the “Community” gTLDs present since the 2012 round. In this case, the applicant would have to be a non-profit entity and it would have to have a “critical mass” of support from others in the area of interest represented by the string.

The model would basically rule out the likes of L’Oreal’s .makeup and Amazon’s .book, but would allow, off the top of my head, something like .relief being run by the likes of the Red Cross, Oxfam and UNICEF.

Because the working group could not coalesce behind any of these proposals, it’s perhaps an area where public comment could have the most impact.

The SubPro draft final report is out for public comment now until September 30.

After it’s given final approval, it will go to the GNSO Council and then the ICANN board before finally becoming policy.

New gTLD prices could be kept artificially high

Kevin Murphy, August 27, 2020, Domain Policy

ICANN might keep its new gTLD application fees artificially expensive in future in order to deter TLD warehousing.

Under a policy recommendation out from the New gTLDs Subsequent Procedures working group (SubPro) last week, ICANN should impose an “application fee floor” to help keep top-level domains out of the hands of gamers and miscreants.

In the 2012 application round, the $185,000 fee was calculated on a “cost-recovery basis”. That is, ICANN was not supposed to use it as a revenue source for its other activities.

But SubPro wants to amend that policy so that, should the costs of the program ever fall before a yet-to-be-determined minimum threshold, the application fee would be set at this fee floor and ICANN would take in more money from the program than it costs to run.

SubPro wrote:

The Working Group believes that it is appropriate to establish an application fee floor, or minimum application fee that would apply regardless of projected program costs that would need to be recovered through application fees collected. The purpose of an application fee floor is to deter speculation and potential warehousing of TLDs, as well as mitigate against the use of TLDs for abusive or malicious purposes. The Working Group’s support for a fee floor is also based on the recognition that the operation of a domain name registry is akin to the operation of a critical part of the Internet infrastructure.

The working group did not put a figure on what the fee floor should be, instead entrusting ICANN to do the math (and publicly show its working).

But SubPro agreed that ICANN should not use what essentially amounts to a profit to fund its other activities.

The excess cash could only be used for things related to the new gTLD program, such as publicizing the availability of new gTLDs or subsidizing poorer applicants via the Applicant Support Program.

ICANN already accounts for its costs related to the program separately. It took in $361 million in application fees back in 2012 and as of the end of 2019 it had $62 million remaining.

Does that mean fees could come down by as much as 17% in the next application round based on ICANN’s experience? Not necessarily — about a third of the $185,000 fee was allocated to a “risk fund” used to cover unexpected developments such as lawsuits, and that risk profile hasn’t necessarily changed in the last eight years.

Fees could be lowered for other reasons also.

As I blogged earlier today, a new registry service provider pre-evaluation program could reduce the application fee for the vast majority of applicants by eliminating redundancies and shifting the cost of technical evaluations from applicants to RSPs.

The financial evaluation is also being radically simplified, which could reduce the application fee.

In 2012, evaluations were carried out based on the applicant’s modelling of how many domains it expected to sell and how that would cover its expenses, but many applicants were way off base with their projections, rendering the process flawed.

SubPro proposes to do away with this in favor generally of applicants self-certifying that their financial situation meets the challenge. Public companies on the world’s largest 25 exchanges won’t have to prove they’re financially capable of running a gTLD at all.

The working group is also proposing changes to the Applicant Support Program, under which ICANN subsidizes the application fee for needy applicants. It wasn’t used much in 2012, a failure largely attributed to ICANN’s lack of outreach in the Global South.

Under SubPro’s recommendations, ICANN would be required to do a much better job of advertising the program’s existence, and subsidies would extend beyond the application fee to additional services such as consultants and lawyers.

Language from the existing policy restricting the program to a few dozen of the world’s poorest countries (which was, in practice, ignored in 2012 anyway), would also be removed and ICANN would be encouraged to conduct outreach in a broader range of countries.

In terms of costs, dot-brand applicants also get some love from SubPro. These applicants will be spared the requirement to have a so-called Continuing Operations Instrument.

The COI is basically a financial safeguard for registrants, usually a letter of credit from a big bank. In the event that a registry goes out of business, the COI is tapped to pay for three years of operations, enabling registrants to peacefully transition to a different TLD.

Given that the only registrant of a dot-brand gTLD is the registry itself, this protection clearly isn’t needed, so SubPro is making dot-brand applicants exempt.

Overall, it seems very likely that the cost of applying for a new gTLD is going to come down in the next round. Whether it comes down to something in excess of the fee floor or below it is going to depend entirely on ICANN’s models and estimates over the coming couple of years.

New back-end approval program could reduce the cost of a new gTLD

Kevin Murphy, August 27, 2020, Domain Policy

ICANN will consider a new pre-approval program for registry back-end service providers in order to streamline the new gTLD application process and potentially reduce application fees.

The proposed “RSP pre-evaluation process” was one of the biggest changes to the new gTLD program agreed to by ICANN’s New gTLDs Subsequent Procedures working group (SubPro), which published its final report for comment last week.

The recommendation addresses what was widely seen as a huge process inefficiency in the evaluation phase of the 2012 application round, which required each application to be subjected to a unique technical analysis by a team of outside experts.

This was perceived as costly, redundant and wasteful, given that the large majority of applications proposed to use the same handful of back-end RSPs.

Donuts, which applied for over 300 strings with virtual cookie-cutter business models and all using the same back-end, had to pay for over 300 technical evaluations, for example.

Similarly, clients of dot-brand service providers such as Neustar and Verisign each had to pay for the same evaluation as hundreds of fellow clients, despite the tech portions of the applications being largely copy-pasted from the same source.

For subsequent rounds, that will all change. ICANN will instead do the tech evals on a per-RSP, rather than per-application, basis.

All RSPs that intend to fight for business in the next round will undergo an evaluation before ICANN starts accepting applications. In a bit of a marketing coup for the RSPs, ICANN will then publish the names of all the companies that have passed evaluation.

The RSPs would have to cover the cost of the evaluation, and would have to be reevaluated prior to each application window. ICANN would be banned from making a profit on the procedure.

SubPro agreed that applicants selecting a pre-approved RSP should not have to pay the portion of the overall application fee — $185,000 in 2012 — that covers the tech eval.

RSPs may decide to recoup the costs from their clients via other means, of course, but even then the fee would be spread out among many clients.

The proposed policy, which is still subject to SubPro, GNSO Council and ICANN board approval, is a big win for the back-ends.

Not only do they get to offer prospective clients a financial incentive to choose them over an in-house solution, but ICANN will also essentially promote their services as part of the program’s communications outreach. Nice.

Single/plural gTLD combos to be BANNED

Kevin Murphy, August 27, 2020, Domain Policy

Singular and plural versions of the same string will be banned at the top level under proposed rule changes for the next round of new gTLDs.

The final set of recommendations of ICANN’s New gTLDs Subsequent Procedures working group (SubPro), which were published after four years of development last week, state:

the Working Group recommends prohibiting plurals and singulars of the same word within the same language/script in order to reduce the risk of consumer confusion. For example, the TLDs .EXAMPLE and .EXAMPLES may not both be delegated because they are considered confusingly similar.

The 2012 round had no hard and fast rule about plurals. There were String Similarity Review and String Confusion Objection procedures, but they produced unpredictable results.

At least 15 single/plural string pairs currently exist in the root, including .fan(s), .accountant(s), .loan(s), .review(s) and .deal(s). Sometimes they’re both part of the same registry’s portfolio, other times they’re owned by competitors.

But others, including .pet and .pets and .sport and .sports, were ruled by independent panels too “confusingly similar” to be allowed to coexist.

The proposed new rule would remove much of the subjectivity from these kinds of decisions, replacing the current system of objections with a flat no-coexistence rule.

If a gTLD that was the plural of an existing gTLD were applied for, the application would be rejected. If the singular and plural variants of the same word were applied for in the same round, the applications would likely end up at auction.

But there would be some wriggle room, with the ban only applying if both applied-for strings truly are singular/plural variations of each other in the same language. The working group wrote:

.SPRING and .SPRINGS could both be allowed if one refers to the season and the other refers to elastic objects, because they are not singular and plural versions of the same word. However, if both are intended to be used in connection with the elastic object, then they will be placed into the same contention set. Similarly, if an existing TLD .SPRING is used in connection with the season and a new application for .SPRINGS is intended to be used in connection with elastic objects, the new application will not be automatically disqualified.

In such situations, both registries would have to agree to binding Public Interest Commitments to only use the gTLDs for their stated, non-conflicting purposes. Registrants would also have to commit to only use .spring to represent the season and .springs for the elastic objects, also.

The ban will substantially eliminate the problem I’ve previously referred to as “tailgating”, where a registry applies for the plural variant of a competitor’s successful, well-marketed gTLD, prices domains slightly lower, then sits back to effortlessly reap the benefits of their rival’s popularity.

One could easily imagine applicants for strings such as .clubs or .sites in the next round, with applicants content to lazily ride the coat-tails of the million-selling singular namespaces.

The rule change will also remove the need for existing registries to defensively apply for the single/plural variants of their current portfolio, and for existing registrants to be compelled to defensively registry domains in yet another TLD.

On the flipside, it means that some potentially useful strings would be forever banned from the DNS.

While it might make sense for a film producer to register a .movie domain to market a single movie, it would not make sense for a review site or movies-related blog, where a .movies domain would be more appropriate. But now that’s never going to be possible.

SubPro’s work is still subject to final approval by SubPro, the GNSO Council and ICANN board of directors before it becomes policy.

The end of the beginning? ICANN releases policies for next round of new gTLDs

Kevin Murphy, August 25, 2020, Domain Policy

Over eight years after ICANN last accepted applications for new gTLDs and more than four years after hundreds of policy wonks first sat around the table to discuss how the program could be improved, the working group has published its draft final, novel-length set of policy recommendations.

Assuming the recommendations are approved, in broad terms the next round will be roughly similar to the 2012 round.

But almost every phase of the application process, from the initial communications program to objections and appeals, is going to get tweaked to a greater or lesser extent.

The recommendations came from the GNSO’s New gTLD Subsequent Procedures working group, known as SubPro. It had over 200 volunteer members and observers and worked for thousands of hours since January 2016 to come up with its Final Draft Report.

Some of the proposed changes mean the cost of an application will likely go down, while others will keep the cost artificially high.

Some changes will streamline the application process, others may complicate it.

Many of the “changes” to policy are in fact mere codifications of practices ICANN brought in unilaterally under the controversial banner of “implementation” in the 2012 round.

Essentially, the GNSO will be giving the nod retroactively to things like Public Interest Commitments, lottery-based queuing, and name collisions mitigation, which had no basis in the original new gTLDs policy.

But other contentious aspects of the last round are still up in the air — SubPro failed to find consensus on highly controversial items such as closed generics.

The report will not tell you when the next round will open or how much it will cost applicants, but the scope of the work ahead should make it possible to make some broad assumptions.

What it will tell you is that the application process will be structurally much the same as it was eight years ago, with a short application window, queued processing, objections, and contention resolution.

SubPro thankfully rejected the idea replacing round-based applications with a first-come, first-served model (which I thought would have been a gaming disaster).

The main beneficiaries of the policy changes appear to be registry service providers and dot-brand applicants, both of which are going to get substantially lowered barriers to entry and likely lower costs.

There are far too many recommendations for me to summarize them eloquently in one blog post, so I’m going to break up my analysis over several articles to be published over the next week or so.

In the meantime, ICANN has opened up the final draft report for public comment. You have until September 30.

The report notes that previously rejected comments will not be considered, so if your line is “New gTLDs suck! .com is King!” you’re likely to find your input falling on deaf ears.

After the comment period ends, and SubPro considers the comments, the report will be submitted to the GNSO Council for approval. Subsequently, it will need to be approved by the ICANN board of directors.

It’s not impossible that this could all happen this year, but there’s a hell of a lot of implementation work to be done before ICANN starts accepting applications once more. We could be looking at 2023 before the next window opens and 2024 before the next batch of new gTLDs start to launch.

UPDATE: This post was updated August 27, 2020 to clarify procedural and timing issues.

New (kinda) geo-TLD rules laid out at ICANN 66

Kevin Murphy, November 2, 2019, Domain Policy

The proposed rules for companies thinking about applying for a geographic gTLD in the next application round have been sketched out.

They’re the same as the old rules.

At ICANN 66 in Montreal today, a GNSO Policy Development Process working group team discussed its recently submitted final report (pdf) into geographic strings at the top level.

While the group, which comprised over 160 members, has been working for over two years on potential changes to the rules laid out in the 2012 Applicant Guidebook, it has basically concluded by consensus that no changes are needed.

What it has decided is that the GNSO policy on new gTLDs that was agreed upon in 2007 should be updated to come into line with the current AGB.

It appears to be a case of the GNSO setting a policy, the ICANN staff and board implementing rules inconsistent with that policy, then, seven years later, the GNSO changing its policy to comply with that top-down mandate.

It’s not really how bottom-up ICANN is supposed to work.

But at least nobody’s going to have to learn a whole new set of rules when the next application round opens.

The 2012 AGB bans two-letter gTLDs, for example, to avoid confusion with ccTLDs. It also places strong restrictions on the UN-recognized names of countries, territories, capital cities and regions.

It also gave the Governmental Advisory Committee sweeping powers to object to any gTLD it didn’t like the look of.

What it didn’t do was restrict geographic names such as “Amazon”, which is an undeniably famous geographic feature but which does not appear on any of the International Standards Organization lists that the AGB defers to.

Amazon the retailer has been fighting for its .amazon gTLDs for seven years, and it appears that the new GNSO recommendations will do nothing to provide clarity for edge-case applicants such as this in future rounds.

The group that came up with report — known as Work Track 5 of the New gTLD Subsequent Procedures PDP Working Group — evidently had members that want to reduce geographic-string protections and those who wanted to increase them.

Members ultimately reached “consensus” — indicating that most but not all members agreed with the outcome — to stick with the status quo.

Nevertheless, the Montreal session this afternoon concluded with a great deal of back-slapping and expressions that Work Track 5 had allowed all voices, even those whose requests were ultimately declined, to be heard equally and fairly.

The final report has been submitted to the full WG for adoption, after which it will go to the full GNSO for approval, before heading to public comment and the ICANN board of directors as part of the PDP’s full final report.

How new gTLD auctions could kill gaming for good

Kevin Murphy, January 11, 2019, Domain Policy

Ever heard of a Vickrey auction? Me neither, but there’s a good possibility that it could become the way most new gTLD fights get resolved in future.

It’s one of several methods being proposed to help eliminate gaming in the next new gTLD application round that have received some support in a recently closed round of public comments.

ICANN’s New gTLD Subsequent Procedures working group (SubPro) is the volunteer effort currently writing the high-level rules governing future new gTLD applications.

Two months ago, it published a preliminary report exploring possible ways that contention sets could be resolved.

The current system, from the 2012 round, actively encourages applicants to privately resolve their sets. Usually, this entails a private auction in which the winning bid is shared evenly between the losing applicants.

This has been happening for the last five years, and a lot of money has been made.

Losing auctions can be a big money-spinner. Publicly traded portfolio registry MMX, for example, has so far made a profit of over $50 million losing private auctions, judging by its annual reports. It spent $13.5 million on application fees in 2012.

MMX is actually in the registry business, of course. But there’s a concern that its numbers will encourage gaming in future.

Companies could submit applications for scores of gTLDs they have no intention of actually operating, banking on making many multiples of their investment by losing private auctions.

Pointing no fingers, it’s very probably already happened. But what to do about it?

Who’s this Vickrey chap?

One suggestion that seems to be getting some love from diverse sections of the community is a variation of the “Vickrey auction”.

Named after the Canadian Nobel Prize-winning economist William Vickrey, it’s also called a “second price sealed bid auction”.

Basically, each applicant would secretly submit the maximum price they’d be willing to pay for the contested gTLD, and the applicant with the highest bid would pay the amount of the second-highest bid.

This method has, I believe, been used more than once in private contention resolution during the 2012 round.

But under the system suggested by SubPro, each applicant would make their single, sealed, high bid at time of application, before they know who else is gunning for the same string.

That way, contention sets could be mostly eliminated right at the start of the process, leading to time and cost efficiencies.

There’d be no need for every application in a contention set to go through full evaluation. Only the high bidder would be evaluated. If it failed evaluation, the second-highest bidder would go into evaluation, etc, until a successful applicant was found.

For losing applicants, a possible benefit of this is that they’d get much more of their application fees refunded, because they’d be skipping much of the process.

Neither would they have to bear the ambient running costs of sitting on their hands for potentially years while the ICANN process plays itself out.

It could also substantially speed up the next round. If the round has five, 10, 20 or more times as many applications as the 1,930 received in 2012, resolving contention sets at the very outset could cut literally years off processing times.

The SubPro concept also envisages that the winning bid (which is to say, the second-highest bid) would go directly into ICANN’s coffers, eliminating the incentive to game the system by losing auctions.

I must admit, there’s a lot to love about it. But it has drawbacks, and critics.

Why Vickrey may suck

SubPro itself notes that the Vickrey model it outlines would have to take into account other aspects of the new gTLD program, such as community applications, applicants seeking financial support from ICANN, and objections.

It also highlights concerns that bids submitted at the time of application constitute private business-plan information that applicants may not necessarily want ICANN staff seeing (with the revolving door, this info could quite easily end up at a competitor).

Companies and constituencies responding to the recent public comment period also have concerns.

There’s hesitance among some potential applicants about being asked to submit blind bids. There are clearly cases where an applicant would be prepared to pay more to keep a gTLD out of the hands of a competitor.

One could imagine, for example, that Coca-Cola would be ready to spend a lot more money on .cola if it knew Pepsi was also bidding, and possibly less if it were only up against Wolf Cola.

The Intellectual Property Constituency raised this concern. It said that it was open to the idea of Vickrey auctions, but that it preferred that bids should be submitted after all the applications in the contention set have been revealed, rather than at time of application:

Although there is a potential downside to this in that the parties have not put a “value” on the string in advance, the reality is that many factors come into play in assessing that “value”, certainly for a brand owner applicant and possibly for all applicants, including who the other parties are and how they have indicated they intend to use the TLD.

The Brand Registry Group and Neustar were both also against the Vickrey model outlined by SubPro, but neither explained their thinking.

The Business Constituency, which is often of a mind with the IPC, in this case differed. The BC said it agreed that bids should be submitted alongside applications, only to be unsealed in the event that there is contention. The BC said:

This Vickrey auction would also resolve contention sets very early in the application evaluation process. That saves contending applicants from spending years and significant sums during the contention resolution process, which was very difficult for small applicants.

It’s hard to gauge where current registries, which are of course also likely applicants, stand on Vickrey. The Registries Stakeholder Group is a pretty diverse bunch nowadays and it submitted a set of comments that, unhelpfully, flatly contradict each other.

“Some” RySG members believe that the current evaluation and contention process should stay in place, though they’re open to a Vickrey-style auction replacing the current ascending-clock model at the last-resort stage after all evaluations are complete.

“Other” RySG members, contrarily, wholeheartedly support the idea that bids should be submitted at the time of application and the auction processed, Vickrey-style, before evaluation.

“An application process which requires a thorough evaluation of an applicant who will not later be operating the gTLD is not an efficient process,” these “other” RySG members wrote. They added:

if contention sets are resolved after the evaluation process and not at the beginning of it, like the Vickery model suggestion, it would enable applicants who applied for multiple strings to increase the size of their future bids each time they lost an auction. Each TLD needs to be treated on its own merits with no contingencies allowed for applicants with numerous applications.

It’s not at all clear which registries fall into the “some” category and which into “other”, nor is it clear the respective size of each group.

Given the lack of substantive objections to pre-evaluation Vickrey auctions from the “some” camp, I rather suspect they’re the registries hoping to make money from private settlements in the next round.

Other ideas

Other anti-gaming ideas put forward by SubPro, which did not attract a lot of support, included:

  • A lottery. Contention sets would be settled by pulling an applicant’s name out of a hat.
  • An RFP process. This would mean comparative, merit-based evaluation, which has never been a popular idea in ICANN circles.
  • Graduated fees. Basically, applicants would pay more in application fees for each subsequent application they filed. This would disadvantage portfolio applicants, but could give smaller applicants a better shot at getting the string they want.

All of the comments filed on SubPro’s work has been fed back into the working group, where discussions about the next new gTLD round will soon enter their fourth year…