Forgot your password?
typodupeerror
The Internet Technology

ICANN Approves Non-Latin ccTLDs 284

Posted by kdawson
from the here-comes-everybody dept.
Several readers including alphadogg tipped the news that ICANN has approved non-Latin ccTLDs at its meeting in Seoul. "Starting in mid-November, countries and territories will be able to apply to show domain names in their native language, a major technical tweak to the Internet designed to increase language accessibility. On Friday, the Internet's addressing authority approved a Fast-Track Process for applying for an IDN (Internationalized Domain Name) and will begin accepting applications on Nov. 16. The move comes after years of technical testing and policy development... Currently, domain names can only be displayed using the Latin alphabet letters A-Z, the digits 0-9 and the hyphen, but in future countries will be able to display country-code Top Level Domains (cc TLDs) in their native language. ... 'The usability of IDNs may be limited, as not all application software is capable of working with IDNs,' ICANN said in a 59-page proposal (PDF) dated Sept. 30 that describes the [application] process." Reader dhermann adds, "Great, now even less chance I can identify NSFW links before they are blocked by my work's big brother app and my boss is notified... again."
This discussion has been archived. No new comments can be posted.

ICANN Approves Non-Latin ccTLDs

Comments Filter:
  • by czarangelus (805501) <iapetus&gmail,com> on Friday October 30, 2009 @10:36AM (#29923617)
    Arabic TLDs are a threat to national security
    • by Canazza (1428553)

      it's only a matter of time before someone registers bánkófámérícá.com or llóydstsb.co.uk for their phishing schemes

      • Re: (Score:3, Informative)

        by Anonymous Coward

        That has been possible for years.
        This is about registering bankofamerica.cõm or lloydstsb.cø.ûk

        The part AFTER the dot.

  • by Anonymous Coward

    I'm glad we're going with Non-Latin TLDs now, I never understood going to the website "e.pluribus.unm"

  • Perdire (Score:3, Funny)

    by SEWilco (27983) on Friday October 30, 2009 @10:41AM (#29923695) Journal
    There go my plans for world domination through venividivici.vvv
  • by azior (1302509) on Friday October 30, 2009 @10:42AM (#29923703)

    ï höpé thãt slâshðõt wìll dö thís töø wìth ÜRLs!

    www.íçáñn.örg

    ìt wörkéð!

    • by Anonymous Coward on Friday October 30, 2009 @11:03AM (#29924009)

      Here's a demonstration of how non-Latin characters show on /., starting with Arabic:

      Hindi:
      Russian:
      Japanese:
      Korean:
      Chinese:

      • Here's a demonstration of how non-Latin characters show on /., starting with Arabic:

        Hindi:
        Russian:
        Japanese:
        Korean:
        Chinese:

        Just because the characters don't show up in the edited text doesn't mean that they won't be handled in anchor tags or Slashdot's URL tag.

        • Re: (Score:3, Insightful)

          by rxmd (205533)

          Just because the characters don't show up in the edited text doesn't mean that they won't be handled in anchor tags or Slashdot's URL tag.

          Well, Slashdot mangles them anyway [russian-]. The URL should end in .com.

          Slashdot's web interface is quite embarrassing in this respect. Having a non-Unicode-capable page in 2009 is like having one that is optimized for Netscape 0.9, no matter what amount of JavaScript and Web 2.0 bling they put in there.

          If international URLs will finally force Slashdot to implement a triviality such as string parsing, so much the better.

    • by mrdoogee (1179081)
      A Møøse once bit my sister...
  • ICANN has lost it! (Score:4, Insightful)

    by RiotingPacifist (1228016) on Friday October 30, 2009 @10:44AM (#29923735)

    Far too much software makes the assumption that TLDs only contain [a-z0-9-], so if you want to go changing that there needs to be a damn good reason, there is not. There are ~1369 2 letter TLDS to be shared between ~200 soverin states and 49284 3 letter generic ones to be split between uses (.xxx .nws .org .edu, etc), there doesn't seam to be any good reason to expand that and make lots of software more complex.

    • by Looce (1062620) * on Friday October 30, 2009 @10:49AM (#29923803) Journal

      ... of course, is Punycode.

      A comment [slashdot.org] before yours has www.íçáñn.örg, which, when entered into Firefox, turns into

      www.xn--n-tfarxw.xn--rg-eka

      . Looks like the software will still live :)

    • Re: (Score:2, Informative)

      by imagoon (1159473)
      If everyone in the world liked those latin characters, then sure. But maybe someone else in the world prefers yahoo.(nihon*)? Wanted to write it in kanji but /. doesn't seem to take unicode.
    • Domain names have been muddy for quite some time. Think of all the non commercial dot coms. Or government sites on anything other than their .gov or their country code. del. del.icio.us? They've been mostly ignored. People get .com to look professional, .net at random (though it is supposed to be for ISPs), and .org if you want to stand for some ideal.

      Though TBH I'm not certain WHY we need TLDs anyways. It isn't like there is some commercial slashdot.com it just redirects. I imagine that any big name will
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      This is about letting people use characters from their frickin' own language instead of just english.

      Just like so many other things in programming.. if the software doesn't do international, it doesn't do international.

      This has nothing to do with making more TLDs.

    • Re: (Score:3, Insightful)

      by Jorgensen (313325)

      Yeah right. Because everybody in the whole world only uses ASCII right?

      Sorry for sounding flippant, but such US-myopia is far to prevalent for my liking.... Come on guys: Wake up and smell the coffee! There is more to the world than the US! There is no reason to make most of South East Asia and China 2nd-rate citizens on the internet.

      I agree that there is a lot of software that needs changing as a result though. But that just means more work, right? You could probably sell this as an anti-recession measure

    • by cdrudge (68377)

      Far too much software makes the assumption that TLDs only contain [a-z0-9-]

      It's not really an assumption is it if until now the "standard" only called for [a-z0-9-].

    • Re: (Score:3, Insightful)

      by jayme0227 (1558821)

      You know, except for ease of use for those who don't use Latin characters in their daily lives. But who cares about them? They should just go back to their own country and create their own internet.

    • Re: (Score:3, Informative)

      by bill_mcgonigle (4333) *

      if you want to go changing that there needs to be a damn good reason

      I don't have any first-hand experience, but according to the BBC story when one enters a native-script domain name into one's browser, the domain name is entered normally (for the locale) and then to enter, e.g., ".in", one needs to press a key combination to shift the keyboard into latin-mode, then, enter the two letters, then shift the keyboard back into native mode.

      It's a usability problem. I sure would be annoyed if .com had to be rend

  • Encoding? (Score:2, Interesting)

    by mewsenews (251487)

    The encoding seems weird to me:

    In reality, the new domain names will be stored in the DNS as sequences of letters and numbers beginning xn-- in order to maintain compatibility with the existing infrastructure. The characters following the xn-- will be used to encode a sequence of Unicode characters representing the country name.

    Any DNS gurus care to explain why they wouldn't simply use UTF8?

    • Re:Encoding? (Score:4, Informative)

      by Psx29 (538840) on Friday October 30, 2009 @10:53AM (#29923843)
      "in order to maintain compatibility with the existing infrastructure." Tons (dare I say, a majority) of software would break if they used UTF8
    • Re:Encoding? (Score:5, Informative)

      by DamonHD (794830) <d@hd.org> on Friday October 30, 2009 @10:53AM (#29923859) Homepage

      To avoid breaking all the DNS-related code out there that assumes (ie correctly, based on the current spec) only alphanumerics and '-' in each component.

      If you wish to rewrite every single bit of DNS-dependent code, in every laptop, server, embedded network device, etc, etc, ... well assume that it can't be done, and with this mechanism it doesn't need to be. Though I bet a few bits of code will barf at the '--' anyhow...

      Rgds

      Damon

      • by amorsen (7485)

        (ie correctly, based on the current spec)

        Only hostnames are restricted. Other than that, DNS is almost 8-bit-clean (it case folds A-Z to a-z and dot is special) so UTF-8 is fine.

        Punycode only exists because some people have puny ...

        • by DamonHD (794830)

          Can you be sure that the DNS code in the WinME that runs your building's lifts is 8-bit clean, just for example? Or your old-but-good HP laser printer with embedded networking?

          This is pragmatically addressing the probability of code still in use but written long ago when UTF-8 and 8-bit-clean were woolly notions and twinkles in academic eyes, or just badly slapped together by some junior lowest-big developer who thought "oh, just (ASCII) letters and numbers" and it seemed to work...

          I'd bet you a whole doll

        • by kc8apf (89233)

          Actually, all labels are restricted to the characters allowed for ARPANET hosts. The spec does state that implementations should store labels as a length octect followed by a sequence of octets, thus implying that any compliant software _should_ handle UTF8, but no one wants to take that chance.

    • Re: (Score:2, Informative)

      by tokul (682258)

      Any DNS gurus care to explain why they wouldn't simply use UTF8?

      I am not DNS guru, but guessing. RFC882 - November 1983. RFC2044 - October 1996.

    • by NevarMore (248971)

      Backwards compatibility with existing systems that don't support UTF-8 but still need to make DNS queries. Ranges from basic tools like dig, to un-updated browsers, to embedded devices like routers.

      Are there any public DNS servers that support this to see what happens with my existing software??

    • by Looce (1062620) *

      Since software makes the assumption that TLDs only contain [a-z0-9-] [slashdot.org], UTF-8 can't be used in the DNS. Internationalised domain names, even before these new ccTLDs, used that xn-- system, called Punycode [wikipedia.org]. For instance, the site tinyarro.ws, which provides short URLs via a Unicode domain name, already used .ws for that purpose. It turns into xn--hgi.ws when the DNS request is issued.

      ccTLDs using Punycode is just an extension of that mechanism for second-level domains.

      • Yeah, Slashdot apparently needs to be internationalised too. That ".ws" should be "[U+27A1].ws" (BLACK RIGHTWARDS ARROW).

        • by tepples (727027)

          Slashdot apparently needs to be internationalised too.

          Slashdot uses a Unicode character whitelist due to past abuse [slashdot.org], and U+27A1 isn't on that whitelist. The euro sign € is though.

          • by koolfy (1213316)
            And what's the story of the "" ( ... ) unicode character ? Is it supported already ? (looks like it isn't)
      • Re: (Score:3, Insightful)

        by Creepy (93888)

        Actually, UTF-8 can and is being used in DNS - as long as you stick to basic Latin characters, that is. Also it is Unicode - as I posted earlier, Unicode is a blanket for UTF-8, UTF-16 and UTF-32 which makes it ambiguous.

        UTF-8 bits 0-7 is ASCII as long as bit 8 isn't set, so to fully support it you'd need to still exclude bits below 7 that are not valid html characters and include support for multiple bytes and bit 8. The reason existing DNS servers won't work with it is because bit 8 indicates multibyte

    • Good question. The field size for DNS requests is in double words (16bits) increments, so I don't see why it couldn't have been.

    • The goal is to encode international characters as the characters currently accepted by the standard (a-z, 0-9, etc.) UTF does not do this. Also, the number of characters you can have in a domain name is limited to 26 (and it is the encoded length that counts), so the coding has to be efficient. This is precisely what Punycode [wikipedia.org] is designed to do. Software can recognize an encoded name by the fact that it begins with the special sequence of letters "xn--"
    • by Delwin (599872) *
      because UTF8 only solves the null term problem, not the readable character issue.
    • Any DNS gurus care to explain why they wouldn't simply use UTF8?

      Because they know full well that the vast majority of web developers don't really know what unicode is [joelonsoftware.com] or how it works. Moreover the unicode spec is forever in flux and complete overkill for the international url problem. Lation only urls are a fly, we don't need a bazooka.

      Frankly, the current Punycode based system is truly inspired, giving the best of both worlds. Newer browsers can display and use international urls seamlessly, but older syst

    • To prvnt phishing and other abus. provids an identifiabl stpgap to prvnt m ding smthing with th URL that I'v just dn in this mmnt.
  • Phishing aid (Score:5, Insightful)

    by querist (97166) on Friday October 30, 2009 @10:50AM (#29923817) Homepage
    This will only make phishing attacks easier unless there are SERIOUS checks on domain name registrations. There are letters in the Cyrillic alphabet that have different character codes than their look-alike letters in the Latin alphabet. I'm sure there are other collisions as well. I'm sure they accounted for this in the proposal, but the problem always lies in the implementation. From a security standpoint, this is a VERY bad idea without proper regulation of domain name registrations, and so far it has been demonstrated that we cannot manage them properly even with only the Latin alphabet. From a cultural and usability standpoint, this is a good thing. It will be easier for someone whose native language uses a non-Latin alphabet to recognize the supposed purpose of a web site by its domain name if some of those domain names can be in their native language. A hypothetical native Tamil speaker who speaks no English will be able to recognize the purpose of a site with an appropriate domain name in Tamil, for example
    • There are letters in the Cyrillic alphabet that have different character codes than their look-alike letters in the Latin alphabet. I'm sure there are other collisions as well. I'm sure they accounted for this in the proposal, but the problem always lies in the implementation

      This is a decision made by ICANN. We've known for some time that they will willingly approve really tremendously bad ideas, if enough money is presented to them. They recently moved on a motion to start selling gTLDs, after all.

      From a security standpoint, this is a VERY bad idea without proper regulation of domain name registrations, and so far it has been demonstrated that we cannot manage them properly even with only the Latin alphabet

      Security is not of any concern for ICANN. Never has been, never will be. As long as they keep making money they're happy; security, spam, phishing, etc, be damned.

    • Re:Phishing aid (Score:4, Informative)

      by nsayer (86181) <nsayer@k[ ]com ['fu.' in gap]> on Friday October 30, 2009 @11:10AM (#29924099) Homepage

      I think the limitation that nationalized character sets will be restricted to the country TLDs where that language is native is a good first step. Additionally, I believe you're not allowed to use the latin alternative form characters from unicode (like 0xFF20-0xFF5F).

      If you're really paranoid, you could just be extra suspicious of domains that end in two letters (and yes, I am including .us), particularly when the 2nd level name is something you recognize, like paypal, ebay, etc. If you're in China, there may indeed be a legitimate paypal.cn, but I suspect it would set off my spidey sense to see a URL like that show up in my e-mail.

      • Re: (Score:3, Insightful)

        by dkf (304284)

        If you're really paranoid, you could just be extra suspicious of domains that end in two letters (and yes, I am including .us), particularly when the 2nd level name is something you recognize, like paypal, ebay, etc. If you're in China, there may indeed be a legitimate paypal.cn, but I suspect it would set off my spidey sense to see a URL like that show up in my e-mail.

        That won't work. There really are a lot of big companies that have country-specific sites that use the two-letter global domains. For example, if you're after books in German then you might be very interested in visiting amazon.de, which is totally legit.

        • Re: (Score:3, Insightful)

          by nsayer (86181)

          Yeah, but if you know that you want that, then you'll be expecting it. We're talking about being on the lookout for 2 letter TLDs in places you don't expect them.

    • I don't think it's a big deal for TLDs since afaict those are created manually anyway.

      For lower level domains (which are already using IDN) it's a bigger issue, firefox resorted to using a whitelist to get arround irresponsible registrars.

    • Re: (Score:2, Informative)

      by pablo.cl (539566)

      There are letters in the Cyrillic alphabet that have different character codes than their look-alike letters in the Latin alphabet.

      Remember we are talking about ccTLDs. There are no more than 200 countries that would like to use non ASCII ccTLD, and they can be inspected manually. Russia wasn't awarded Cyrillic .ru because it looks like Latin .py (Paraguay). They will get .fr (Russian Federation) that looks like 0p (0 with vertical bar).

      • Russia wasn't awarded Cyrillic .ru because it looks like Latin .py (Paraguay). They will get .fr (Russian Federation) that looks like 0p (0 with vertical bar).

        Are you sure it's not .rf - which doesn't clash with anything either, and makes much more sense to Russians themselves (since that is the standard abbreviation for Russian Federation in Russian).

    • Re:Phishing aid (Score:5, Insightful)

      by Mathieu Lutfy (69) * on Friday October 30, 2009 @12:09PM (#29925011) Homepage

      This risk can be greatly reduced if they limit domain names to only one alphabet, i.e. Russian domain with Cyrillic ccTLD should have only Cyrillic letters in it.

      In many of these countries, they often have two domain names for a website: one that is easy to remember by foreigners, one that is easy to remember by locals (i.e. cyrillic name transliterated to Latin alphabet). The transliterated domain name is usually horrible, sounds weird, and often people transliterate stuff in different ways, so it's often not easy to remember anyway.

      I think non-latin ccTLDs is a good thing.

      matt

  • Yay!!! The door is open for an even harder to detect phishing scheme! Imagine the emails linking to http://slashd/ [slashd]öt.org/something...

    I'm all for internationalization, but perhaps limit it to internationalized domain extensions (.jp or .es for example)...
    • by nsayer (86181)

      You not only didn't read TFA, but you didn't even read the summary very well, did you?

    • Re: (Score:3, Informative)

      by mjwx (966435)
      You do know that this is for the TLD part of the URL only. The first part of a domain can already be written in non latin scripts, Korean for example but the TLD must but Latin, this decision just enables the .com.kr to be turned into Hangul.

      If ICANN did not standardise this then nations will just implement their own systems which will be different and incompatible with each other, much like China and Thailand have already done.
  • Thee current RFC 1738 http://www.faqs.org/rfcs/rfc1738.html [faqs.org] Only allows URLs to be composed of

    " Within those parts, an octet may be represented by the chararacter which has that octet as its code within the US-ASCII [20] coded character set. In addition, octets may be encoded by a character triplet consisting of the character "%" followed by the two hexadecimal digits (from "0123456789ABCDEF") which forming the hexadecimal value of the octet. (The characters "abcdef" may also be used in hexadecimal encod

    • by nsayer (86181)

      RTFA. Internationalized characters in domains are encoded. See also RFC 3492.

  • Is it my imagination, or does this proposal only apply for TLDs, like .uk and .jp? I don't see any mention of supporting it for the rest of the domain name. That seems a logical extension, but it's not been announced.

    • Re: (Score:3, Informative)

      by petermgreen (876956)

      It's already been in use for the rest of the domain name under certain TLDs for some time.

  • cmon how could you think "but in future countries" sounds okay.

    it should be "but in the future countries"

    great info though. I mean its nice to see that the internet is starting to become more international, especially as the US cuts mandatory ties to ICANN.

  • Excellent idea (Score:4, Insightful)

    by ugen (93902) on Friday October 30, 2009 @11:11AM (#29924109)

    Now those countries, organizations and businesses that wish to become inaccessible to most of the world (except the native speakers of their own language) can finally do so as easily as possible. Create their own little Internet reservations and stay there :)

    As long as my software (such as Firefox) obligingly converts these IDN urls into the dash-hex notation making them obviously unreadable, I am ok with that.

    Disclaimer: I am a native of non-English speaking country. I am sure a few of my countrymen will use this feature based on misplaced patriotism. I am also sure that vast majority will ignore it just like they ignore potential to use non-latin domain names that exists right now.

    • by Alioth (221270)

      How is this any different today?

      If the content pointed to by a domain written in Latin characters points to a site written in Chinese, non-speakers of Chinese still can't actually do anything useful with the site.

      • by Petaris (771874)

        Why? There are translators. The problem is that you are cutting people off as they can't type in Chinese characters (I know its possible but you have to install extra bits to do so, and know what your typing and how to type it on a latin keyboard).

        I have friends in a few different non-latin alphabet countries and family in one. If their email addresses were in their alphabet I likely wouldn't be able to email them easily even though we, mostly anyway, correspond in English.

    • by wvmarle (1070040)

      For example in China this will be used a lot. Other countries using non-Latin scripts will do so as well.

      Actually it was possible already for a few years to register domain names in Chinese characters in Hong Kong, but still ending in .hk. Now that part can also become Chinese characters as replacement for .hk, .cn or .tw.

      The catch was that a Chinese URL would work only within HK/China. Now this will also start to work worldwide.

      One big issue for many lower-educated Chinese is that the Latin script is as

    • They make sense as aliases to sites that also have standard domain names.

    • by chord.wav (599850)

      Your point may be valid for some non-latin languages but as soon as you put China into the equation the figures change radically.

      China IS the world. They are zillions! If they start using Chinese names it'll be us, "latin speakers", who'll be confined to our "own language and make our sites inaccesible to the rest of the world".

  • Unicode can mean many things - UTF-8, UTF-16, UTF-32 - so specifying Unicode is not detailed enough to implement and by not specifying, it is opening a can of worms IMO. UTF-8 tends to be slower and larger for non-ASCII but has wide acceptance. It would also be the favorite for Linux/UNIX because it is very common there (my Linux box has LANG=en_US.UTF-8) and also for communication with databases (in my experience, UTF-8 is what most enterprise companies use for their database settings if they need multi-

    • TFA is badly written and factually inaccurate.

      All that is actually going on here is that icann is allowing use of IDN (which is already in use at lower levels of the heirachy) in the root. The standards for IDN already specify exactly how the names are encoded.

      http://tools.ietf.org/html/rfc3490 [ietf.org]

    • Re: (Score:3, Informative)

      by spitzak (4019)

      Several mistakes there.

      First of all any domain name is going to have to be encoded as a stream of bytes somehow because far too much stuff is already implemented to handle the string that way. As others pointed out punycode is used.

      Second, UTF-8 is smaller than UTF-16 for all languages, even Chinese. This is because all the ASCII 0x00-0x7F characters are smaller, and therefore the encoding will be smaller if there are more of these than there Unicode 0x800-0xFFFF characters. This seems incorrect for Chinese

  • See these slides about exploiting UTF-aware software.

    http://www.casabasecurity.com/files/Chris_Weber_Character%20Transformations%20v1.7_IUC33.pdf [casabasecurity.com]

  • by mano.m (1587187) on Friday October 30, 2009 @11:33AM (#29924457)
    A lot of the debate here seems to be about English-speaking countries vs. the rest of the world, but English isn't the only language that uses the Latin. Also, the unavailability of non-Latin scripts hasn't hampered the flourishing of home-grown websites in India and China named in their many local languages - what makes the ICANN think this is even necessary?
    • There's only a handful languages that use strict ascii, one is dead, and a bunch is a small family of closely related languages spoken by about 2 million people in the Pacific, which is what TFS and TFA describes TLDs as being able to do.

  • by Nadaka (224565) on Friday October 30, 2009 @11:40AM (#29924577)

    Yay. Now you can can register yourbankname.com with some funky characters that render in exactly the same way as the letter you are used to.

  • Aren't IDNs already available via Puncode encoding? (For example the ones at http://www.w3.org/2003/Talks/0425-duerst-idniri/slide12-0.html [w3.org]) Or am I missing something?
  • could be chàse.com or cháse.com

    every website i go to from now on, i need to study the url with a magnifying glass to make sure i am getting the actual site i wanted. not even as a security precaution, but just to avoid phony sites that might be spoofing a real one for all sorts of purposes, even if just humor, not all of them nefarious, but all of them certainly annoying

    a with accent mark may be easy to see, but there are some subtle unicode characters that are so completely like the lowercase "L"

I'm all for computer dating, but I wouldn't want one to marry my sister.

Working...