Forgot your password?
typodupeerror
The Internet Software

URL Shortener tr.im To Go Community-Owned, Open Source 145

Posted by Soulskill
from the keeping-up-with-the-twitses dept.
Death Metal sends word that the owners of URL-shortening service tr.im are in the process of releasing the project's source code and moving it into the public domain. This comes after reports that the service may shut down and that they were entertaining offers from prospective buyers. From a post on the site's blog: "It is our hope that tr.im, being an excellent URL shortener in its own right, can now begin to stand in contrast to the closed twitter/bit.ly walled garden: it will become a completely open solution owned and operated by the community for the benefit of the entire community." They plan to complete the transition by September 15th, and the code will be released under the MIT license. In addition, "tr.im will offer all link-map data associated with tr.im URLs to anyone that wants it in real-time. This will involve a variety of time-based snapshots of aggregated destination URLs, the number of tr.im URLs created for any given destination URL, and aggregate click data."
This discussion has been archived. No new comments can be posted.

URL Shortener tr.im To Go Community-Owned, Open Source

Comments Filter:
  • by Bruha (412869) on Wednesday August 19, 2009 @09:06AM (#29117601) Homepage Journal

    They serve no purpose other than giving people a way to distribute malicious links. The Idea was to save some bandwidth, but now it uses more because people are having to write scripts that allow mouseovers to see where the link actually goes which now just causes a few lookups of the same url to happen anyways per person rather than just sitting on a post somewhere.

    In most cases the URL itself is less than 1% of the size of the content of a web page so exactly who or what they're saving is unclear.

  • I mean I like OSS and all, but I wrote my own redirected for my domain it can't be more than 30 lines of PHP

    http://example.com/index.php?url=http://example.org/long+url/

    SQL lookup, return the url if it exists, increment last number if it doesn't. Return: http://example.org/10/ [example.org]

    Mod Rewrite to assist in the redirect and tada.

    Added benefit of not scaring off friends with an odd domain.

    • by Rockoon (1252108)
      I was thinking the same sort of thing.

      For their next attention-getting trick, they are going to open source Hello World.
      • For their next attention-getting trick, they are going to open source Hello World.

        No need [gnu.org].

        • by IBBoard (1128019)

          But that's GPLed. I disagree with the GPL on [insert random reason here] grounds. If only they'd used the BSD license then I could take all of their work and incorporate it in to [insert proprietary program here] without having to pay anything back to the community. ;)

  • by zwei2stein (782480) on Wednesday August 19, 2009 @09:08AM (#29117633) Homepage

    So, they are going open. How is this going to solve issues that make shorteners evil ( http://www.techcrunch.com/2009/04/06/are-url-shorteners-a-necessary-evil-or-just-evil/ [techcrunch.com] )?

    transparency loss (great, there is db that can resolve links. Are browsers supposed to querry 'shortener like' urls and display proper ones?)

    rot & reliability loss (tr.im claims they will be forever open and totally not sell domain to highest bidder and whatnot, but domain is still weakest link - it goes broken and tons of links get broken too)

    pointless proxy (great, so it is now pointless 'open' proxy. yay).

    • by Sockatume (732728)

      Standards for defining "evil" have slipped in the past few years, I see.

      If they're "going open" then I'd say that it's a good start on an open "shortened URL" standard that could, some day, solve those issues while providing a similar function. I can see the use for such a system, if only to provide a way to share links away from a computer, and I'll take short URLs over 2D codes any day.

    • by IBBoard (1128019)

      (tr.im claims they will be forever open and totally not sell domain to highest bidder and whatnot, but domain is still weakest link - it goes broken and tons of links get broken too)

      But he said it, and it is now reported on the Internet, so surely it must be true? How can anyone have ever said anything that was then reported on the Internet that wasn't true or that they knew couldn't last? I'm sure that if he does move ownership of the domain to a company or organisation then that company would never sell o

  • by ashtophoenix (929197) on Wednesday August 19, 2009 @09:09AM (#29117637) Homepage Journal

    Death Metal sends word that the owners of URL-shortening service tr.im are in the process of releasing the project's source code and moving it into the public domain.

    So?

  • by TimHunter (174406) on Wednesday August 19, 2009 @09:42AM (#29117985)
    FTFA:

    Starting today, tr.im will begin its migration into the public domain

    The source code for tr.im will be released under the MIT open-source license.

    Maybe I'm being too literal here, but MIT-licensed source code is not in the public domain.

    • begin its migration into the public domain

      Though I can't see why they need an intermediate step at "open source" between "proprietary" and "public domain".

  • by samj (115984) <samj@samj.net> on Wednesday August 19, 2009 @09:52AM (#29118119) Homepage

    I've had a beef with URL shorteners for a long while now for reasons that have been covered ad nauseam (not the least of which being that in addition to adding significant overhead [techcrunch.com] - typically hundreds of milliseconds per request - they are just plain evil [techcrunch.com]). IMO the best solution is to let webmasters create and advertise their own short links using the "shortlink" link relation (e.g. rel="shortlink" in the HTTP headers and/or HTML HEAD) such that they can be auto-detected by clients who then no longer need to generate their own using 3rd party services. I wrote the shortlink specification [purl.org] a few months ago (based on similar work done by others), released it into the public domain using CC Zero and went about soliciting feedback. The standard got a big shot in the arm last week when WordPress.com announced support for rel=shortlink [wordpress.com] on over 100 million pages. I've since requested support be introduced into the top 20 Twitter clients (representing over 80% of Twitter usage) and have had only positive feedback so far. A number of other high profile sites like PHP.net and Ars Technica have also jumped on board. Anyway if you, like me, are sick of URL shorteners then you're welcome to give me a hand making them go away...

    Sam

    • by dkf (304284)

      I wrote the shortlink specification [purl.org] a few months ago (based on similar work done by others), released it into the public domain using CC Zero and went about soliciting feedback.

      So, are you going to just put it on a random website out there or are you going to do the proper thing and get it on a standards track somewhere? (Maybe IETF or W3C.) That's the only way to get it really trusted by the bulk of users, since they trust those organizations to keep on what they've been doing for years.

    • Oh no - not hundreds of milliseconds! Anything but that, for a site I will use a shortener to visit one time in my life! Sounds to me like another case of "I don't understand why people want this, so nobody should have it".

      Shortlink is a good idea for what it does - but it still puts the onus on the web site owner to create and permanently save a shortlink for every piece of content that can differ based on "get" parameters. When you're a google, that's a lot of latitudes and longitudes to have to retain forever.

      The only argument I've heard against shorteners so far boils down to "but people can misuse it!" -- which in the end boils down to "this is For Your Own Good". Never something I've been particularly fond of - especially on the Internet.

      • by PitaBred (632671)
        ...latitudes and longitudes you have to retain forever? I think you're confused about how mapping works. An infinite number of latitudes and longitudes existed well before Google. Well before the USA and computers, for that matter.
        • Sounds like I'm not expressing myself clearly. There are a theoretically infinite number of addresses that map to specific coordinates. A given set of directions that someone wants to link to will contain the source coordinates and one or more destination coordinates.

          In order to provide a shortlink as described, every set of unique directions requested consisting of 2..n coordinates must be perpetually mapped to a shortlink that can be accessed directly for all time.

          To make the same point using a di

    • by Eil (82413)

      Ah, see, you had me in total agreement until I noticed that your source material was a pair of TechCrunch articles.

  • Ok, I know this isn't technically on topic and I'm sorry about that... but I'm having this problem with Firefox on a few sites and since I haven't found anyone else that suffers from this problem I haven't been able to isolate it properly.

    I just tried submitting a URL to tr.im and after doing so my browser bogged down and slowed to a crawl. My CPU usage jumps to 50% (so 100% of one of the two cores I have) and my whole system becomes ill-responsive. Meanwhile the "answer" section of tr.im is "fading in".

  • so cute (Score:3, Funny)

    by jDeepbeep (913892) on Wednesday August 19, 2009 @09:57AM (#29118189)
    I prefer www.socuteurl.com [socuteurl.com]. It's just, irresistable. There. I said it. I've made the first step toward recovery.
  • Actually, I am thinking of creating a URL shortener inside my intranet. Here's a purpose that no one's thought of, or at least mentioned: it gives a layer of abstraction. Inside the company they can send emails, or put links on web pages that point to my URL shortener, let's say, "Company Policies". That link will always work no matter if the target web page stays on our legacy ASP system or gets moved to our shiny new Sharepoint. All they have to do to fix thousands of links is update the target in the

  • I always wondered how these haven't taken twitter so far, with all the URLs being shortened. I am not a huge fan of twitter, but it serves me well as a means of getting information quickly from a plethora of sources. But I realy have a bad feeling about people clicking without a second thought in all those shortened URLs. All it takes is to subvert a popular tweeter and bang.
  • I'm sorry...

    Twitter is a walled garden. To @reply someone, you have to go through Twitter.

    Facebook is even more of a walled garden. There's a large number of things you can only do with other people on facebook, once you have a facebook account. And, facebook may keep your data forever.

    But URL shorteners? I'm all for making things open source and interoperable, but all this does is make a long URL into a short one. What would opening it up accomplish compared to, say, making Facebook work with OpenID and XF

    • by Yvan256 (722131)

      Even if you don't take Twitter into account, IMHO there's already enough problems with dead links everywhere. When (not if) those URL shortener services go off-line, say hello to millions upon millions of dead links that go nowhere.

"The trouble with doing something right the first time is that nobody appreciates how difficult it was." -- Walt West

Working...