Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software The Internet

After 32 Years, One of the Net's Oldest Software Archives Is Shutting Down (arstechnica.com) 42

Benj Edwards reports via Ars Technica: In a move that marks the end of an era, New Mexico State University (NMSU) recently announced the impending closure of its Hobbes OS/2 Archive on April 15, 2024. For over three decades, the archive has been a key resource for users of the IBM OS/2 operating system and its successors, which once competed fiercely with Microsoft Windows. In a statement made to The Register, a representative of NMSU wrote, "We have made the difficult decision to no longer host these files on hobbes.nmsu.edu. Although I am unable to go into specifics, we had to evaluate our priorities and had to make the difficult decision to discontinue the service."

Hobbes is hosted by the Department of Information & Communication Technologies at New Mexico State University in Las Cruces, New Mexico. In the official announcement, the site reads, "After many years of service, hobbes.nmsu.edu will be decommissioned and will no longer be available. As of April 15th, 2024, this site will no longer exist." The earliest record we've found of the Hobbes archive online is this 1992 Walnut Creek CD-ROM collection that gathered up the contents of the archive for offline distribution. At around 32 years old, minimum, that makes Hobbes one of the oldest software archives on the Internet, akin to the University of Michigan's archives and ibiblio at UNC.

This discussion has been archived. No new comments can be posted.

After 32 Years, One of the Net's Oldest Software Archives Is Shutting Down

Comments Filter:
  • by RitchCraft ( 6454710 ) on Monday January 29, 2024 @09:34PM (#64199350)

    I hope this archive is added to archive.org if it hasn't been already.

  • Make a mirror (Score:5, Informative)

    by TwistedGreen ( 80055 ) on Monday January 29, 2024 @09:34PM (#64199352)

    These old archives used be huge, but now they're tiny by today's standards. And they already have a tar file for you to download, so you don't have to figure out that messy ftp mirror syntax. Plus it's only 19 GB.

    ftp://hobbes.nmsu.edu/archives... [nmsu.edu]

    Dust off your anonymous ftp skills and grab it while you can!

    • by sconeu ( 64226 )

      Warning: Tarchive is 19GB in size.

      • by Dwedit ( 232252 )

        19GB TAR archive is the worst possible use of Tar. You'd want to use literally anything but Tar for this use case.

        Why? Tar provides no random access at all, the entire file needs to be streamed through in order to list what files are in the archive.

        • WTF do people still insist on using tar these days. There are far superior archive formats these days. Tar was meant for bundling up files to write out to a tape drive.
        • by tlhIngan ( 30335 )

          Why? Tar provides no random access at all, the entire file needs to be streamed through in order to list what files are in the archive.

          Do you know why PKZip always goes "FAST!" in the banner? That's because the Zip header (or really, the trailer - it's at the end of the file) contains the Zip directory, allowing you to quickly list and get to any file in the archive. It's also why split zips are so inconvenient to deal with - the directory has to be written to the first floppy disk so it has to reserve spac

        • I don't think that's entirely fair. If you don't need compression, tar is a very convenient way to bundle up a large directory while preserving permissions and symlinks with minimal system requirements. With download speeds what they are these days, it's often faster to just download an uncompressed copy than it is to wait for the compression to complete, especially on a small server strapped for resources. After all, I downloaded the 19GB file in just over 5 minutes. Plus with today's disk access speeds, i

        • by Improv ( 2467 )

          This is not strictly speaking true. While the file list isn't present upfront, it's possible to skip over most of the contents of each file; getting a full file list ends up being a "read header, seek ahead, read header, seek ahead, read header, seek ahead" operation that might traverse the whole file but doesn't need to read every block of it.

    • Re:Make a mirror (Score:4, Informative)

      by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Tuesday January 30, 2024 @12:35AM (#64199680) Journal

      Torrent option here:

      https://archive.org/details/ho... [archive.org]

  • by YetAnotherDrew ( 664604 ) on Monday January 29, 2024 @09:37PM (#64199360)

    which once "competed fiercely" with Microsoft Windows

    FTFY

    • by jonadab ( 583620 )
      Indeed. OS/2 did have some fans, but it was never meaningful competition for the mainstream OSes. Maybe it wanted to be, maybe even by some measures it should have been, but in practice it wasn't. Its market share never climbed out of the fraction-of-a-percent range. Even things like VMS and BeOS were more widely used.
  • Not sure who the other one is, but Iâ(TM)m sure weâ(TM)ll both be sad.

    Seriously though, Iâ(TM)ll just mirror it and park it on my NAS â¦. Back when I actively used OS/2, that wouldâ(TM)ve been unthinkable.

    • by sabri ( 584428 )

      Connecting to hobbes.nmsu.edu (hobbes.nmsu.edu)|128.123.88.139|:443... connected.
      HTTP request sent, awaiting response... 200 OK
      Length: 19469445120 (18G) [application/x-tar]
      Saving to: "hobbes_ftp_11Jan2024.tar"

      hobbes_ftp_11Jan2024.tar 0%[ ] 83.10M 722KB/s eta 7h 48m

      If only they had some bandwidth... That torrent option looks appealing now.

  • 32 year matches the age of Aminet [wikipedia.org], which will now outlive this. (Though not 'online' back in 1992, Aminet did start as an archive then.)

  • by JBMcB ( 73720 ) on Monday January 29, 2024 @09:45PM (#64199378)

    ftp.funet.fi (Demos!)
    ftp.sunet.se
    Tucows
    Whatever MIT's archive was called (can't remember but it was extensive and, for some reason, abysmally slow)
    UMinn's Gopher archives
    Walnut Creek, but IIRC they charged if you wanted faster downloads
    Info-Mac
    University of Illinois Urbana-Champlain (The supercomputing center)

    • by sconeu ( 64226 )

      Oh come on... don't forget SIMTEL-20

    • ftp.nasa.gov - where all sorts of stuff got stuffed. They had a great early collection of 3D data files
      ftp.aminet.net - all the opensource Amiga software

    • Walnut Creek, but IIRC they charged if you wanted faster downloads

      WC-CDROM did not charge for downloads.

      It was an "archive of archives" and mirrored many other collections, but also built their own.

      cdrom.com was the busiest site on the Internet in 1994. They were overtaken by Netscape in 1995.

    • by Burdell ( 228580 )

      wuarchive.wustl.edu (Washington University in St. Louis) which gave us the wu-ftpd server software that was widely used on other sites

      sunsite.unc.edu (University of North Carolina) was another large FTP repository for free software

      sciences.sdsu.edu (San Diego State University) had a bunch of music (like movie/TV themes) in Sun AU format... if you had a fast enough connection, you could get them straight to /dev/audio - streaming music before anybody called it that!

    • by jonwil ( 467024 )

      I can't remember if it was an actual host or just mirrored things from elsewhere but I remember the University of Oakland had a big archive back in the day.

  • "...Although I am unable to go into specifics, we had to evaluate our priorities and had to make the difficult decision to discontinue the service."

    "Sensors detect Microsoft OneDrive contract, Captain..."

    Narrator: "It was not, in fact, cheaper to host their files in The Cloud."

  • It's already been mentioned in this thread, but I hope archive.org picks it up. in recent times, I have found myself looking at older software archives more and more as I start to relearn the value of finished software, software that was good, and not the hellish nightmare we live in now where everything's online activation and constantly updating, or being put more and more behind a paywall, or in a higher features tier. I can guarantee there is probably shovel where in there, but I still enjoy the exper
    • Handy to have a script built to use wget or equivalent to crawl and archive entire sites. Im into alot of retro computing stuff and any time i come across some kind of site with a gold mine of information or software related to retro computing ill run my script to mirror the entire site. Never know when you try and come back a year, 3 or 5 from now the site might not be around anymore.
  • by Anonymous Coward

    Back to www.download.com we go.

  • "I am unable to go into specifics, we had to evaluate our priorities and had to make the difficult decision to discontinue the service"

    Hmm... was it really a maintenance cost issue, or did IBM reach out for some royalties, or with a DMCA that it too expensive to fight?

    • by ls671 ( 1122017 )

      Yeah, I wondered about that too. They have to be on a very tight budget to cut costs by saving 19GB of storage space, maybe a little more taking into raid and backups but still, only 19GB of data from the start.

  • I'm curious whether the backend for hosting this is disproportionately complex(either following a design from when 19GB of data was still something of some note; or perhaps quite literally a configuration that has been brought forward for a couple of decades with only minimal changes, I'd assume that it's not still running on literally the same FTP servers it started on); whether it was someone's passion project and they are retiring/died; or whether the bean counters are looking so carefully and squeezing
    • by Scoth ( 879800 )

      If I had to guess, as someone involved in a bunch of security audit junk at his current company - there's probably been a security audit or vulnerability scan on Everything the university offers for download. Some auditor somewhere freaked out that they're offering this 30 year old software for download with visions of some dumbass complaining that they'd downloaded it from them and were liable for their computer getting hacked because they installed this random ancient problem. My company had to clean up s

    • Having worked at NMSU many years ago I know this used to be hosted on a NeXT workstation and was migrated to an RS6k when the University moved to AIX from SunOS and NeXT, while it has been many years since I worked there, my suspicion is that it lost "sponsorship" from executive management as not adding "value" to the university. But that is pure speculation. The technical aspects of running this are not the most likely cause of it's demise, as has been mentioned the storage space and bandwidth are minimal,

  • "Although I am unable to go into specifics..."

    Hopefully someone will elaborate. It can't be storage space; at the scale of any college, storage is basically free.

  • I guess the requirements of bandwidth and disk space for such a large and popular archive are too much for the university to bear.

    It's sort of like the problem that OpenBSD has in maintaining their ancient releases; the multiple megabytes of storage required is just too much for their meagre budget.

    I hope these problems can be solved in the coming 21st century.

  • People think it was mostly an archive of old OS/2 software, but it was also the repository where new OS/2 software was getting uploaded for the community. It is not just as simple as backing it up and storing on the Archive.org. Now the OS/2 community needs a new solution to share online his new OS/2 produced software.

Crazee Edeee, his prices are INSANE!!!

Working...