Forgot your password?
typodupeerror
Software Linux

Long Live Closed-Source Software? 676

Posted by Zonk
from the i-am-something-of-a-fan-of-closed-source-games dept.
EvilRyry writes "In an article for Discover Magazine, Jaron Lanier writes about his belief that open source produces nothing interesting because of a hide-bound mentality. 'Open wisdom-of-crowds software movements have become influential, but they haven't promoted the kind of radical creativity I love most in computer science. If anything, they've been hindrances. Some of the youngest, brightest minds have been trapped in a 1970s intellectual framework because they are hypnotized into accepting old software designs as if they were facts of nature. Linux is a superbly polished copy of an antique, shinier than the original, perhaps, but still defined by it.'"
This discussion has been archived. No new comments can be posted.

Long Live Closed-Source Software?

Comments Filter:
  • by rs79 (71822) <hostmaster@open-rsc.org> on Sunday December 30, 2007 @03:47PM (#21858480) Homepage
    Apache.

  • Stupid phrasing (Score:3, Informative)

    by JamesRose (1062530) on Sunday December 30, 2007 @04:04PM (#21858618)
    Obviously thats just not true of all open source software. However, with some OSS, like Open office, I just can't be bothered, because they're trying to replace closed source software, not making it in their own right, just copying it, no creativity just coding for the sake of it being open source and giving them a warm fuzzy feeling inside. For me, using Open office at the moment is like stepping back to ms office 10 years ago, why would I do that- ms office came with my PC so it hasnt cost me anything (it did, but not directly) and more importantly in businesses the users aren't charged anything- it's just an office expense. The guy does have a point though: it's no longer enough just to be open source, to be accepted you MUST be open source and useful. I think it's a step that was missed when the OSS developers started looking for larger distribution to people who weren't intereseted in computer ethics.
  • by Anonymous Coward on Sunday December 30, 2007 @04:09PM (#21858658)
    Long Live Closed-Source Software!
    There's a reason the iPhone doesn't come with Linux.
    by Jaron Lanier

    If you've just been cornered by Martha Stewart at an interdisciplinary science conference and chastised for being a wimp, you could only be at one event: Sci Foo, an experimental, invitation-only, wikilike annual conference that takes place at Google headquarters in Mountain View, California. There is almost no preplanned agenda. Instead, there's a moment early on when the crowd of scientists rushes up to blank poster-size calendars and scrawls on them to reserve rooms and times for talks on whatever topic comes to mind. For instance, physicist Lee Smolin, sci-fi author Neal Stephenson, and I talked about the relationship between time and math (touching on ideas presented in my October 2006 column).

    The wimp comment was directed at me, and Martha was right. I hadn't stood up for myself in a group interaction. I've always been the shy one in the schoolyard. Back in the 1980s, I was drawn to the possibility that virtual reality would help extend the magical, creative qualities of childhood into adulthood. Indeed, the effect of digital technology on culture has been exactly that, but childhood is not entirely easy. If Lee hadn't forged through the crowd to create our session, I never would have done it. What made Martha's critique particularly memorable, though, is that her observation was directly relevant to what emerged from Sci Foo as the big idea about the future of science.

    It wasn't official, of course, but the big idea kept popping up: Science as a whole should consider adopting the ideals of "Web 2.0," becoming more like the community process behind Wikipedia or the open-source operating system Linux. And that goes double for synthetic biology, the current buzzword for a superambitious type of biotechnology that draws on the techniques of computer science. There were more sessions devoted to ideas along these lines than to any other topic, and the presenters of those sessions tended to be the younger ones, indicating that the notion is ascendant.

    It's a trend that seems ill-founded to me, and to explain why, I'll tell a story from my early twenties. Visualize, if you will, the most transcendentally messy, hirsute, and otherwise eccentric pair of young nerds on the planet. One was me; the other was Richard Stallman. Richard was distraught to the point of tears. He had poured his energies into a celebrated project to build a radically new kind of computer called the LISP Machine. It wasn't just a regular computer running LISP, a programming language beloved by artificial intelligence researchers. Instead it was a machine patterned on LISP from the bottom up, making a radical statement about what computing could be like at every level, from the underlying architecture to the user interface. For a brief period, every hot computer-science department had to own some of these refrigerator-size gadgets.

    It came to pass that a company called Symbolics became the sole seller of LISP machines. Richard realized that a whole experimental subculture of computer science risked being dragged into the toilet if anything happened to that little company--and of course everything bad happened to it in short order.

    So Richard hatched a plan. Never again would computer code, and the culture that grew up with it, be trapped inside a wall of commerce and legality. He would instigate a free version of an ascendant, if rather dull, program: the Unix operating system. That simple act would blast apart the idea that lawyers and companies could control software culture. Eventually a kid named Linus Torvalds followed in Richard's footsteps and did something related, but using the popular Intel chips instead. His effort yielded Linux, the basis for a vastly expanded open-software movement.

    But back to that dingy bachelor pad near MIT. When Richard told me his plan, I was intrigued but sad. I thought that code was important in more ways than politics can ever be. If politically correct code
  • Re:Apache (Score:5, Informative)

    by Bill Dimm (463823) on Sunday December 30, 2007 @04:25PM (#21858778) Homepage
    I wonder if he uploaded his shit article on an apache server.

    curl -i 'http://discovermagazine.com/2007/dec/long-live-closed-source-software/' | head -2

    HTTP/1.0 200 OK
    Server: Zope/(Zope 2.9.6-final, python 2.4.0, linux2) ZServer/1.1 Plone/2.5.2

  • by rdean400 (322321) on Sunday December 30, 2007 @04:29PM (#21858810)
    I think it's a fair assessment that open source spends a lot of time reinventing the wheel for the sake of having OSS coverage, but that's not to say the realm of OSS is devoid of innovation.

    To be honest, the only piece of innovation that's really given me a "Wow!" moment in Open Source is the Mylyn project from Eclipse.
  • by poopdeville (841677) on Sunday December 30, 2007 @04:31PM (#21858832)
    Sure, there are *popular* examples, such as Apache. But popularity doesn't mean innovative. Apache was simply one of the first web servers, which caused it to get hammered on until it was useful. But there's nothing in Apache that makes you stand back and say, "Wow! That's absolutely brilliant thinking!"

    If you're cynical enough, you could say the same thing about any software. On the other hand, Apache was innovative. And the Apache Foundation continues to found and fund new projects, including SpamAssassin -- the first Bayesian spam filter.

    In any case, Haskell is open source. So is Erlang.

    While I'm sympathetic to Jaron's point, I think he's missing a big one. Linux represents about 30 years of knowledge of best practices in software engineering. This is not a bad thing, because Linux is flexible enough to support nearly any kind of computation environment, right now. Including the weird experimental ones he (and I) like. And the run of the mill workhorse desktop environments most people need.
  • Re:NIH syndrome (Score:2, Informative)

    by jcaldwel (935913) on Sunday December 30, 2007 @04:35PM (#21858866)

    the desire to reimplement everything from the ground up using 'new technology' but this really falls into the trap of thinking that new is automatically better.

    From the sounds of it, Jaron Lanier really wants to start from scratch. A quote from an interview with Sun: [sun.com]

    Interviewer: Maybe we need to go back and start all over again?

    Jaron: That's what I've been thinking lately. Tracing the history of programming, we can see places where it went wrong, based on the limited experiences and metaphors that were available at the time. It's possible to imagine a different history. Let's go back to the middle of the 20th century, to a very brilliant, first generation of serious hackers that included people like Alan Turing, John von Neumann, and Claude Shannon. Their primary source of coding experience involved coding information that could be sent over a wire. They were familiar with encoded messages on the telegraph and telephone. Everything was formulated in terms of a message being sent from point A to point B, with some advance knowledge on point B about the nature of the message. Or if not that, at least an attempt by point B to recreate that knowledge, in the case of hacking.

    ...So much for standing on the shoulders of giants.

  • by semiotec (948062) on Sunday December 30, 2007 @05:17PM (#21859222)
    Bollocks!

    Going from "is my work really outdated?" to "How can I keep my work from becoming outdated?" and implicitly assumes that the work _is_ outdated. Wasting time considering how to deal with inane questions from clueless intellectual artiste is just stupid.

    Would you ask your plumber how to improve network design just because some guy thinks the Internet is a series of tubes?

    Sure, it's important to have constructive criticisms and developers certainly should be open to such, but it's just as important to understand which criticisms are even worth considering.

    If you've ever worked in scientific research you will have better understand of this. In every area of research I've been involved in, there are always some quacks or backyard inventors who claim they have found the solution to all of the unanswered questions and yet refuses to publish or elaborate, only profusely arguing that the scientific community is too old-fashioned to accept their ideas. Bollocks!

    -----

    Now, please carefully consider the following:

    1. was your comment really just bollocks?

    then follow up by:

    2. how can you keep your comment from becoming total bollocks
    3. how can you write comments that don't sound like bollocks
  • Re:I'll bite: (Score:3, Informative)

    by ShinmaWa (449201) on Sunday December 30, 2007 @05:42PM (#21859426)

    Having used all of the above, what's especially innovative about any of them?
    Okay, I'll bite back. I can't speak to Hibernate or Spring, but I will speak to Eclipse.

    Eclipse is a fully mature, OSGi-compliant tools platform that just happens to be, in its default form, a self-hosted Java IDE. However, Eclipse itself can be transmogrified [eclipse.org] into anything you want it to be, including application servers, games, smart clients, and software that helps run both the Dutch railway and NASA's Mars rovers. That seems pretty innovative to me.
  • by Weedlekin (836313) on Monday December 31, 2007 @03:02PM (#21868554)
    "Linux was simply the first x86 Unix that supported
    the sort of hardware that people actually have rather than
    some Sun engineer's notion of what a PC should be."

    This is true if of course one chooses to ignore Coherent and SCO Xenix (the original SCO, not the Caldera bunch who now own the name), both of which were available for IBM PCs and clones thereof in the early 1980s.

A penny saved is a penny to squander. -- Ambrose Bierce

Working...