Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking Operating Systems Software Hardware IT

New York Times Says Thin Clients Are Making a Comeback 206

One of the seemingly eternal questions in managing personal computers within organizations is whether to centralize computing power (making it easy to upgrade or secure The One True Computer, and its data), or push the power out toward the edges, where an individual user isn't crippled because a server at the other side of the network is down, or if the network itself is unreliable. Despite the ever-increasing power of personal computers, the New York Times reports that the concept of making individual users' screens portals (smart ones) to bigger iron elsewhere on the network is making a comeback.
This discussion has been archived. No new comments can be posted.

New York Times Says Thin Clients Are Making a Comeback

Comments Filter:
  • How cool! (Score:5, Funny)

    by Tink2000 ( 524407 ) on Monday October 13, 2008 @02:05AM (#25351769) Homepage Journal

    Now, the terminals that work has had since 2003 are back in vogue. Awesome.

    • Re: (Score:3, Insightful)

      Heh, I'm in financial services, try 1963. Nothing like using a state of the art thick client to emulate a 60's era dumb terminal... your fees at work!
    • Re: (Score:2, Funny)

      by Anonymous Coward

      2003? Bring back 1973, the last year when American clients were thin.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday October 13, 2008 @02:08AM (#25351789)
    Comment removed based on user account deletion
    • by Bill, Shooter of Bul ( 629286 ) on Monday October 13, 2008 @02:25AM (#25351875) Journal
      Don't forget the cost of maintaining the network. In a school district setting that would probably mean a WAN connecting all of the schools and district offices together. If the network goes down.... every one has to stop working. I'm sure you are very talented and it might work for your particular district. In my area, I know the level of Network Engineers they have and I'm convinced the whole thing would blow up.
      • Which is why its not a great idea putting mission critical thin clients across a WAN

        Though having worked for several years in large corporate environments (and their associated love for citrix farms), I would observe

        - WAN accelerators work. A riverbed (mind you at ~$50,000AUD a pop it ain't exactly cheap) will make a 2M link seem like LAN speeds for the protocols its optimised for. Depending on cost of bandwidth....

        - Consolidation does not have to go overboard. If there are at least a few hundred users, it can be cost efficient to run a local server. Most network problems that are not a result of a bungled change / cabling stuffup are WAN.

        - Government network? good luck with that buddy!

        - The bean counters find it very easy to quantify the cost 'savings' and push their agenda as such. However for your potential losses due to downtime caused by network outages.... heck the fortune 500 I am contracted to presently doesn't even have a method for estimating the dollar cost of downtime, let alone a method for estimating the amount of downtime likely to occur (needless to say they also choose the cheapest carrier, which has a ridiculous inability to meet SLA, and then consolidate like mad to place even more reliance on the WAN).

        Like most things in IT there is no silver bullet or magic formula, each case needs to be judged on their own merit.

        And on a side note, given how much hardware costs have dropped and the fact that user requirements have remained relatively static (i.e. most generic office workers are still using the same software as 4 years ago), how hard can it be to embed the email client (with local cache so they can at least view emails they already downloaded) and office suite on the thin client itself so at least they can keep working on documents?

        • A riverbed (mind you at ~$50,000AUD a pop it ain't exactly cheap) will make a 2M link seem like LAN speeds

          When it stops working, do you refer to it as a billabong?

        • by peragrin ( 659227 ) on Monday October 13, 2008 @06:47AM (#25353139)

          here is the kicker you can't easily run citirix and windows apps across a WAN. too much bandwidth that is lag sensitive.

          my company runs an AIX server with ssh access. each user literally SSH's into the server which loads up the acccess to the point of sale/inventory database. Everything important is tightly controlled. but the fact that you can run it over a dial up 36.6 modem effectively means that even if the internet is choking you can still work.

          • Have you tried Sun Ray? Some Citrix marketing material uses cute statistics such as "average bandwidth used over 24 HOURS" which might make it seem to be a better low bandwidth solution than it is. On the other hand, I find Sun Ray very usable over a WAN even though my home network connection barely exceeds 1Mb. As to the fact that thin clients are useless when the network goes down... so are PCs for all practical purposes. The fact that Sun Rays uses 1/40th the power of a typical desktop PC would be aw
          • by afidel ( 530433 )
            You couldn't be more wrong, Citrix ICA works on dialup with 300+ms ping and 19kbit/s bandwidth. It even works over satellite for some definition of 'works'. About the only thing it doesn't handle well is large amounts of packet loss and huge print jobs on bandwidth starved connections (though it's better than anything else here since it uses raw print files which are as small as you can get and still have the formatted output).
      • Re: (Score:2, Informative)

        by lostguru ( 987112 )
        Our district already has it, each school has two T1's direct to the district office, VOIP, and web all go through there. Works fine, only problem is we can't get the morons at the district to remove things from the content filters.
      • But now that is considered a normal, critical part of services like electricity and fresh water.
      • If the network goes down.... every one has to stop working.

        At this point, if the network goes down then all clients, thin or thick, will effectively stop working anyway.

      • If the network goes down.... every one has to stop working.

        The same thing is true if the power goes out. I suppose some people might get a few hours out of a laptop battery, but then they're down.

    • Re: (Score:2, Informative)

      Of course the the idea of running a server and a bunch of lightweight clients is so much easier to tend to. I work for a school district and we run our own version on thin/diskless clients. We have a few thousand running now, and change about a thousand a year more over every year. After 3 years of great improvements all around, we are never going back to individual stations. I do find it comical that old ideas seem to keep coming back, and it just might be because they are good ideas. Of course, we run f
      • You sidestepped the major issues and questions resolving thin clients and related setups:

        The first question is: for your supposed all-around solution, what exactly is it intended to be used for?

        The second is:
        Why could said solution in the first not be solved by people having computers in the first place, albeit cheap ones if stuff is so minimal it can be done on thin client?

        • Why could said solution in the first not be solved by people having computers in the first place, albeit cheap ones if stuff is so minimal it can be done on thin client?

          I'm sure it could be solved that way. But El cheapo fat desktops are a relatively expensive way to do things. Thin clients are generally less expensive to deploy than cheap PCs, there is less to go wrong on the client end, it is easier and cheaper to replace/troubleshoot the client, it requires less effort to manage hundreds +++ of them, and the need to upgrade/replace desktops every few years goes away. Generally, they require a lot less electricity to run than a conventional desktop.

          • Okay, upgrading a mainframe vs upgrading pcs. Really moot in some ways. As software matures, performance requirements in some scenarios tend to increase. So your server costs go up exponentially. At some point, you'd end up clustering servers, which is undoubtedly not cheaper than just getting el cheapo desktops or laptops. Not to mention extremely unfriendly if you have to bring in new software or have large documents which need to be saved temporarily (like 4-8gigs to be sent out in the same day). To thin

            • by NotBornYesterday ( 1093817 ) * on Monday October 13, 2008 @10:15AM (#25354927) Journal
              Thin clients are not going to always be the idea desktop. However, different thin client solutions offer different levels of efficiency, and so the math you reference above is not typical for many scenarios.

              As an example, SunRays [zdnet.com] generally scale much better than a cheap PC environment, with much better return on investment.

              You are going to be spending money on servers either way. According to your own figures, you have 7.5 users per server. SunRay solutions typically yield 20+ users per server cpu core. I'm not doubting your figures, but what do you guys do that requires so much back-end power? Are they multi-cpu servers? Fully utilized? Are they under-utilized? Single or dual cpu servers? Obviously, I'm not in your position, but before I looked at desktop solutions, I'd look at server consolidation. VMware or similar might save you a bundle and make things easier to admin.

              As for new software, SunRay environments are pretty easy to patch and deploy new software in. As a matter of fact, that's one of the strengths - deploy the patch or app to a single server or a few servers, and you are done.

              Electricity is hardly a selling point if you're losing productivity and still spending the money on servers, to boot.

              Obviously, achieving functionality is more important than being efficient. However, the point of thin clients is that they generally keep office productivity the same or better, IT efficiency is tremendous, and the equation ((thin clients * users) + (servers)) is less than ((full PC desktop) + (servers)) generally holds true. At that point, saving several hundred KwH might be pretty attractive.

      • by Bazman ( 4849 ) on Monday October 13, 2008 @02:53AM (#25352027) Journal

        Seven hundred?!?! Microsoft had a web page where you could put in your client requirements and they would tell you how many Win 2003 TS machines you would need to support these clients. I don't think we ever got it down to fewer than 10 users per server - how did you manage 700?

        Currently we have four servers for about forty seats in our labs. They don't get much usage, and people don't seem to notice they're sharing a machine with the other 10 people on that row of the lab.

        I'd give thin clients to everyone, but then someone in an office of their own will tell us they really need Skype, and they really need a web camera... I suppose these things could be connected to a thin client and forwarded over USB, but it's not something we've tried...

          The other show-stopper is where users need admin rights for particular software. It does still seem to happen, mostly with big important pieces of software like our finance system or student records management. It may just be it needs to write to the C: drive so we could bodge it with access rights, but we don't want to screw up the installation so the user gets admin rights. Now, could we do that on a shared Windows 2003 TS box? I don't think so. With VM tech we could give them a VM of their own to play with though...

          VM tech has also helped us deploy Linux and Windows to our labs. Previously we had say four servers running Linux and four running Windows, and if the lab session needed Windows then there were four Linux servers sitting idle, and the users crammed onto the four Windows servers. With VMs, we stick a Windows and a Linux VM on each server, then the users are more spread onto the eight servers. Win.

        • We use Linux servers, Linux apps, and Linux thin clients. Our ratio is two servers for 160 clients/users. But we also have support from the CEO, so when we say "no" to users wanting animation, sound, etc, it means "no". Users have access to everything they need to perform their jobs and costs are very low (compared to thin OR fat MS environments).

      • Out of curiosity, how much do you pay for thin clients? Last I checked they were running about $200 - $400 from HP and Wyse(?). Funny thing is you can buy a new PC for $400 (or close) now, especially if you get discounts from the big resellers. You can also buy P4's with XP that can handle most of your users needs for under $200. And if you don't want XP, just format and install Linux or Cirix or whatever. This is basically why I didn't recommend the move to thin clients, at least not using their hardw
    • Eh (Score:5, Insightful)

      by Sycraft-fu ( 314770 ) on Monday October 13, 2008 @02:59AM (#25352055)

      There are plenty of downsides too. While it might be easier to maintain, it is also easier to fuck up. Someone does something that breaks a piece of software, now the whole department/company/whatever doesn't have it rather than just that person. A network outage is now a complete work stopping event rather than an inconvenience. Special software installs for special tasks are hard since that software has to be tested to make sure it doesn't hose the server.

      I could keep going, if I wished. Now that isn't to say that means the thin client model is bad. In fact we are hoping to do it for our instructional labs at some point. What I'd really like (and there are VM solutions to do) is that not only would we have thin clients, but a student could use a laptop as a thin client too and load our image from their home or whatever.

      However, the idea that they are just cheaper/better is a false one. They can be cheaper in some cases, in others you can easily spend more. Likewise they can simplify some thing and make others more complex.

      There isn't a "right" answer between large central infrastructure and small distributed infrastructure. It really depends on the situation.

      All I will say is if you are looking at doing this at your work as you suggest be very, very careful. Make sure you've really done your homework on it, and make sure you've done extensive testing. I don't think it's a bad idea, but be sure you know what you are getting in to. Just remember that while people get whiny when, say, an e-mail server goes down, if the terminal server goes down and NOTHING works, well then people go from whiny to furious in a second.

      It's the same kind of deal with virtualization. It is wonderful being able to stack a bunch of logical servers on to one physical server. However if that one physical server dies you can be way more fucked. You have to spend a good deal more time and money in making sure there is proper redundancy and backups and such. So while packing 10 servers on 1 using VMWare Server (free) might be nice and cheap, you also might be creating a ticking time bomb. You then might discover that putting those 10 servers on a small cluster with a fibre channel disk array and VMWare Virtual Infrastructure (not free) solves the reliability problem nicely, but isn't quite as cheap as you thought.

      Just something to be careful with. At work we have both sorts of things. We've got individual desktops, and we've got thin clients (though we actually got rid of most of those). We've got individual servers, we've got virtual servers, and so on. All methods have advantages and disadvantages. I am not a zealot either way, just warning that a change from a decentralized to a heavily centralized infrastructure isn't something to be done lightly. You solve various problems, but introduce a host of new ones.

      In particular hardware reliability is something you want to keep in mind. You for sure want an "N+1" situation with your terminal servers, perhaps even more than that. You can't count on the hardware being reliable. Hopefully it is, but I've seen even the real expensive, redundant shit (like a Sun v880) fail with no warning. When it's the be all, end all and all work stops when it is down, that just can't happen.

      • Excellent points. One thing I've seen is that if a thin-client deployment doesn't go well, people will still blame the client hardware (Sun Ray/Citrix/Wyse) rather than the system. So if your email server, ldap server, SMB server, webserver, proxy server or network switches aren't well designed or deployed, don't go thin because it gives the technology an undeserved black eye. If you deploy fat clients and all of this underlying infrastructure is sh**, you can always push for an upgrade to desktop PCs in
    • by Moraelin ( 679338 ) on Monday October 13, 2008 @03:37AM (#25352253) Journal

      1. Actually, regardless of whether they are making a comeback or not, or what their advantages and disadvantages may be, this is probably just a PR story. Just like the "The Suit Is Back!" that got traced back to a PR agency a couple of years ago.

      PR loves to masquerade as news because it bypasses your BS filter. An ad for Men's Warehouse suits gets looked over, a piece of news that you won't get hired unless you wear a suit, tries to replace your premises with theirs and let you take a leap to the "I must buy a suit" conclusion. Or better yet, to the even better conclusion, "I must only hire people in suits 'cause everyone else is doing it." There are a lot of sheeple out there who only need a "The Herd Is That Way -->" sign to willingly enter someone's pen and be sheared like "everyone else".

      For anyone who's not sheeple, this is a non-story. Whether _you_ need a server instead of PCs or not, depends on what _your_ needs are and what _your_ employees are doing. Use your own head.

      The only ones who need an "everyone else is doing X" story are those who have to follow a herd to feel secure.

      Hence, the love PR has for this kind of story.

      2. Over-simplifications like "all they need is internet, database access, and word processing" were false when arguing why grandma should only need an old 486, and tend to be just as false for a company. So you'll have to do some analysis if for a particular company that is indeed true, or just glossing over what's really going on. (Or even wishful thinking by some IT guy who feels his job would sound more important if he was overseeing a server.)

      E.g., a lot of companies have salesmen who go with a laptop to various customers to give a presentation and try to win a contract. Are you ready for the case when that guy you're trying to sell insurance doesn't have internet to connect to your server via VPN? Are you sure that those server side apps' files can be converted flawlessly to MS Office or whatever those sales guys have on their laptop?

      It's just one example where goimng, "bah, they only use database apps and word processing" is glossing over a more complex problem.

      3. The argument for saving costs is a good one, and far from me to advise wasting money. But you have to be sure that you're actually _saving_ money across the organisation, not just saving $1000 in the narrow slice you see, at the cost of causing $1,000,000 to be lost in workarounds and lost productivity somewhere else. Entirely too much "cost cutting" lately is the latter kind of bullshit theatre.

      E.g., if someone costs you $100,000 per year -- and I don't mean just wage, but also electricity costs, building rent, etc -- saving $1000 is nullified if it drops their productivity by as little as 1%. Saving a few hours per year of an IT guy's work can be a very bad trade off, if it costs that guy as little as 5 minutes total per 8h work day to put up with the quirks and delays of the centralized system. (480 minutes a day, times 1% is 4.8 minutes.) It can add up very easily to that. It only takes wasting 1 second per form through some web-app instead of letting that guy massage the data locally in Excel or Access(*), to add up to more than that in a day. A close enough approximation can very easily be approximative enough to actually turn the whole thing into a loss.

      (*) ... or whatever F/OSS equivalents you prefer. This is not MS advocacy, so fill in the blanks with whatever you prefer.

      And as you move higher up the totem pole, things get even funkier. If a salesman is doing contracts worth millions of dollars with those presentation, I hope you better save a _lot_ with that centralized solution, because it only takes one lost contract (e.g., because he couldn't connect) to put a big minus in the equation. E.g., if you're going to pay a CEO tens of millions per year, and actually believe that his work is worth every cent (heh, I know, but let's keep pretending,) then... again, you better be damned sure that you don't drop _his_

      • by Kjella ( 173770 )

        The only ones who need an "everyone else is doing X" story are those who have to follow a herd to feel secure.

        With all due respect, there are many companies that do things in their own OMGWTF way that really badly needs to be wacked over the head with a clue-by-four that says "Everybody is doing it this way, it's simple, cheap, reliable, flexible and in every way better than what you've hacked together. Please put that abomination out of its misery and let us show you a standard, sane and modern solution." Of course many are thinly disguised marketing attempts too but there's definately a need for real information

      • saving $1000 is nullified if it drops their productivity by as little as 1%...

        I can guarantee that no one in Corporate America(TM) actually cares about the efficiency of the users - if solution A is cheaper than B, they'll choose A every time. After all, if the users are inefficient, that's a management problem. Consider, also, that if IT buys junk and people have to work an extra hour a day to do the same work they did before, that:

        • If they're salaried, the company doesn't bear the cost of the extr
      • Actually, regardless of whether they are making a comeback or not, or what their advantages and disadvantages may be, this is probably just a PR story. Just like the "The Suit Is Back!" that got traced back to a PR agency a couple of years ago.

        Paul Graham wrote a nice article [paulgraham.com] about this. Well worth a read for anyone who hasn't seen it yet.

    • by Lumpy ( 12016 )

      Actually it's not. WE tried that, instead of following the IT departments finding that the reccomendation was based on the idiot CTO went and bought all NCD terminals and spent a huge long dollar on a Citrix farm.

      We ended up spending 6X the cost on the whole setup than buying the typical dell pc's. The entire time it was a mess and never worked right because of the people in power buying what some sales rep told him was the best and ignoring the Experts on staff that researched the whole damn thing.

      Thin

    • Re: (Score:3, Insightful)

      by DerWulf ( 782458 )
      Yes, welcome to 2000, 1990, 1980 and 1970! Look, here is the deal: centralization has massive problems itself. First of all you can't glue together 10000 cpus, 10000 hdds and 10000 ram banks and have the same performance as 10000 PCs. Secondly there is no unified preference / customization management for applications. We use eclipse on windows terminal server and setting it up so that every user has their own workspace and correct dependencies was such a nightmare that IT coded their own eclipse launcher. N
  • First Post! (Score:5, Funny)

    by Harmonious Botch ( 921977 ) * on Monday October 13, 2008 @02:09AM (#25351795) Homepage Journal

    ...or, well, it would have been first if I wasn't on a thin client waiting 15 ^%*^&# seconds for a keystroke echo.

    • You should (would) have seen the posts from the "I'm using Vista you insensitive clod" bunch.

      They're still waiting for the cancel/allow box to show up ;).
  • by 4D6963 ( 933028 ) on Monday October 13, 2008 @02:10AM (#25351803)
    Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...
    • Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...

      That would be when the vendors have made their cut and hand over to the consultants for their turn at the trough.

    • This isn't really an industry cycle, it looks more like a plug for a bunch of current products, ala: http://www.paulgraham.com/submarine.html [paulgraham.com]

    • Re: (Score:3, Insightful)

      it's not about rediscovering the advantages/disadvantages of thin clients. AFAIK thin clients were never fully abandoned. it's simply about finding the right niche for thin clients.

      for instance, if you're setting up some computers at a public library that only need to search through the library catalog and nothing else, then thin clients are the clear way to go. if you're running a school network where thousands of students will be sharing a few hundred computers, but they'll need word processing, desktop p

      • by 4D6963 ( 933028 ) on Monday October 13, 2008 @03:41AM (#25352277)

        the computing demands of the casual user hasn't increased that much since the days of Windows 95

        Right, just try watching YouTube on Firefox with a Pentium 133.

        by giving everyone else thin clients, you'll give them less chance to screw up their system, thus giving them more uptime and more reliability, which users will appreciate.

        Uh huh, you can solve the "chance to screw up their system" by keeping the thick client but virtualising the OS, and as for more uptime and reliability it will only be as reliable and uptimely as is your network/servers, which is in most contexts probably not any better, plus you have to deal with general downtimes, and this way people are going to end up with all their eggs in the same basket, which, although avoidably, could bring huge IT catastrophes. Relying entirely on a centralised network is absolute madness, a single network administrator's mistake, a lack of redundancy combined with a hardware failure, a bad decision or incompetence could paralyse an entire infrastructure. Centralising everything only looks nice on paper.

        • hasn't increased that much != hasn't increased at all. a modern 700 MHz cpu is perfectly capable of surfing the web and handling most office computing work.

          and what does using thin clients have to do with lack of redundancy? no one said you had to use a single file server for the entire network. using fat clients will not make up for a lack of common sense. and if you can't manage to keep a dozen servers up, you're certainly not going to be able to handle maintaining a couple hundred fat clients.

          so, yea, if

        • Comment removed based on user account deletion
        • Right, just try watching YouTube on Firefox with a Pentium 133.

          What is the business case for not blocking YouTube and Google Video at the office terminal server's web proxy?

    • Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...

      I first got the computer bug seriously when I was in college, and took some courses requiring the use of dumb terminals in our computer center... they were running off a DEC minicomputer running Unix, and I was hooked. I learned to do a lot using those old green and orange screen terminals, and to this day, I wonder if most businesses wouldn't be incredibly more productive if they went back to simple no-GUI dumb terminals... with text email and Lynx browsers.

      Think about it. How many employees now blow off h

      • by 4D6963 ( 933028 )
        Where are you going to people who will a) accept being deprived of the computer 'perks' they take for granted and b) qualified to work without a GUI? Also, what sort of business could run with just that nowadays?
      • Think about it. How many employees now blow off hours at a time during the workday by playing solitaire, going to MySpace, releasing the latest trojan into their LAN via email attachments...

        And how much of your workforce are you going to be left with once everybody quits because of your GUI-less, diversion-free system?

        • by rich_r ( 655226 )
          There's plenty of data entry work that requires just that, and I've done enough of it! If I've got no internet access anyway, I'd rather just have a well designed text-based system that is fast, lag free and supplied with a decent keyboard.

          I fail to understand why people moved away from systems that just worked and replaced them with boxes that did so much but are used for exactly the same tasks and do it just that little bit worse.

          • I fail to understand why people moved away from systems that just worked and replaced them with boxes that did so much but are used for exactly the same tasks and do it just that little bit worse.

            Because people are not machines, no matter how reppetative thier work is. A fluffy kitten as thier wallpaper, or a nice gui will improve morale, which goes a long way to increasing productivity. There is always a human element involved that doesnt quite "make sense" when looked at from a purly numbers perspective

      • Re: (Score:3, Insightful)

        by markdavis ( 642305 )

        >Even with a GUI terminal, if it was stripped down and wasn't Windows based
        >(and had drastically limited Internet access), I think a lot more would get done around offices.

        Bingo! That is exactly what we have- Linux server, Linux apps, Linux thin clients (160). Everything is locked down tight. We have everything users need in order to be productive and nothing else (accounting apps, OpenOffice, Firefox, Sylpheed, IceWM, some utils). Internet access is only through a white list of approved sites. B

    • Hrm. Arent desktops virtually becoming thin clients with the advent of "cloud computing"? As far as I can tell (working in a hugenormeous corporate environment 20,000+ users), any application worth its salt is being run on big iron servers, where the clients only run browsers written in Java (or some other platform neutral client) where they can perform trivial functions not worth running on the main servers. I cant really think of any software outside of development and video games that are processor inte

  • by The Master Control P ( 655590 ) <ejkeever@nerdshacFREEBSDk.com minus bsd> on Monday October 13, 2008 @02:15AM (#25351821)
    Is now just a bunch of generic PCs in smaller form factors. So in essence you're sticking a network layer between the rest of the computer and it's video card. So instead of network outages (which are inevitable) crippling just network operations, they now cripple everything including your ability to keep typing your office documents or looking at the email you've already got.

    It's annoying as hell, but if my network craps itself I still have a working computer in front of me and I can still do a subset of what I was doing before. Not so with thin clients.

    <tinfoil mode>
    Of course they want to take the actual computer away from you, they want to have control over you. If they could, your "computer" would be a mindless terminal to a Big Brother Approved mainframe that spied on everything you did.
    </tinfoil mode>
    • by inKubus ( 199753 ) on Monday October 13, 2008 @03:48AM (#25352295) Homepage Journal

      The Linux Terminal Server Project [ltsp.org] is actually pretty good. And useful for a variety of things beyond just saving dough on the desktop end. Remote access is one that comes to mind. Sure, you could have a bunch of X terms, but this will work with ANY box with a PXE (hell even Netboot) NIC. You don't need virtualization or any of that garbage. UNIX was designed as a "multi-luser" operating system ;), back when mainframes were last in vogue. Xwindows is really quite good over a slow network and has been for DECADES.

      Now, I want to stress that I am a proponent of terminals in only certain areas. A public library computer bank. A factory environment, where you want your server safe and securely away from sparks and heat. A customer service environment where the employee is only doing one or two things. My business ops people would have real computers for the reasons you mentioned. I want them to be accounting and developing even if the server is down.

      • by inKubus ( 199753 )

        Correction, you can also boot using Floppy, CD, or USB boot image.

      • LTSP is pretty cool. It's an install option on the Alternate Install CD for Ubuntu. I use it at home because I like to have the option to netboot to linux if a guest needs a desktop. I don't like guests messing up my real desktops.

        I combined it with two other projects: DRBL for on demand clustering when I want to do a little light rendering, and Clonezilla to enable any PC that connects to the network to make an image backup to a network share.

        Works for me, it doesn't cost extra over the cost of the se

      • And useful for a variety of things beyond just saving dough on the desktop end.

        Save too much dough, though, and you end up with a result that feels half-baked ;)

        -- Jonas K

    • by Xouba ( 456926 )

      <tinfoil mode> Of course they want to take the actual computer away from you, they want to have control over you. If they could, your "computer" would be a mindless terminal to a Big Brother Approved mainframe that spied on everything you did. </tinfoil mode>

      You're not using GMail or any of Google other services, are you?

    • In a modern business system environment, if the network goes down, productivity pretty much stops. Period. It doesn't matter if the clients are fat or thin.

  • Dumb clients, fat clients, thin servers, retarded paywalls.

  • by Fluffeh ( 1273756 ) on Monday October 13, 2008 @02:17AM (#25351833)
    When you have customers with thick clients, sell em thin ones cause they are "better-er".

    When you have flogged off all of your customers with a thin client, the new thing is a "better-er-er" thick client.

    Whole thing sounds like very simple 101 style marketing. Why try to sell someone something they have? Convince them what you have is better. Total no-brainer imo.
  • We're gonna need them, what with the economy cratering!
  • Middle ground? (Score:4, Insightful)

    by Max Romantschuk ( 132276 ) <max@romantschuk.fi> on Monday October 13, 2008 @02:26AM (#25351879) Homepage

    How about a netbook-style device which could offer limited functionality on it's own for email, web, basic office apps (say a boot image updated from the central server when connected), and used as a thin client at the office plugged into a docking station with proper display(s) and keyboard+mouse? Best of both worlds?

  • by Anonymous Coward

    We have recently adopted a phased approach of deploying new thin clients as our estate of traditional desktops hit retirement. After having seen several false dawns and uncomfortably proprietary solutions in the last 15 years, it was only now that we have been happy enough with the whole solution (thin client HW, network connectivity, back-end virtualization SW) to take the plunge.

    There are now a range of HW clients (we use ChipPC [chippc.com]).
    There are a couple of viable virtualization systems (we use Citrix Xen [xensource.com], with

  • Finally I can sell all the Wyse 120 terminals I have in the garage! If you want me I'll be high-rolling at the casino for a couple of weeks...

  • by Plantain ( 1207762 ) on Monday October 13, 2008 @02:30AM (#25351907)

    My clients are all obese, and show no intentions of slimming down; what am I doing wrong?

  • by MosesJones ( 55544 ) on Monday October 13, 2008 @02:35AM (#25351937) Homepage

    Oh yes its back the battle that everyone has been waiting for its the Rumble on the Desktop, the fight of the century, the challenger is the undisputed next year champion, fighting out of California by way of Finland it is the Penguin himself, Tux "next year" Linux.

    And now the champion, dominating in the 70s, losing form in the 80s, disappeared as a recluse in the 90s and the start of the century but now he is back to claim his crown. With the black trunks and green trim its Thin "Latency is a Bitch" Client.

    Lets have a good clean fight to finally decide who will be declared the Desktop champion of 2009.

    This fight is sanctioned by the ODC (Optimistic Desktop Council) and will be fought under rules of low data, huge assumptions and a complete lack of understanding on the total size of the market.

    • Re: (Score:3, Interesting)

      by 4D6963 ( 933028 )

      This isn't boxing, more like wrestling. So don't be surprised if you see VM "you trashed your OS here have this backup virtual image" ware jump up on the ring, headbutt in all directions and virtualise the shit out of your thick clients.

      Am I the only one who believes that the future is not in thin clients but in desktop supervisors who make all your OSes run transparently virtualised? I'm talking about 10-15 years.

      • Google would like to disagree with you.. :)

        maybe not in 10-15 years, but I see the future as more mobile devices. If we can fix battery life (possibly) and displays (probable with in-eye HUD type affairs) then processing and capacity will increase. We'll probably see a lot of 'download what you need as you need it' combined with local processing. Your desktop and big monitors will probably go the way of the abacus.

        • by 4D6963 ( 933028 )

          I'm not sure what you mean regarding the comment about Google but Google has anywhere to go but up. It's one of these companies that start with a huge momentum because they have a tremendous edge over the concurrence, but at some point (about now), they're just another big company with a huge power and market share but little momentum because, as I like to think of it, they reached their orbit, i.e. they did anything they could hope doing in the domain they started in, now the challenge for them is to keep

          • I did say that I wasn't sure about the 10 year timeframe... but on the other hand, you never know.

            Mobile phones have more power today than desktops of 10 years ago. Modern desktops are reaching a plateur of processing power - not necessarily because we can't squeeze more from them, but because they have enough juice for most people. Its not like years ago when you had to buy a new PC every year or two.. today a 3 year old PC will happily run almost everything as fast as you want. (sure, you can stick Vista

            • by 4D6963 ( 933028 )
              I just think you're way overestimating the capabilities of such devices, regardless of their power, battery life, applications or inputs. Of course people will do lots with them, but I doubt they'd abandon their regular PC for these. I could turn on to be wrong, but that would defy common sense. Keep in mind that the biggest issues are controls and the screen, not any of the other technical aspects you mentioned. These are the real inherent bottlenecks.
              • true, but I think the screen will be fixed relatively shortly, even if you have to wear Austin Powers style specs. The controls... I reckon voice activation and a more touchscreen-yness will suffice (considering the teeny keyboards people actually use today!), so I doubt it'll be too long. For sure, just look at how technology has progressed over the last 10 years to think about what might happen in the next ten. I mean, I remember playing cyberpunk 15 years ago where the idea of a personal cellphone was l

        • Well, we can't fix the battery life. It'll always be too short (redefining "too short" every time there is a breaktrough). Also, we can't fix bandwidth and heat dissipation. We may be able to fix the problem of small disks for a time, but even if so, it will come back to haunt us latter.

          All those are physical constraints, so don't expect portable devices to replace fixed ones.

    • What makes you think it's necessarily either/or?

      I've got a SunRay on my desk which I've used as a primary desktop for about a year. It's not perfect, but it runs X, handles audio, and I can plug a flash drive in the USB port and it just works the way it's supposed to. I've got it set to use KDE, and it actually looks and feels nearly identical to the Linux box whose screen/keyboard/mouse sit next to it. The SunRay is dead silent (it has no moving parts, not even an off switch). When I hit the power s

  • by Martian_Kyo ( 1161137 ) on Monday October 13, 2008 @03:17AM (#25352139)

    too many dumb users (ok I am being too harsh here, too many uneducated users) these days. Thin clients = less freedom, which in case of most users means they'll make fewer mess ups.

    This means less boring maintenance work for IT people, in large companies especially.

  • Unless you've got a lot of bandwidth to spare, Flash will kill performance.

    • That is why on our large, Linux, thin-client network, we do not install nor allow Flash. Yes, you are correct... animation is a HUGE enemy of thin clients. But we also have an enforced site whitelist.

      On those few sites that are so broken as to require Flash to do anything productive, employees are welcome to come to the training room and use a specially configured station with a local Firefox + Flash + Java.

  • there was a short period bridging the vt100 terminals to the sunrays from 1997 to 2000, where the University library installed personal computers for accessing their network.

    No, seriously. This is non-news.

    The transition to personal Computers stopped long ago. I can not remember to have seen an institution in the last five years switching to a PC-based infrastructure, but i see since approx. 2001 a rise of thin clients in larde organizations. The organizations for which this pays off will get smaller and sm

  • New York Times Says Thin Clients Are Making a Comeback

    But in Texas they're as fat as ever.

  • by zullnero ( 833754 ) on Monday October 13, 2008 @03:52AM (#25352317) Homepage
    And later in the year, when the corporation I worked for lost 10 million because one of their customers went bankrupt, I, by chance, got to sit in on a bigwigs meeting.

    After announcing the loss and accompanying layoffs, he actually followed it by saying "And I don't think suggesting thin clients will help us out of this one."

    Man, it was so hard to keep from laughing...next time I hear that, and it sounds like I will hear that again, I think I'll just risk my job and have a big belly laugh.
  • by azgard ( 461476 ) on Monday October 13, 2008 @04:26AM (#25352443)

    From physics, it's obvious that centralized computing is more energy efficient than distributed one. The longer distance you have to move energy (that encodes the information) to compute the results, the more energy you need. Also, centralization allows for better resource sharing.

    The only issue is who pays for the costs. Mass production of computers allowed to decrease their costs to the point that distributed systems were cheaper than centralized ones. However, as the demand for computer power grows, energy spent on computing itself enters the equation, and the times will change again.

  • My company uses a mix of fat and thin clients. IT gives a few choices between desktops, which costs the department money, and then there are 'free' (to department) thin clients. In actuality, the thin clients cost more when actually purchasing. Don't ask me why Wyse charges $800 for a mini-itx system with 1gb/ram and 1gb of flash, running a VIA C7 1.2ghz processor when the device runs just a Citrix client anyways. I personally prices a comparable mini-itx system, with a tiny case, for ~60% of what Wyse char

  • It's been about 10 years since the last time they were hyped. Besides, all the yammering about "computing in the cloud" has the thin-client folks excited.

  • ... the truth is time and latency sensitive apps will always need "thick" clients, latency insensitive apps can use "thin" clients. These back and forth illusionary arguments ignore the fundamental need for redundancy of computational power in the hands of individuals who depend on them.

    The truth is both are needed, they compliment one another and the needs of those who use them. Consider 3D rendering software like 3D studio max, you can do network renders and local renders, both are necessary depending

  • I read the headline and thought of course now all you need is web front ends and cell phones/PDAs and presto you've got a "thin client." ;) If it's a web front end on a PC, you can still make sure your folks have those essentials like office & what ever else windows apps your place finds that it can't live without. Heck, cell phones/PDAs even have office functionality today. That sales man can use his laptop or if worse comes to worse his PDA/cell phone to pull that much needed power point presentation

  • This always seems to go in 5-to-7-year cycles, but this time it might actually stick given the always-on, always plentiful bandwidth we're getting now.

    Thin clients are amazing in situations where you have an average office-worker PC doing a single task (call center, POS, reservations agents, etc.) You can connect them to the terminal server of your choice. If the users really need a true computer, you can give them a virtual desktop or blade PC (By virtual desktop, cutting through the hype I mean access to

  • Too bad they suck (Score:3, Insightful)

    by Thaelon ( 250687 ) on Monday October 13, 2008 @10:08AM (#25354871)

    I worked exclusively through thin clients for a year at my last job and absolutely hated it.

    It was slow, and ungainly and every now and then - from a few hours to a couple of months - someone else's X session windows would pop up on my screen. Wonderful in an environment where we worked with secret (as in classified as) information. We knew the problem, and the IT guys could usually fix it in a few minutes, but the fix always seemed to be temporary somehow.

    Not to mention you're costing productivity for people like me who tend to work very rapidly via esoteric hotkeys, and rapid fire keystrokes, and using the keyboard buffer to issue commands to dialogs, context menus, windows that haven't yet appeared. One of my earliest employers once described seeing me work at a computer as "really making that thing sing". So sticking me on a slow machine or dumb terminal is costing you my productivity and happiness. And it's not like a decent machine $1500-2000 is really that big of a deal spread out over the several years it will last. Especially if it's one more straw kept off of the camel's back that keeps me for looking for another job and costing you domain knowledge and experience with your unique problems when I leave.

    IMO, thin clients should be reserved for "guest" users who will only be temporarily using your network where no degree of customization or where speed is not important. Like an interactive presentation or a library, or some temporary event.

  • Thin Clients Are Making a Comeback...

      Thin clients? Just how bad is the economy down there in the USA, can't you afford food?

"If it ain't broke, don't fix it." - Bert Lantz

Working...