Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Networking Operating Systems Software Hardware IT

New York Times Says Thin Clients Are Making a Comeback 206

One of the seemingly eternal questions in managing personal computers within organizations is whether to centralize computing power (making it easy to upgrade or secure The One True Computer, and its data), or push the power out toward the edges, where an individual user isn't crippled because a server at the other side of the network is down, or if the network itself is unreliable. Despite the ever-increasing power of personal computers, the New York Times reports that the concept of making individual users' screens portals (smart ones) to bigger iron elsewhere on the network is making a comeback.
This discussion has been archived. No new comments can be posted.

New York Times Says Thin Clients Are Making a Comeback

Comments Filter:
  • by The Master Control P ( 655590 ) <ejkeeverNO@SPAMnerdshack.com> on Monday October 13, 2008 @02:15AM (#25351821)
    Is now just a bunch of generic PCs in smaller form factors. So in essence you're sticking a network layer between the rest of the computer and it's video card. So instead of network outages (which are inevitable) crippling just network operations, they now cripple everything including your ability to keep typing your office documents or looking at the email you've already got.

    It's annoying as hell, but if my network craps itself I still have a working computer in front of me and I can still do a subset of what I was doing before. Not so with thin clients.

    <tinfoil mode>
    Of course they want to take the actual computer away from you, they want to have control over you. If they could, your "computer" would be a mindless terminal to a Big Brother Approved mainframe that spied on everything you did.
    </tinfoil mode>
  • by Bill, Shooter of Bul ( 629286 ) on Monday October 13, 2008 @02:25AM (#25351875) Journal
    Don't forget the cost of maintaining the network. In a school district setting that would probably mean a WAN connecting all of the schools and district offices together. If the network goes down.... every one has to stop working. I'm sure you are very talented and it might work for your particular district. In my area, I know the level of Network Engineers they have and I'm convinced the whole thing would blow up.
  • by Anonymous Coward on Monday October 13, 2008 @02:27AM (#25351887)

    We have recently adopted a phased approach of deploying new thin clients as our estate of traditional desktops hit retirement. After having seen several false dawns and uncomfortably proprietary solutions in the last 15 years, it was only now that we have been happy enough with the whole solution (thin client HW, network connectivity, back-end virtualization SW) to take the plunge.

    There are now a range of HW clients (we use ChipPC [chippc.com]).
    There are a couple of viable virtualization systems (we use Citrix Xen [xensource.com], without the presentation server "tax").
    We've chosen a dedicated virtualization hardware appliance on the back-end from 360is [360is.com].

  • by 4D6963 ( 933028 ) on Monday October 13, 2008 @02:51AM (#25352021)

    This isn't boxing, more like wrestling. So don't be surprised if you see VM "you trashed your OS here have this backup virtual image" ware jump up on the ring, headbutt in all directions and virtualise the shit out of your thick clients.

    Am I the only one who believes that the future is not in thin clients but in desktop supervisors who make all your OSes run transparently virtualised? I'm talking about 10-15 years.

  • by Bazman ( 4849 ) on Monday October 13, 2008 @02:53AM (#25352027) Journal

    Seven hundred?!?! Microsoft had a web page where you could put in your client requirements and they would tell you how many Win 2003 TS machines you would need to support these clients. I don't think we ever got it down to fewer than 10 users per server - how did you manage 700?

    Currently we have four servers for about forty seats in our labs. They don't get much usage, and people don't seem to notice they're sharing a machine with the other 10 people on that row of the lab.

    I'd give thin clients to everyone, but then someone in an office of their own will tell us they really need Skype, and they really need a web camera... I suppose these things could be connected to a thin client and forwarded over USB, but it's not something we've tried...

      The other show-stopper is where users need admin rights for particular software. It does still seem to happen, mostly with big important pieces of software like our finance system or student records management. It may just be it needs to write to the C: drive so we could bodge it with access rights, but we don't want to screw up the installation so the user gets admin rights. Now, could we do that on a shared Windows 2003 TS box? I don't think so. With VM tech we could give them a VM of their own to play with though...

      VM tech has also helped us deploy Linux and Windows to our labs. Previously we had say four servers running Linux and four running Windows, and if the lab session needed Windows then there were four Linux servers sitting idle, and the users crammed onto the four Windows servers. With VMs, we stick a Windows and a Linux VM on each server, then the users are more spread onto the eight servers. Win.

  • by 4D6963 ( 933028 ) on Monday October 13, 2008 @03:41AM (#25352277)

    the computing demands of the casual user hasn't increased that much since the days of Windows 95

    Right, just try watching YouTube on Firefox with a Pentium 133.

    by giving everyone else thin clients, you'll give them less chance to screw up their system, thus giving them more uptime and more reliability, which users will appreciate.

    Uh huh, you can solve the "chance to screw up their system" by keeping the thick client but virtualising the OS, and as for more uptime and reliability it will only be as reliable and uptimely as is your network/servers, which is in most contexts probably not any better, plus you have to deal with general downtimes, and this way people are going to end up with all their eggs in the same basket, which, although avoidably, could bring huge IT catastrophes. Relying entirely on a centralised network is absolute madness, a single network administrator's mistake, a lack of redundancy combined with a hardware failure, a bad decision or incompetence could paralyse an entire infrastructure. Centralising everything only looks nice on paper.

  • by inKubus ( 199753 ) on Monday October 13, 2008 @03:48AM (#25352295) Homepage Journal

    The Linux Terminal Server Project [ltsp.org] is actually pretty good. And useful for a variety of things beyond just saving dough on the desktop end. Remote access is one that comes to mind. Sure, you could have a bunch of X terms, but this will work with ANY box with a PXE (hell even Netboot) NIC. You don't need virtualization or any of that garbage. UNIX was designed as a "multi-luser" operating system ;), back when mainframes were last in vogue. Xwindows is really quite good over a slow network and has been for DECADES.

    Now, I want to stress that I am a proponent of terminals in only certain areas. A public library computer bank. A factory environment, where you want your server safe and securely away from sparks and heat. A customer service environment where the employee is only doing one or two things. My business ops people would have real computers for the reasons you mentioned. I want them to be accounting and developing even if the server is down.

  • by Suzuran ( 163234 ) on Monday October 13, 2008 @04:35AM (#25352483)

    Easy, same way I handle it at our office with our terminal server: "You can't do that."

    Employees have no business copying CDs worth of data to (or worse, from) the office. In the eight years since the implementation of our terminal server environment, I have had exactly zero cases where there was a legitimate need to copy large amounts of data from the terminal server.

    Your computer at work is for working, not playing games when you think nobody is watching. Almost all of the complaints I get from employees wanting a "real PC" instead of a thin client revolve around their desire to screw around on the clock without being detected.

    In 100% of the cases where the employee was granted a PC instead of a terminal, later investigation revealed unauthorized usage within one month, ranging from forging call sheets to play flash games to a salesman using over 75% of the company's total internet transfer in one month at myspace.

  • by peragrin ( 659227 ) on Monday October 13, 2008 @06:47AM (#25353139)

    here is the kicker you can't easily run citirix and windows apps across a WAN. too much bandwidth that is lag sensitive.

    my company runs an AIX server with ssh access. each user literally SSH's into the server which loads up the acccess to the point of sale/inventory database. Everything important is tightly controlled. but the fact that you can run it over a dial up 36.6 modem effectively means that even if the internet is choking you can still work.

  • by NotBornYesterday ( 1093817 ) * on Monday October 13, 2008 @10:15AM (#25354927) Journal
    Thin clients are not going to always be the idea desktop. However, different thin client solutions offer different levels of efficiency, and so the math you reference above is not typical for many scenarios.

    As an example, SunRays [zdnet.com] generally scale much better than a cheap PC environment, with much better return on investment.

    You are going to be spending money on servers either way. According to your own figures, you have 7.5 users per server. SunRay solutions typically yield 20+ users per server cpu core. I'm not doubting your figures, but what do you guys do that requires so much back-end power? Are they multi-cpu servers? Fully utilized? Are they under-utilized? Single or dual cpu servers? Obviously, I'm not in your position, but before I looked at desktop solutions, I'd look at server consolidation. VMware or similar might save you a bundle and make things easier to admin.

    As for new software, SunRay environments are pretty easy to patch and deploy new software in. As a matter of fact, that's one of the strengths - deploy the patch or app to a single server or a few servers, and you are done.

    Electricity is hardly a selling point if you're losing productivity and still spending the money on servers, to boot.

    Obviously, achieving functionality is more important than being efficient. However, the point of thin clients is that they generally keep office productivity the same or better, IT efficiency is tremendous, and the equation ((thin clients * users) + (servers)) is less than ((full PC desktop) + (servers)) generally holds true. At that point, saving several hundred KwH might be pretty attractive.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Monday October 13, 2008 @10:20AM (#25354989)
    Comment removed based on user account deletion

If you think the system is working, ask someone who's waiting for a prompt.

Working...