Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Technology

One Video Card, 12 Monitors 262

Jamie found a story that might make your jaw drop if you happen to have some need to put 12 video cards in your machine. Although if that isn't enough, you can always install two of these. I don't think I'm kidding.
This discussion has been archived. No new comments can be posted.

One Video Card, 12 Monitors

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Monday June 07, 2010 @12:25PM (#32485118) Journal
    I know that motherboards only support two but I seem to recall a story of someone who might be interested in that [slashdot.org].

    Also, in the article, they call this behemoth "Powercolor innovation." I'd rather we called it "Powercolor scaling" unless they actually tackled the problem in some way other than slapping to cards together into one.
  • Multi-seat Computing (Score:5, Interesting)

    by timeOday ( 582209 ) on Monday June 07, 2010 @12:27PM (#32485136)
    I think a 4 or 6 core CPU could support 12 users in many cases. I could see building a computer lab at a school this way to minimize administrative burden. But it's too bad multi-seat linux doesn't work better. I have struggled with it on and off over the years, and it just doesn't seem to have critical mass of interest to gain real distro support.
  • by ATestR ( 1060586 ) on Monday June 07, 2010 @12:33PM (#32485240) Homepage

    This is a cool card, but how many of us would ever buy one? Even if the cost of this unit is equivalent to another high end video card, putting a dozen or so on my desk is more cash that I budget in a year for toys.

    Admittedly, I find the idea of having many monitors attractive. I use a dual monitor setup at work, and I find it restrictive to go back to one monitor on my home laptop. What I'd like to have is a 2(h) x 3(w) array of monitors... someday.

  • Re:Sounds good. (Score:4, Interesting)

    by rcpitt ( 711863 ) on Monday June 07, 2010 @12:39PM (#32485330) Homepage Journal
    I currently run 4 monitors - used to run 6 but that was before I got the 1920x1280 units I currently have.

    IMHO - you will never come close to having a paperless office until the screen real estate comes at least close to (or over) the desk real estate.

    I write articles and code - and find that having the reference stuff up at the same time on another screen, with graphics on another, makes writing a LOT faster!!!

  • Users per computer (Score:3, Interesting)

    by tepples ( 727027 ) <tepples.gmail@com> on Monday June 07, 2010 @12:41PM (#32485358) Homepage Journal

    Realistically how many different displays can the average consumer use at a time?

    Consumer, singular, or consumers, plural? If mainstream operating systems didn't have a problem recognizing multiple keyboards and mice and separating their input, then one could share a desktop computer among multiple users that way. Then a personal computer could become a family computer,* and school computer labs could get away with using less hardware.

    * Even if you aren't running an NES emulator.

  • Re:Sounds good. (Score:3, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Monday June 07, 2010 @12:53PM (#32485508) Journal
    Not really arbitrary: Historically, with analog outputs, you needed one RAMDAC, plus associated passives and connector, per video output. For cost reasons, one or two RAMDACs got folded pretty quickly into common display controller chipsets, just to save on the number of packages on the card. This area was where the massive economies of scale lived. If you didn't mind paying more, people like Matrox have always been willing to sell you cards with more heads.

    With the newer digital interconnects, you need a TMDS out, plus associated passives and connector, per video output. Again, deviating from the mass-market-friendly 1 or 2 outs configuration has always been possible; but pricey.

    The only really novel aspect of this ATI "Eyefinity" stuff is that ATI decided to crank up the number of outputs supported, by default, right in their silicon, so sharply and thus brought lots and lots of heads into the realm of "commodity gamer cards" rather than "underperforming, yet strikingly expensive, niche cards".
  • by vlm ( 69642 ) on Monday June 07, 2010 @01:06PM (#32485720)

    Would this card drive one dozen monitors set up as digital picture frames?

    I have a linux based file server in the basement that does not really do anything with its video output.

    If I could hook up 12 picture frame monitors in various rooms of my house, that would be fun.

    I don't want the extreme headache of manually updating 12 SDHC or CF cards. I don't want 12 individual stupid yearly subscriptions to some internet ripoff company that'll probably go out of business and make my investment obsolete the week after I buy them.

    I just want to drop .jpgs into certain folders on my pre-existing file server and have the pictures randomly displayed thru the house, shuffling perhaps every 10 minutes. Also I'll have certain webcams periodically downloaded and added to the mix. And a cron job to display certain pictures at certain times, etc. A couple lines of perl, bash, and wget, thats what I'm talking about.

  • http://www.newegg.com/product/product.aspx?item=N82E16824255011 [newegg.com]

    each is 1920x1200

    i put one in landscape mode, then i bought an articulating monitor arm, and i put the other one in portrait mode. the setup looks schizophrenic, but listen up folks:

    browsing the internet on a 16:9 monitor in portrait mode is a dream

    try it some day. you capture so much of a webpage you are usually peering at through a slit you are constantly scrolling through with lots of unused screen real estate on either side

    as a web developer, it helps too, believe me: the landscape mode screen for code/ packet inspection/ debugging/ email, etc... the other screen for a really good 10,000 foot overview of what you are actually putting up in the browser in terms of page layout

    trust me folks: get a 16:9 monitor and put it in portrait mode if you browse a lot on the internet. it is about as good as it gets in terms of ui experience

  • by vlm ( 69642 ) on Monday June 07, 2010 @01:29PM (#32486082)

    Because of that you'd save little, if any, money over cheap systems acting as thin clients

    Good detailed technical analysis, but I can get an equally valid argument by working a different angle.

    Unless you're doing something real weird/wrong, the cheapest part of a computer lab is the hard drive, video card, chassis, etc. Zero that out, and you've got something very unusual, rare, and complicated, yet remains at 99% of the total cost, that being mostly salary and indirect costs (health insurance, pension, etc) and stuff like HVAC, electric bill, fractional capital expense of the building, cost of electrical and LAN wiring and related hardware... If you want to save a whopping 1% of the total cost of ownership, the very superficial answer is just install 99 computers instead of 100.

    If your 24 room school costs $12M to build, which seems believable, then your empty room cost $500K. You can pull your hair out to "save" $2500 worth of hard drives and $1250 worth of chassis/power supplies, but that's a false economy. And you'll never be able to piecewise upgrade.

  • by Zerth ( 26112 ) on Monday June 07, 2010 @01:32PM (#32486122)

    How about 12 USB mini-monitors, with USB->network adapters.

    A fair bit cheaper, unless you want 15"+ frames.

  • Re:Sounds good. (Score:3, Interesting)

    by tinkerghost ( 944862 ) on Monday June 07, 2010 @01:41PM (#32486256) Homepage

    I write articles and code - and find that having the reference stuff up at the same time on another screen, with graphics on another, makes writing a LOT faster!!!

    My preference is 3 monitors:

    • Coding - the IDE or Nedit windows.
    • Reference - Usually both language reference windows for rarely used commands and the project reference on the same window.
    • Testing - Web browser or testing script windows depending on the project.

    With that setup, I don't have to flip between desktops to work & doing reference checks is as simple as looking between the monitors. No flipping back & forth between the project reference & the test results you just compare the 2 windows & be done with it.

    That said, I can't even think of what I would do with 12 monitors other than running a kiosk with each K/V/M setup dedicated to it's own OS image.

  • Re:Sounds good. (Score:3, Interesting)

    by bami ( 1376931 ) on Monday June 07, 2010 @01:57PM (#32486466) Homepage

    Get your boss to get some licenses to this:
    http://www.realtimesoft.com/ultramon/ [realtimesoft.com]

    I was an internet at some random software company for 6 months, and it helps when maximizing windows, stupid pop-up boxes appearing everywhere and just helps with sorting windows, even on uneven monitors. I run it myself on a 1680x1050 monitor next to a 1280x1024 monitor, and it really helps with stupid dialog boxes.

  • Re:Only one problem (Score:3, Interesting)

    by vlm ( 69642 ) on Monday June 07, 2010 @03:48PM (#32488176)

    One of the projector models at one of the NOCs had a plan to prevent that:

    1) Integrated optics. The first lens was mounted in the lamp "module" and the module was sealed. you'd have to find a way to bust open the module without cracking the lens or screwing up its alignment.

    2) ID chip, much like an ink jet cartridge. "Hmmmm. lamp serial number 98243804728531 has been operated for 1000 hours or whatever". Yes, on the control menu, where you'd do things like brightness/contrast, there was an option to display the serial number and hours used on the screen.

    So there's some serious problems in the way, both optically and electronically.

    Supposedly this was a "feature" as a detonating halogen bulb could destroy the optics. So, stop them from using one past its prime, and if it blows up and takes out the first lens, thats OK since every bulb module comes complete with a new lens. Also you can't touch the glass bulb if its inside a sealed module. As a side note it also made the projector very profitable for the manufacturer.

    And there is a problem in that I've never seen a single point source white LED much above 6 watts or so. You can buy multiple LED modules that insert into a standard edison lightbulb socket, but thats not going to work. If you can buy a 100 watt single chip LED I'd be impressed to see it.

  • by Changa_MC ( 827317 ) on Monday June 07, 2010 @05:08PM (#32489248) Homepage Journal

    You ran explicitly unsupported software then.

    And a nettop runs windows, which makes it another point of failure. The L series primarily saves on support, since all software deployment is on the server, and all hardware is on-the-fly swappable. 2 redundant servers replacing 20 workstations means zero downtime for about the same cost.

    I said elsewhere we use x550s, that's where you save money.

Neutrinos have bad breadth.

Working...