Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Hardware

In Memoriam: VGA (hackaday.com) 406

szczys writes: VGA is going away. It has been for a long time but the final nails in the coffin are being driven home this year. It was the first standard for video, and is by far the longest-lived port on the PC. The extra pins made computers monitor-aware; allowing data about the screen type and resolution to be queried whenever a display was connected. But the connector is big and looks antiquated. There's no place for it in today's thin, design minded devices. It is also a mechanism for analog signaling in our world that has embraced high-speed digital for ever increasing pixels and integration of more data passing through one connection. Most motherboards no longer have the connector, and Intel's new Skylake processors have removed native VGA functionality. Even online retailers have stopped including it as a filter option when choosing hardware.
This discussion has been archived. No new comments can be posted.

In Memoriam: VGA

Comments Filter:
  • by Anonymous Coward on Saturday January 30, 2016 @10:36AM (#51402645)

    Um, WHAT THE FUCK???

    CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?

    "and is by far the longest-lived port on the PC."

    Serial port?

    Who the fuck wrote this piece of shit revisionist ignorant blurb?

    • And now we know what to expect from the new overlords.
    • Who the fuck wrote this piece of shit revisionist ignorant blurb?

      And his Hackaday shrill

    • by Ol Olsoc ( 1175323 ) on Saturday January 30, 2016 @10:55AM (#51402745)

      Um, WHAT THE FUCK???

      CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?

      "and is by far the longest-lived port on the PC."

      Serial port?

      Who the fuck wrote this piece of shit revisionist ignorant blurb?

      One of those people who think that everything can be done on a smartphone

      • by __aaclcg7560 ( 824291 ) on Saturday January 30, 2016 @11:23AM (#51402891)

        One of those people who think that everything can be done on a smartphone.

        I had a friend who gave me an expensive Asus wireless router because he made a change to the configuration from his iPad that locked out his iPad. He refused to reset the router to factory settings and use my laptop to configure the settings via a wired connection. It had to be done through the iPad only. No matter how I tried to explain what he wanted wasn't realistic, it had to be done the way he wanted it done. He want back to using the Comcast modem, which had an external button for turning on the wireless.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          I know a few guys like that.

          The only explanation I can come up with is they are so butthurt from spending so much money on an iPad that the salesman/hype-machine promised them could do *everything* that they absolutely must use it for everything or risk admitting to themselves that they are a sucker.

          No one wants to be a sucker.

    • What about the port you stick the power cable into?

      • by Osgeld ( 1900440 ) on Saturday January 30, 2016 @11:14AM (#51402829)

        serial ports were around back when the power cable was still attached

        hell serial ports predate computers

        • by Junta ( 36770 ) on Saturday January 30, 2016 @11:31AM (#51402957)

          For those wondering, it seems that C13 (the power plug) was 1970. DB-9 dates to 1952, though RS232 dates to 1969 (still older than C13).

          Of course I would say DB-9 has been far from ubiquitous for quite a few years. Most boards have a header for it (not much reason to not have that). Even in servers, they increasingly omit a physical connection (favoring instead using network to get serial port data). On network datacenter equipment, they generally use something like a mini-usb or smaller form factor, or even sleeve-tip-ring ports, breaking out to DB-9 because they don't want to spend the precious port real estate on something as large as a DB-9.

          So C13 is not longer lived than DB-9, though one could argue it has had the 'longest life' compared to RS-232 over DB-9, if you accept that the past few years don't count for DB-9 so much (clearly still around, but usually only from an adapter or breakout)

          • Re: (Score:2, Informative)

            by Anonymous Coward
            'taint no such connector as a "DB-9." There's a DB-25, even a DB-37. There's also a DE-9 and a DE-15. But a DB-9 doesn't exist, despite what lots of people incorrectly call a DE-9.
            • by tepples ( 727027 )

              DB is the shell and 9 is the number of pins. Therefore, a DB9 is a DB25 with most of the pins missing. There is no standardized DB9, but one wouldn't be unjustified in claiming that the de facto DB9 is DB25 that has only the pins used by a PC serial port (the ones that have counterparts on the DE9).

              • by thegarbz ( 1787294 ) on Saturday January 30, 2016 @06:04PM (#51405505)

                but one wouldn't be unjustified in claiming that the de facto DB9 is DB25 that has only the pins used by a PC serial port

                Except that's not what the de facto use describes. DB denotes the shell size, commonly the one with 25 pins in it DB25. What people call "de facto DB9" is a DE shell size. Anyway you cut it the common usage is wrong.

                At least it would be if the definition was regulated at all. Since it's not it's kind of hard to argue that a DB-9 isn't just another name for a DE-9 given how even manufacturers of connectors [farnell.com] are using that nomenclature.

          • by TWX ( 665546 ) on Saturday January 30, 2016 @12:10PM (#51403225)

            Of course I would say DB-9 has been far from ubiquitous for quite a few years. Most boards have a header for it (not much reason to not have that). Even in servers, they increasingly omit a physical connection (favoring instead using network to get serial port data).

            Last generation of desktop computers I've routinely worked with at work, Dell Optiplex 7010, has DB-9 serial, and it looks like the 4th Quarter 2015 Dell Optiplex 7040 still has a DB-9 serial port as well.

            I had to do firmware updates on some Fluke network testers last week. Admittedly these were slightly older models, but the update gave them the ability to identify 1G advertisement from the switch, to do in-line PoE voltage monitoring, to identify appliance/voice VLAN, and to do identify CDP from the switch. Doing this required the use of a serial cable with good old pins 2, 3, and 5 for receive, transmit, and ground respectively. It was harder to get the serial-part of the process going than it should have been, trying to use a serial-less Windows 8.1 laptop with adapters was a challenge and I finally ended up getting out a WYSE 52 terminal and null-modem cable to see if the software on the PC was actually sending anything out through Microsoft's weird wrappers on top of the keyspan USB to serial adapter, then establishing that yes, the software was talking, try to figure out why the scanner wasn't acknowledging. Turns out that was problems with the socket for the 2.5mm phono jack on the scanner itself.

            Anyway, as much as some of us might like for RS-232 serial to be dead it doesn't look like we can write it off entirely any time soon, given the sheer expense of the kinds of devices that we have to support that use it. It's a lot easier to give up VGA because monitors, by and large, are not expensive, and even when they are there will still be methods to get analog video to them either through add-in cards or through conversion devices.

            • It's a lot easier to give up VGA because monitors, by and large, are not expensive, and even when they are there will still be methods to get analog video to them either through add-in cards or through conversion devices.

              This is precisely what is happening on Intel Skylake motherboards. The chipset or processor doesn't support VGA, but there are like three lines for display outputs internally. It is common already that a converter is built onto the motherboard so that one of the output ends up as VGA instead of digital, and that is cheap enough.

              Cursory look at current motherboards ("bottom of the barrel" on price) tells me the COM port is quite common still, and even LPT is still available on the back sometimes.

              See :
              Gigabyt

            • In a previous job (now some 9 years ago) I had several big-metal servers which could only have their early boot systems (like a BIOS but not) accessed over serial ports, and for repairs and maintenance a serial terminal was critical.
              Everybody else used software terminal emulators with USB adapters but it was a constant nightmare, it would be months between uses and the next time one was needed, invariably something would get messed up between the drivers and the emulator.
              So I hunted around and bought an old

          • by fisted ( 2295862 ) on Saturday January 30, 2016 @12:25PM (#51403333)

            The connector is gone, but the need for something equivalent persists. Network, adapters etc are nice, but they are very complicated to use; complicated enough to require a device driver [stack], which implies a booted operating system.

            Until the OS is booted, all those ports are dark, IOW, one cannot use them for debugging the boot process, or the (booting) loader and kernel. The IBM PC, as much as I despise it, makes using the serial port trivial, since the BIOS effectively has a device driver for it (although manually driving it isn't much of a big deal either).

            It takes:
            mov ah 1
            mov al <char>
            mov dx 0
            int 14h

            to vomit <char> out the serial port from 16-bit real mode (i.e. the mode the loader starts in)

            So one way or another, a serial port (equivalent) will persist. It might get a little harder to access, though (e.g. some Android phones have their serial console going out the audio jack...), but it can't be done away with altogether.

        • serial ports were around back when the power cable was still attached

          hell serial ports predate computers

          9-pin serial ports were a nonstandard "optimization" introduced with the PC/AT, which was in the early 1980s. These ports have arguably have been more dead than the VGA connector for some time. A couple of motherboards I bought this year still happen to have VGA connectors, but no external 9-pin serial port.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      It was the first standard that most millenials ever had to deal with. That's the recognized standard for fact checking on the Internet.

    • Rob Malda just had an aneurysm either that or he's laughing his ass off.

    • What do you expect - it's from hackaday - you just know it's bullsh*t written "because we need to write something."

      Add in Hercules port, parallel port.

      Also

      Most motherboards no longer have the connector,

      I guess he hasn't bought a recent laptop - mine has both hdma and vga. And even this gamer laptop at tiger direct [tigerdirect.ca] has vga out.

      ASUS ROG G751JT-DB73 - Core i7 4720HQ / 2.6 GHz - Windows 8.1 64-bit - 16 GB RAM - 256 GB SSD + 1 TB HDD - DVD-Writer - 17.3" 1920 x 1080 ( Full HD ) - NVIDIA GeForce GTX 970M - 802.11ac - black

      Connections & Expansion
      Interfaces: Headphone/SPDIF combo jack ¦ Thunderbolt ¦ HDMI ¦ 4 x USB 3.0 ¦ LAN ¦ VGA ¦ Microphone input
      Memory Card Reader: 2 in 1 ( SD Card, MultiMediaCard )

      • by Junta ( 36770 )

        It's common to slap on a larger laptop, but the 'ultrabook' level thickness skips a lot of ports (on mine, it even skips an RJ-45 ethernet, though it still has an ethernet port and a passive breakout to provide an RJ-45 for it).

        • I don't see the allure of an ultrabook. I've always used my laptops only as desktop replacements plugged into a second screen for dual monitor use (quiet counts for a lot when you notice just how much nicer the work environment becomes when the pc power supply and cpu fan whines are gone).

          • by Anonymous Coward

            So you don't see the point of an ultra book because you don't use your laptop as an actual mobile device. Surprising.

          • by gmack ( 197796 ) <gmack@noSpAM.innerfire.net> on Saturday January 30, 2016 @12:16PM (#51403265) Homepage Journal

            That.. is the single most misguided reason I've ever heard for choosing a laptop over a desktop. My desktop PC was built with quiet components If I push the graphics really hard (games, not HD movies) I can hear the fan on that start up.

            For my trouble, I get more RAM, a more powerful CPU, better graphics, and far more expansion ports and my laptop stays on a shelf unless I'm travelling or I need an on site computer for a contract and in both of those cases size really does matter..

    • Um, WHAT THE FUCK???

      CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?

      "and is by far the longest-lived port on the PC."

      Serial port?

      Who the fuck wrote this piece of shit revisionist ignorant blurb?

      Not to mention a total BS premise. They have been saying the serial port is dead for DECADES! And I can't tell you how often my USB floppy drive has been a life saver at a client with critical data on floppy and no floppy drives to read it!

      And does the moron know that the DVI ports that are now on most motherboards HAVE FUCKING VGA BUILT IN?!?

    • It was the first decent standard for MS-DOS/Windows video. Everything before it was a pile of shit, where you needed a new standard every time a higher resolution became available. Remember separate modes for text and low-res graphics? Remember how painful those early PC monitors were to work on?

      • by Anonymous Coward on Saturday January 30, 2016 @11:53AM (#51403117)

        It was evolutionary rather than revolutionary. The EGA was its immediate predecessor and was pretty good, except for some reason the resolution didn't conform to the 4x3 aspect ratio which was standard at the time.

        VGA came along with 640x480 res, which was decent. But within a couple years that was obsolete, so there came 800x600 (IBM later called that XGA) and then a succession of "Super VGA" "standards" (as in the joke, there are so many of them) all with different resolutions higher than 800x600, and some supporting wide aspect ratios.

        BTW VGA also supported two low res, 256 color modes, mode 13h and Mode X [wikipedia.org], which became favorites of DOS gamers because they were well suited for smooth animation while not requiring exorbitant amounts of installed RAM.

    • Re: (Score:3, Informative)

      by Dunbal ( 464142 ) *
      To be honest, VGA (nor EGA, CGA, XGA etc) was not a "standard". VESA was a standard.
    • If you were extremely charitable you might describe it as the first standard for computer video output which could actually handle video, except even that would be wrong. The first such standard was RGB with separated syncs, then came sync-on-green RGB. VGA came from that and added the communications channel.

    • For those who are too young to remember...

      CGA = Crap Graphic Adaptor = no porn
      EGA = Extra Graphic Adaptor = some porn
      VGA = Very Graphic Adaptor = loads of porn

  • by oic0 ( 1864384 ) on Saturday January 30, 2016 @10:36AM (#51402647)
    It certainly has stopped being so popular but it isn't likely tl fade completely away for a long time. I still see it on monitors and TVs. These thin devices thst have no port usually have a display port that easily converts to vga with a cheap dongle.
    • by Anonymous Coward on Saturday January 30, 2016 @10:47AM (#51402709)

      I still see it on monitors and TVs.

      And projectors! How else can I connect to those projectors if not VGA? And their life-span is probably decades. I think the new projectors actually have alternatives to VGA optional, but usually this is HDMI, which I predict is going away sooner than VGA. (HDMI being replaced by DP)

      • by AthanasiusKircher ( 1333179 ) on Saturday January 30, 2016 @11:50AM (#51403097)

        And projectors! How else can I connect to those projectors if not VGA? And their life-span is probably decades. I think the new projectors actually have alternatives to VGA optional, but usually this is HDMI,

        THIS. The person who wrote TFA must not do any presentations anywhere ever. Yes, new projectors often have other inputs, but that's often irrelevant in a conference venue or a classroom or whatever, where often there's ONE cable that's presented to you to hook in your laptop -- and it's a VGA cable (often with an audio headphone jack plug, if you need it).

        That's the same as it was most places decades ago. If your laptop today doesn't have a VGA port, you get a dongle. Everybody who needs to plug into a projector has a standard VGA one. Switching to another standard would require a major initiative, since this is NOT a place where you can just adopt a different standard on the fly.

        Probably tens of thousands of people show up an unfamiliar place every day and expect to be able to plug a laptop into a projector to give a presentation. For better or for worse, everybody knows that you bring a connector for VGA, and if you change that, you need to be darn sure all of your presenters know that (and, even if they do, lots of people who give talks can be old and won't understand if they show up with a laptop that doesn't connect to something else, so you'll be scrambling at the last minute to move stuff to another computer or whatever).

        I don't see this standard switching anytime soon -- it tends to be used in high-profile, time-sensitive situations where people expect to be able to plug a computer in and have it work instantly. Unless a venue is going to provide a dongle that fits every possible port on the planet (and most don't), it will be really hard to switch.

        The only thing that will eventually allow the switch won't be a new port standard, but rather wireless broadcast of video directly to the projector. It's still quite rare, but it's feasible and the only way to get out of the VGA rut. I doubt HDMI/DP/whatever is EVER going to overcome VGA for such applications -- the next "standard" won't have cables at all.

    • I have bought some Dell R430 and R730 servers, which are latest generation (Haswell based Xeons, DDR4 RAM) and guess what their one and only video output format is? That's right, a VGA port. No DVI, no DP, just VGA. No surprise either: Go have a look at high end network'd KVMs. They are all VGA. It works, so it is staying around in that space (same deal as serial for that matter).

      It is certainly a standard on the decline, digital transmission makes more sense particularly since our displays are digital thes

  • not first standard (Score:5, Informative)

    by Anonymous Coward on Saturday January 30, 2016 @10:38AM (#51402659)

    "It was the first standard for video" - not quite
    Perhaps NTSC monochrome RS-170 on a coax connector might be the first standard for video.
    And even in the IBM PC world, monochrome and CGA were earlier.
    Of course, perhaps the author of this article wasn't alive back then, and hasn't yet learned to "check your sources before publishing"

  • by Anonymous Coward on Saturday January 30, 2016 @10:41AM (#51402681)

    Sure the world must move on some day, but I'd like to point out that at least for me HDMI has only ever brought disadvantages. Apart from severe problems with dealing with several audio channels or routing audio to external analog speakers, it also had and still has the charming property of turning the whole display black for 1-2 seconds from time to time. Not to speak of countless problems with false colors and red-tinted display on my Philips TV.

    Frankly speaking, after years of using it I have come to the conclusion that HDMI is just shit in comparison to analog VGA, no matter how much seemingly more clear the display may be. I believe it was mainly forced on everyone for introducing DRM crap and to sell expensive cables and VGA would do well enough. Digital is not always the best.

    • I believe it was mainly forced on everyone for introducing DRM crap and to sell expensive cables and VGA would do well enough. Digital is not always the best.

      This! And it is the reason they are trying to kill VGA.

    • Apart from severe problems with dealing with several audio channels or routing audio to external analog speakers, it also had and still has the charming property of turning the whole display black for 1-2 seconds from time to time. Not to speak of countless problems with false colors and red-tinted display on my Philips TV.

      I think I found the source of your problem ... :-)

    • by ColdWetDog ( 752185 ) on Saturday January 30, 2016 @11:33AM (#51402971) Homepage

      HDMI sucks [bluejeanscable.com]:

      HDMI is a horrid format; it was badly thought out and badly designed, and the failures of its design are so apparent that they could have been addressed and resolved with very little fuss. Why they weren't, exactly, is really anyone's guess, but the key has to be that the standard was not intended to provide a benefit to the consumer, but to such content providers as movie studios and the like. It would have been in the consumer's best interests to develop a standard that was robust and reliable over distance, that could be switched, amplified, and distributed economically, and that connects securely to devices; but the consumer's interests were, sadly, not really a priority for the developers of the HDMI standard. ... HDMI has presented a few problems. Unlike analog component video, the signal is not robust over distance because it was designed to run balanced when it should have been run unbalanced (SDI, the commercial digital video standard, can be run hundreds of feet over a single coax without any performance issues); the HDMI cable is a complicated rat's-nest arrangement involving nineteen conductors; switches, repeaters and distribution amplifiers for use with HDMI cable, by virtue of this complicated scheme, are made unnecessarily complicated and troublesome; and the HDMI cable plug is prone to falling out of the jack with the slightest tug. On the plus side, in the great majority of simple installations,

      • I love their candor, and as a result I own a lot of Blue Jeans' cables. They're good cables, but I regret it now. They run from a 4x4 matrix in my basement to every room in the house, and work flawlessly, but now it's cheaper just to add an Apple TV or Amazon Fire Stick to every TV rather than run HDMI from a central HTPC.

  • by MrKaos ( 858439 ) on Saturday January 30, 2016 @10:44AM (#51402693) Journal

    When it first came out I remember thinking of it's acronym that way, instead of Video Graphics Array.

  • by fluffernutter ( 1411889 ) on Saturday January 30, 2016 @10:46AM (#51402705)
    I feel sorry for a world that must get rid of electronics because they use a port that looks old. I have three VGA monitors that work perfectly fine, and I hope to not have the throw them in the trash before they stop working perfectly fine.
    • I still have at least two composite-input monochrome monitors that work perfectly fine, or did when I last tried them -- probably twenty years or so ago. I intend one day to haul out the old TRS-80 Model I and see if it still works. If not, I stand a really good chance of successfully repairing it myself, unlike most electronics released in the last couple of decades. (Of course, it's more likely to work than more recent equipment, if only because it predates the biggest capacitor-quality catastrophes.)

      But

      • I agree, phones shouldn't be the size of bricks for the sake of ports. However, I don't see the need to get a laptop from an inch thick to 1/4-inch thick either. That comes from the pursuit of the new shiny toy that looks cool, not from a functional reason.
        • I would tend to call "Weighs a fraction of what a 1" thick laptop would and is much easier to carry around when traveling" a functional reason.

          Source: I learned the hard way that carrying around a 17" gaming laptop that weight in at 23 lbs with power supply was not functionally sustainable when travelling to conferences.

          • I will never get that. How are you functionally being prevented from carrying a full sized laptop? I regularly carry 2-3 full sized laptops when traveling, no one has ever told me I cannot get on the plane with them. Again, having a bag that weighs 5 pounds versus a bad that weighs 15 pounds is for your convenience.
            • Also, I'm not even saying you shouldn't be able to have that laptop. If you want to have a laptop with a display port or hdmi only then fine. I'm saying it's sad that the whole market tends to move that way, even when there are plenty of people happy with full sized laptops.
            • Convenience is functionality. Lugging a 20 (or whatever) lb laptop together with a bunch of other luggage through a crowded airport, or half a mile down a city street from train station to hotel is noticable over time. Certainly you CAN do, but when you can get the same computing power in a lighter package, and travelling is part of your job, why wouldn't you?

              Now if I was just driving to the office and walking 60 feet or so at each end, then I could see favoring a heavy laptop, but not just so I could have

        • However, I don't see the need to get a laptop from an inch thick to 1/4-inch thick either. That comes from the pursuit of the new shiny toy that looks cool, not from a functional reason.

          Actually, small is good for some people. Essentially, all they need is a Windows tablet with a real keyboard, and they do not want to lug around a lot of weight.

    • by Junta ( 36770 )

      Note that they offer mDP to VGA adapters. If you really want to be compatible with those monitors (or if you are frequently expected to hook up to random projection setups, and I've seen some conference rooms that were *constructed* in 2015 only provide VGA, strangely enough.

      • Yes and I swear at Apple every time I go to use my Macbook on a monitor and I can't find my adapter. When I bought the macbook, it didn't even occur to me that there would be anything different then VGA.. because VGA was still working for me everywhere else. All the other systems I purchased at the same time or later came with a VGA port, and I really wasn't even looking for a VGA port on them. I know I'm fighting a losing battle here, because we live in a world where consumers want what they want, and i
  • by PopeRatzo ( 965947 ) on Saturday January 30, 2016 @10:57AM (#51402755) Journal

    You think we can get some better snacks in the commenter lounge vending machine? I know there's going to be belt-tightening, but Bit-O-Honey and yogurt granola bars aren't going to cut it.

  • by Anonymous Coward on Saturday January 30, 2016 @11:07AM (#51402795)

    Yea, VGA needs to FOAD because it looks antiquated.

    Come to think of it, you're looking rather antiquated. What to do about you?

    Slashdot, it looks totally antiquated too. I can;t wait for the new owner to implement a beta interface design that better monetizes community synergies. Make sure you model it after flat UI design so no one can see or find anything. It'll look so sexy!

  • RIP DVI... (Score:5, Interesting)

    by __aaclcg7560 ( 824291 ) on Saturday January 30, 2016 @11:09AM (#51402805)
    I was looking at new monitors recently. Seems like DVI is going away than VGA. Many monitors have VGA, HDMI and occasionally DisplayPort connectors. The only two connectors I use in my home network is VGA for servers and HDMI for everything else.
  • by Tuor ( 9414 ) <tuor.beleg@gTWAINmail.com minus author> on Saturday January 30, 2016 @11:12AM (#51402815) Homepage

    It's still almost everywhere. At work we still have VGA monitors and docks. The monitor also has a digital connector of some kind, but never more than two other flavors. My TVs have VGA.

    You know what's great about VGA sticking around? Older equipment that was often expensive and built like a tank still works. Projectors, CRTs, and KVMs. I've seen retrocomputer enthusiasts build VGA adapters for all kinds of old systems. It's nice to have something that you can rely on when you're traveling; if you have a VGA dongle you know you can work.

    I hope VGA has a couple more decades in it, and with the slow adoption of 4K TVs, it just might.

  • Until then I have my doubts. Also most of the projectors at my company still work and still have VGA only. Until those $2000 behemoths start dying off VGA aint going anywhere.
  • by EvilSS ( 557649 ) on Saturday January 30, 2016 @02:11PM (#51403985)
    My favorite quote from TFA: "Unless the monitor you’re viewing this on weights more than 20 pounds and is shooting x-rays into your eyes, there’s no reason for your monitor to use a VGA connector."

    I thought this bullshit line of thought died out in the 60's or 70's.
  • by rrohbeck ( 944847 ) on Saturday January 30, 2016 @03:17PM (#51404357)

    I was just in a meeting with Dell about their next generation. There's no demand for anything but VGA because all infrastructure is VGA based. There isn't even IPMI for anything but VGA.

"If it ain't broke, don't fix it." - Bert Lantz

Working...