In Memoriam: VGA (hackaday.com) 406
szczys writes: VGA is going away. It has been for a long time but the final nails in the coffin are being driven home this year. It was the first standard for video, and is by far the longest-lived port on the PC. The extra pins made computers monitor-aware; allowing data about the screen type and resolution to be queried whenever a display was connected. But the connector is big and looks antiquated. There's no place for it in today's thin, design minded devices. It is also a mechanism for analog signaling in our world that has embraced high-speed digital for ever increasing pixels and integration of more data passing through one connection. Most motherboards no longer have the connector, and Intel's new Skylake processors have removed native VGA functionality. Even online retailers have stopped including it as a filter option when choosing hardware.
It was the first standard for video? (Score:5, Insightful)
Um, WHAT THE FUCK???
CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?
"and is by far the longest-lived port on the PC."
Serial port?
Who the fuck wrote this piece of shit revisionist ignorant blurb?
Re: It was the first standard for video? (Score:3, Insightful)
Re: (Score:2)
Same as the old overlords? No clue about history, or that pissing in people's cornflakes is a bad idea.
Re: It was the first standard for video? (Score:5, Interesting)
Re: It was the first standard for video? (Score:5, Insightful)
Re: (Score:3, Insightful)
I don't know if it's fair to blame the people who just took over the site for a long-standing editor posting a story that was written by another third party. You might as well blame Obama for this, because I'm sure it's somehow his fault as well.
Every story, save for a couple, since the announcement of the takeover has ostensibly been posted by timothy. Samzenpus and Soulskill have apparently been sent to pasture, and I suspect that this 'timothy' is only 'timothy' in name only. It's likely a shell account, run by who knows who, that the new editorial overlords are going to use while they transition to their new staff.
Re: It was the first standard for video? (Score:4, Interesting)
Re: It was the first standard for video? (Score:4, Informative)
TIMMAY!!!!!! (Score:3)
Who the fuck wrote this piece of shit revisionist ignorant blurb?
And his Hackaday shrill
Re:TIMMAY!!!!!! (Score:5, Informative)
Definitely a hackaday shill account. I got bored after looking through his first 45 submissions [slashdot.org] - it's all hackaday, all the time.
Re: (Score:2)
Re: (Score:3, Funny)
I don't read hackaday but, from this brief exposure, I can tell they're asshats.
1. They think cheap, reliable, widely-supported hardware is going away because all computer users are "design conscious."
2. They use white text on a black background.
Also, when my CHIP arrives, it'll have a VGA adapter.
Re:It was the first standard for video? (Score:5, Funny)
Um, WHAT THE FUCK???
CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?
"and is by far the longest-lived port on the PC."
Serial port?
Who the fuck wrote this piece of shit revisionist ignorant blurb?
One of those people who think that everything can be done on a smartphone
Re:It was the first standard for video? (Score:5, Interesting)
One of those people who think that everything can be done on a smartphone.
I had a friend who gave me an expensive Asus wireless router because he made a change to the configuration from his iPad that locked out his iPad. He refused to reset the router to factory settings and use my laptop to configure the settings via a wired connection. It had to be done through the iPad only. No matter how I tried to explain what he wanted wasn't realistic, it had to be done the way he wanted it done. He want back to using the Comcast modem, which had an external button for turning on the wireless.
Re: (Score:3, Insightful)
I know a few guys like that.
The only explanation I can come up with is they are so butthurt from spending so much money on an iPad that the salesman/hype-machine promised them could do *everything* that they absolutely must use it for everything or risk admitting to themselves that they are a sucker.
No one wants to be a sucker.
Re: (Score:3)
What about the port you stick the power cable into?
Re:It was the first standard for video? (Score:4, Informative)
serial ports were around back when the power cable was still attached
hell serial ports predate computers
Re:It was the first standard for video? (Score:5, Insightful)
For those wondering, it seems that C13 (the power plug) was 1970. DB-9 dates to 1952, though RS232 dates to 1969 (still older than C13).
Of course I would say DB-9 has been far from ubiquitous for quite a few years. Most boards have a header for it (not much reason to not have that). Even in servers, they increasingly omit a physical connection (favoring instead using network to get serial port data). On network datacenter equipment, they generally use something like a mini-usb or smaller form factor, or even sleeve-tip-ring ports, breaking out to DB-9 because they don't want to spend the precious port real estate on something as large as a DB-9.
So C13 is not longer lived than DB-9, though one could argue it has had the 'longest life' compared to RS-232 over DB-9, if you accept that the past few years don't count for DB-9 so much (clearly still around, but usually only from an adapter or breakout)
Re: (Score:2, Informative)
Re: (Score:3)
DB is the shell and 9 is the number of pins. Therefore, a DB9 is a DB25 with most of the pins missing. There is no standardized DB9, but one wouldn't be unjustified in claiming that the de facto DB9 is DB25 that has only the pins used by a PC serial port (the ones that have counterparts on the DE9).
Re:It was the first standard for video? (Score:4, Informative)
but one wouldn't be unjustified in claiming that the de facto DB9 is DB25 that has only the pins used by a PC serial port
Except that's not what the de facto use describes. DB denotes the shell size, commonly the one with 25 pins in it DB25. What people call "de facto DB9" is a DE shell size. Anyway you cut it the common usage is wrong.
At least it would be if the definition was regulated at all. Since it's not it's kind of hard to argue that a DB-9 isn't just another name for a DE-9 given how even manufacturers of connectors [farnell.com] are using that nomenclature.
Re:It was the first standard for video? (Score:5, Informative)
Of course I would say DB-9 has been far from ubiquitous for quite a few years. Most boards have a header for it (not much reason to not have that). Even in servers, they increasingly omit a physical connection (favoring instead using network to get serial port data).
Last generation of desktop computers I've routinely worked with at work, Dell Optiplex 7010, has DB-9 serial, and it looks like the 4th Quarter 2015 Dell Optiplex 7040 still has a DB-9 serial port as well.
I had to do firmware updates on some Fluke network testers last week. Admittedly these were slightly older models, but the update gave them the ability to identify 1G advertisement from the switch, to do in-line PoE voltage monitoring, to identify appliance/voice VLAN, and to do identify CDP from the switch. Doing this required the use of a serial cable with good old pins 2, 3, and 5 for receive, transmit, and ground respectively. It was harder to get the serial-part of the process going than it should have been, trying to use a serial-less Windows 8.1 laptop with adapters was a challenge and I finally ended up getting out a WYSE 52 terminal and null-modem cable to see if the software on the PC was actually sending anything out through Microsoft's weird wrappers on top of the keyspan USB to serial adapter, then establishing that yes, the software was talking, try to figure out why the scanner wasn't acknowledging. Turns out that was problems with the socket for the 2.5mm phono jack on the scanner itself.
Anyway, as much as some of us might like for RS-232 serial to be dead it doesn't look like we can write it off entirely any time soon, given the sheer expense of the kinds of devices that we have to support that use it. It's a lot easier to give up VGA because monitors, by and large, are not expensive, and even when they are there will still be methods to get analog video to them either through add-in cards or through conversion devices.
Yes, VGA not dead (Score:3)
It's a lot easier to give up VGA because monitors, by and large, are not expensive, and even when they are there will still be methods to get analog video to them either through add-in cards or through conversion devices.
This is precisely what is happening on Intel Skylake motherboards. The chipset or processor doesn't support VGA, but there are like three lines for display outputs internally. It is common already that a converter is built onto the motherboard so that one of the output ends up as VGA instead of digital, and that is cheap enough.
Cursory look at current motherboards ("bottom of the barrel" on price) tells me the COM port is quite common still, and even LPT is still available on the back sometimes.
See :
Gigabyt
Re: (Score:3)
In a previous job (now some 9 years ago) I had several big-metal servers which could only have their early boot systems (like a BIOS but not) accessed over serial ports, and for repairs and maintenance a serial terminal was critical.
Everybody else used software terminal emulators with USB adapters but it was a constant nightmare, it would be months between uses and the next time one was needed, invariably something would get messed up between the drivers and the emulator.
So I hunted around and bought an old
Re:It was the first standard for video? (Score:4, Informative)
The connector is gone, but the need for something equivalent persists. Network, adapters etc are nice, but they are very complicated to use; complicated enough to require a device driver [stack], which implies a booted operating system.
Until the OS is booted, all those ports are dark, IOW, one cannot use them for debugging the boot process, or the (booting) loader and kernel. The IBM PC, as much as I despise it, makes using the serial port trivial, since the BIOS effectively has a device driver for it (although manually driving it isn't much of a big deal either).
It takes:
mov ah 1
mov al <char>
mov dx 0
int 14h
to vomit <char> out the serial port from 16-bit real mode (i.e. the mode the loader starts in)
So one way or another, a serial port (equivalent) will persist. It might get a little harder to access, though (e.g. some Android phones have their serial console going out the audio jack...), but it can't be done away with altogether.
Re: (Score:3)
serial ports were around back when the power cable was still attached
hell serial ports predate computers
9-pin serial ports were a nonstandard "optimization" introduced with the PC/AT, which was in the early 1980s. These ports have arguably have been more dead than the VGA connector for some time. A couple of motherboards I bought this year still happen to have VGA connectors, but no external 9-pin serial port.
Re: (Score:2, Funny)
It was the first standard that most millenials ever had to deal with. That's the recognized standard for fact checking on the Internet.
Re: (Score:3)
Rob Malda just had an aneurysm either that or he's laughing his ass off.
Re: (Score:2)
He's in the Taco Cave enjoying all the Taco Cash he got for selling this joint.
Laughing his ass off, I hope.
Re: (Score:3)
What do you expect - it's from hackaday - you just know it's bullsh*t written "because we need to write something."
Add in Hercules port, parallel port.
Also
Most motherboards no longer have the connector,
I guess he hasn't bought a recent laptop - mine has both hdma and vga. And even this gamer laptop at tiger direct [tigerdirect.ca] has vga out.
ASUS ROG G751JT-DB73 - Core i7 4720HQ / 2.6 GHz - Windows 8.1 64-bit - 16 GB RAM - 256 GB SSD + 1 TB HDD - DVD-Writer - 17.3" 1920 x 1080 ( Full HD ) - NVIDIA GeForce GTX 970M - 802.11ac - black
Connections & Expansion
Interfaces: Headphone/SPDIF combo jack ¦ Thunderbolt ¦ HDMI ¦ 4 x USB 3.0 ¦ LAN ¦ VGA ¦ Microphone input
Memory Card Reader: 2 in 1 ( SD Card, MultiMediaCard )
Re: (Score:2)
It's common to slap on a larger laptop, but the 'ultrabook' level thickness skips a lot of ports (on mine, it even skips an RJ-45 ethernet, though it still has an ethernet port and a passive breakout to provide an RJ-45 for it).
Re: (Score:2)
I don't see the allure of an ultrabook. I've always used my laptops only as desktop replacements plugged into a second screen for dual monitor use (quiet counts for a lot when you notice just how much nicer the work environment becomes when the pc power supply and cpu fan whines are gone).
Re: It was the first standard for video? (Score:2, Insightful)
So you don't see the point of an ultra book because you don't use your laptop as an actual mobile device. Surprising.
Re: It was the first standard for video? (Score:5, Funny)
Re:It was the first standard for video? (Score:5, Insightful)
That.. is the single most misguided reason I've ever heard for choosing a laptop over a desktop. My desktop PC was built with quiet components If I push the graphics really hard (games, not HD movies) I can hear the fan on that start up.
For my trouble, I get more RAM, a more powerful CPU, better graphics, and far more expansion ports and my laptop stays on a shelf unless I'm travelling or I need an on site computer for a contract and in both of those cases size really does matter..
Re: (Score:3)
I could never downgrade to using a laptop screen for a dual monitor setup. I prefer to have the screens next to each other (dual 22.5") on a proper ergonomic stand since I spend many hours on working and staring downward will cause a massive neck cramp.
Also, only an american would assume I do most of my travelling by car.
Re: (Score:3)
Desktops are no longer cost efficient now that laptops are so cheap. $400 gets you 8 gigs, quad core, and a screen, so you only need one screen to have a dual-monitor setup.
Depends on performance now doesn't it. $400 gets you 8 gigs (where by the way? $400 typically gets you 4gigs), quad core mobile variant that's slow at best and thermally throttles if you look at it funny at worst, and a tiny tiny screen with uneven backlight bleeding and if you're lucky crap viewing angles, 6-bit colour, and glossy enough to see the sad look on your face at the underwhelming performance when you realise that you spend a lot of money for something that was primarily designed to be small.
You
Re: (Score:3)
Um, WHAT THE FUCK???
CGA? EGA? MDA? Hercules? NTSC? PAL? SECAM?
"and is by far the longest-lived port on the PC."
Serial port?
Who the fuck wrote this piece of shit revisionist ignorant blurb?
Not to mention a total BS premise. They have been saying the serial port is dead for DECADES! And I can't tell you how often my USB floppy drive has been a life saver at a client with critical data on floppy and no floppy drives to read it!
And does the moron know that the DVI ports that are now on most motherboards HAVE FUCKING VGA BUILT IN?!?
Re: (Score:3)
It was the first decent standard for MS-DOS/Windows video. Everything before it was a pile of shit, where you needed a new standard every time a higher resolution became available. Remember separate modes for text and low-res graphics? Remember how painful those early PC monitors were to work on?
Re:It was the first standard for video? (Score:5, Informative)
It was evolutionary rather than revolutionary. The EGA was its immediate predecessor and was pretty good, except for some reason the resolution didn't conform to the 4x3 aspect ratio which was standard at the time.
VGA came along with 640x480 res, which was decent. But within a couple years that was obsolete, so there came 800x600 (IBM later called that XGA) and then a succession of "Super VGA" "standards" (as in the joke, there are so many of them) all with different resolutions higher than 800x600, and some supporting wide aspect ratios.
BTW VGA also supported two low res, 256 color modes, mode 13h and Mode X [wikipedia.org], which became favorites of DOS gamers because they were well suited for smooth animation while not requiring exorbitant amounts of installed RAM.
Re: (Score:3, Informative)
Re: (Score:3)
If you were extremely charitable you might describe it as the first standard for computer video output which could actually handle video, except even that would be wrong. The first such standard was RGB with separated syncs, then came sync-on-green RGB. VGA came from that and added the communications channel.
Re: (Score:3)
CGA = Crap Graphic Adaptor = no porn
EGA = Extra Graphic Adaptor = some porn
VGA = Very Graphic Adaptor = loads of porn
Re: (Score:3)
Oh goodness. I still have a PCMCIA MPEG co-processor in my desktop drawer from my first laptop (a Satellite Pro 435CDS, 32MB RAM, Pentium I, etc. Dual booted Windows 98 and Red Hat 6.1. Good times - that thing probably could have taken a bullet for me).
Eventually... But not yet (Score:5, Informative)
Re:Eventually... But not yet (Score:5, Interesting)
I still see it on monitors and TVs.
And projectors! How else can I connect to those projectors if not VGA? And their life-span is probably decades. I think the new projectors actually have alternatives to VGA optional, but usually this is HDMI, which I predict is going away sooner than VGA. (HDMI being replaced by DP)
Re:Eventually... But not yet (Score:5, Insightful)
And projectors! How else can I connect to those projectors if not VGA? And their life-span is probably decades. I think the new projectors actually have alternatives to VGA optional, but usually this is HDMI,
THIS. The person who wrote TFA must not do any presentations anywhere ever. Yes, new projectors often have other inputs, but that's often irrelevant in a conference venue or a classroom or whatever, where often there's ONE cable that's presented to you to hook in your laptop -- and it's a VGA cable (often with an audio headphone jack plug, if you need it).
That's the same as it was most places decades ago. If your laptop today doesn't have a VGA port, you get a dongle. Everybody who needs to plug into a projector has a standard VGA one. Switching to another standard would require a major initiative, since this is NOT a place where you can just adopt a different standard on the fly.
Probably tens of thousands of people show up an unfamiliar place every day and expect to be able to plug a laptop into a projector to give a presentation. For better or for worse, everybody knows that you bring a connector for VGA, and if you change that, you need to be darn sure all of your presenters know that (and, even if they do, lots of people who give talks can be old and won't understand if they show up with a laptop that doesn't connect to something else, so you'll be scrambling at the last minute to move stuff to another computer or whatever).
I don't see this standard switching anytime soon -- it tends to be used in high-profile, time-sensitive situations where people expect to be able to plug a computer in and have it work instantly. Unless a venue is going to provide a dongle that fits every possible port on the planet (and most don't), it will be really hard to switch.
The only thing that will eventually allow the switch won't be a new port standard, but rather wireless broadcast of video directly to the projector. It's still quite rare, but it's feasible and the only way to get out of the VGA rut. I doubt HDMI/DP/whatever is EVER going to overcome VGA for such applications -- the next "standard" won't have cables at all.
Re: (Score:3)
You know you can get 2560x1920 over a VGA port right?
Anyway, plenty of high res projectors still use VGA, because they're (a) on the ceiling (b) wired in and (c) VGA supports very very long cables and none of the other standards do except display port, but it's not all that common compared to HDMI. HDMI is shocking.
Re:Eventually... But not yet (Score:5, Informative)
Mine does? I have a 4k screen plugged in on displayport so that it can do 60fps. It definitely does audio as well...
8 channel audio + gbps on aux lane. Short cable th (Score:3)
The Displayport specification optional audio up to 8 channels, plus gigabits available on the auxiliary for whatever else a manufacturer wants to support.
The main advantage of HDMI is longer cables. Displayport is currently speced at three meters, while HDMI is longer (10 meters?). Of course with either standard you can use a longer cable and it may work with your specific devices and that specific cable, but it's not guaranteed beyond the specified lengths.
Re: (Score:3)
Even HDMI is not guaranteed at 10m, there is actually no length requirement for HDMI. To maintain the minimum spec for HDMI over 15m, you need very high quality cabling or fiber. DisplayPort is by spec 15m (50ft). Additionally DisplayPort runs over both Thunderbolt and USB-C without any conversions.
Re: (Score:3)
What you claim is untrue. https://superuser.com/question... [superuser.com]
In particujlar, this bit:
From displayport.org/faq:
Q. Does DisplayPort also support audio?
A. Yes, DisplayPort supports multi-channel audio and many advanced audio features. DisplayPort to HDMI adapters also include the ability to support HDMI audio.
I wound up on this page however because my DisplayPort audio wasn't working. After searching many sites, I learned that the video driver (vs. audio driver) is responsible for the DisplayPort audio; while
Re: (Score:3)
Maybe you are thinking of return audio? DisplayPort has audio. One trick DP offers is a single cable hooking up two monitors - handy from a laptop.
But mainly for any given generation, DP has better throughput - more resolution/refresh rate.
Re: (Score:2)
Essentially arbitrary numbers of monitors, even, limited only by the bandwidth of the interface. DP1.2 can do up to four 1920x1200 displays or two 2560x1600 or a single 4k, all at 60Hz.
Re: (Score:3)
I don't understand why people want DisplayPort
HDMI is a synchronous interface — video and audio data use up fixed parts of a frame's time. One might almost say that's it's just a digital mapping of an analog television signal.
DisplayPort is a packetised interface — video, audio, and whatever else you might want to carry over it can be sent at any time. Because of that, it's a little more expensive to implement (you need more hardware in the device), but it's immensely more flexible: you could carry multiple low-resolution video streams
Re: (Score:2)
DisplayPort does audio.
DisplayPort is easily converted into VGA or HDMI with reasonably cheap converters. HDMI 1.3 or so/DVI are even doable with passive adapters if the source supports them.
DisplayPort can be carried by the USB Type C infrastructure.
Literally, the only consumer-facing advantage of HDMI is that it's a lot more popular.
Re: (Score:3)
As others have pointed out, DisplayPort does do audio.
There are two main reasons to favour DisplayPort over HDMI on PC, though both of them are situational. The first is that it supports multiple monitors from a single port/cable.
The second, perhaps more significant, is that the first generation of "affordable" 4k monitors (ie. sub $1,000) generally don't have HDMI 2.0 support. This means that if you want 60Hz output at 4k, you need to use a DisplayPort cable.
Re: (Score:2)
Not going anywhere in data centers (Score:2)
I have bought some Dell R430 and R730 servers, which are latest generation (Haswell based Xeons, DDR4 RAM) and guess what their one and only video output format is? That's right, a VGA port. No DVI, no DP, just VGA. No surprise either: Go have a look at high end network'd KVMs. They are all VGA. It works, so it is staying around in that space (same deal as serial for that matter).
It is certainly a standard on the decline, digital transmission makes more sense particularly since our displays are digital thes
Re: (Score:3, Informative)
Those resolutions technically not VGA. The original VGA standard only went to 720x400 in monochrome text mode, 640x480 graphics with 16 colors - although some nonstandard signal timings could push it a bit higher.
Once SVGA and MultiSync became available, lots of "unofficial" super-high resolution modes popped up, but about the best you can do with a standard VGA cable is 2560x1600.
Source: https://en.wikipedia.org/wiki/VGA_connector#Cable_quality [wikipedia.org]
not first standard (Score:5, Informative)
"It was the first standard for video" - not quite
Perhaps NTSC monochrome RS-170 on a coax connector might be the first standard for video.
And even in the IBM PC world, monochrome and CGA were earlier.
Of course, perhaps the author of this article wasn't alive back then, and hasn't yet learned to "check your sources before publishing"
HDMI=mostly disadvantages (Score:4, Interesting)
Sure the world must move on some day, but I'd like to point out that at least for me HDMI has only ever brought disadvantages. Apart from severe problems with dealing with several audio channels or routing audio to external analog speakers, it also had and still has the charming property of turning the whole display black for 1-2 seconds from time to time. Not to speak of countless problems with false colors and red-tinted display on my Philips TV.
Frankly speaking, after years of using it I have come to the conclusion that HDMI is just shit in comparison to analog VGA, no matter how much seemingly more clear the display may be. I believe it was mainly forced on everyone for introducing DRM crap and to sell expensive cables and VGA would do well enough. Digital is not always the best.
Re: (Score:3)
I believe it was mainly forced on everyone for introducing DRM crap and to sell expensive cables and VGA would do well enough. Digital is not always the best.
This! And it is the reason they are trying to kill VGA.
Re: (Score:2)
Apart from severe problems with dealing with several audio channels or routing audio to external analog speakers, it also had and still has the charming property of turning the whole display black for 1-2 seconds from time to time. Not to speak of countless problems with false colors and red-tinted display on my Philips TV.
I think I found the source of your problem ... :-)
Re:HDMI=mostly disadvantages (Score:5, Informative)
HDMI sucks [bluejeanscable.com]:
HDMI is a horrid format; it was badly thought out and badly designed, and the failures of its design are so apparent that they could have been addressed and resolved with very little fuss. Why they weren't, exactly, is really anyone's guess, but the key has to be that the standard was not intended to provide a benefit to the consumer, but to such content providers as movie studios and the like. It would have been in the consumer's best interests to develop a standard that was robust and reliable over distance, that could be switched, amplified, and distributed economically, and that connects securely to devices; but the consumer's interests were, sadly, not really a priority for the developers of the HDMI standard. ... HDMI has presented a few problems. Unlike analog component video, the signal is not robust over distance because it was designed to run balanced when it should have been run unbalanced (SDI, the commercial digital video standard, can be run hundreds of feet over a single coax without any performance issues); the HDMI cable is a complicated rat's-nest arrangement involving nineteen conductors; switches, repeaters and distribution amplifiers for use with HDMI cable, by virtue of this complicated scheme, are made unnecessarily complicated and troublesome; and the HDMI cable plug is prone to falling out of the jack with the slightest tug. On the plus side, in the great majority of simple installations,
Re: (Score:2)
I love their candor, and as a result I own a lot of Blue Jeans' cables. They're good cables, but I regret it now. They run from a 4x4 matrix in my basement to every room in the house, and work flawlessly, but now it's cheaper just to add an Apple TV or Amazon Fire Stick to every TV rather than run HDMI from a central HTPC.
Video Games Adapter (Score:3)
When it first came out I remember thinking of it's acronym that way, instead of Video Graphics Array.
monitors (Score:3)
Re: (Score:3)
I still have at least two composite-input monochrome monitors that work perfectly fine, or did when I last tried them -- probably twenty years or so ago. I intend one day to haul out the old TRS-80 Model I and see if it still works. If not, I stand a really good chance of successfully repairing it myself, unlike most electronics released in the last couple of decades. (Of course, it's more likely to work than more recent equipment, if only because it predates the biggest capacitor-quality catastrophes.)
But
Re: (Score:2)
Re: (Score:2)
I would tend to call "Weighs a fraction of what a 1" thick laptop would and is much easier to carry around when traveling" a functional reason.
Source: I learned the hard way that carrying around a 17" gaming laptop that weight in at 23 lbs with power supply was not functionally sustainable when travelling to conferences.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Convenience is functionality. Lugging a 20 (or whatever) lb laptop together with a bunch of other luggage through a crowded airport, or half a mile down a city street from train station to hotel is noticable over time. Certainly you CAN do, but when you can get the same computing power in a lighter package, and travelling is part of your job, why wouldn't you?
Now if I was just driving to the office and walking 60 feet or so at each end, then I could see favoring a heavy laptop, but not just so I could have
Re: (Score:3)
Re: (Score:2)
However, I don't see the need to get a laptop from an inch thick to 1/4-inch thick either. That comes from the pursuit of the new shiny toy that looks cool, not from a functional reason.
Actually, small is good for some people. Essentially, all they need is a Windows tablet with a real keyboard, and they do not want to lug around a lot of weight.
Re: (Score:2)
Note that they offer mDP to VGA adapters. If you really want to be compatible with those monitors (or if you are frequently expected to hook up to random projection setups, and I've seen some conference rooms that were *constructed* in 2015 only provide VGA, strangely enough.
Re: (Score:3)
Note to new Slashdot management (Score:4, Funny)
You think we can get some better snacks in the commenter lounge vending machine? I know there's going to be belt-tightening, but Bit-O-Honey and yogurt granola bars aren't going to cut it.
Re: (Score:2)
Looks Antiquated (Score:5, Funny)
Yea, VGA needs to FOAD because it looks antiquated.
Come to think of it, you're looking rather antiquated. What to do about you?
Slashdot, it looks totally antiquated too. I can;t wait for the new owner to implement a beta interface design that better monetizes community synergies. Make sure you model it after flat UI design so no one can see or find anything. It'll look so sexy!
RIP DVI... (Score:5, Interesting)
I was thinking about VGA the other day (Score:5, Interesting)
It's still almost everywhere. At work we still have VGA monitors and docks. The monitor also has a digital connector of some kind, but never more than two other flavors. My TVs have VGA.
You know what's great about VGA sticking around? Older equipment that was often expensive and built like a tank still works. Projectors, CRTs, and KVMs. I've seen retrocomputer enthusiasts build VGA adapters for all kinds of old systems. It's nice to have something that you can rely on when you're traveling; if you have a VGA dongle you know you can work.
I hope VGA has a couple more decades in it, and with the slow adoption of 4K TVs, it just might.
Anyone know if Netcraft confirmed it yet? (Score:2)
Oh noes. the radiations! (Score:5, Insightful)
I thought this bullshit line of thought died out in the 60's or 70's.
Not in the professional/server space (Score:3)
I was just in a meeting with Dell about their next generation. There's no demand for anything but VGA because all infrastructure is VGA based. There isn't even IPMI for anything but VGA.
Re: (Score:3)
The female may be durable, but I've seen my fair share of bent pins on the male end.
Now you could say the traditional retention screws are more secure, but I really haven't had an issue with connector security for video. For one some have a much easier retention clip. And for another I'd rather the connection come apart than put stress on the system if something severe were to happen.
I agree with sentiment about serviceability and cooling, but the VGA plug doesn't help that.
Re: (Score:3)
The female may be durable, but I've seen my fair share of bent pins on the male end.
That's what she said!
Re: (Score:2)
Re: (Score:2)
I will also say that that has worked for me most of the time, though some manufacturers use more brittle material or something, because I have also seen broken pins.
Either way, I haven't had a durability issue with HDMI/DP, that take a cue from card edge connectors and have a relatively beefy support for the contacts.
Re: (Score:2)
I want well ventilated, repairable devices. At least the VGA plug isn't as flimsy as some modern connectors.
But soldered on memory that you can not upgrade and non-replaceable batteries are the future!
Re: (Score:2)
Not small enough for the Raspberry Pi Zero, which uses a mini-HDMI port.
Re: (Score:2)
Mini HDMI is larger than USB-C.
Re: (Score:2)
No, they replaced that with Mini DisplayPort, HDMI, Thunderbolt and USB-C. All of which are very small.
Not on anything I own... Other then the tablet which I do not plan on hooking up to a large screen any time soon. (ever)
Re: (Score:2)
Hercules?
Re: (Score:2)
Printers didn't have them.
The dot matrix printer and monochrome monitor for my Commodore 64 had built-in power cables. The only removable power cable I had back then was for the 5.25" external floppy drive. Not sure if built-in power cables were a common feature for home microcomputer peripherals in the 1980's. I was the only kid in my neighborhood with a complete setup, but the girls still thought I came from a poor family because I didn't have an Apple ][.
Re: (Score:2)
So if you want to call it for VGA based on it not being ubiquitous, then VGA: 1987~2015(ish)
For DB9 for use with standard serial, I'd say: 1969~2005(ish). Serial is alive and well, but few things directly provide a DB-9 port. So it 'died' first, though it lived longer.
Re: I'll stick with VGA and SDI as long as possibl (Score:3, Insightful)
Your concern seems to be with HDCP, not HDMI; the latter is just DVI with an extended table of resolutions, hence why passive cables work. Your ThinkPad is not encoding HDCP over the HDMI connector.
I don't know why people like the ghosting that occurs when going through a DAC and ADC to use VGA on a digital flat panel. Trying to sync on the analog timing signals is a mess. I personally can see the artifacts and it hurts my eyes.
Re: (Score:3)
I want my video port to send the video signal to my monitor without hand-shaking, asking for permissions and assuming I'm a pirate and kitten-murderer.
I notice ***all*** broadcasters and serious videographers use SDI because it is uncompressed unencrypted HD video that includes audio all in a single coax with a locking BNC connector. None of this DRM baggage, and those big boys on a video shoot need to connect cameras to switchers and recorders need them to promptly feed the video instead of spending time on WTF this ain't displaying (but you probably already know that). I'm surveying camera equipment, and these days it is all HD, and to feed signals to
No DRM on PS/2 port, unlike on HDMI (Score:3)
The PS/2 port also doesn't have to deal with Digital Restrictions Management. There are plenty of adapters that translate between the PS/2 keyboard and mouse protocol and the USB Human Interface Device protocol, allowing use of legacy keyboards and mice with legacy-free PCs. Likewise, HDMI without HDCP could be translated into VGA and analog audio signals by an external DAC. But it's illegal (via anti-circumvention law) to produce such an adapter compatible with HDCP, and it may be illegal (via license term