Ten Gadgets That Defined the Decade 313
Corpuscavernosa writes "As 2009 winds down and we try to come up with new and clever ways of referring to the early years of this century, there's really only one thing left to do: declare our ten favorite gadgets of the aughts and show them off in chronological order. It's arguable that if this wasn't the decade of gadgets, it was certainly a decade shaped by gadgets — one which saw the birth of a new kind of connectedness. In just ten years time, gadgets have touched almost every aspect of our daily lives, and personal technology has come into its own in a way never before seen. It's a decade that's been marked the ubiquity of the internet, the downfall of the desktop, and the series finale of Friends, but we've boiled it down to the ten devices we've loved the most and worked the hardest over the past ten years. We even had some of our friends in the tech community chime in with their picks on what they thought was the gadget or tech of the decade."
XP and OS X? (Score:5, Interesting)
trinkets or tools? (Score:4, Interesting)
If you said the 80's I'd say Rubik's Cube, Simon, and other toys. If you mean useful tools and not just novelties, the 80's is when the PC became more than just a hobbyist device. You had early brick cell phones but they truly came into their own in the 90's. Likewise, laptops went from being novelties to useful and only became more awesome in the 00's.
I think setting a round number to meet is kind of dumb. What if there weren't ten notable devices?
I think that the ipod and iphone are probably the most significant devices but not just for what they are but for what they presage. Ipod's music on the go is nice but Apple breaking into the music industry and becoming a major distributor has a far greater impact on the landscape. Iphone put a crack in the usual walled garden arrangement of US carriers and is showing competitors how to do things. Handheld computers have been around for ages but the ipod/phone is bringing us to the point at which there's enough market saturation to change the way we do things.
When I was a kid, only us geeks had computers. You went to school and you looked for other freaks and outcasts. That's where you were likely to find other computer people. And we used computers for the usual geeky stuff, socializing over BBS, playing games, and being geeks. With the arrival of the internet, non-geek households started getting computers. And the early social scene really sucked in the rest of the youth audience. By the time I was in college, everyone had their own computers. And the more ways there were to socialize on them, the more popular they got. Yeah, in the past you had phreakers who were into phones for the tech of it and you had teenage girls who spent just as much time on the phone but only for gossiping with friends. Still, the phone had an impact on society, the way people live.
I bring up the social sites because the phones are providing as much functionality on them as a standard computer. And all of this is having an impact. A lot of people in my age range are going without cable tv, they can download whatever they want to watch. They are dropping landlines since the cell does everything they need. Traditional media channels are going to get boned. And all of this will have a cultural impact.
I can shop on my phone. I can download podcasts, videocasts, tv shows, music, books, audiobooks, access the net, and this is only the beginning. I think we're seeing the beginning of the destruction of mainstream media. Yeah, many have made that call before but I see it happening. Change comes with the youth and ends when the old generation dies off. AM radio is on its last legs. I don't know anyone who listens to FM radio anymore, not anyone under 50. MTV continues to be a joke and sets no trends anymore. Authors are cutting deals directly with Amazon to publish on Kindle. Podcasts and videocasts are gaining wider audiences and network/cable television continues to flounder with their broken advertising model. The shows may have a huge audience but the Neilsen ratings cannot account for it. This is why Family Guy got cancelled only to shock Fox by being a top-selling DVD of all time. They had no idea the kind of reach that show had and brought it back.
Everything I'm mentioning above I think is setting the stage for uncontrolled culture. It took big bucks to fund mass media back in the day. Now any yabob on Twitter can reach an audience in seconds that would make William Randolph Hearst get wood. And the cost? Nothing! They say never pick a fight with a man who buys ink by the barrel. How much worse does it get when the electrons are free?
Now it's possible that the audience won't fracture that much. Give kids free reign in a supermarket to eat anything they want and you know they're heading to the candy section regardless of how well the veggie section is stocked. Give the masses unfettered access to all media and they might end up gravitating back to the old celebrities or create new celebrities who will take the place of the old. It might still be possible to shape and mold public opinion as easily as before. But I have a gut feeling things could turn out differently in the 21st century. If the 20th century was defined by mass media, the 21st could be defined by what comes next.
Dude, it's on my phone. (Score:3, Interesting)
And it will find the nearest Starbucks for me and tell me if they're open.
Yeah! Why isn't GPS on that list?
Playstation 2 = Gadget (Score:4, Interesting)
Re:Gadgets (Score:5, Interesting)
Re:360? (Score:5, Interesting)
Really, the 360 as the video game console of the decade? The PS2 really changed things more than the 360 for the simple reason of the DVD player.
For that matter, the first Xbox was a lot more influential than the 360, because it was new competition for Sony. The 360 was just an incremental update.
Re:The decade isn't over yet! (Score:5, Interesting)
Simple Simon games (Score:5, Interesting)
I remember visiting Japan for the first time in 1999. Of course I wandered in to a video game arcade to check out the scene. I laughed at the poor Japanese and their imitative video games - look, that guy is just touching the controls in the exact way that the machine tells him to! What a retarded game! It's no game at all, he's just mindlessly copying what the machine tells him to do in exact sequence...no more "fun" than working on an assembly line. A children's game, really...we had the same thing called Simple Simon [bigredtoybox.com] when I was a kid...these Japanese video games even have the same four colors. I mean, there could at least be a dozen colors or something, make it difficult. And the controller shaped like a guitar? Oh man, how pathetic: if you're going to be cool and play the guitar, be cool and learn the goddamn instrument, it ain't that hard. Only Japanese people, with their tolerance of tedium and their relentless drive to copy, could possibly "enjoy" such a "game".
This Christmas, I'm passed out from wine, and when I vaguely become aware, I hear these overplayed classic rock tunes accompanied by clicking. I go out, and sure enough, three family members are staring at the TV, imitating the colors on the screen, each lost in his own world with no communication. Just this eerie clicking, accompanied by this sound that I identified from when I was in marching band and the drummers had practice pads. There is no talking, no rocking out, no jumping around the room flailing at an ax like Eddie Van Halen on coke. Their faces are stone masks of concentration. The song finishes, and my family grins at each other, "Wow, we sure had a fun time interacting. What a great game that brings us together!"
Shows you how much I know. I also thought "Magic: the Gathering" was a stupid game because it was so wildly unbalanced. Who would want to play that, a game where you can win not by superior skill or even dumb luck, but simply by spending more money than your opponent?
"Click wheel Only" (Score:3, Interesting)
These were the first iPods with the modern Click Wheel interface only and full USB 2.0 interface support.
What does that mean? I had the very first iPod. All it had was a click wheel. In fact it was better than a few later generations, since the wheel actually turned and thus gave more feedback.
As for "full USB 2.0 interface", well that was nice for Windows users but a step back from the Firewire400 the original sported. It allowed the original iPod to load songs just as fast as any later USB 2.0 model, and made 5GB of storage practical instead of a chore.
Everything that made the iPod what it was was there from the start - iTunes, fast transfers, click wheel interface, easy UI. I don't think saying any later generation made it "come of age" makes that much sense, apart from the move to support Windows users as well which was key to growth.
Nokia N900 (Score:5, Interesting)
Re:Only one (Score:1, Interesting)
I'd agree with you about the LCD screen, and it going from laptops (and at the turn of the century, you paid big bucks for a laptop, assuming you wanted an active matrix display) to being so inexpensive that a $30 "picture frame" can ship, and usually has few to no blown pixels.
CRTs have a few advantages such as faster response and better color saturation, but since I'm not doing tasks which require me to know that Pantone 15-5519 TCX is 2009's color of the year, an LCD screen which doesn't eat all my desk space is good enough.
This list kind of sucks. (Score:3, Interesting)
This compilation is really short-sighted, though they seem to have gotten a few things right.
I definitely agree, though, that the RAZR did a lot to force manufacturers to slim their phones, despite it being a pretty mediocre phone on its own. I also agree that the Treo 650 was basically responsible for putting smartphones on the map for most people, though the Blackberry popularized push e-mail to the point of making it an expectancy for most people nowadays.
Re:Say what? (Score:3, Interesting)
This guy/gal needs to have their head examined
As an enthusiastic OS X user, though, I'd conceed that the first few releases were not much use. However, that's mainly because of lack of native software support - it always looked a million dollars. Mind you, he does seem to have a revisionist history concerning the original reaction to XP...
The big achievement of OS X, however, was that in the space of a few years, Apple moved their entire user base over to a completely new, non-binary compatible, UNIX-based system. XP was always hamstrung by legacy issues.
Re:XP and OS X? (Score:3, Interesting)
Fail, fail: First of all, plug and play is a standard feature of PCI, and NT4 couldn't support PCI to the extent that it does without it. Second of all, there is a secret but easy way to enable ISA PnP in NT4 [fredhanson.com].
NT4 is a gigantic piece of shit, and so is DirectX; Direct3D is an abortion which would never have happened if those assholes at 3DFX had gone with MiniGL from the get-go instead of going the egotistical route, and allowing their collective hubris to cause them to create a wholly new 3D API, something which was totally unneeded as well as undesirable at the time... or ever. OpenGL's slow pace was not a problem even then, as the full functionality of the 3DFX chip had analogues in OpenGL, including their much-lauded multitexturing support, in the form of SGIS_MULTITEXTURE — now an integral part of the standard as ARB_multitexture [opengl.org], but even then perfectly usable under its original, vendor-specific (and -derived) name.
The biggest irony here is that you could, for obscene amounts of money, acquire 3D accelerators which operated under NT 3.51, especially including examples from 3D Labs. Most of the changes from NT 3.51 to NT 4 could have been made without making the single largest change, which was the merge of the Kernel and GDI memory spaces to improve graphics performance. But had the graphics accelerator revolution arrived sooner, that might never have even happened. Meanwhile, NT4's facelift could be faked trivially enough by appropriating the shell and required, upgraded DLLs from Windows 95 and slapping them onto NT 3.51. 3.51's biggest problem is its limited addressing, which prevents the use of filesystems larger than 2 gigabytes (among other problems.) In fact, NT 3.51 only has 4GB of virtual memory. But there seems no reason why these failings could not have been corrected without merging memory spaces that caused NT4 to be substantially less reliable than NT 3.51.
In any case, curse you, Microsoft!, and a special shout-out to 3DFX: fuck you!
Re:360? (Score:2, Interesting)
If we're talking about a console that is defining where new consoles have to start and grow from, then I totally agree -- the Xbox 360 has set the bar for new systems in the coming decade.
The PS2, however, made owning a console for gaming mainstream. Of course, this also occurred at about the same time that those of us who grew up with (or knowing someone with) a console became adults (that sounds weird, doesn't it?) so it's a hard call as to which was more influential -- the PS2 or our expectations.
If you want to identify trend changers for the decade, I have to side slightly higher on the PS2 side. The Xbox 360, and to a lesser extent the Wii, with it's motion sensing apparatus and focus on non-traditional gamers, are definitely setting the stage for the future; but had the PS2 not been as popular and pervasive as it was, the Xbox 360 would never have seen the light of day -- high end gaming would have remained the province of the power-user computer owner, and not the run-of-the-mill joe sixpack wanting to do more with his TV.
The PS3 was a disappointment -- it's a beefed up PS2 with newer/better hardware, but is a study in failed promises (lack of ongoing PS2 support, etc.) and lost opportunities to change the landscape... The PS2 defined a landscape... the PS3 is riding in that same landscape, while the Xbox 360 is expanding it.
The PS2 set the console stage for 2000-2009. The next iteration of the Xbox, after considering the few things the Wii did right, will set the stage for 2010-2019. One could argue that it already does set that stage, but it's early enough I expect them to push the bar up soon, and that's what our children will be using as their measuring stick in 2020.
--
I drank what?
Re:The list (Score:2, Interesting)
And in the meantime, the USB flash drive [wikipedia.org] was completely missed out. (credit: denzacar)
Am I too poor to buy the above items, or is this list a mismatch to most of our experiences?
Re:The decade isn't over yet! (Score:2, Interesting)
You do realise that any 10 years is a decade? The year 10 in fact belongs in ten decades - as does any year.
But traditionally people group the years into blocks that run X0-X9, probably because it's easy to say things like "eighties" and "nineties". This is the "naughties", which is 2000-2009.
If you want to run your own article next year for 2001-2010, no one is stopping you. But that's got nothing to do with this article, which has nothing wrong with it regarding the years chosen.
Re:The decade isn't over yet! (Score:2, Interesting)
It is a simple case of mixing concepts.
A decade is a period of ten years. This is a clear cut de jure definition.
The eighties or the nineties or the 'x'ies is a perioed running from 1980-1989 or 1990-1999 etc. This is a de facto definition based upon popular consensus.
The definition of when the eighties start has nothing to do with when the Gregorian or Julian calendar began. It is merely a popular way of describing a commonly understood time period.