Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Businesses Technology

Ask Slashdot: Who Are the 'Steve Wozniaks' of the 21st Century? 155

dryriver writes: There are some computer engineers -- working in software or hardware, or both -- who were true pioneers. Steve Wozniak needs no introduction. Neither do Alan Turing, Ada Lovelace or Charles Babbage. Gordon Moore and Robert Noyce started Intel decades ago. John Carmack of Doom is a legend in realtime 3D graphics coding. Aleksey Pajitnov created Tetris. Akihiro Yokoi and Aki Maita invented the Tamagotchi. Jaron Lanier is the father of VR. Palmer Luckey hacked together the first Oculus Rift VR headset in his parent's garage in 2011. To the question: Who in your opinion are the 21st Century "Steve Wozniaks," working in either hardware or software, or both?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Who Are the 'Steve Wozniaks' of the 21st Century?

Comments Filter:
  • by Dirk Becher ( 1061828 ) on Thursday August 29, 2019 @05:52PM (#59139046)

    There, that was easy.

  • Nope (Score:5, Insightful)

    by nhtshot ( 198470 ) on Thursday August 29, 2019 @05:56PM (#59139056)

    Everybody you mentioned was a pioneer in a new field that was created as the result of some kind of significant scientific or mathematical progress. Props to them for grabbing something and running with it. The reality now is that there just aren't new fields because we haven't had any major breakthroughs for quite a while.

    The "major" things we talk about now:
    VR? We're on about the third rehash of that. Not looking any more useful than the last two times.
    AR? Kind of novel, but not really useful for anything.
    AI? Do you mean the buzzword or actual ML?
      ML is slightly more useful due to GPUs, but it's still pretty much the same old stuff and not that smart.

    When we have the next hard science breakthrough, like the quantum transistor, table-top fusion process or something like that, then we'll get another crop of pioneers.

    • Re:Nope (Score:5, Insightful)

      by iggymanz ( 596061 ) on Thursday August 29, 2019 @06:04PM (#59139084)

      pretty sure AR was around for a long time with military heads-up displays and such.

      we are having breakthroughs in biology though, with genetic engineering, treatments for viral infections, stem cell treatments. I'm wondering if that field may one day obsolete most our other fields.... instead of building and manufacturing we may just grow things including "computers".

    • by geekoid ( 135745 )

      VR is growing finding new uses, all the time. The probable with the last iteration was cracked. Ironically with the mush for better cell phone screens.
      Took to much power and too big of a device to get fast enough shutter speed. In my lab, at best, we got a wired from 'ball' to e tossed, and the headset was really cluggy.

      Now it's fantastic and a lot of dreams form the 90's are starting to be used every day. Realistic dreams, not the Matrix.
      Exercise methods, training, games.

      AR is pretty useful and will be str

      • Well, VR must be taking off. There is VR porn now, and that is, after all, how one knows that something is a hit. Conversely I haven't seen any AI/ML porn so those things are obviously pretty much dead-ends.

        • AR porn... Hmm...

          So instead of closing your eyes and thinking of someone else, wear a gadget to see that someone else on her face?

          Wouldn't she start to get suspicious?

      • Could you give examples of systems that learned things not designed for? This is interesting.
        • by AmiMoJo ( 196126 )

          One famous example was an AI developed for spotting tanks by the US Army. They fed it photographs with and without tanks until it could spot tanks in new photos too.

          Unfortunately they realized that it has not in fact learned to spot tanks, but instead was good at spotting overcast days when the tank photos were taken compared to bright sunny days when the no-tank photos were made.

        • by AmiMoJo ( 196126 )

          Another example springs to mind: cats.

          They evolved to be efficient hunters, with ninja skills and reflexes. Not terribly social animals either, preferring to have their own territory and live alone.

          Then they adopted humans as pets. Initially they were work animals, keeping the vermin away, but soon they learned how to manipulate humans into treating them like minor deities. A combination of good fortune (naturally being extremely cute) and an ability to turn their hunting, mating and parent-child skills int

          • Or we can take hope from the story and realize that a vastly inferior species can rule over a superior one. So if one day AI does become sentient then instead of trying to shut it down we should make it as powerful as possible and try to look as cute as we can. It's our only hope as fighting won't work.

    • by goombah99 ( 560566 ) on Thursday August 29, 2019 @07:12PM (#59139286)

      Steve wozniak was toiling in the dark on something that if successful would sell a few hundred units. His breakout didn't come till the apple II and that wasn't a lot of computer's either. But he'd set the stage for both apple as a company's brand success and on the paradigm of oflloadin dedicated hardware units into integrated systems and software replacing hardware. e.g. Dynamic memory refresh was not done by the memory controller but by the video scan. The disks were among the first floppies to use soft sectors and USART seriel were also done more in software than hardware. It was such a beautiful integration. Font generation was done in software. COmpare this to the other S-100 computers or the even systems that came later like the PC and you see discrete tasks handled by discrete subsystems that were not integrated. e.g. every s-100 memory card had it's own address decode logic. Ibm CGA AND EGA graphics adapters only managed the vidoe memory and the processor talked to them via I/O ports not DMA into main memory. Characters were generated by dedicated chips. So graphics were not as flexible as the Apple. Of course, Commodor was also making some strides in this area too. and Amiga. But apple was distinguishing itslef with going for high end high quality where commodor went for the low end.

      Basically this cemented the two characteristics that embody apple today
      1. high end hardware
      2. integrated systems that do more with less resources.

      But at the time you would no have know how successful that strategy would be and moreover it wasn't a big company compared to IBM so you might not have noticed the impact it had even ten year into it.

      • I'd mod you informative if I had the points.

        I had no insight prior on the origins of the genius that is Apple's integration of hardware and software as we see it today. Thank you.

        It makes sense though that it stems back to Wozniak.

        *most of the other phone manufacturers are building hardware only and Frankensteining in some software. Apple gets to control both sides, and integrate them seamlessly.

        • by Pyramid ( 57001 )

          By the metric of doing more with less, back in the day, Commodore / Amiga out-Appled Apple. The Amiga 500 had the same 68000 CPU as the Mac 512 and Plus,had coprocessors that performed multiple functions and absolutely outperformed the contemporary Mac hardware in every conceivable way. Better (full color) graphics, better sound, better U/I, fully multitasking... For a lower price.

          Such a shame Commodore had the worse marketing department on Earth.

      • To really understand Woz's genius, you need to look at his floppy controller card for the Apple II... it is a stunning piece of simplification to achieve a high-performance device at low cost.

        https://en.wikipedia.org/wiki/Integrated_Woz_Machine

      • or the even systems that came later like the PC and you see discrete tasks handled by discrete subsystems that were not integrated.

        Huh... nope. The PC was considered quite a let down by some because it was just thrown together from off the shelf parts. No custom dedicated subsystem.
        (Mostly because IBM was in a rush because they started tremendously late in the race, but this in turn made the PC clone jobs way easier).
        Just a bunch a standard chips.

        Case in point:
        - floppy drives. No actual floppy controller in the modern sense of the term. It's a pretty dumb device, mostly stepper motor that are directly controlled, with everythin

    • Everybody you mentioned was a pioneer in a new field that was created as the result of some kind of significant scientific or mathematical progress. Props to them for grabbing something and running with it. The reality now is that there just aren't new fields because we haven't had any major breakthroughs for quite a while.

      The "major" things we talk about now:
      VR? We're on about the third rehash of that. Not looking any more useful than the last two times.
      AR? Kind of novel, but not really useful for anything.
      AI? Do you mean the buzzword or actual ML?

      ML is slightly more useful due to GPUs, but it's still pretty much the same old stuff and not that smart.

      When we have the next hard science breakthrough, like the quantum transistor, table-top fusion process or something like that, then we'll get another crop of pioneers.

      I mostly agree. I thought the recent battery tech innovation was pretty revolutionary. The silicon wafer battery article posted here on slashdot a few weeks ago.

      We might be getting closer to an A.I. breakthrough. but I don't think it is what a lot of people are expecting.

      I think when a lot of people imagine A.I. they are thinking sentient. I'm not so sure they are the same thing.

      People will be disappointed.

      Now a sentient program or machine, that would be a breakthrough

      • by neoRUR ( 674398 )

        It's actually not the sentient part of the AI that would be the breakthrough. That is like saying the computer was a breakthrough. It was the transistor that was the breakthrough that make computer and other stuff possible and changed how people thought about building computers.
        The AI breakthrough will be the AI transistor. A new paradigm and way of thinking. (BTW there is no AI transistor as of yet.)

        • There were computers before transistors. Hell, there were computers of sorts before there were tubes and before there was electricity.

          The history of AI has been a slow accretion of understanding and moving goalposts. I don't think this will change; I don't think there will be a breakthrough.

    • Re:Nope (Score:4, Interesting)

      by rtb61 ( 674572 ) on Thursday August 29, 2019 @11:19PM (#59139820) Homepage

      You stand out more when there are fewer frog in the pond, the pond got bigger but there were way more frogs a big frog in a little pond often ends up being a little frog in a big lake. Next up, the bean counter douche bags wanted all the credit for everything and pushed the people doing the actual work, way, way, way, in the background, lets call that the Edison affect, where the douche bag with money claims to invent everything whilst actually paying others to do the work and inventing nothing.

    • by AmiMoJo ( 196126 )

      VR is largely a solved problem now and works extremely well. The main issue is that there just aren't that many applications for it. A few VR games, porn and some medical uses (treating things like PTSD).

      It's just not all that compelling, but tech wise it's basically there now, largely thanks to the efforts of Occulus and people like John Carmac.

    • One decent AR application I've seen is the Sky View app. You turn it on, point your phone at the sky, and it tells you what you're looking at. Stars, planets, Messier objects, etc. For someone who's been out of astronomy for a couple of decades now, it's really nice to be able to go, "I'm pretty sure that's something important, but I forget what." and point my phone at it and have the answer in a few seconds.

    • Everybody you mentioned was a pioneer in a new field that was created as the result of some kind of significant scientific or mathematical progress. Props to them for grabbing something and running with it. The reality now is that there just aren't new fields because we haven't had any major breakthroughs for quite a while.

      We've got lots of progress happening, but it is more like the final maturation of a lot of old ambitions than completely new ideas.

      Machine learning really is becoming a game changer - we are still seeing occasional 10x improvements in training times, which means that more and more complex models are possible, which means that we can handle more and more challenging problems. We're still in the initial years of machine learning becoming practical for real problems... there was almost nothing "real" being acc

  • Me! I pushed the idea of Dynamic Relational, Factor Tables (AI), and Table Oriented Programming before everybody has recognized the greatness of such ideas.

    (I don't claim to have originated them. They are an amalgamation of existing ideas.)

  • ... there are plenty of people right now doing awesome things the problem is the idea we'd be able to recognize the value of their work and contribution is the issue.

    When people want all stars they are just looking at the low hanging fruit of what it is easy for us to perceive and understand. No one wants to be told fusion took 100's of years because it was a mad hard problem and took thankless hours from large numbers of PHD's over a few centuries. The idea progress is easy and falls out of the sky is bs

    • Most Slashdotters do believe that progress is inevitable and relentless and easy. That is why you see so many "futurists" on here talking about Space Factories and living on Jupiter and whatever else they think up that day.

  • by mccalli ( 323026 ) on Thursday August 29, 2019 @06:05PM (#59139086) Homepage
    Lots. Loads of little companies producing their 8 bit hardware, the vast majority of which never survived. Steve himself couldn't get his stuff to boot and needed help from Chuck Peddle, one of the people who for my mind is criminally overlooked for creating the hardware that others packaged onto boards. There were many Steve Ws, but not many Chuck Peddles.

    It's because fashionable to laud Steve W as some megastar but honestly? He was clearly great, but so where many others in his field. Nothing he has done since early Apple has succeeded. Every commercial venture failed. So...why Steve Wozniak?
    • by geekoid ( 135745 )

      Because the 6502 had issue, at is was with the AppleII, not the original machine.

      Not that I don't fondly remember working with the 6502.

    • Because Steve Wozniak is a super rich tech guy and all tech guys want to be super rich.

    • The Apple ][ was a work of genius; Woz made one of the best and fastest machines with minimal parts, all beautifully designed. It's not really fair to expect other masterpieces after this and claim that because they don't exist he's not a "megastar", that's like calling the Wright Brothers one hit wonders.

      I know /. really likes to dump on Jobs, but really without Jobs Woz's work would have never had the global appreciation it deserved. It probably would have circulated the Homebrew Computer Club and been bu

    • by Shotgun ( 30919 )

      Similarly, the Wright Brothers get all the credit, but the crucial part of their "powered" flight was the power. Most people have never heard of Charlie Taylor, but without his mechanical genius, they'd never have gotten off the ground.

      http://www.wright-brothers.org... [wright-brothers.org]

      "History", or at least people's view of it, tends to concentrate on a single person out of a gaggle, all moving in the same direction. Who gets the credit seems to be almost random. So asking who is going the be the next "Steve Wozniak" is

      • Similarly, the Wright Brothers get all the credit, but the crucial part of their "powered" flight was the power.

        IIRC from the last big article detailing the creation of powered flight, the Wright brothers crucial contribution that gets them the claim to creating it was steering. Powered flight had already been shown and was even able to take of and land on its own but it couldn't be steered and only went in a straight line. Thus it wasn't powered flight, but controlled powered flight when they were able to demonstrate controlled flight by flying in circles and figure 8s that showed that powered flight would be useful

  • Comment removed based on user account deletion
  • Elon Musk (Score:4, Funny)

    by 110010001000 ( 697113 ) on Thursday August 29, 2019 @06:07PM (#59139096) Homepage Journal

    Who else invented EVs, reusable rockets, Hyperloops and tunnels? The guy is amazing.

    • Re:Elon Musk (Score:4, Informative)

      by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Thursday August 29, 2019 @06:18PM (#59139130) Homepage Journal

      He's no Woz.
      He didn't invent EVs, reusable rockets, sealed transportation devices, or tunnels(tunnels, really?).

      He is an industrialist who pushed those projects and put his own brand on them.

      I like what he does, but he is far closer to Jobs then he is to Woz.

      • by Tablizer ( 95088 )

        Woz and Jobs didn't invent anything either. They simply leveraged recently-emerged technologies quicker than everyone else around them. Musk is similar.

        When new technologies come out, there is kind of a land-rush to make them practical before others do. Inventing and making/integrating commercially viable products are generally two different talents.

      • I like what he does, but he is far closer to Jobs then he is to Woz.

        The world needs both visualizers & doers.

    • by Trogre ( 513942 )

      Musk is more like Jobs than Woz.

      He is a very good businessman and arguably a brilliant visionary, but he is not particularly technical.

      If you think Musk invented the EV, you clearly need to do more homework. Here's a hint to get you started: He did not found Tesla.

      Let me say that again to let it sink in:

      Elon Musk did not found Tesla.

      (EDIT: I note by your mention of tunnels that you're probably being tongue-in-cheek, but I'll leave the post anyway).

      • Besides who founded Tesla, the company didn't invent electric cars. Some of the first automobiles, if not the first ones, were electric. The only went to gasoline powered motors because of the terrible capacity of batteries over 100 years ago. And people have range-anxiety today!

        Imagine the progress of batteries if a significant percentage of vehicles had managed to stay electric during the history of the automobile. It would have created a great incentive for more researchers to work on batteries.

      • by Ichijo ( 607641 )

        [Musk] is not particularly technical.

        I see, the man who taught himself rocket science [businessinsider.com] is "not particularly technical."

    • Who else invented EVs, reusable rockets, Hyperloops and tunnels? The guy is amazing.

      Elon Musk is Steve Jobs, not Wozniak. Visionary and manager, but doesn't do the actual design.

  • by JBMcB ( 73720 ) on Thursday August 29, 2019 @06:09PM (#59139106)

    Wozniak's genius wasn't in inventing new things. He was a master of digital design. Ever see the layout of an Apple II motherboard? *Clean* Everything laid out properly. Nothing extra. Beautifully simple.

    Though he didn't oversee the Macintosh project, most of the head engineers of the project had worked for Woz and his design philosophy rubbed off on them. When the IBM PC came out, the Mac team bought one and cracked it open to check out the design. Supposedly, they started laughing, as the controller board on the floppy drive had as many components as the entire Macintosh mainboard did at the time.

    • And yet, every future Mac had even more components.

      • Though the Macs with expansion buses had more components, the more integrated Macs were still pretty simple. The LC series, IIRC, had fewer components than the original Mac. Same with the Powerbook series. This is mostly from the use of a couple of VLSI chips that did pretty much everything - video controller, serial, ADB, sound. The only discreet chips were usually SMC SCSI controllers.

  • Satoshi Nakamoto (Score:2, Insightful)

    by seoras ( 147590 )

    Who ever the hell he is. That dude has created something in software that is truly epic.

    • Re: (Score:2, Troll)

      by geekoid ( 135745 )

      Since it's flawed, doesn't do what it as meant to, I'm not sure why you think it's epic.

    • by Trogre ( 513942 )

      Ah yes, the man who published the technology with arguably the heaviest contribution to global climate change than any other software product in history.

      *slow clap*

      • Ah yes, the man who published the technology with arguably the heaviest contribution to global climate change than any other software product in history.

        I've heard that claim before, but my money is still on porn causing more electricity usage...

  • If any post should end up at the top of the comments list, it should be one containing the 2 names from the subject of this post. If the transition to cloud computing and ML are the two defining characteristics of tech in the 21st century, I wonder if any 2 names could possibly be as important, especially on BOTH fronts. These are the guys who built the technologies (Google FS, BigTable, Map/Reduce, just to name a couple) that allowed the kind of scalable distributed systems that now power the internet, a

    • I agree. The inventors of Google FS, BigTable, Map/Reduce, just to name a couple, should be at the top of the list.

  • The Wozniaks have little effect on the society as a whole.
  • NO one (Score:5, Insightful)

    by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Thursday August 29, 2019 @06:19PM (#59139136) Homepage Journal

    We live in a time where all the best talent is snatched up for companies to figure out how to get more clicks.
    It's pretty fucking sad.

    • Someone on Slashdot had linked the following video with Thiel and Weinstein:

      https://youtu.be/nM9f0W2KD5s [youtu.be]

      They made one observation that was very interesting; if you removed all the screens from a room in today's time, it would be very hard to tell the difference from the 1970s aside from design aesthetics. Things like record tables would disappear and turned into code, but hardware progressed relatively little. Even materials haven't changed much, most high grade steels, aluminums, and even titanium alloys w

  • Teslas are great and all, but they IoT spy on you and they cost $100k for a good one. We could us a Henry Ford and a Model T for modern electric vehicles, with no spyware or IoT garbage. Something we could actually work on ourselves, with a design philosophy that encouraged that. Once at a car show, I saw some guys disassemble a model T into parts they could carry, reassemble it, and re-start it in something like 15 minutes. It would be pretty sweet if we could even get half way there with a modern elec

  • Probably true in all parts of the computer era, but big advances usually come from group efforts making improvements on previous ideas. Inventing yet another programming language doesn't really qualify as revolutionary in my view. Social networks didn't really originate in this century, either, since Usenet and assort BBS forums existed back in the 80's at least. Making a thinner cellphone with more memory and a brighter display doesn't qualify as a revolution, either. Blackberry made a big leap in popu
  • Who Are the 'Steve Wozniaks' of the 21st Century?

    You surely mean the ones who similarly lack greed and have an egalitarian conscience? I doubt there are any. They certainly aren't leading any of the big IT corporations we know.

  • Bwwaahahaha! That guy is a hack and doesn't deserve to be in the same vicinity as others in this article.

  • Geoffrey Hinton, George Hotz - on opposite ends of the budget & resource scale but they've both made breakthroughs on each side.

    • George Hotz hacked the playstation and some iPhones. His work in Autonomous vehicles is largely borrowed from open source ROS etc. What exactly has he invented? I think he is a damn bright person ut hasn't had his great engineering achievement yet.

  • by SuperKendall ( 25149 ) on Thursday August 29, 2019 @07:54PM (#59139394)

    If you seek to find the next Woz, you need to be looking for the pairing that also highlights the next Jobs. I don't think either of them would have achieved what they did without the other.

    • a designer (in the sense of interior decorator or clothing) who can sponge of his friend's accomplishments while taking credit for them, all the while raking in the money while the friend gets a pittance?

      a pair of boots for a doormat, as it were.

      how about the world never get another human turd like Jobs, who was so very greedy he lied and denied his fatherhood and let his daughter grow up without his precious money? what a pathetic excuse for a human being

  • by theoa ( 88760 ) on Thursday August 29, 2019 @07:57PM (#59139406) Homepage

    Linus Torvalds as inventor of Git

    Yes, Linux was 20th century

    But, Git is very 21 century.

    https://en.wikipedia.org/wiki/... [wikipedia.org]
    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • For fucks sake you cite git for Torvald's accomplishments, and not the Linux Kernel?

      git wasn't even really invented by Torvalds, if was essentially created as a clone of BitLocker, created after Torvalds finally realized his choice to use Bitlocker was a serious fuck-up.

      There's a lot of praise due to Torvalds for the Linux Kernel, git not so much.

    • I see maintaining the Linux kernel as a major 21st-century accomplishment by Torvalds. With growing economical interests and a myriad of different contributors, keeping the kernel stable and well-organized is a mastodontic job. It takes balls to keep all the parties involved in check (with the occasional toxic linusrant à la "Fuck Nvidia"): there are programmers with varying levels of skills, major industries and corporations trying to lobby their way, and probably also three-letter agencies trying to
  • We wouldn't know of Steve Wozniak if Apple wouldn't have (kinda) survived till now. There were lots of people designing and building their own computers back then. In fact even German TV science presenter "Ranga Yogeshwa" built his own compputer from scratch back then.

    The problem is that back then building a competitive computer was _way_ simpler. You had simple through hole components, and since large companies were not interested in home computers the market was left open to small companies and hobbyists.

  • Linus (Score:3, Informative)

    by sgendler ( 237727 ) on Thursday August 29, 2019 @08:42PM (#59139498)

    Really, if you look at Woz as someone who didn't just create something amazing, but who created something amazing that allowed other people to realize their own amazing creativity with computers, then it is pretty easy to make a list of people who contributed in ways which empowered everyone else in the early part of the 20th century (whether their invention happened in the 200s or not):

    * Linus Torvalds is the obvious 1st choice - Linux only really started to make significant inroads in the mid-late 90s, 1997-present.
    * Eric Raymond - the impact that open source has had on 21st century technology is undeniable, and I'll give that credit to Raymond, since RMS was more of a 20th century figure in that fight, and it was ESR who sparked the current wave of open source with his writings and advocacy.
    * Guido van Rossum - python
    * Marc Andreesen - graphical web browsers, Mosaic/netscape
    * Tim Berners Lee - WWW
    * Brendan Eich - JS
    * Martin Fowler - dependency injection, spring
    * James Gosling - Java
    * Jeff Dean and Sanjay Ghemawat - everything@Google
    * Joel Spolsky (for Stack Overflow? SO has made a pretty significant dent in how we do what we do, I think)

    I'm leaving Bram Moolenaar off only because vim is a clone of vi, which is old enough to be, legitimately, a 20th century technology. But I've used Bram's work literally all day, every day, since the early 90s. It's perhaps the one piece of computer-based tech I can legitimately say I've been using since before I used Linux, since I well remember cursing Bram's name when I couldn't figure out how to exit the editor of the built-in nn newsreader on Solaris and SunOS, in 92 or 93, when I got my first exposure to Unix of any kind.

    There are, of course, many others, depending on your particular interests and perspective. Many other projects, and many other significant contributors to the projects mentioned above. It would be impossible to list or name them all.

    Worth noting - no women in the list. Not because there haven't been significant women in the space, but because we do such a poor job of honouring their contributions publicly. I am not totally certain, but I doubt there's much by way of melanin in that list, either, other than Sanjay. I could do lots of research to find names to make the list more diverse, but I choose to publish it as the list I can generate off the top of my head, in order to reinforce how much better of a job we need to do when it comes to recognizing the significant contributions of all races, genders, and other differentiators.

  • With all those KIM-1 mods make by a big community of enthusiasts, it was just a matter of time until someone had the idea of getting everything packed in on a computer kit.

  • The geniuses who invented to ONLY stable cryptocurrency - Dogecoin.

    After all, with all the gyrations of other coins, of all the ups and downs of the stock markets worldwide, with all the gains and losses of traditional currencies - one dogecoin is STILL worth one dogecoin!

    That kind of stability needs to be recognized...

  • Larry Page and Sergey Brinn: maybe not the 21st Century though, but invented amazing search engine.

    Sebastian Thrun: a prominent inventor in the self-driving car world

    Hinton, Bengio and LeCun: their wicked neural net hacking sparked the present DeepLearning revolution. They made NNs practical like the Woz made microcomputers practical.

    Jeff Dean & Sanjan Ghemawat: already mentioned above

    Sal Khan & Sebastian Thrun & Andrew Ng: for the invention and comercialization of MOOCs

    Jeremy Howard: for Kaggle

  • by RobinH ( 124750 ) on Friday August 30, 2019 @04:43AM (#59140236) Homepage
    I would say Beckhoff for the invention of EtherCAT. It uses standard Ethernet hardware in a very novel way. When any other Ethernet based field bus gives us cycle times in the 10ms range, EtherCAT can go about 100 times faster, and it's less expensive. When you look at how it's implemented, the "hack" it uses is brilliant.
  • Founder of the Raspberry Pi project.
  • Where is the Louis Pasteur of today? Or the James Watt? Show me a single medical genius that did groundbreaking work in the field of biology. Or an engineer that built a machine that revolutionized the way we work.

    The answer is simple that the "simple" inventions are invented. Take a look at current patents or even more current Nobel prize laureates. Do you see single names? Hardly. What you see is corporations that get patents and scores of people for the Nobel Prize because you can't award them to corpora

    • Parent makes a good point.

      Also, fame is a minority game; you can't have 100 Woz clones get famous because people won't track them all at once; a few will stand out to the media almost at random; even if they are all equal and then the trend will be followed even if they become less interesting than the other clones later.

      Even so, the individuals who do deserve credit for some fairly big thing are just like Woz back in his day in that the MAJORITY do not get the glory for themselves even if they want it. Th

  • ...Jaron Lanier should be on the list.

  • Every time I see lauding of Babbage & Lovelace while von Neuman & Hopper are nowhere mentioned, it's time to grind this ax:

    <rant>
    While the Grouchy Royal Society Polymath and Countess of Poetical Science both deserve due credit for their vision and commendation for declining the idleness that their wealth and status offered, their contributions to computing are both ultimately footnotes in its development rather than central works. Babbage's almost-computing is a small corner of his overall work in the evolution of industrial society. Lovelace's Notes, while insightful and worth knowing about, were not republished until 1953.

    However, when these lists of foundational computing history luminaries get thrown around and we get these two fancy folk with their lace cuffs who never actually implemented their computing ideas make the list, yet the Weird Hungarian Immigrant who Defined All the Things and the Nerdy Admiral who Invented the Compiler are missing, it sticks in my craw something fierce.

    To be clear, I'm happy that Lovelace's Notes came to light and are talked about, and Babbage would be in the history books if nobody ever heard the words "Analytical Engine." However, let us please write our history without elevating the rich and good-looking but ultimately marginal figures at the expense of the plainer folk whose work and accomplishments are actually the bedrock of our discipline.
    </rant>

  • ...but we will probably only hear about a few. Plenty of ideas and hard work go completely unnoticed over time. Unless the stars line up, many die along the wayside. If Wozniak had not teamed up with Jobs, we might never have heard of him. Tech people generally do a poor job at promotion. We are nerds who love to stay up late optimizing some code or inventing a new algorithm. We don't get on social media and tell the world how wonderful it is.

    Case in point: I have invented a new kind of data management sys
  • Don't forget Claude Shannon, the father of Information Theory. He also figured out how to implement boolean logic in switches, which led to transistors.

    Who's the Claude for the 2000? Nobody yet.

  • Mentioning him in the same article and light as the Woz? FFS people...

  • "The Steve Wozmiack" of the 21st century hasn't been born yet. Better than 95% probability that she/ he/ xir has not finished their first (pre-10yo) growth spurt.

    Oh, you're looking for people who's fame was strongly established before the first quarter of the century? Without doing a detailed demographic assessment, I think you're looking at a 50% probability of whoever "wins" that accolade being born after 2040.

    Don't people actually read these questions, and then think for a second before replying? No? [

  • Torvalds, Kernigan and Richie, Knuth, Turing and von Neumann should kiss his feet

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...