Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology

Tech Jobs Are Replacing Tech Jobs in Silicon Valley 127

An anonymous reader writes: More than 22% of the jobs in Silicon Valley are now in the technology sector, reports the San Jose Mercury News, while the area has lost nearly 156,000 factory jobs over the last 15 years. But 59% of those lost manufacturing jobs were at tech companies, indicating that "the hardware has faded in importance compared with the software," says economist Christopher Thornberg. "It's all about the applications these days." Over the last 15 years employment gains happened in "information" areas characterized as mobile/internet/social media as well as software and tech services -- for example, at companies like Google, Facebook, Apple, and Salesforce -- and at hotels and restaurants catering to high-tech workers. "It's not just that tech is replacing other industries," reports the San Jose Mercury News. "Tech is replacing itself."
This discussion has been archived. No new comments can be posted.

Tech Jobs Are Replacing Tech Jobs in Silicon Valley

Comments Filter:
  • by russotto ( 537200 ) on Sunday April 03, 2016 @04:37PM (#51834291) Journal
    It's just being built in China now; cheaper labor, fewer environmental regulations. (Obviously)
    • by Anonymous Coward on Sunday April 03, 2016 @04:59PM (#51834389)

      One interesting thing we're now seeing is how a lot of software is getting worse. This includes not only commercial software like Windows 8 (and 10, to some extent), but also a lot of open source software. Firefox, GNOME 3, systemd and the Slashdot Beta site are good examples of how inferior software is being forced on users, without any benefit in quality, price, capability, or any other traditional metric.

      Something else that's interesting about this situation is how it is being driven by hipsters/Millennials. In the past, technical changes would have to be backed up with a strong technical argument. A change just wouldn't happen if it didn't bring some important benefit to the users. But hipsters/Millennials have taken a different approach. They tend to ram through changes, "justifying" the changes by pretty much just telling the users that they are "wrong" when they object to such changes because such changes don't bring any benefit.

      Firefox is perhaps the best example of unwanted changes being forced upon unwilling users. Nearly every release of Firefox features some unnecessary UI change that reduces its usability, or the removal of useful configuration options, or the addition of unwanted functionality (like Pocket and Hello), or even the inclusion of ads that are built into the browser itself. Now we're hearing that Firefox will be switching to a Chrome-like extension model, which will no doubt break many existing extensions. When the users of Firefox scream in pain, "No! We do not want these changes!", the Firefox developers ignore their pleas and force the changes on the few remaining Firefox users anyway. After being treated so poorly, we've seen many Firefox users flee to alternate browsers, leaving Firefox with only about 7% of the market [caniuse.com].

      All of this is contrary to what we'd expect to be seeing, and what we in fact did see for many years. From the advent of computing up until around 2005, when hipsters/Millennials started getting involved with industry, we did see continual improvement. Software would get better as it aged, as is developers learned more about what users actually needed, and what techniques worked best. Then the hipsters/Millennials came along, chose to ignore all of this accumulated knowledge, and in just a few short years they have trashed so much software and ruined the experience for so many users.

      We can only hope that the generation that comes after the hipsters/Millennials will be able to undo all of the damage the hipsters/Millennials have caused. This is unfortunate, because instead of this subsequent generation being able to improve things, they will just waste their effort bringing us back to where we were in 2005. So not only do we have to contend with the wasted generation that the hipsters/Millennials are responsible for, we'll also have to contend with the waste they forced on the next generation(s)! The saddest part is that it's all so unnecessary.

      • by rfengr ( 910026 )
        Fully agree; don't know why you were modded down.
        • It's pretty obvious that his bigoted anti-hipster / millenial comments are what got it modded down, regardless of the quality of the rest of the post. It will probably be nodded back up eventually.

          • by Anonymous Coward on Sunday April 03, 2016 @06:32PM (#51834757)

            How the hell is it "bigotry"? Most software UI design/development is done by people in their 20s and early 30s. You know, people born after 1980. By definition they're part of the Millennial Generation, hence it's perfectly correct and acceptable to refer to them as "Millennials". And nearly all of those people do subscribe to the "Hipster" way of life. One of the core tenets of the "Hipster" philosophy is putting design above utility, which is exactly what we do see. When a group of people do something wrong, and they're part of well defined groups (like "Millennials" and "Hipsters"), it's not "bigotry" to point out that they've screwed up!

            • by ranton ( 36917 ) on Sunday April 03, 2016 @07:53PM (#51835133)

              How the hell is it "bigotry"? Most software UI design/development is done by people in their 20s and early 30s. You know, people born after 1980. By definition they're part of the Millennial Generation, hence it's perfectly correct and acceptable to refer to them as "Millennials". And nearly all of those people do subscribe to the "Hipster" way of life.

              There are about 77-80 million Millennials in the US alone. This group is more socioeconomically, ethnically, and ideologically diverse than any previous generation. Yet you stereotype about a quarter of the US population into a single narrow ill-defined group.

              And while perhaps a large percentage of mobile apps and young startups have their UI's designed primarily by Millennials, I doubt most software is designed and approved by people under the age of 35. I agree the majority of the work may be done by Millennials, but the Directors and VPs approving the designs before they are released to the public are probably Gen X. I would be willing to bet most of the important design decisions made for Windows 10 (mentioned by the OP) were done by Gen X. The Lead Designer, for instance, graduated from college in 1990 (Albert Shum).

              One of the core tenets of the "Hipster" philosophy is putting design above utility

              Hipster is such a loosely defined derogatory term that any claim there are core tenets of their philosophy is suspect. And if there were, it would based more on independent thinking, counter-culture, progressive politics, appreciation for independent art, and creativity (stolen from Urban Dictionary). Counter-culture is not the same as form over function. It is simply a rejection of main stream culture.

              • by swb ( 14022 ) on Sunday April 03, 2016 @08:19PM (#51835231)

                Counter-culture is not the same as form over function. It is simply a rejection of main stream culture.

                It strikes me that in previous generations, counter-culture had much more emphasis on the counter part of it. It was more than just a rejection of the content of the dominant culture, but also a vigorous rejection of the structure of the dominant culture. It was as much about being *against* it as you were in creating a new culture. There was a strong undercurrent of nihilism.

                Of course, marketing and advertising have long figured out how to extract the style and form of counter-cultures, sanitizing them of any of their hostility to mainstream thinking, and presenting it as something new and improved.

                So what I think is now presented as counter-culture really isn't -- it's the same old structure and system, this time with plaid shirts, beards and microbrews.

                • So what I think is now presented as counter-culture really isn't -- it's the same old structure and system, this time with plaid shirts, beards and microbrews.

                  I think you're absolutely right about this, however I think laying it at the feet of the Millenials is wrong.

                  While I do see a lot of Millenials following these fads (plus the whole tattoo fad), they're not the only ones. I'm a little over 40 and am now back in the dating rat-race, and I can't tell you how many women my own age (I tend to look most fo

                  • by swb ( 14022 )

                    I'm 49 this year and I hear a lot of horror stories about dating from men my age I know who have been divorced. Most of them are above average in looks and general personality and all of them are above average in income, but the women they run into from dating sites are all mental as anything.

                    A certain percentage just appear to be narcissistic shrews who haven't figured out that what works on dumb guys when you're 21 and attractive doesn't work at all if you're 45 and have two kids -- that's why the guy yo

                    • Thanks for the reply, I do like to hear other peoples' honest perspectives, and verify that I'm not just nuts myself, and also to entertain theories about this.

                      I hear a lot of horror stories about dating from men my age I know who have been divorced. Most of them are above average in looks and general personality and all of them are above average in income,

                      Sounds exactly like me (well, I like to think I'm above average in general personality and looks; my ex-wife says I am and says I'm a "great catch". That

                    • Damn Slashdot and the inability to edit...

                      I also wanted to add, I think these women have an easy time staying single, because women are typically much more social than men, and so they've developed a little "clatch" of other single women they hang out with, and this fills their need for a social outlet. One other guy said to me that he thinks many of these women are very comfortable being single and really only go on dates because it's socially expected of them (by their family or friends), and then they j

                    • by swb ( 14022 )

                      I have a kind of evolutionary biology theory that says that as women age out of their child-bearing years their shifting hormones causes a general decline in sexual interest. The evolutionary biology aspect of it is that around 40, it becomes increasingly difficult for women to them to bear healthy children so losing sex interest makes them less likely to bear marginal pregnancies. It could possibly kill them or they may end up with a chromosomal defect baby which would be high maintenance at and age wher

                    • Wow, this is a great conversation here. I had all but given up on Slashdot for that...

                      Anyway, this theory does sound very compelling. As for that one woman I briefly dated, she was 40, never married (but was engaged once), no kids, but on our first date she was already talking about having kids. (For good reason, she wanted to make sure she wasn't wasting time with someone who didn't want any, because apparently (according to her) a lot of men who'd be prospective dates for her either already had kids an

                    • by swb ( 14022 )

                      I had forgotten the women racing to motherhood in their late 30s. I guess I kind of attribute this our society's "perpetual youth", where everyone lives as if they're 25 for 15 years. I think most of these women are mostly normal, but between careers and living a kind of never ending post-college lifestyle never seem to get drawn into marriage; possibly dating long-term but never marrying. I also think some marry, briefly and reflexively, but end up divorcing fairly quickly because they lack the maturit

                • It was more about being against than anything else. The hippie boomers continuously said 'Destroy it all ! Down with everything!' not realizing that you end up with a vacuum that has turned into this...sovietizing of the USA and probably other western nations. This, in turn, really was triggered by communist party infiltration into the "counterculture" movements of the 1960s. It's the boomer generation that has triggered this cascade, and their kids are the full flower of this horror.
              • I they're so much into independence and creativity why do they all look the same way, eat the same things, listen to the same music etc etc?

                In my day we had heavy metallers, mods, goths, casuals, teds, ...

            • How the hell is it "bigotry"? Most software UI design/development is done by people in their 20s and early 30s. You know, people born after 1980. By definition they're part of the Millennial Generation, hence it's perfectly correct and acceptable to refer to them as "Millennials". And nearly all of those people do subscribe to the "Hipster" way of life. One of the core tenets of the "Hipster" philosophy is putting design above utility, which is exactly what we do see. When a group of people do something wrong, and they're part of well defined groups (like "Millennials" and "Hipsters"), it's not "bigotry" to point out that they've screwed up!

              One example about putting design over utility is gnome's nautilus. Where in the past I could do some action with one and sometimes two clicks of the mouse, now it requires up to five. Pathetic! I also started getting carpal tunnel pain with my ligaments controlling my right hand fore-finger. Ergonomics and utility over OO design. The design of interface software to match the limitations of the underlying APIs is a good explanation of why the Gnome interface is the pits

          • by KGIII ( 973947 )

            Ever generation starts out blaming the elders. Elders finish it by complaining about the youth.

            Seriously, even Plato figured this out.

            • . . .about 10 years ago, I went back and did grad school (frankly to check a block for promotion. I learned almost nothing I didn't already know in Grad School, with the exception of mind-numbing detail of a technology ('distributed databases') which had already been obsoleted years before.

              But the real shock was my fellow students. We were roughly 50-50, older students coming back for a Masters, and "kids" fresh out of undergrad. And uniformly, I noticed that the "kids" had absolutely horrible language s

              • I found this when I went back to school a few years ago. I also ran into a similar situation staffing a help desk. I had two major issues. First, I started out too lenient. My immediate superior was lax on the discipline, and that worked, we all showed up, did our jobs, went home. I hired these kids in their early 20's and it was a mess. Start times, showing up, doing what they are told to do the way they are told to do it, and simply calling in to let me know if they would be late or out. Ended up ha
              • We were roughly 50-50, older students coming back for a Masters, and "kids" fresh out of undergrad. And uniformly, I noticed that the "kids" had absolutely horrible language skills: they could not spell, EVEN with spell-check. They could not write coherently, they had serious problems articulating a reasoned argument from evidence.

                Somehow, I suspect this is due to the difference in ages. I bet if you took a bunch of baby boomer's writings from their early 20's and showed them to other baby boomers today, they'd say the same thing. 25+ years is a long time to refine those language skills.

                • Ironically enough, I'm an Xer - born in the mid 1960s - and I have always thought the boomers were terrible at writing.
              • If you destroy the ability to use a language, you destroy the culture that uses the language. It's a great way to affect the change of, say, stripping away all constitutional rights in a country that was one of the first on earth to have a constitution based on the idea that the citizen was the ultimate in authority, rather than the state. If you remove the ability to use language, you remove the ability to reason.
      • by U2xhc2hkb3QgU3Vja3M ( 4212163 ) on Sunday April 03, 2016 @05:58PM (#51834633)

        I'm not sure about 2005, but I totally agree with you. All three major operating systems (Windows, OS X, Linux) started going downhill when style became more important than function. Looking at Apple, it was an incredibly huge mistake to let an industrial designer in charge of user interface design. They're two totally different fields that require completely different skill sets and knowledge.

        The end result is barely adequate hardware used to display flat graphics in washed-out pastel colours resulting in user-hostile interfaces with small fonts rendered with insufficient contrast.

        • by chipschap ( 1444407 ) on Sunday April 03, 2016 @06:04PM (#51834647)

          I don't know enough to comment about hipsters and millenials. To each his own. But I do agree with the comment that form and style now seem more important than function and even basic quality. I think the smartphone mess is really characteristic of this, but it reaches to the desktop as well.

          The ultimate in function, the command line, is the minimum in style. (Of course there are usability arguments, but my point still stands.)

          All I have to do is look at Windows 8. I'm honestly impressed with the slick appearance, very far beyond my Gnome 2 desktop. And it's great until you actually try to do something ... at which point you realize that glitz and glitter don't get work done.

          • The ultimate in function, the command line, is the minimum in style.

            Unless you have ANSI color codes on! Then you're sedawkgrepping like the cool kids.

        • I think part of it is the capability of computers. XP is kind of a peak in OS, where computers had the capability to do everything the OS truly needed. Vista, 7, 8 and 10 provided minimal OS gains over XP, not enough to justify the cost of buying a whole new OS. So they had to add pretty, eye catching features to get people interested in the OS (mostly people who consume content on computers, rather than produce it). I would rather see Windows 10 be stable and fast (which since release it seems to be go
          • I think phones did the same thing. I've got a Galaxy S5 which honestly is more than I really need in a handheld computing device.... As a result I think smartphones are doing the same thing as PC OSes, to keep business going and people buying the latest thing they can no longer add functionality so they add fluff.

            No, it's worse than that, they're actively removing features. I have the Galaxy S4, and the S5 looks like my next upgrade maybe, but for now I'm happy with the S4. The S5 maybe adds a bit of spee

            • Yeah I'm keeping my S5 for as long as it does the job. After that I may be done with Samsung phones, unfortunately.
              • If I upgrade any time soon, it'll be to an S5 (I have the S4). But after that, I don't see how I'd want a newer Samsung than that, unless the S8 is the next S5.

      • My biggest problem with Firefox is how it keeps loosing my dictionary. If I open up too many tabs, suddenly the "Check Spelling" disappears and I have to go re-install the dictionary again. I don't know if this is a flaw a hipster coder introduced, but I'll go along with your idea and blame them for it anyway lol. Not all Millennials are hipsters; I know several Millennials that are quite serious IT people. But they are most certainly not "hipsters"...but looking at the Firefox dev team [mozilla.org] only one of them loo
      • I think that comes by extension by a general trend I've noticed of people not really giving a crap what anyone else wants. Even if moving an application in a direction that makes it more desirable and thus profitable. It's almost like people are becoming dead to the wants and needs of others while elevating their own wants and needs. Perhaps it's because of a generation of schooling that were never allowed to fail kids, or too much helicopter parenting. It's like developers think they are precious littl
      • by Anonymous Coward

        Well said but all is not lost. Look at the quality of a project like Postgressql, solid as a rock, none of the hipster stuff you've described. I'd even say the same for the Linux kernel. Throughout all the drama that pops up once in a while, the process and code are still old-school solid, way above the push-it-out-to-test model of the hipster-driven companies like Facebook, Pinterest, etc.

        So there are examples where you can comfortably use the term "engineering" with relation to software. Postgres and

      • Lots of things out there screwing up software. New college graduates just don't have the necessary software skills, they're being trained for entry level jobs without the theoretical basis to get beyond that. It's not all software anyway, you can't do software without the hardware. And you can't do hardware without the lower level software either. But there's a mob of programmers who never think beyond the "app" and think that the hardware and lower level software are not things for mere mortals to comp

        • by sjames ( 1099 )

          I'v looked around the arduino development, and I conclude older programmers who grew up in a time where 64K was huge, 4MHz was fast, and the OS never stood between you and the hardware have a distinct advantage. You can do a lot with such a small system as long as you don't try to program it like a PC.

      • This thread just sounds like a bunch of grumpy old men. Lots of software tends to suck these days; that much I'll agree with you on. But to suggest it's solely the fault of "those darn hipsters" is ludicrous.

        Jonathan Ive gets some of the blame at Apple, and he's nearly 50. And at every software company I've been with, executives in their 30s, 40s, and 50s are still the ones calling the shots and bankrolling everything. You're suggesting that they are blameless because some designers and engineers in
      • One interesting thing we're now seeing is how a lot of software is getting worse. This includes not only commercial software like Windows 8 (and 10, to some extent), but also a lot of open source software. Firefox, GNOME 3, systemd and the Slashdot Beta site are good examples of how inferior software is being forced on users, without any benefit in quality, price, capability, or any other traditional metric.

        I see your point.

        My favorite example: The Typo3 successor Neos - a type-a software trainwreck entirel

      • I'm (barely) a Millennial and have never really been a hipster in my opinion. I spend a lot of my time as a developer explaining to the systems people (who are mostly 10-30 years older than I am) that anything not fully designed in systems requirements will result in me using educated guesses as to what they want. This often results in much hemming and hawing until I tell them to put in writing that I can make educated guesses in my implementation and suddenly I see designs being fleshed out.

        Then down th
        • by Cederic ( 9623 )

          No developer gets perfect requirements. If they do, the requirements are wrong and/or out of date.

          An inherent part of development is filling those gaps. One reason agile methodologies were so attractive is that they reduced the feedback loop times.

          You don't need to know what's wanted, you just use techniques that address those gaps without wasting time. Sure, it uses your time, but is yours really any more valuable than that of the people you're demanding those intricate detailed requirements from?

          Just fuck

          • Done that. You'd be surprised how often people have issues with attempts to get it right the first time and demand it be changed. Though I'm working on software in a certification environment, where minor errors take a lot of money to get through the full process to fix. The certification process still doesn't play that well with agile. Maybe some day they will get up to speed.
      • Software is getting bigger and more complex; it's not getting more defects-per-line or defects-per-task, just more shit-it-has-to-do.

      • I mostly agree, but I think you're picking on Firefox way too much. The UI of Firefox hasn't changed much (at least for me on Linux Mint, not too sure about the Windoze version since I haven't used that in a while). The standard menu disappeared to save screen real estate, but comes back if you press "Alt". It has a somewhat annoying big-button-menu. But if you compare it to Chrome, its main competition, it isn't that bad: Chrome is far more minimalistic. Also don't forget: Firefox still has "about:con

      • One interesting thing we're now seeing is how a lot of software is getting worse. This includes not only commercial software like Windows 8 (and 10, to some extent), but also a lot of open source software. Firefox, GNOME 3, systemd and the Slashdot Beta site are good examples of how inferior software is being forced on users, without any benefit in quality, price, capability, or any other traditional metric.

        Something else that's interesting about this situation is how it is being driven by hipsters/Millennials. In the past, technical changes would have to be backed up with a strong technical argument. A change just wouldn't happen if it didn't bring some important benefit to the users. But hipsters/Millennials have taken a different approach. They tend to ram through changes, "justifying" the changes by pretty much just telling the users that they are "wrong" when they object to such changes because such changes don't bring any benefit.

        Firefox is perhaps the best example of unwanted changes being forced upon unwilling users. Nearly every release of Firefox features some unnecessary UI change that reduces its usability, or the removal of useful configuration options, or the addition of unwanted functionality (like Pocket and Hello), or even the inclusion of ads that are built into the browser itself. Now we're hearing that Firefox will be switching to a Chrome-like extension model, which will no doubt break many existing extensions. When the users of Firefox scream in pain, "No! We do not want these changes!", the Firefox developers ignore their pleas and force the changes on the few remaining Firefox users anyway. After being treated so poorly, we've seen many Firefox users flee to alternate browsers, leaving Firefox with only about 7% of the market [caniuse.com].

        All of this is contrary to what we'd expect to be seeing, and what we in fact did see for many years. From the advent of computing up until around 2005, when hipsters/Millennials started getting involved with industry, we did see continual improvement. Software would get better as it aged, as is developers learned more about what users actually needed, and what techniques worked best. Then the hipsters/Millennials came along, chose to ignore all of this accumulated knowledge, and in just a few short years they have trashed so much software and ruined the experience for so many users.

        We can only hope that the generation that comes after the hipsters/Millennials will be able to undo all of the damage the hipsters/Millennials have caused. This is unfortunate, because instead of this subsequent generation being able to improve things, they will just waste their effort bringing us back to where we were in 2005. So not only do we have to contend with the wasted generation that the hipsters/Millennials are responsible for, we'll also have to contend with the waste they forced on the next generation(s)! The saddest part is that it's all so unnecessary.

        I too, raised concerns to "Red Hat about the "tinkering" that they do within Gnome, and how they break existing functionality, even in Linux. I left off using Gnome for xfce. I also moved to using Scientific Linux, as their concentration is "if it's not broken, leave it alone".

    • by OhPlz ( 168413 )

      Yea, I'm not sure what the point of the article is. This is just the continuing decline of American manufacturing. We don't build computers here anymore so obviously computer manufacturing (the steepest loss listed in the article) would be hurt by that. Broadening the scope, software solutions tend to replace specialized hardware solutions because it makes things less expensive. That shouldn't be a surprise to anyone either.

      • The article is not just pointless but also misleading. The implication is that software jobs are replacing hardware jobs... but that is pretty obviously not the case, when you look at the overall numbers of jobs in the respective fields. For the most part, what this is actually showing is a decline in manufacturing, as you say. Mostly due to offshoring. And it is a Very Bad Thing to be losing that production capacity. However, software replacing specialized hardware mostly happened a long time ago. It's c
      • by lgw ( 121541 )

        This is just the continuing decline of American manufacturing

        American manufacturing has grown every decade since it existed - something like 10x since 1940 (inflation-adjusted dollars, as measured by the Fed). There has never been a decline. Oh, sure, the manufacturing jobs are gone, never coming back, but don't confuse that with the industry. Everything's automated now, or nearly so. Just like farming, it has become something we need ever fewer people doing (and it will probably end up at less than 5% of the workforce, like farming).

        • That isn't much help for the people who need those jobs. They don't have anything else they can do to support themselves. Contrary to what a lot of people seem to think, people who are only mentally capable of factory work are not going to be successful at going to college and getting a job that requires a college education. What few jobs are left for people like this (manual labor-type jobs) are getting filled by immigrants, so as a result we're seeing huge support for Trump.

          • by lgw ( 121541 )

            That transition started about 40 years ago, and is nearly complete now. It's old news, even by Slashdot standards. People mostly found service jobs. "Service economy" was the big buzzword in the 80s or so, IIRC.

            Now it's those jobs that are going away, or contended for by immigrants.

            • Service economy was the big buzzword in the 60s.
              • by lgw ( 121541 )

                I wish I coulf find graphs going back past 40 years. We were at about 30% of US jobs in manufacturing in the 70s, vs 10% now - I wonder how high it was in the 50s, and when the ramp down started.

    • Not to mention much lower corporate tax rates...
    • >> It's just being built in China now; cheaper labor, fewer environmental regulations. (Obviously)

      Never forget that you always get what you pay for.

  • by Arcady13 ( 656165 ) on Sunday April 03, 2016 @04:47PM (#51834345) Homepage
    You should replace the person with the tech job of writing headlines with a new person with the tech job of writing headlines.
  • by Anonymous Coward

    Back when every new tech gadget was an actual gadget, a company might need say 3-4 hardware guys
      and 4-5 software guys to design it, plus of course a facility somewhere to build and test it.

    Now that everything runs off the same phone, Apple might need 10000 hardware engineers or whatever to design each new phone, but that supports an ecosystem for a million software devopers. And at that volume they are manufactured overseas.

    Of course jobs are skewing toward software.

  • by rfengr ( 910026 ) on Sunday April 03, 2016 @05:13PM (#51834451)
    That's what I like being an old school EE doing RF/Microwave hardware. It's still high tech, but a slower pace. Things don't change for the sake of change, rather when you can improve upon something using quantative measurements.
    • Have you replaced all your GaAs PAs with GaN yet? Why not?

      • by rfengr ( 910026 )
        Of course, where it is applicable. Did an X-Band GaN Doherty last fall. Though GaN had been in development for years.
      • GaN has some real advantages over GaAs for anything in the 1W output power ballpark and above. Final stages and high power switches make sense, but mostly no point for LNA's and gain stages.

        So, as the poster stated it is not being done for fashion, but for real measurable performance improvement.

  • by account_deleted ( 4530225 ) on Sunday April 03, 2016 @05:14PM (#51834455)
    Comment removed based on user account deletion
  • Need more H1B's to replace high payed workers

  • Well, all that disruption was bound to reflect back. I mean, when you mess with the bull sometimes you get the horns.

    I made myself laugh with the number of cliches I've managed to weave into this dense, trite post.

    Will

  • by Anonymous Coward

    Since we don't make anything in America anymore we decided that we'd make assholes

    https://medium.com/bad-words/the-asshole-factory-71ff808d887c#.p2jif6fwx

  • And Privacy Badger blocked 41 trackers, this is a record for any web page I've visited.
    • And Privacy Badger blocked 41 trackers, this is a record for any web page I've visited.

      Oh, so that's why the web page I saw looked to have been written directly in html, and contained only text that reflowed without issues.

      I was wondering why a "news outlet website" for Silicon Valley looked so completely retro!

      At least there was no flashing rainbow text.

  • But those lower middle class manufacturing "tech" workers still lost their job.

  • I took introduction to electronics at college in the early 1990's. Tried a few times to get a summer job as an electronic assembler in Silicon Valley. The majority of workers were Filipinos, and I quickly discovered that I wouldn't get a job because I wasn't Filipino. White people, I was told, were managers and not workers. So I dropped electronics as a major. Ten years later I would go back to college to learn computer programming and earn my IT certifications, and the electronic department got reduced fro
    • by rfengr ( 910026 )
      Well 20 years from now all those programming jobs will be off shored too. Shoulda been a manager from the start.
      • Well 20 years from now all those programming jobs will be off shored too.

        I'm not worried about that. Computer security will keep me busy for the next 20 years, especially if I stay in government IT and Windows still provide job security. If not, I'll move on to something else.

        Shoulda been a manager from the start.

        I worked at Cisco a few years ago when I got laid off because my contract came up for renewal and my boss got locked out from renewing my contract. The majority of the workers got laid off at that time: mid-level managers. Not even those jobs are safe.

  • by Tablizer ( 95088 ) on Monday April 04, 2016 @01:09AM (#51836193) Journal

    They are going to have to change their name from "silicon" to something code-ish.

    How about Recursion Valley?

  • the hardware has faded in importance compared with the software

    Uh, no. Hardware is of utmost importance (you can't run software and bring all those applications to the hoi polloi.). It's just so happen that hardware can now be commoditized with the bulk of it (if not its entirety) being manufactured and assembled somewhere else where it is cheaper. If your hardware is not truly innovative, it is going to be handled at a FoxConn assembly line.

    • Does hardware need to be "truly innovative" anymore? Is there much practical difference between a washing machine and a tablet?

      (that was two questions, but at least one was rhetorical)

      • Does hardware need to be "truly innovative" anymore?

        It can be "truly innovative" in its usage of new materials that either lowers its TOC (say, by lowering energy consumption) or by lowering its upfront purchase price or its portability.

        It can also be innovative when several of its components - despite appearing to be non-innovative - can be integrated into providing novel solutions.

        It needs to solve a problem in an innovative way in a manner that enough people can care for it in order to be viable (a concept separate from being "truly innovative".)

        Is there much practical difference between a washing machine and a tablet?

        (that was two questions, but at least one was rhetorical)

        Oh he

        • Yea, but it doesn't really matter if I have a Samsung, Kenmore, Whirlpool, whatever as my efficient washing machine? I'm not suggesting that we use 20 year old technology, but I am suggesting that tablets are a commodity and they are becoming roughly equivalent in terms of capabilities and features. Apple iPad versus Samsung/Android Tab, should the differences matter much to most people? I can't see why it would.

          We've had tablets taking orders in restaurants in the US since the 1990's. Arby's near me finall

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...