Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Is Computer Science Dead? 641

warm sushi writes "An academic at the British Computing Society asks, Is computer science dead? Citing falling student enrollments and improved technology, British academic Neil McBride claims that off-the-shelf solutions are removing much of the demand for high-level development skills: 'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch. Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.' Is that quote laughable? Or has the software development industry stabilized to an off-the-self commodity?"
This discussion has been archived. No new comments can be posted.

Is Computer Science Dead?

Comments Filter:
  • Wow! (Score:5, Interesting)

    by OverlordQ ( 264228 ) on Tuesday March 13, 2007 @04:58AM (#18329123) Journal
    Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.

    And who made those packages?

    Software don't write itself.
  • Horology anyone? (Score:3, Interesting)

    by Tracer_Bullet82 ( 766262 ) on Tuesday March 13, 2007 @05:01AM (#18329149)
    I remember a few years ago, 2 the minimum if my memory serves me, that watchmaking is a dead business. Even the us education dept. considered it dead and buried with less than 100 students per year taking it.

    today though, with watchmaking (back) on the rise, the supply of workers is much less than the demand.

    everything, well most thing at least, is cyclical. we'd expect so called researchers to have much longer timelines in their research than the immeduate ones.
  • Don't think so (Score:3, Interesting)

    by VincenzoRomano ( 881055 ) on Tuesday March 13, 2007 @05:03AM (#18329161) Homepage Journal
    Just like building construction science was not dead with Egyptians, Greeks, Romans, Chinese, Aztech ... and so on, the IT won't.
    New technologies, new languages, new paradigms as well as new hardware will push forward the IT.
    I fear the sentence has come from some "old school" mind, still tied to old technologies. Which could really die sometime in the future.
  • Quote: (Score:1, Interesting)

    by Anonymous Coward on Tuesday March 13, 2007 @05:16AM (#18329253)
    The only way to understand the wheel is to re-invent it.
  • by cyclop ( 780354 ) on Tuesday March 13, 2007 @05:17AM (#18329261) Homepage Journal

    This doesn't mean CS is dead.

    Surely computing is much more accessible, and there is a hella lot more ready-to-go software and libraries compared to what was there 10 years ago, but this means nothing. New applications will always be needed/invented, and someone will need to code them. And even with the latest and easiest programming languages, doing things well needs some kind of education.

    I am a biophysics Ph.D. student. I have never had a formal CS education nor I am a code geek (although I like to code), and just building a relatively little data analysis application with plugin support in Python is making me smash my nose against things that would make my code much better, that probably are trivial for people with a CS education (what's currying? what is a closure? how do I implement design patterns? etc.) but that for me are new and quite hard (btw: a good book about all these concepts and much more?): so I understand why CS is of fundamental importance.

  • by cmholm ( 69081 ) <cmholmNO@SPAMmauiholm.org> on Tuesday March 13, 2007 @05:18AM (#18329271) Homepage Journal
    Comp Sci has always been dead, and always will be. In 1982, one of my early CS professors claimed that the window of oportunity for a job as a programmer or s/w engineer was going to close soon as automatic code generators took over the task of raw code banging. Employers would just need a few engineers for design, and that would be it.

    But, I shouldn't be surprised that yet another generation of technology dilettantes think that they're reached the pinnicle of achievement in a line of endeavor, and from here on out it's just like corn futures (Somebody oughta tell Monsanto to stop wasting time with GMO research). But seriously, when we've got bean counters like Carly Finarino and whichever IBM VP it was claiming that the years of technical advance in IT are over, not to mention the author of the fine article, Mr. McBride, I see people who are in the wrong industry. Perhaps they should be selling dish washers, or teaching MCSE cram schools.

    McBride is whining because the students aren't packing his CS classes like they used to. His reasons whittle down to these: mature software packages exist to service a number of needs (which has always been true, to the contemporary observer), and it's too easy to outsource the whole thing to India. It is the writing of someone throwing in the towel. It's like the trash talk you hear from people who are about to leave your shop for another job. I won't be surprised to find him in fact "teaching" MCSE "classes" very soon. Good. His office should be occupied by someone who still has a fire in their belly.
  • Re:Not dead (Score:3, Interesting)

    by MichaelSmith ( 789609 ) on Tuesday March 13, 2007 @05:28AM (#18329333) Homepage Journal

    Compare computer science to other science - like architecture. Computer Science is still very immature with very few true best practices and standards. It will not die anytime soon.

    Maybe this is slightly off topic, but my wife is an architect, and any time I want to stir up one of her co-workers I tell him tales of version control, automated builds, automated unit testing and bug databases linked to revisions.

    None of this exists outside of the software business in anything like the same form. When it comes to producing information in a controlled fashion software is streets ahead of any other field.

  • Are you mental? (Score:2, Interesting)

    by dintech ( 998802 ) on Tuesday March 13, 2007 @05:38AM (#18329377)
    'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch.'

    This is equivalent to 'Off-the-shelf applications now fulfil all possible needs and changing requirements.'

    Surely not. The British Computer Society should really talk amongst themselves before releasing such obvious trolling public statements. This idea could get in to the hands of people who would take it seriously...

    Some muppet in your management chain is trying to 'leverage' a Microsoft Office implementation for your Credit Derivitives Trading platfrom.
    Cancel or Allow?
  • by VirusEqualsVeryYes ( 981719 ) on Tuesday March 13, 2007 @06:00AM (#18329517)
    The parent makes the same mistake that the article and the summary make: computer science != programming. TFA talks on and on about "longing" for old programming languages, about new programming tools, about the ability of 8-year-olds to program (?), about almost anything programming-related. The only non-programming thing TFA cites is the falling numbers of computer science majors, which, in my opinion, does not indicate the death of anything, but rather reflects the amount of respect that IT jobs get in the private sector--that is, next to none.

    But there's so much more to computer science than programming and general software. There's robotics, artificial intelligence, distributed computing, networking, graphics, architecture, and theory, not to mention the overlaps with other fields, such as with electrical engineering (architecture), mechanical engineering (robotics, integration), mathematics (especially statistics), sociology (mass models), and just about any other science or even non-scientific field that could use modeling--multifield modeling requires skills that techie teens do not have. Don't forget that there are uncountable subfields within each field, and I most likely missed one or more fields as it is.

    Artifical intelligence and robotics are especially potent because they are both in their infancy and merely budding as fields of study. Their potential is huge. And TFA has the balls to claim that CS is dying? Quite the contrary.
  • by Geoffreyerffoeg ( 729040 ) on Tuesday March 13, 2007 @06:07AM (#18329575)
    The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments.

    You should blame Java. And you should blame C++, Python, and any other similar medium-high level language, if that's the intro language and your sole teaching language.

    Here at MIT we have 4 intro courses. The first, the famous Structure and Interpretation of Computer Programs [mit.edu], is taught entirely in Scheme, a purer and more pedagogical dialect of Lisp. You learn how to do all the high-level algorithms (e.g., sorting) in a purely mathematical/logical fashion, since Scheme has automatic object creation / memory handling, no code-data distinction, etc. At the end of the class you work with a Scheme interpreter in Scheme (the metacircular evaluator), which, modulo lexing, teaches you how parsing and compiling programs works.

    The next two are EE courses. The fourth [mit.edu] starts EE and quickly moves to CS. You use a SPICE-like simulator to build gates directly from transistors. (You've done so in real life in previous classes.) Then you use the gate simulator to build up more interesting circuits, culminating in an entire, usable CPU. From gates. Which you built from transistors. The end result is, not only are you intimately famliar with assembly, you know exactly why assembly works the way it does and what sort of electrical signals are occurring inside your processor.

    Once you know the highest of high-level languages and the math behind it, and the lowest of low-level languages and the electronics behind it, you're free to go ahead and use Java or whichever other language you like. (Indeed, the most time-consuming CS class is a regular OO Java software design project.) You're not going to get confused by either theory or implementation at this point.

    So yes, blame Java, if you're trying to teach memory allocation or algorithm design with it.
  • by seriouslyc00l ( 1075045 ) on Tuesday March 13, 2007 @06:09AM (#18329589)
    Computer Science isn't dead. Some old computer scientists are dying. And new ones are being born. By the dozens. In the west, and in the east. Yes, jobs migrate by rules of economics. That doesn't kill the science. Because what migrated was not science - it was bricklaying of the computer age. If computer scientists were to do the "bricklaying", that would kill the science. Having said that, there are bricklayers in every community - east or west. It's a pity that bricklayers from the west have had to see their jobs move east. Sorry, but that's how the rules of economy work. The real scientists, whether they are from the east or west, stay put where they are, doing what they like doing best. The invention of concrete mixing machines did not render civil architects extinct - on the contrary, it made it necessary to have more of them, and to have better ones. And so it is with software. Off-the-shelf software doesn't make software engineers obsolete - it makes it possible to explore new application areas - and this requires more and better software engineers than before.
  • by thaig ( 415462 ) on Tuesday March 13, 2007 @06:12AM (#18329615) Homepage
    I think it's more like:
    Mechanical Engineers and Mechanics
    or
    Electrical engineers and Electricians

    Each job has its problems but focuses on a different end of the product lifecycle.

    Some software doesn't die and merely needs to be maintained, so naturally, after a while there is less need for hardcore Computer Scientists to develop new things. Open source probably accelerates this trend - e.g why write a portable runtime library for your app when you can use the NSPR or the Apache one?

  • by Rik Sweeney ( 471717 ) on Tuesday March 13, 2007 @06:23AM (#18329677) Homepage
    That's why people don't do it. When I was at University in the UK (Portsmouth if anyone cares), I did Maths and Computing.

    The first year consisted of learning how to format a floppy disk and write a Word document. Oh, and there was some Java thrown in there, but people found Java too hard and complained. Java then got removed from the curriculum and we did crap like theories in Artificial Intelligence instead.

    We had the option of doing C++ in our final year but this largely consisted of printing out to the console and writing some text to a file. No fancy shit like Pointers or anything like that. Most people didn't elect to do this option as programming is hard work and they just stuck to Matlab instead.
  • by ZombieEngineer ( 738752 ) on Tuesday March 13, 2007 @06:32AM (#18329729)
    I don't know if this was meant to be a flame bait but I'll bite.

    I am an engineer by trade (making training simulators for chemical plant operators) and I have encountered more than my fair share of Computer Science graduates.

    A lot of these people are focused on "how do I meet this product spec?" and not necessarily a solution fit for purpose. I routinely encounter situations where enumeration comparisons are done using strings and searches are implemented using a linear search (I kid you not, I once reduced a program run from 90 mins to 4 mins by replacing a single linear search with a binary search). Just because every 6 months there is a more powerful CPU on the market doesn't justify ever increasingly sloppy coding.

    There are a few people who are focused on "how do I make this better?". For these people making a compiler that would recognise linear search and replace with a more appropriate technique automatically is there objective, before people jump up and down saying there is no way a compiler could determine this I will point out that there was a consulting company who 20 years ago had a FORTRAN compiler that would silently replace nested loops with equivalent BLAS matrix calculations (said consulting company was bought out by Intel several years ago). So what is the big deal? FORTRAN died several years ago... Well it is a bigger deal today with Dual Core processors where things like BLAS calculations are perfectly suited to parallel processor architecture.

    Moving on to address some of your other comments: "Everyone still needs an IT department"
    If your IT department is stacked with CS people then someone isn't doing their job properly. I found IT support (did it for a University department while working on my post-grad) is highly dependent on the level of planing and implementation. A well planned system with appropriate lock-downs (era of Win 3.1, we mirrored the HDD of the local machine from the network server when people logged in) resulted in no viruses or other on-going issues (you had a network drive for personal storage but the desktops were a shared resource, you could install software, use it but the moment you logged off and back on again - Poof!). Prior to having a planned strategy, IT support consisted of firefighting & band-aid patching.

    "There are a ton of companies who need very specialized internal applications, or their own "B2B" applications"
    Oh Please!!! Specalised applications are a pain in the neck to support, the real issue here is that who ever implemented them did not fully understand what the end user requirements were. There is a real art of extracting that sort of information out of people and it requires an inquiring mind, good communication and people skills. There are application houses that milk corporations of money due to scope changes because they couldn't get the original spec right (I am not going to enter into the argument of whose is to blame for a defective spec, there are valid arguments for both sides).

    ZombieEngineer
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday March 13, 2007 @06:38AM (#18329753)
    CS isn't dying. Academias monopoly on CS is dying. Forging swords was an experts job 400 years ago. Now it's a hobbyists thing. I may not know my way around memory allocation that well anymore, simply because my last three PHP customers and I couldn't give a sh*t, but I did opcode/assam programming 20 years ago (to control single dots on my sharp handheld screen) and the book I need to learn C in and out again is resting on a shelf two meters away.

    CS is sort of becoming a science like philosophy. There are people who study it and earn money with it, but anyone half way interested can join an educated discussion with them on the topic. And, on top of that, the experts view on the topic usually is quite strange and outside of common sense. You'll find tons of Wittgenstein crackpots at academic positions simply because they dig mental masturbation as a dayjob. The Schopenhauer guys all have occupations that are more 'real'.

    Nobody takes a guy serious anymore ranting about how this PL is worse than that and how Java sucks and real men use C or PHP is for sissies and Ruby is cool. They don't even want to hear from me that Zope still is lightyears ahead of Rails ;-) . People want the job done. And move on.

    Point in (simular) case: Nowadays nobody (not even academics) - except maybe a few people who build satellites and stuff - gives a rats ass if x86 sucks or not. It has won. Period. And I bet unemployed non-x86 hardware guys tell you how crappy it is if you give them some change and a warm dinner.
    If some kid in india who's read a copy of Kernighan & Ritchie can solve my low-level problem with some Linux module that's getting in my way, I don't give a hoot wether he's an academic or not. Yet I bet he's got a simular skill kit of one.

    Bottom line:
    Computers and their science have become mainstream and are slowly moving out of their steam age. Get with the programm.
  • by Sobrique ( 543255 ) on Tuesday March 13, 2007 @06:41AM (#18329773) Homepage
    I'm starting to agree. I look around my 'IT office' and most of it _isn't_ CS degree level. It's helpdesk, RFTM and 'rebuild my PC' level. Now, the infrastructure development and systems architecture is still very definitely a specialist IT role, which is my current focus, but most of the people on the 'coalface' need about as much IT literacy as the guy using MS word.
  • by majortom1981 ( 949402 ) on Tuesday March 13, 2007 @07:27AM (#18330001)
    I think it will have problems in the future. I tired being a comp sci major. Porgramming is hard tedious mind numingly boring stuff. Most people realise that ,Unles syour into that sort of thing nobody in their right mind would do it. thats why their will be a problem in the future.
  • by Paulrothrock ( 685079 ) on Tuesday March 13, 2007 @07:31AM (#18330023) Homepage Journal

    You wanna do research-level computing? You want to design and create brand new ways of computing? You want to work on AIs? Get a degree in CS.

    If you want to code or do networking or project management, there are plenty of other courses out there that'll give you a much better education for that sort of job.

    What happened towards the end of the dot-com boom is that people started to realize that CS wasn't exactly right for generating code monkeys, and colleges started offering different types of courses to fill these positions.

  • IBM is full of bright people (and I don't mean that sarcastically). They are very smart. The problem is they hire a lot of fresh grads and give them write access to the main repository. Then they get them to work on some ancillary functionality before moving up. Unfortunately, for me, where my job was testing DB2 on different compilers, it's these ancillary functions that held me back.

    That hash template for instance, took me about 2 months to fully work out why GCC didn't like it (it would fail at runtime), actually to this day I don't know exactly what was wrong, I just put a workaround in it's place. Worse yet, that hash template isn't even part of the majority of the runtime path, it's only called during startup/shutdown.

    DB2 for instance, is the product of thousands of developers work. Most of whom are not even working on it anymore. So one guy may start a class [or method] and another add to it. But they might not take the design the same direction. So while the original developer may had one implementation in mind it ended up going another direction, the subsequent developers trying to wrestle what code there is to do what they think it should.

    In the case of the hash template, I honestly think it was from a newb developer who didn't have the practical experience required to stop and think "there has to be a simpler solution." I remember when I was a newb developer (12-18 years old) I would write extremely long and complicated code for things that I could now accomplish with much more elegant code [and style]. These grads obviously didn't do much development during uni and the result is they honestly don't know how to express ideas concisely yet.

    It's like if you're learning French [or any other second language]. You might use more verbose, or awkward language to express an idea the natives have simpler language for. For example, a newb may say "Nous parlons comme tout les autres Francais," whereas a French dude may say "Nous parlons comme du monde." The latter being more accepted and easily understood.

    As for why management didn't catch it. Well several things. First, they're super busy at IBM. When I was there they were always running off to this meeting or that meeting. Second, it's not their job to sit and inspect the entire codebase (DB2 is also very large ...). Third, even if they did see shitastic code, HR wouldn't let them fire/reassign them so easily.

    They do have standards and testing suites. The problem is they're not comprehensive. At the time I started there, DB2 was only tested with ICC on x86 platforms. Even though they did support GCC (and used it on non-x86 platforms). Had they tested with GCC too, the hash template code would likely not have been accepted.

    Anyways nuff ranting...
  • by Anonymous Coward on Tuesday March 13, 2007 @08:12AM (#18330281)
    Indeed. From the post:

    'Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.' Is that quote laughable? Or has the software development industry stabilized to an off-the-self commodity?"

    The author of the post seems to have already come to the conclusion that a stabilized soft. dev. industry implies CS is dead. As a CS researcher in academia, I can tell you that off-the-shelf software is an awesome thing to have available, but by no means implies we are done. Having easy to use software makes the day to day stuff much less of a pain than it used to be. However, unless people think Excel or their HR department's time scheduling package is going to solve cancer, build robots, understand the human brain, pre-order pizza on poker night, find new sources of oil, track identity theft, extract captions from raw images and video, improve your laptop's battery life through software efficiencies, ........

    Building accounting software is trivial for a computer scientist, the tax domain comes with a well documented set of rules, and they are already based on numbers.

    Computer Science is not dying at all; rather, it has become so important that is rapidly spreading out to all the other sciences, where now any professor in any science looks very carefully at any grad applicant that has programming ablities. Just like math is a pre-req for any undergrad science major, basic computer science will eventually be the same way.

    Computer Science is as dead as Calculus.

    My belief on the decline of CS majors, besides the media scaring people away from fear of India, is that people are more interested in solving problems that actually matter. With the CS fundementals solved, or deemed too difficult for all but the exceptionally high calibur (which is the current state of mathematics), people are looking to contribute to open problems that they can make a difference on personally. This is a GOOD THING. Why be sad about the fact that another generation of students will not be slogging away at building web servers and the like? Instead they might help you, yourself, live longer, healthier, happier. And as I said, CS is involved in all of this, so its not like we get just one or the other.
  • Bingo (Score:5, Interesting)

    by benhocking ( 724439 ) <benjaminhocking@nOsPAm.yahoo.com> on Tuesday March 13, 2007 @08:28AM (#18330405) Homepage Journal

    There is a lot of CS work out there. But it's science work, not programming or product development. That's not CS, that's engineering or just programming.

    Leaving aside the issue of whether there is plenty of programming or product development work still out there (I think there is), you're absolutely right. We might as well argue that physics is dead because there are so few jobs for physicists. The supply/demand ratio for physicists is quite high. However, that doesn't mean that there isn't plenty of good science left to do. (No talking about string theory here - too volatile a topic.)

    Examples of very interesting areas in computer science, besides software development, compilers, networking, programming languages, graphics, and architecture include: quantum computing, neural networks, genetic algorithms, and genetic algorithms with neural networks. (Perhaps I'm wee bit biased here.) I guess to be fair I should also mention the tremendous growth in bioinformatics.

  • Re:Wow! (Score:3, Interesting)

    by __aavonx8281 ( 149913 ) on Tuesday March 13, 2007 @08:32AM (#18330439)
    I've heard people use this argument before, and I think it is one that students make while they're in school or shortly after they graduate. What this train of thought fails to realize is that "applied" skills can be self taught, and what separates the CS grads from the other employees who have just picked up computers on their own is their fundamental understanding of the logic and layers that actually make up the basis of the "applied" skills. The self taught hacker will only know 'how' to make stuff work, not necessarily the 'why' behind the application. Don't knock the theory and abstract math until you get out in the field and you're designing a complex, distributed system. Then all that stuff that "doesn't help" will come in very handy and will allow you the understanding to actually solve problems rather than just hacking at them until you come up with something that works.
  • Re:Wow! (Score:3, Interesting)

    by Toba82 ( 871257 ) on Tuesday March 13, 2007 @08:36AM (#18330461) Homepage
    That's basically my job. I take business processes that my employer needs and turn them into applications to save other employees time. This will never be replaced by off the shelf software.
  • by coolmoose25 ( 1057210 ) on Tuesday March 13, 2007 @09:06AM (#18330737)
    I agree with your analysis that Comp. Sci. has divided from Software Development. I was not a CS major in school - I was a Mech. Engineer. But I got into CAD systems design, and ended up in the software game. So now, I'm a "Senior Software Engineer" but I have no formal training in CS other than the fact that I had to code Fortran for 4 years in college. We learned bubble sorts, and all that jazz, but I have no idea how a compiler works other than it takes what I write in VB.net and turns it into something the computer can execute. However, the point of the article is that BOTH CS and Software Development is dead. Just buy a package and your troubles will be over. Frankly, I wish I could have just 1/10th of what the companies I've worked for have spent on packaged solutions. For that price, I could have built, by myself, a custom package that would do EXACTLY what they want, and I would have become rich in the process. Instead, I get paid a lot of money to cobble these disparate systems together, make AS400 green screen apps talk to fat client software, I write all the stuff that pulls them together, and if we are lucky, and I'm good, it all works... mostly. So why do companies do that? Because it simply MUST be cheaper, of course! /laughing maniacally
  • by sgml4kids ( 56151 ) on Tuesday March 13, 2007 @11:01AM (#18332439) Journal
    I agree.

    I took a 13 year hiatus between starting my CS degree and finishing it. In that time, the curriculum at my university changed so what once was a 2-part course on programming models and idioms became a 2-part course on learning the intricacies of C++. Courses on compiler design were replaced with courses on writing web applications. Instructors often short-circuited the requisite mathematics -- forget trying to understand the "why" ... just learn enough to be productive in a job. When I started my degree, I learned things that spanned many technologies. By the time I finished it, my university simply taught the technology-du-jour.

    When I think "Computer Science" I think Knuth and Shannon. It seems that for many others, "Computer Science" means Linux or C# or Balmer.

    For example, in testing and quality assurance. It saddens me that few QA departments in the software world use statistical analysis of software and few use the scientific method.

    I'm certain that Computer Science is still happening somewhere, but most of what I see in schools and industry is Computer Technology.
  • by dgatwood ( 11270 ) on Tuesday March 13, 2007 @02:17PM (#18336061) Homepage Journal

    Nonsense. It is cheaper to do it at universities where you can pay the researchers next to nothing even by outsourced standards. Better still, foreign universities where you can pay next to nothing even by American university standards. :-)

    Corporate environments don't tend to lend themselves to heavy research. I'm sure there are exceptions, but they are the exception. If you want to do research, do a postdoc at a university. That sort of thing has limited potential for long-term financial stability, though, unless your end goal is to become a college professor. Generally speaking, there are plenty of research tasks to do and plenty of graduate and undergrad assistants to do it.

  • by Lord Bitman ( 95493 ) on Wednesday March 14, 2007 @03:36AM (#18344263)
    In my experience, a CS degree means no such thing :/
  • by Targon ( 17348 ) on Wednesday March 14, 2007 @08:09AM (#18345451)
    Back in the mid to late 1970s, computers were these things you only heard about in movies or TV shows for those of us growing up back then. Sure there were some people who used computers, or who had access to them, but access to computers was something that only very large corporations had, or schools, or certain government jobs(but not all). The closest most people got to a computer was a terminal(a screen with keyboard that connected to a computer).

    The result of this is that there was something mysterious about computers. When the first personal computers became available to the general public(many will remember the Tandy/Radio Shack TRS-80 models 1,3, and 4 with the model 2 being more of a business model, and the Apple 2 series), these machines became the first ones available to those who didn't have enough information to build their own computers. They were fun, allowed for playing some games, and this inspired many to continue to learn how computers worked. There was also a good amount of encouragement given by teachers back in those days and into the 1980s.

    So, between having an interest in computers and technology by some, and being encouraged by others to continue learning, Computer Science grew in popularity. As time went on, and computers became more and more common in the 1980s into the 1990s, there was continued support by those in education and in general for those who showed a true interest in computers.

    So, what happened to change this SHOULD be the question being asked, not just looking around and complaining about the current situation. As technology became more and more common, the number of jobs grew in the sector until the tech crash in 2001-2002 when the real down-turn in the industry really started to show up. With many jobs lost, there was an excessive number of computer science trained people around.

    If you were in high school at that point and you were hearing about tech jobs being hard to get, switching focus might have seemed like a good idea. For parents and teachers, encouraging people to go into a field where the job market wasn't very good also wouldn't seem like a good idea. And so, here we are, in 2007, and the job market has gotten a bit better but still isn't booming. Entry-level positions are hard(or harder) to find because of outsourcing. Reports of how programmers are treated by companies(generally long long hours with little appreciation), and a lack of control in the development process as a junior level programmer scares people away.

    The computer industry has also transitioned from being "we need programmers because there are no pre-made applications that do what we want" to having different specialized areas. Now you have networking, system administration, Information Technology, database administration, and other specialized areas. As a result, those with an interest in computers will select a major that fits the area they have an interest in. Why go Computer Science if a MIS degree will get you where you want to go?

    So, the way to get students interested in Computer Science is to encourage them that it is an area that still needs people, and that it's not a major for those who are going to end up as a "code monkey". To be honest, the computer industry NEEDS true computer scientists since most applications seem to have been slapped together by people who may be able to write code, but can't figure out how to design an application(which is why multi-threaded applications are an exception in the MS Windows environment).

"The four building blocks of the universe are fire, water, gravel and vinyl." -- Dave Barry

Working...