Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

The Year in Technology 80

bedessen writes "It's that time again, when we look back on the year in summary. New Scientist has an article "2002, The Year in Technology", as well as "The Year in Medicine and Biology." Popular Science brings us "The 15th Annual Best of What's New.""
This discussion has been archived. No new comments can be posted.

The Year in Technology

Comments Filter:
  • by corebreech ( 469871 ) on Saturday December 28, 2002 @04:57PM (#4973277) Journal
    That might be #1 for this decade, yes?
  • I am surprised! (Score:1, Interesting)

    This didn't make mention of the huge (and renewed) debate as to whether 1 or 0 is a prime number or not.

    At the conference for applied and new mathematics in Melbourne last year there was a huge fervor over this as new evidence came to light.

    Basically, from I gather, it's like this:

    Technically, neither 1 nor zero is a prime number. It is easiest to see why zero isn't: since a prime number is only divisible by one and itself,
    let's find all the divisors of zero.

    Well, since 0 x 1 = 0, and 0 x 2 = 0, and 0 x 3 = 0, and so on, all these numbers divide zero, i.e. zero is divisible by every positive integer. So
    it isn't a prime number.

    As for 1, you might want to call it a prime number, since it really _is_ divisible by only one and itself. But then you run into some problems.
    For instance, you may know that every positive integer can be factored into the product of prime numbers, and that there's only one way to do
    it for every number. For instance, 280 = 2x2x2x5x7, and there's only one way to factor 280 into prime numbers. But if you let 1 be a prime,
    then you can get the following factorizations: 1x1x1x2x2x2x5x7, 1x2x2x2x5x7, and so on. The factorization is no longer unique.

    Furthermore, there are a whole bunch of theorems in Number Theory that tell you something about prime numbers. But most of these theorems just flat out ain't true for the number 1. So in light of these facts, we just declare the number 1 to not be a prime.

    So that's why we don't WANT 1 to be a prime. Mathematicians have summarized this in a nice neat definition: a prime number is a positive integer which has exactly 2 different positive integers that divide it evenly - no more and no fewer.
    • by Anonymous Coward
      ..The vast majority of the world doesn't really give a fsck. ..The final solution of whether one is prime or not will probably lead to a technological revolution the likes of which we haven't seen since the Rennaisance. :p

      Physicists? Biogeneticists? Cloning? Those don't scare me.

      Mathematicians scare me. ;)
      • "Physicists? Biogeneticists? Cloning? Those don't scare me.

        Mathematicians scare me. ;)"

        if you went to school, you would realize that physicists, biogeneticists (is that a word?), and cloners ARE mathmaticians on one level or another...so be afraid...be very afraid!!!
    • Isnt 1 or 0 being a prime number unimportant? Perhaps my math isnt high enough yet (Math Analysis (Trig) in High school currently). Isnt it being a prime number just a label? Whats the significance? Its an interesting argument anyway. I like this kind of stuff, Im just wondering if it changes anything or is an argument for the sake of knowing.
      • Perhaps my math isnt high enough yet (Math Analysis (Trig) in High school currently). Isnt it being a prime number just a label?

        Prime numbers are *extremely* significant in mathematics. Prime numbers are one of the most fundamental ideas in set theory. It turns out that it doesn't matter what base number system you use, or if you even use numbers or finite sets, you will end up with some form of prime numbers (or prime cardinality).

        The idea of primality is a very basic concept in mathematics and it lies very close to the fundamentals of logic and set theory. You mention that you are in high school, but if you are interested in reading more, I suggest reading something from an Abstract Algebra website or text book. Abstract algebra is a college level math course, that could be understood easily by an interested and motivated high school student. No calculus or prior college math is necessary to understand Abstract Algebra, although a highschool geometry course might help you with the concepts of theorems and proofs.

    • by Decimal ( 154606 ) on Saturday December 28, 2002 @05:34PM (#4973370) Homepage Journal
      So that's why we don't WANT 1 to be a prime.

      I don't want 11 to be prime, either. Would you mind doing some of that math work and fixing this, please?
    • mod parent DOWN (Score:4, Informative)

      by $carab ( 464226 ) on Saturday December 28, 2002 @08:15PM (#4973861) Journal
      Youve got to be kidding me. This comment should not have been modded up in the first place.

      The message text is at:
      http://mathforum.org/library/drmath/view/58723.htm l [mathforum.org]

      I remember having a distinctly similar conversation with my third grade teacher when we learned about things like long division and prime numbers.
  • Coolest one (Score:5, Interesting)

    by core plexus ( 599119 ) on Saturday December 28, 2002 @05:18PM (#4973319) Homepage
    "The endlessly versatile carbon nanotube was then shown also to have an explosive side [newscientist.com] in April. A laboratory accident revealed that a bundle of carbon nanotubes will explode when exposed to an ordinary camera flash." Just in time for New Years!
  • by Yoda2 ( 522522 ) on Saturday December 28, 2002 @05:32PM (#4973362)
    I taught computers to learn nouns and verbs based on visual perception this year. See here [greatmindsworking.com] for more info.
    • "Another thing that got forgotten was the fact that against all probability a sperm whale had suddenly been called into existence several miles above the surface of an alien planet.
      And since this is not a naturally tenable position for a whale, this poor innocent creature had very little time to come to terms with its identity as a whale before in then had to come to terms with not being a whale any more.
      This is a complete record of its thought from the moment it began its life till the moment in ended it.
      Ah...! What's happening? it thought.
      Er, excuse me, who am I?
      Hello?
      Why am I here? What's my purpose in life?
      What do I mean by who am I?
      Calm down, get a grip now...oh! this is an interesting sensation, what is it? It's a sort of...yawning, tingling sensation in my...my...well, I suppose I'd better stat finding names for things if I want to make any headway in what for the sake of what I shall call an argument I shall call the world, so let's call it my stomach.
      Good. Ooooh, it's getting quite strong. And hey, what about this whistling roaring sound going past what I'm suddenly going to call my head? Perhaps I can call that...wind! Is that a good name? It'll do...perhaps I can find a better name for it later when I've found out what it's for. It must be something very, important because there certainly seems to be a hell of a lot of it. Hey! What's this thing? This...let's call it a tail - yeah, tail. Hey! I can really thrash it about pretty good, can't I? Wow! Wow! That feels great! Doesn't seem to achieve very much but I'll probably find out what it's for later on. Now, have I built up any coherent picture of things yet?
      No.
      Never mind, hey, this is really exciting, so much to find out about, so much to look forward to, I'm quite dizzy with anticipation...
      Or is it the wind?
      There really is a lot of that now, Isn't there?
      And wow! Hey! What's this thing suddenly coming toward me very fast? Very, very fast. So big and flat and round, it needs a big wide-sounding name like...ow...ound...round...ground! That's it! That's a good name--ground!
      I wonder if it will be friends with me?

      And the rest, after a sudden wet thud, was silence

      curiously enough, the only thing that went through the mind of the bowl of petunias as it fell was Oh no, not again. Many people have speculated that if we knew exactly why the bowl of petunias had thought that we would know a lot more about the nature of the Universe than we do now."

      Seriously though.. I'm downloading your dissertion now.
  • by SHEENmaster ( 581283 ) <travis&utk,edu> on Saturday December 28, 2002 @05:35PM (#4973373) Homepage Journal
    They can never have too much coffee (caffeine is only good in some professions), and if they run Windows you have an easy-to-win malpractice suit that benifits yourself as well as the open source community!

    True, I didn't RTFA but that's what posts are for!
  • Purely from computing's perspective, for me it's the arrival of Hperthreading on desktops. Next year, hyperthreading and increase in FSB (800 Mhz!) plus memory bandwidth will truly obsolete a lot of computers.
  • Anyone else caught this [newscientist.com] from the new scientist article?
    My word! They make bikinis that size?
  • by njdj ( 458173 ) on Saturday December 28, 2002 @06:01PM (#4973449)
    Sadly, the article on technology describes nothing that can really be described as a breakthrough. There were some more little steps towards quantum computing, but this journey did not start in 2002 and certainly did not reach fruition in 2002.

    An honest title for the article would have been "No technology breakthroughs in 2002", but that wouldn't have sold any magazines ...
  • Heh, just seems like the usual happened. Things were improved, more ideas were thought up, transistors got smaller, Microsoft got dumber, and linux got better. It doesn't seem like anything really remarkable happened this year that would count as a breakthrough. Next year, maybe?
  • --I agree with the editors at popular science, the ibm product is COOL. The modular computing system-at work, it's a desktop, cruising around the core stays with you as a PDA or wearable rig, once home slide it into the laptop. This is a GREAT idea, although the price is medium sucky. I hope the concept catches on and more companies provide similar "modular computing" platforms. Reminds me-same concept-as what I got for Christmas from my girlfriend, a black and decker 12 volt cordless multitool [blackanddecker.com], quite a nifty gadget, has a common battery and electric motor, but you can replace the head for a drill/driver, a jig saw or a sander. Slickness.
  • Amazing. The news is, first of all, that a number of ethical cretins are trying to clone a human baby, and next that proposed treatments for serious diseases have failed disastrously, next that a couple of nasty diseases turn out to be more dangerous than expected, and finally that research continues in areas that can not be expected to produce anything to ease suffering for years, if not decades. Where is the good news?

    Next look at the smartass, off-topic, smutty reactions of lots of /. posters. Ye gods....!! If this is an indication of how the public reacts to questions of health and science, we are in for a rough century.

    It seems IMHO time to question seriously the basic approach the scientific community is taking toward biomedical research. What, exactly, is the cost/benefit ratio these days?

    Further, how sensible is it to buy into the article of faith that all we have to do is continue to pour billions into basic research, expecting that sooner or later we will all lead longer and better lives as a result? It could be that we are wasting tons of money. It would be an excellent idea to re-examine how we allocate scarce resources in the pursuit of knowledge. I'm not a Luddite, but I am very disappointed that our progress has been so slow. Consider, for example, when you last saw any statistics showing how much money has been spent researching cancer (both on basic research and in the development of clinical tools), and how the suvival rates for the disease have changed over the last half-century. I think you don't see these figures because they are grim, indeed.

    Maybe thirty years ago a physician told me that childhood leukemia was "almost not fatal any more." Where is it today? "Not fatal?" Are we chasing a will-o'-the-wisp, or have we really got a grip on where we want to go, and how to get there? How uncoordinated and goofy are our efforts? Should we not be further along by now??

    My argument is not against science, basic research, or knowledge. It is simply that it would be better--more efficient--if we spent our money more wisely, that is, according to rational plans that consider results and costs when deciding where to put our efforts. Are we in this to learn things, or to save human life? Can we do both? Sometimes it appears that there IS a very real difference between two camps: one pushing for more labs and money for whatever it wants to pursue, the other genuinely concerned with saving lives. Consider:

    Long ago the Nixon administration tried to shift funds to the implementation of widespread early detection programs, in the sure knowledge that certain cancers (not all) can be cured if detected when small. The scientific community howled like a stuck pig. Sure, Nixon was a jerk, but his priority was the saving of lives, now. As a result of intensive lobbying, the early detection approach was scrapped, and who knows how many lives have been lost because of that? I could not criticize this if it could be shown that pressing on with expensive basic research had saved even an equivalent number of lives, but I am sure no such result was obtained. Those who argued against Nixon's approach were willing to sacrifice hundreds of thousands, perhaps millions, of human beings in order to be able to carry on programs whose results could only be speculated about. I do not call that ethical--it seems more like selfishness, and inhumane selfishness at that.

    We ought to have another look, ask some hard questions, and consider whether the scientific establishment has taken the bit in its teeth. Poor results for 2001 are a hint that I might be right.
    • I agree with you, and I think you can push your argument even further. I think the cost/benefit aspect of money in *all of research* could be reconsidered. How has research in computer science, physics, biotechnologies, [insert your favorite hardcore scientific discipline here], improved life in general? How do their results contribute to our general happiness (and I mean happiness, not comfort)? Are we living in more peace? Have we reduced poverty? Are we more grateful for what we got? More confident in our future? Is our planet in a better shape?

      Simply put: what are the 2002 breakthroughs in "happiness science"?

      Maybe the solutions will not come from hardcore science but from other disciplines: economics, philosophy, history, *gasp* arts... I don't know which discipline, I can't probably even name nor describe them properly, as I'm just another techie nerd... I'm just speculating on what the world would be if all the priorities in research budgets were turned upside down. Think of the department of philosophy/arts/economics in your university getting all the research grants and the brilliant students, think of in what state of progress such disciplines would be if they had the same publication rate as, say, what software engineering has now.

      I am not saying that allocating more money to social sciences would necessarily equate to getting the same results. But maybe just focussing more on such issues alone would contribute to a better awareness of the problems and questions that *really* matter.

      End of utopic rant!

  • The world is your exercise-book, the pages on which you do your sums.
    It is not reality, although you can express reality there if you wish.
    You are also free to write nonsense, or lies, or to tear the pages.
    -- Messiah's Handbook : Reminders for the Advanced Soul

    - this post brought to you by the Automated Last Post Generator...

Talent does what it can. Genius does what it must. You do what you get paid to do.

Working...