Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet Science

Nicholas Carr Foresees Brains Optimized For Browsing 110

An anonymous reader writes "In the next decade, our brains are going to become optimized for information browsing, says best-selling author Nicholas Carr. According to Carr, while the genetic nature of our brains isn't being changed by the Internet at all, our brains are adapting 'at a cellular level' and are weakening modes of thinking we no longer exercise. Therefore, in 10 years, if human beings are using the Internet even more than they do today, says Carr, "our brains will be even more optimized for information browsing, skimming and scanning, and multitasking — fast, scattered modes of thought — and even less capable of the kinds of more attentive, contemplative thinking that the net discourages."" While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?
This discussion has been archived. No new comments can be posted.

Nicholas Carr Foresees Brains Optimized For Browsing

Comments Filter:
  • re (Score:2, Interesting)

    by Anonymous Coward

    Television and the automobile, certainly. However, it seems arguable that books encourage attentive, contemplative thinking. The automobile can be a bit fuzzier - but certainly highway driving requires extreme amounts of attention. City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

    Also, how does this make the argument seem weak? I'm sure there's a large body of work arguing the same is indeed true of televi

    • Re:re (Score:5, Informative)

      by chrb ( 1083577 ) on Friday May 11, 2012 @07:15PM (#39974887)

      City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

      Route planning and navigating through a complex urban environment can require more thought than driving along a relatively straight highway. MRI scans on taxi drivers have shown actual physical brain changes from learning complex urban maps. [bbc.co.uk]

      • Re:re (Score:4, Insightful)

        by TheCarp ( 96830 ) <sjc AT carpanet DOT net> on Friday May 11, 2012 @08:38PM (#39975423) Homepage

        Isn't that to be expected? Any time you look for physical brain changes from years of practiced learning, you find it. That is just what the brain does.

        also cabbies are a special case, most people drive the same repetitive routes over and over, route planning is hardly needed after you have settled in to one or two ways of getting to work and home.

        • Isn't that to be expected? Any time you look for physical brain changes from years of practiced learning, you find it.

          No, actually, that was the interesting part of that study. It was the first that showed a part of the brain growing as a result of an activity. As recently as 2000 many neurologists argued (fiercely) that the brain is static after adult-hood.

          In general it is difficult to tell whether someone has an enlarged brain region because they are good at it, or if they are good at it because they have an enlarged brain region. There is still a small possibility of the latter even with the cabbies, because they self

      • by Anonymous Coward

        While I agree with everything you say I have to be a pedant and suggest that all mental activity causes "actual physical brain changes". The study is remarkable due to these changes being measurable with our imprecise and uninvasive tools. A parallel would be to perceive evidence of human activity on earth from orbit with the naked eye. Not being able to see evidence would in no way show human activity wasn't there.
        In the case of the human mind one would assume that evidence of the thought also functions as

      • Re: (Score:2, Redundant)

        by steelfood ( 895457 )

        Highway driving requires the most attention at high speeds, or while weaving through heavy (but moving) traffic. Driving unfamiliar or less familiar routes (like going to a shopping mall 50 miles away that you'd normally go to only twice a year) also require the same navigation abilities as city driving, but at a larger physical scale.

        Map reading and route memorization helps maintain short term memory. This exercise is rendered moot by using a GPS. It also doesn't come into play when driving a familiar rout

        • [snip]Now television...that's the biggest intellect-killer out there. At least there's interaction when browsing or surfing the internet (for example, posting on /.), albeit minimal.[snip]

          tl;dr :-)

    • Re:re (Score:5, Insightful)

      by mellon ( 7048 ) on Friday May 11, 2012 @08:19PM (#39975309) Homepage

      Plus, Lamarckian evolution involves inheritance, whereas the author is talking about learned/conditioned behavior in individuals. The brain is plastic. It very definitely does adapt to do well whatever you do often.

      In my experience, highway driving is great for contemplation. City driving not so much. YMMV... :)

      • I believe the main reason we don't have more female programmers is not a question of talent, but early exposure and the belief they can be successful in that area. If you don't think about something, "that part" of the brain will end up being used for something else instead.
    • However, it seems arguable that books encourage attentive, contemplative thinking.

      And long attention span , and the ability to grasp complex subjects, and so on. If people aren't able to read books anymore, that really sucks.

    • by drsmithy ( 35869 )

      The automobile can be a bit fuzzier - but certainly highway driving requires extreme amounts of attention. City driving isn't usually done for long stretches - unless it's stop and go, in which case nothing is happening to make it require much brain exercise.

      This sounds backwards. City driving has a lot more hazards and variations that need to be tracked simultaneously. Trundling along on a highway at a pretty constant speed (probably using cruise control, at that) isn't especially taxing.

      I would be inter

      • That's what I was thinking when I first read that myself. Driving in the city requires a lot more focus on what's going on, not only on the road but with pedestrians, cyclists...you don't have to worry about that shit on the highway.

        Highway hypnosis [wikipedia.org], for example, seems to point to highway driving being a lot less taxing on the brain. I know that I've never had any sort of road hypnosis while trying to get around town, but on long road-trips, there's definitely been stretches where I 'zoned out' and drove

  • I want to see brains optimized for gopher and emacs.
  • If brains become web browsers, does that mean we'll need antivirus injections, javascript bandages, and be careful what cookies we eat?
  • by RavenousBlack ( 1003258 ) on Friday May 11, 2012 @07:07PM (#39974823)
    Do something more often and your brain will become optimized for it. I think they call it learning.
    • by hughJ ( 1343331 )
      neuroplasticity
      • This.

        Learning is merely the acquisition of knowledge. This includes acquiring the knowledge of new methods of thought, or new ways to think. But actually thinking, and rewiring the brain to think in a certain way, is completely different.

        You can teach knowledge, but you can't teach people how to think. They either do it or they don't. You can show them many methods to think, but you can't force them to think in a particular way.

    • by Dahamma ( 304068 )

      And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

      • by elucido ( 870205 )

        And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

        So you'll forget how to read if you don't read the entire book cover to cover?

      • And the inverse is usually true as well: don't do something at all for a long time and you tend to forget how to do it - like calculus :)

        The proverb about riding a bike seems to suggest the opposite. Sure, you get rusty, but it doesn't take anywhere near as long to get back up to speed as it did to learn it in the first place. Seems the memory doesn't disappear but goes dormant.

        But maybe calculus is different...

  • weak analogy (Score:5, Insightful)

    by tverbeek ( 457094 ) on Friday May 11, 2012 @07:07PM (#39974827) Homepage

    While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

    The counterargument here seems weak to me; books, television, and the automobile aren't the same as the web, so the learned change wouldn't be of the same kinds.

    • by Dahamma ( 304068 ) on Friday May 11, 2012 @07:35PM (#39975025)

      The counterargument here seems weak to me

      Yeah, that's because the original article was written by a best-selling, Pulitzer prize nominated author, and the counterargument was written by timothy.

    • the automobile aren't the same as the web, so the learned change wouldn't be of the same kind

      Indeed [wired.com]. Probably the impacts from web use will be greater than those from some other sources because an increasing number of young children, with their highly plastic brains, are spending time accessing the Internet. Mostly it is adults who, say, drive an automobile.

    • by mcrbids ( 148650 )

      I don't think that the counterargument is weak at all because Oh look! Another awesome cat video! You gotta check this out! [youtube.com] And that means that what were we talking about again?

  • by epyT-R ( 613989 ) on Friday May 11, 2012 @07:08PM (#39974829)

    oh well. I guess somethingawful was right all along! Now I must research this by finding blogs that agree with my bias..
    ---
    the internet is a tool. like any tool it can have positive and negative effects on the user, remembering that positive and negative are relative terms.

    • The 'net is a lot like television. Watching smart shows makes you smarted, watching dumb shows entertains you but makes you stupider.
      • by epyT-R ( 613989 )

        hmmm.. well I think the determination between 'smart' content and 'stupid' content relates to the reasons why it's being looked up. TV is a bit different as it's completely passive, and all programming is from established outlets who depend on ubiquitous eyeballs. In order to do that, it must appeal to the largest demographic, demanding insipid melodrama and whitewashed, politically correct truth, even in things like documentaries and news broadcasts. In contrast, it is still possible to find interesting

  • Habit != evolution (Score:4, Interesting)

    by Spy Handler ( 822350 ) on Friday May 11, 2012 @07:09PM (#39974837) Homepage Journal

    Evolution requires death, selective pressure, serious things like that, and takes place over generations. This ain't evolution. It's just people getting into a habit.

    Yes it probably does change our brains on a cellular level, just like the recent habit of no hard physical labor changed our muscles on a cellular level. It's easily reversible simply by doing the old things again.

    • Changes in cellular physiology are not fully reversible. If you revert to an active lifestyle after a decade of being sedentary, you can grow new muscle tissue, but it'll probably never be as healthy as what you had before. I can run and swim and lift as much as I have time for (and I am), but I'll never again be as fit as I was when I was 20. Likewise, there's no turning my brain back to the condition it was in back then, either (which is both a good thing and a bad thing).

      • Being non reversible doesn't mean it is an evolutive treat. To be claimed as an evolutive treat it must be encoded into the genes and passed to the next generations. So, the original argument is really poor and weak. It seems that guy just don't know what is evolution and is throwing this for the fame.
        • Oups, I meant trait instead of treat. BTW, no matter how short you cut your leg, even if irreversible your new born won't born with one leg.
        • it must be encoded into the genes and passed to the next generations

          That's true, but those changes can probably be acquired in other ways than mutation and crossing over alone. There is a whole field [wikipedia.org] about it.

          If you include traits like infection with Wolbachia [wikipedia.org] in insects (which has far-reaching consequences including parthenogenesis in species which otherwise need sexual reproduction), there is actually a fair amount of Lamarckism in the natural world.

          • by tverbeek ( 457094 ) on Friday May 11, 2012 @08:51PM (#39975515) Homepage

            Describing evolution strictly in terms of DNA isn't exactly "wrong"... but it's comparable to describing astronomy strictly in terms of Newtonian physics: perfectly good most of the time, but there are "edge" cases (such as objects approaching the speed of light, or certain species of intelligent primate with advanced communication skills) where it doesn't quite explain what's happening. To fully understand and explain hominid evolution, you also need to look at the linguistic/educational channel through which certain non-genetic traits are passed from generation to generation.

        • Genes are not the only way we pass things on to later generations. We also do it through language. Genetics is just the "hardware" side of the system; humans also have developed a way of passing on behaviors and skills through "software", which we load into our offspring after they come off the assembly line. A great deal of what makes us the kinds of animals we are is implemented in software, not hardware. That ability to evolve in ways beyond mere genetic mutation is how we've become one of the most s

      • by elucido ( 870205 )

        Changes in cellular physiology are not fully reversible. If you revert to an active lifestyle after a decade of being sedentary, you can grow new muscle tissue, but it'll probably never be as healthy as what you had before. I can run and swim and lift as much as I have time for (and I am), but I'll never again be as fit as I was when I was 20. Likewise, there's no turning my brain back to the condition it was in back then, either (which is both a good thing and a bad thing).

        That is just aging. That has nothing to do with how much you work out. You could be the athlete of the century and by age 30 you wont be like you were at age 20.

        • But that was my point: time's arrow only points one way. It's isn't even about what we consider "aging"; even if the change happens on a short biological scale, such that you're still about the same age before and after, it's still not fully reversible.

    • Also, it seems a bit narrow to insist that "evolution" be defined only in terms of genetic inheritance. The ability of a sufficiently intelligence species to not only learn new behaviors but also teach them to their offspring is – in effect – a persistent change in that species. We didn't become a species of arithmetic-performing apes through natural selection of genetic material, but by passing on that skill through teaching. Furthermore, a species which is capable of (more or less permanent

    • by jo42 ( 227475 )

      Welcome to Dumbtards'R'Us - even dumber than before!

      Idiocracy wasn't a comedy, it was a documentary sent back from the future...

    • by Ichijo ( 607641 )

      Evolution requires death...

      No, evolution doesn't require death, but death assists evolution by preventing some individuals from procreating.

      ...selective pressure, serious things like that..

      Developing a skill that's in demand by society gives the individual a greater chance at passing on his/her genes, and that's evolution.

      • Developing a skill that's in demand by society gives the individual a greater chance at passing on his/her genes, and that's evolution.

        Developing a skill that's in demand by society also gives the individual a greater chance at passing on that skill through education... and not just to his offspring, but to others' offspring! Isn't that a form of evolution as well? It's a well-established principle that we're the product of both nature and nurture... why look at evolution solely in terms of one (DNA), and

  • Caveman0: I am draw story on cave wall!

    Caveman1: No! Memorize oral tradition make brains strong! Picture story make brains weak!

    [panel of cave-children staring vacantly at cave paintings, slack-jawed, drooling]

    Caveman0: Me go too far! Me am play gods!

    In case you don't know the meme, original source: http://dresdencodak.com/2009/09/22/caveman-science-fiction/ [dresdencodak.com]

    steveha

  • Not a weak argument (Score:5, Interesting)

    by artor3 ( 1344997 ) on Friday May 11, 2012 @07:17PM (#39974901)

    While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

    Yes, the same kinds of changes could be attributed to the things you named. Which is likely why people who grew up with black and white television dreamed in black and white. Our brains are absolutely affected at a deep level by the things we spend our time on. It seems almost trivially obvious to say so. The real question is whether or not this is a bad thing. Yes, our modes of thinking may become dependent on "browsing" -- on having a ready cache of facts and trivia that don't need to be stored in gray matter. But if it is the case that browsing is indeed always available, might that not be a good thing? Couldn't that free up resources, currently devoted to memorizing state capitals, that could be better spent on higher level reasoning? Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

    • Which is likely why people who grew up with black and white television dreamed in black and white.

      When I grew up there were both black and white and color TVs. So did I dream in color or B&W? If I buy an HDTV will I start to dream in 1080p?

      Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

      For instance? I didn't realize that mathematics had changed all that much since the invention of calculators.

      • by bondsbw ( 888959 )

        For instance? I didn't realize that mathematics had changed all that much since the invention of calculators.

        No claim was made that the "more interesting topics" are new. They are perhaps just more advanced, like skipping some of the months of long division to focus earlier on algebra.

        It bothers me when a fifth grader knows more about some subject than I do. But the problem isn't that I feel dumb... it's that I know that the fifth grader will one day be in my shoes, and ask, "Why was I put through years of this? I can't remember any of this because I never need it... and I want those years of my life back."

        Comp

        • It bothers me when a fifth grader knows more about some subject than I do. But the problem isn't that I feel dumb... it's that I know that the fifth grader will one day be in my shoes, and ask, "Why was I put through years of this? I can't remember any of this because I never need it... and I want those years of my life back."

          Well, I suppose one could apply that to anything we learn in schools today, but personally, I abhor the idea of only teaching what is directly useful to everyone in their day-to-day lives. I certainly don't use all the history I was taught in day-to-day life, but I cannot even imagine how much more dull my life would be had I never been exposed to it in the first place.

          That kind of attitude, the "we must only teach what can be used in a potential career" idea, is going to result in generations of kids grow

    • We teach math for a reason. If you go into a store and want to pay your bill, you have to know how to count so you can present the correct amount of money, and so you can check that your change is correct also.

      It's not strictly been necessary to know how to do that for about two generations now, people *could* just carry an electronic pocket calculator and do the sums on demand at the store counter, but nobody does. It's stupid, and using the counting machinery in your head is much better and more practi

    • Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

      Did people really memorize logarithm tables? I always just looked them up in the back of the book......

    • Math classes can certainly teach more interesting topics now that calculators have obviated the need to memorize logarithm tables.

      Eh, no. Memorizing log tables is not relevant to math. It isn't even a part of math. You can leave your answers in log form without losing points. Math is more abstract than the final number at the end. If you've ever done algorithm analysis, (big O notation), that's closer to what's math than figuring out the log, sin, cos of some non-trivial number.

      And because of this, calculators don't help. They do the exact opposite. I mentioned in an earlier post how by using a GPS, you lose the need to exercise a cer

  • Doing something because it had practical benefit (or is even a necessity) does not mean it's optimal. Certainly neural pathways that are unused may atrophy, and repetition will make us better at any activity mental or physical, but I'm not sure that's really something I would call optimization.

  • I only skimmed the article though. Back to facebook.

  • we will endup with no brain at all.

    WTF!!! Don't you have something about an elixir of immortality or a time-travel machine, I meant, something real.

  • While Carr isn't making a case for Lamarckian evolution, the argument here seems weak to me; the same kind of brain change could be attributed to books, or television, or the automobile, couldn't it?

    Do you doubt that has occurred? Not to mention video games, urbanization, industrialization, birth control, etc.

  • by codeAlDente ( 1643257 ) on Friday May 11, 2012 @07:53PM (#39975143)
    Whether humanity's method of information gathering is books and TV, the Internet, or (Heaven Forbid) interpersonal interaction, we'll all do it in some combination of long and short intervals. The Internet makes it possible to do both the high-frequency information gathering described here, and low-frequency contemplative activities such as gaming sessions, ./ articles and reading science papers. It's a lot easier now to learn about a single, narrow topic in depth than it has ever been in the past. Science has become more specialized. Less time spent searching for facts means more time to spend contemplating your favorite scientific issue. Consequently, given a period of time and a problem of a given complexity, scientists can now analyze an issue / solving a problem in greater detail and with better efficiency. Contemplate that, Carr.
  • Comeon guys (Score:4, Insightful)

    by girlintraining ( 1395911 ) on Friday May 11, 2012 @07:58PM (#39975165)
    A best selling author is apparently equal in credentials to a phd in Neurology? Really, slashdot? Where's the evidence? Brain scans? Double blind tests? Who was the control? What's the confidence rating? He's practicing pop psychology -- and he's even less credible than Dr. Phil. No evidence of any kind and he's making extraordinary claims about a field he has no formal training in. If this was a story about someone's evidence disproving evolution, Slashdot readers would be tearing the author limb from limb -- this guy's making claims that belong in the same bucket. Why are you wasting your time with this crackpot theory? You're supposed to be scientists, technology experts, and engineers -- act like one. Demand proof.
  • The brain remakes itself constantly, in weeks, not decades. ANYTHING you do repetitively becomes a source of new pathways. So in our case, our brains are already optimized to Bear Party bootlegs and Pizza Bites.

  • Just because the brain develops a new mode of operation it doesn't mean the brain forgets the old mode.
    Did you forget how to ride a bike? Forget how to swim? Forget how to play chess or play video games? Forget how to read?

    Just because you learn to speed read it doesn't means you forgot how to read. And just because you pay close attention and don't use the internet it doesn't mean you've developed the ability to reason. The ability to reason comes from discrimination of good and bad information. Also what

  • Early exposure could hijack parts of the brain that were meant for other things. It's the same reason why all polyglots are airheads. Just limit your kids to an hour or two a day and they'll be normal.

  • Flow concentration (Score:4, Interesting)

    by sandytaru ( 1158959 ) on Friday May 11, 2012 @08:17PM (#39975297) Journal
    Browsing concentration is sort of the opposite of "flow" concentration. Flow concentration is engaged when you're doing something engrossing - painting, writing, sudoku, coding. As long as we balance out our browsing habit with artistic and creative pursuits, we'll be fine.
  • I'm pretty sure the data shows that if our brains are "optimized" for anything on the Internet, it's the pornography first and foremost.

  • by __aaltlg1547 ( 2541114 ) on Friday May 11, 2012 @08:32PM (#39975383)

    No, it would be Larmarkian if he claimed you could inherit this characteristic. It has long been known that the brain's wiring depends on how you use it.

    The thing about this claim is browsing the web a lot is enough unlike how people have used their brains in the past that it will cause a noticeable difference in how we gather, understand and analyze information.

    Myself, I don't think it's all that different than what humans have done for a million years. When I was a kid, there was no such thing as the web. We gathered information by consulting various sources: television, books in the library, magazines, talking to other people and observing things for ourselves. Occasionally, we set out to break new ground and find out things that nobody knew, or that we didn't know somebody knew. People still gather information in all those ways but now they use the web too. The kinds of information a web search dredges up are the same kinds of things we used to draw information from 30 years ago: magazine articles, bloadvertisements, videos (analagous to television programs), blogs (which are journals) and scholarly articles. Although you get the information to your eyes a lot faster, you can't absorb it any faster with the web. And the quality of the information is probably worse, because it's so damn cheap to put up a blog full of bullshit, unchecked facts and misunderstood information. It's left to the searcher to decide what information is relevant, which of conflicting information sources are more accurate or reliable, etc. This is the same problem people always had.

    The activity of creating new information -- original research or analysis -- was never easy, and there were never good tools available to most people to help with it. Now, at least there are computers that can assist you in analyzing large volumes of information or carrying out calculations too daunting to do by hand.

    So all said, I doubt it will have much effect. People will still need to analyze data, but that has always been an activity for a few who were especially good at it. The rest of us can browse away, just like our apelike ancestors did 4 million years ago. (I bet they said Google, too.)

  • by Anonymous Coward

    The guy is an editor for Encyclopedia Britannica, a publication which is likely being endangered by the free flow of information on the internet (wikipedia, we're looking at you!) and he's making arguments against it. What. A. Shock.

    Severe conflict of interest here.

  • Except... (Score:4, Interesting)

    by kuzb ( 724081 ) on Friday May 11, 2012 @08:52PM (#39975525)

    This is like Sony getting up and telling us all that the Xbox is making us stupid.

    You have to question the source here. The guy is an editor for Encyclopedia Britannica - something that is likely being hurt by free information publications such as Wikipedia.

  • by tsotha ( 720379 )
    I read the first part of the summary, but the paragraph was really too long. I find these days I'm prone to starting one thing and then
  • ... TFA, but got a warning that the page requires JavaScript and Masochism be enabled.

  • when people had only black and white tvs people and some still do have all there dream in back and white. wile the newer generation of people raised on color tvs dream in color. so yes are brains did change from tv.so saying we are becoming at storing alot more wile less detailed info is not so far fetched.
  • This seems redundant to me, since the way in which we find relevant answers from a vast source of information such as the internet needs to (and will) change considerably in the near future so that we no longer scan large volumes of information and search results.

    If you consider that in terms of efficiency of getting 'an answer from a question' we currently:

    Have Question -> get vast amounts of information from intertubes -> sort through information -> hopefully get answer.

    But this is stupid. We can

  • "Brains Optimized for Browsing" can only mean one thing: zombies optimized for browsing.

    (Talk about a case of "soft inheritance" [wikipedia.org]...)

  • When movies were first made, they were single shots. A train approaching a station, something like that. Audiences oohed and ahhed.

    But, the first time a cut was introduced, the audience was completely flummoxed. They had no idea what they were seeing. It's hard to believe that now, but we've probably seen 100,000 cuts by the time we are 5 now, and our brains are rewired to accept it.

    • by Animats ( 122034 ) on Saturday May 12, 2012 @12:33AM (#39976753) Homepage

      But, the first time a cut was introduced, the audience was completely flummoxed.

      More than that. The average shot length in movies [cinemetrics.lv] has been decreasing over the years. There are up and down trends; 1971 had longer shots than 1974. But shot lengths today average around 2 seconds. The Bourne Ultimatum has a mean shot length of 800ms. This is the current record. MTV got people used to that rate of cuts.

      Another thing that people have learned to tolerate is the demise of editorial geography. The best way to explain editorial geography is this (which I'm quoting from memory): "Bogart gets a phone call. He hangs up the phone. He puts on his coat, He opens his door and walks out. He walks down the front steps. He hails a cab. He gets in the cab and the cab drives away. We see a shot of him inside the cab. The cab stops in front of a building. Bogart gets out. He looks up at the tall building. We're shown the building. He walks into the lobby. He pushes the elevator button. He looks up at the elevator indicator. We're shown the elevator indicator moving down. The elevator doors open. Bogart gets in. We're shown the elevator indicator moving up. On another floor, we see the elevator doors open. Bogart gets out and walks down the hall. He knocks on a door, and Lauren Bacall opens the door. Bogart walks through the door into the apartment." Today, we'd see the phone call, and in the next scene, he'd be in the apartment.

      • But shot lengths today average around 2 seconds.

        I attribute that to sheer lack of technical skill. There's few if any left in the industry with the skill both on both sides of the lens to carry off the long scenes that the old movies had. Part of that was the difficulty in the old lenses and in manual editing. But today's actors, directors and camera men just can't pull off the basics any more.

        I can follow the 2 second shots, but actively dislike it. It's too much like following a bunch of stills a

        • It's not lack of talent, it's just aesthetics and marketing. Look at guys like Robert Elswit and Roger Deakins. They're still doing work that rivals anything in old movies, but if you put a film out with long takes and a deliberate pace everyone complains that it's "slow" and automatically gets put in the arthouse category. These films just don't sell that well anymore.

          I'm very intrigued to see what style Elswit brings to the new Bourne movie. I enjoyed the last one on TV, but I sat too close to the
  • I didn't RTFA. I accept that brains adapt to the activities they do, and I understand that that's not the same thing as evolution, so this is not a claim of 'lamarckian' evolution.

    My conjecture is that while the brain of a passive browser/lurker may develop one way, that of someone who also posts to the net might develop in another way. Conversations may be discussions of various issues, and interactions on the net could be likened to that but with more time to think about what you're about to say. Feedb

  • Excuse me, but isn't this just a rehash of what Macluhan already stated some fifty years ago?
    Paai

  • by Anonymous Coward

    When I was a teenager, my brain became optimized for looking at pictures of girls' tits and for jerking off... anything you do, that represents a new behavior changes the structure of your brain, that's how the fuck you learn new behavior. New pathways are formed, or old ones are altered or destroyed... or the attributes of the pathways might change, i.e., activation thresholds could raise or lower, reuptake of neurotransmitters could be impacted, but in any case, any time a new memory of any kind forms, b

  • Which is only able to perceive either neocon Romney, or neocon Obama (or neocon Bush or neocon Clintons, etc.).

    Reach for the dream ---- vote Green.

    Dr. Jill Stein in 2012.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...