Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software Technology

Researchers Develop Surveillance System That Can Watch & Predict 106

hypnosec writes "Carnegie Mellon university researchers have developed a surveillance system that can not only recognize human activities but can also predict what might happen next. Scientists, through the Army-funded research dubbed Mind's Eye, have created intelligent software that recognizes human activities in video and can predict what might just happen next; sounding an alarm if it detects anomalous behavior. "
This discussion has been archived. No new comments can be posted.

Researchers Develop Surveillance System That Can Watch & Predict

Comments Filter:
  • Next up.... (Score:5, Insightful)

    by CimmerianX ( 2478270 ) on Sunday October 28, 2012 @11:39AM (#41797147)
    Thought Police Alpha Version .501 right here. Arrest him!!!! Our system assumed he would shoot somebody.
    • Re:Next up.... (Score:4, Insightful)

      by durrr ( 1316311 ) on Sunday October 28, 2012 @11:45AM (#41797201)

      This system will be hilariously judging when someone with ataxia or just a plain limp sounds the system every damn time he walks past. Or some poor person with social anxiety that is constantly harrassed until he refuses to go out anymore.

    • Re:Next up.... (Score:5, Insightful)

      by betterunixthanunix ( 980855 ) on Sunday October 28, 2012 @12:12PM (#41797371)
      I suspect this system will see more subtle use. I suspect that this system will be deployed within corporations, and used to detect employees who are not satisfied with their treatment. Those employees will either be fired or promoted (to divide them from their peers and prevent them from organizing people). The purpose of a system like this is to enforce the social order, to prevent change, and to ensure those who are in power will remain in power.
      • This system was developed by the military, not by businesses ... that should give at least some clue as to who is developing this and what their initial intentions probably are, for whatever that's worth .. from the paper:

        This research was sponsored by the Army Research Laboratory
        and was accomplished under Cooperative Agreement
        Number W911NF-10-2-0061. The views and conclusions contained
        in this document are those of the authors and should
        not be interpreted as representing the official policies, either
        expres

    • by memnock ( 466995 )

      Can researchers just stop creating applications and systems that enable further surveillance and control of our society?

    • It's already working. I just started reading Asimov's Prelude to Foundation, where (so far) the protagonist Seldon has a mathematical formula for predicting the future.

      Once again, science fiction becomes science fact.
  • by Anonymous Coward

    get ready to be tasered on a false positive

  • by p0p0 ( 1841106 ) on Sunday October 28, 2012 @11:39AM (#41797157)
    Camera monitoring hallway:
    Subject 1: "You, citizen. Pick up that can."
    Subject 2: "..."
    Camera: "Oh shi-"
  • by Anonymous Coward

    You're under arrest.
    What for?
    For pre-crime.
    I'm not committing any crime!
    You can tell that to the court.

  • by elrous0 ( 869638 ) * on Sunday October 28, 2012 @11:43AM (#41797177)

    I said "The corner of VINE" not "PINE," you dumb bitch!

  • Reverse the polarity, and crime turns into good deeds.

  • by Anonymous Coward

    this is the focus of Person of Interest

  • Anomalous Behavior (Score:4, Insightful)

    by betterunixthanunix ( 980855 ) on Sunday October 28, 2012 @11:49AM (#41797223)
    IOW this system will fight social change. If you belong to a group that has the short end of the stick when this system is deployed, you will be flagged for not accepting that treatment like everyone else.
  • Pure BS! (Score:4, Informative)

    by bogaboga ( 793279 ) on Sunday October 28, 2012 @11:54AM (#41797261)

    Scientists, through the Army-funded research dubbed Mind's Eye, have created intelligent software that recognizes human activities in video and can predict what might just happen next; sounding an alarm if it detects anomalous behavior.

    To use the words, "might just happen next", is just code for "might not happen next".

    In short, it's just a loophole for scientists to get more funding, while emphasizing that their software does exactly what they said it would do.

    We have better thing to do or worry about, right?

  • Here we go. (Score:5, Insightful)

    by vikingpower ( 768921 ) on Sunday October 28, 2012 @11:55AM (#41797267) Homepage Journal
    In my native Europe as much as in your great & free US: More surveillance state. More police state. More security craze. Where is this going to stop ? When are ordinary, yet intelligent people going to refuse to live in and contribute to such a state ?
    • by Shaman ( 1148 )

      Precisely. Why are they creating a societal cage?

    • When are ordinary, yet intelligent people going to refuse to live in and contribute to such a state ?

      Never, because:

      1. Most people are busy drinking and watching sports, or gossiping on Facebook
      2. Surveillance systems will be used to find those who are not, and that information will be used to prevent them from achieving any real change.
      • Those would be: a) the not intelligent people, or b) those that decided to NOT give a fuck and just enjoy it while they can. Either way you're right. Sadly. Anybody up for some half-life 2 style action? We just need to convince the government of naming the police "combine" and then get a Gordon Freeman.

    • In my native Europe as much as in your great & free US: More surveillance state. More police state. More security craze. Where is this going to stop ? When are ordinary, yet intelligent people going to refuse to live in and contribute to such a state ?

      When an outside force interrupts the process. Every police state fell not because of internal pressures, but because something external to it caused a slight shift which then energized the population into revolt.

      • That's a rather bold statement. Throughout history revolutions were triggered both from the outside and inside, often a combination of both, sometimes just one of them. Often they dissolve when the dictator dies. The Franco and Salazar regimes in Europe are perfect examples. Luckily it seems that extremely despotic systems based on terror don't last long, see e.g. Cambodia during the Khmer regime or the 3rd Reich. Less oppressive regimes seem to take longer, about 2-3 generations, until they disintegrate. I

    • by MrLizard ( 95131 )

      "When are ordinary, yet intelligent people going to refuse to live in and contribute to such a state ?"

      When the leading food-related health problem becomes starvation, not obesity. Fat, warm (cool in summer), entertained, people do not rebel.

      If the British had Big Macs and X-Boxes back in 1776, we'd still be talking English now.

      PS: For those who are going to think you're oh-so-very-clever and point out "Duh, we are talking English now, dummy!", the sentence above was an attempt at "humor". A common form of

    • More surveillance state. More police state. More security craze. Where is this going to stop ?

      When a typical person has power, one of their greatest fears is losing that power. In other words, this will -never- stop. It will only get worse. Eventually, technology will get to the point where one person can have power over every other person.

      Once there is only one person who can have power over all others, that person will realize how lonely the world is (but maybe not why). At that point, it will be suicide for the entire human race. End of Game.

  • It's vaporware (Score:5, Insightful)

    by Animats ( 122034 ) on Sunday October 28, 2012 @12:00PM (#41797299) Homepage

    No, they haven't "developed a surveillance system". The paper is two psychologists blithering about the potential architecture of one. It reminds me of the awful papers that came from the "expert systems" community in the 1980s. There's been some progress; it mentions Bayesian statistics. But it's fundamentally an approach based on parsing visual data into something that looks like predicate calculus and grinding on that. There's a long history of that not working.

    It's an idea in the right direction, though. A key component of intelligence is prediction. Knowing what is likely to happen is a basic component of common sense, an area in which AI systems have historically been weak. With prediction comes the ability to ask "what if" questions, essential to deciding what to do next without doing something stupid.

    There's been real progress in that area, but not from the expert systems people. Adobe Photoshop's content-aware fill [photoshopessentials.com] is an example of a successful system which has a form of "common sense" - it fills in plausible-looking areas to replace sections deleted from photos. Related technologies exist for videos, and are used for motion compression and 2D to fake 3D conversion. Systems which look at video and guess "what happens next" may be the next step.

    • by Anonymous Coward

      True in a sense. Machine prediction and precognition is long way away, as it depends on more inputs and sensors than just camera lens.
      It all just seems as one of those "throw the cammo over it - DARPA ppl. are coming to visit and we do need some healthy cash" projects.
      More interesting was some recent science program that is focused on the thought creation and generation that had one confirmed result - we form our strings of thoughts in certain regions of our brains and make decisions - 6 seconds before we b

    • Hello, this it Target calling, did you know your teenage daughter is pregnant?
    • I've read the article and the paper and I can't see anything to indicate it's vaporware ... by all accounts there appears to be a working basic system.

      • Read it again, carefully. The key word to note is "proposal". Even the components that have been "implemented", like SCONE are still, yes, vaporware. "Coming soon" since 2003, last updated 2010.

        This is "Wouldn't it be super awesome if someone could implement all our graphs and diagrams and actually make it work in the real world? Therefore, more funding."

  • by andrew_d_allen ( 971588 ) on Sunday October 28, 2012 @12:01PM (#41797309)
    With most surveillance footage it's pretty easy to spot what's going to happen next: the customer will pay for their items, receive change, and walk out of the store. Unless you're watching it on the internet. Then, a car will drive into the storefront or a botched hold-up will occur.
  • It involves precogs right?
  • by ibsteve2u ( 1184603 ) on Sunday October 28, 2012 @12:14PM (#41797385)
    A camera that can tell me if I'm about to be asked "Do I look fat in this dress?"
    • A camera that can tell me if I'm about to be asked "Do I look fat in this dress?"

      Not really an innovation; It doesn't take an intelligent camera to know what comes next. Specifically, it doesn't matter which way you answer, you're still not getting laid tonight. Now, a camera that can text you before your significant other even asks if you want to go shopping with her and provide a list of socially-acceptable excuses would be an innovation. It would also break several laws of physics, notably that timey whimey wibbly wobbley...

      • Not really an innovation; It doesn't take an intelligent camera to know what comes next.

        Oh, I know that there is no safe way to answer it. But if said camera can warn me that the question is about to be asked, I can evade the situation entirely. That might offer me the opportunity to again develop/cultivate a worthwhile "significant other".

        As it stands, I've found that my penchant for honesty is entirely too intrusive.

    • Everybody knows what will happen next to the camera ...
  • I'll wear out that fast-forward button getting to the shower scene.

  • Precrime division, you are being given this speeding ticket for Rash driving that was to take place at 09:16:23 this morning.
  • I see two ways this could be implemented (neither one is truly effective or desirable, but that won't stop people from building it or buying it if it makes them feel safer).
    1. The system observes everyone, notices patterns in behavior and flags deviation. This is bad because it would ultimately force people to "perform" in uniform ways in public places. And if everyone is doing that, no one stands out because they ALL lookmlike they're hiding something... Because they are. Maybe the way you walk is different,
  • by heretic108 ( 454817 ) on Sunday October 28, 2012 @12:40PM (#41797537)

    I'm thinking along the lines of the emacs "spook" function, amongst other things. You just need enough a large enough group of participants working together.

    The system can be trained in weird ways. For instance, if enough people in enough places scratch their noses with their left hands, then break out in a mock fight, the system will learn to sound the alarm every time someone scratches their nose with their left hand.

    Or, for something more socially useful - have people pull out a cellphone, talk for a few seconds, then pull out a mock gun and pretend to mug others. Then, the system will freak out every time some annoying jerk pulls out a cellphone in public. Along that same theme, train the system to send in the troops whenever someone adjusts their underwear in public, or picks their nose, or farts loudly...

  • Future crime and skynet merge...

  • Just don't hook it to the launch system.

  • Bogus Research...FTA (Score:5, Informative)

    by globaljustin ( 574257 ) on Sunday October 28, 2012 @02:37PM (#41798479) Journal

    I looked through the full text of the research (http://stids.c4i.gmu.edu/papers/STIDSPapers/STIDS2012_T02_OltramariLebiere_CognitiveGroundedSystem.pdf)

    It is bogus. Wouldn't get published. They say the system *predicts behavior* using a systematic behavior ontology. When it describes their theory of the ontology it lists three factors in the system.

    "Causal Selectivity" #3 is the one that links **cause and effect** its the part of the equation where your action (reaching in pocket) is either interpreted as something threatening (trigger bomb) or non-threatening (scratch balls discretely in public).

    Guess what...all they do is say "Will be addressed in further research"...!!!

    The whole basis for their claim...'prediction' is explicitly not part of this research. They do not even address the link of one behavior to another, yet it is the whole premise of their claim!

    From page two (emphasis added)

    Ontology pattern matching - comparing events on the basis of the similarity between their respective pattern components: e.g., a person’s burying an object and a person’s digging a hole are similar because they both include some basic body movements as well as the act of removing the soil;

    Conceptual packaging - eliciting the conceptual structure of actions in a scene through the identication of the roles played by the detected objects and trajectories: e.g. if you watch McCutchen hitting an homerun, the Pittsburgh Pirates’ player number 22 is the ‘agent’, the ball is the patient, the baseball bat is the ‘instrument’, toward the tribune is the ‘direction’, etc.).

    Causal selectivity: attentional mechanisms drive the visual system in picking the causal aspects of a scene, i.e. selecting the most distinctive actions and discarding collateral or accidental events (e.g., in the above mentioned homerun scenario, focusing on the movements of the rst baseman is likely to be superuous). In the next section we describe how the Cognitive Engine realizes the rst two functionalites by means of combining the architectural features of ACT-R with ontological knowledge, while **Causal selectivity will be addressed in future work.**

  • "Scientists, through the Army-funded research". Is it there a better example of an oxymoron?
  • I wonder how long it will take the system to stop flagging dancing as suspicious behavior?

  • by mschaffer ( 97223 ) on Sunday October 28, 2012 @03:42PM (#41798911)

    Anyone can predict. Let me know when it can see the future.

  • Robot: I predict that I will be used for probably cause, warantless searches and seizures. I predict that I myself will be part of the problem in the future, terminate me now.
  • by Trax3001BBS ( 2368736 ) on Sunday October 28, 2012 @07:57PM (#41800365) Homepage Journal

    You scan the PDF's and find it's for what path (walking) a person would take.

A computer lets you make more mistakes faster than any other invention, with the possible exceptions of handguns and Tequilla. -- Mitch Ratcliffe

Working...