Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Social Networks Technology

Researchers Create Social Engineering IRC Bot 66

An anonymous reader writes "Researchers at the Vienna University of Technology developed an IRC bot that acts as a 'man in the middle' between two unsuspecting users, modifies URLs passed between them, and also is capable of steering the conversation. Not only does this work surprisingly well on IRC — they found a 76.1% click rate for potentially malicious URLs — but four out of 10 people on Facebook Chat also clicked on links after the bot introduced complete strangers to each other. This would have worked even better if the bot were to clone existing friends' profiles and submit friend requests from those, say researchers."
This discussion has been archived. No new comments can be posted.

Researchers Create Social Engineering IRC Bot

Comments Filter:
  • In other words. (Score:5, Insightful)

    by dreamchaser ( 49529 ) on Saturday June 12, 2010 @12:47PM (#32551030) Homepage Journal

    In other words, over 7 out of 10 IRC users and 4 out of 10 Facebook users are utter idiots.

    • Re: (Score:3, Informative)

      by Culture20 ( 968837 )

      7 out of 10 IRC users [...] are utter idiots.

      Somehow I don't think that's true. I think it's more likely that 7/10 IRC "users" are other bots.

    • Re:In other words. (Score:4, Insightful)

      by hitmark ( 640295 ) on Saturday June 12, 2010 @01:11PM (#32551254) Journal

      even if one is not, a small unsuspecting moment is enough to get caught.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      I'm not so certain about that. IRC users tend to be more technically competent than people that just use Facebook or e-mail. How many of these people had Firefox with NoScript, for example? Malicious links would've been virtually worthless in such a case.

      Merely clicking doesn't prove much without giving out more information, imo.

      • Good point. With regards to the IRC though that depends on the server/network. There are some gaming centric IRC servers that are filled with idiot children.

      • Malicious links would've been virtually worthless in such a case.

        Not really, since plenty of malware comes through plugins like flash, java, and adobe.

      • Dating channels wouldn't attract the technically competant
    • Re: (Score:3, Interesting)

      by imakemusic ( 1164993 )

      Not really. Unless I'm missing something you would effectively be having a conversation with a real person. The only difference is that it is being relayed through a bot which may or may not alter the text - and even if it does alter the text the general gist would still be the same. If you were having a conversation with a person would you click the links they send you? Or would you say "I can't click that link because I can't verify your identity and trustworthiness"? It's definitely devious but I don't t

      • Re: (Score:2, Interesting)

        Indeed, if you are having a conversation with someone you know, and at one point in conversation he says: "BTW a good covering of the subject can be found at http://tinyurl.com/foo" and the bot changes the text to "BTW a good covering of the subject can be found at http://tinyurl.com/bar" you have little chance to notice before you click on it that a bot-in-the-middle changed the link.

        Of course, I have preview enabled in tinyurl, so I'd see the real URL before I go there, and even if I couldn't recognize th

    • Re: (Score:3, Funny)

      I see you like utter idiots, concur. Watch this video your viewing pleasure.. Very wonderful.

    • by arth1 ( 260657 )

      IRC: Where men are bots, and girls are police officers.

      In other words, I doubt that there actually were many regular users trapped by this chatbot. 7 IRC users = 5 bots + 2 cops. You need really high figures to trap actual users.

    • You don't have to be an idiot to get caught by this sort of thing. Just look at Cory Doctorow on Twitter... oh, wait.
  • by Anonymous Coward

    i think i'll let everyone know how we been doing some hacks with bots

    bots to scan for vulnerabilities
    bots to launch the exploit
    BOTS for file sharing
    bots to call home
    bots to eat my toast...HEY THAT'S MY TOAST

  • by garyisabusyguy ( 732330 ) on Saturday June 12, 2010 @12:52PM (#32551082)

    Aside from all of the fun with malicious code and all, the potential to lead people down a mental path through 'conversation' seems to have the potential to expose a LOT of people to make self-incriminating statements

    It's like a photo-radar gun for thought crime, an investigator doesn't even have to be there to do it. Just set your bots out there to lead people into talking about laundering money, seducing teens, killing their neighbor and WHAMO an adventurous district attorney is pressing charges.

    Nah, what was I thinking, we live in way to free of a society for that to ever happen. What a relief

    • by copponex ( 13876 )

      Nah, what was I thinking, we live in way to free of a society for that to ever happen. What a relief

      Entrapment is illegal. Our failure to make sure law enforcement obeys the law is our fault.

      • by am 2k ( 217885 )

        Entrapment is illegal.

        No, it's only illegal for the police. They just have to outsource this task to a private company, which supplies them with the chat logs afterwards, and they're fine.

        • by copponex ( 13876 )

          Can we get back to a world where a person said something after they gathered information on it?

          http://www.lectlaw.com/def/e024.htm [lectlaw.com]

          A person is 'entrapped' when he is induced or persuaded by law enforcement officers or their agents to commit a crime that he had no previous intent to commit; and the law as a matter of policy forbids conviction in such a case.

          Agents in the case being anyone they could pay. Paying someone to bring you criminals is a really bad idea, since any judge would immediately consider the

          • True, but this scenario wouldn't be entrapment, and it already happens.

            Let me alter your emphasis on that definition:

            A person is 'entrapped' when he is induced or persuaded by law enforcement officers or their agents to commit a crime that he had no previous intent to commit; and the law as a matter of policy forbids conviction in such a case.

            So, it's entrapment if they say 'we're going to arrest you unless you rob that store'. It's not entrapment if they pose as a 13 year old girl and ask if you want to have sex with them. That is exactly what this kind of program would be doing. And it's also exactly what is already done by vigilante organizations like Perverted Justice, which are generally backed up by local police.

            • by sjames ( 1099 )

              Actually, if they make the offer it is SUPPOSED to be considered entrapment since they gave you the idea, but in practice, unless they actually tie you down and force you (perhaps not even then) it won't be considered entrapment.

              OTOH, if they pose as a 13 year old girl and wait for some perv to suggest something improper, then it really isn't entrapment.

          • Re: (Score:3, Funny)

            Can we get back to a world where a person said something after they gathered information on it?

            Well, he didn't write that. A bot changed it during submission. :-)

      • Entrapment is practical.

        Solution:
        Trust no one and shut the fuck up. The internet is as forgiving as 4chan.

    • Minority Report ...
    • It's not entrapment if they don't entice you into doing the crime.

  • I'm not very impressed considering a billion-dollar industry is founded mostly on sending "the general public" unsolicited links (in broken english, no less) in World of Warcraft that they willingly visit and then volunteer their login credentials.
  • Reminds me that "magician" who was able to win 50% simultaneous chess matches against any number of professional players.

  • And what's new? (Score:5, Interesting)

    by Dumnezeu ( 1673634 ) on Saturday June 12, 2010 @02:14PM (#32551646)

    I did something similar for a friend, helping him pick up women on IRC. The bot learned his usual questions and if they answered about 10 questions, it meant they were interested in him and the bot would forward the conversation to him and he continued it. Another time, I wrote an IRC bot for myself; it would act as a man-in-the-middle to pick up women by getting female nicknames and then forwarding the messages it got to other female-like nicknames it detected. If the conversation went long enough, it forwarded everything to me and I would pick up the chat from there.

    • by Anonymous Coward on Saturday June 12, 2010 @03:36PM (#32552180)

      That's not creepy AT ALL

      • Some friends of mine from uni wrote a shell script to use finger to get a list of users, remove their name from the list, then look up each logged in user's classes (from LDAP, then from the university calendar to convert codes to English), what year they are in, whether domestic or international, and a whole load of other details from LDAP, and present them in an easy to read report. More recent versions try to scrape facebook for mutual friends, interests and so on (and a photo, to prevent name collision

    • Re: (Score:1, Funny)

      by Anonymous Coward

      And as a result your programming skills have gone up considerably, why your and your friends's score with women is still 0. However, if I'm wrong and it's not 0, please entertain us with the stories about meeting those men who diguised themselves as women on IRC. Thinking about it, the score will still be 0, but we all have a good laugh.

    • by dnaumov ( 453672 )

      I did something similar for a friend, helping him pick up women on IRC. The bot learned his usual questions and if they answered about 10 questions, it meant they were interested in him and the bot would forward the conversation to him and he continued it. Another time, I wrote an IRC bot for myself; it would act as a man-in-the-middle to pick up women by getting female nicknames and then forwarding the messages it got to other female-like nicknames it detected. If the conversation went long enough, it forwarded everything to me and I would pick up the chat from there.

      And then you woke up.

      • And then you woke up.

        You won't believe how dumb people are on IRC! Their dictionary is rather limited, which made tuning the question generator quite simple.

        • You won't believe how dumb people are on IRC! Their dictionary is rather limited, which made tuning the question generator quite simple.

          Or maybe they're just all bots?

    • by antdude ( 79039 )

      Is there a Linux source for this so I can run it too? ;)

      Any other good AI chatbots? I tried Howie, Rbot, and Alice so far but they are outdated/old. :(

    • And the end goal was to distribute your own malicious payload, I guess?

    • I did something similar for a friend, helping him pick up women on IRC. The bot learned his usual questions and if they answered about 10 questions, it meant they were interested in him and the bot would forward the conversation to him and he continued it. Another time, I wrote an IRC bot for myself; it would act as a man-in-the-middle to pick up women by getting female nicknames and then forwarding the messages it got to other female-like nicknames it detected. If the conversation went long enough, it forw

  • Interesting concept (Score:3, Interesting)

    by Arancaytar ( 966377 ) <arancaytar.ilyaran@gmail.com> on Saturday June 12, 2010 @02:19PM (#32551680) Homepage

    I've seen this idea used for pranks before. People hanging out on IRC watching a bot that was hooking up unsuspecting AIM users to each other. Later on, this became a website called Omegle.

  • Don't we already have enough biological artificial intelligence on the internet?
    Do we really need silicon based artificial intelligence to make the bottomless pit of abstraction consume even more of the internet?

    Just because you can blow up an atomic bomb, does it mean you have to?

    This is not social networking to use such a bot. its very anti-social and deceptive.

    Excuse me but real social networking works on real humans, otherwise its artificial networking.

    But here is a thought that might just prove valuabl

  • by goruka ( 1721094 ) on Saturday June 12, 2010 @03:26PM (#32552132)
    For the lulz, about 10 years ago, I created an IRC bot that connected to #sex and #cybersex in dalnet, and pretended to be a young girl awaiting for cyber..
    Then it would interconnect pairs of two who would talk to her and forward the message, but this didn't work for long because they'd soon figure out the opposite partner was of the same sex. So i added a functionality that would flip words, example penis vagina, boobs balls, and would intercept some messages (like if a peer requested a picture, or ASL request) and send a fake ASL or URL of a hot chick. After a few attempts, most of the pairs ended up having cyber anyway!
    Even though bizarre phrases happened (like "I want to insert my 8 inch vagina into your deep wet penis") most people amazingly didn't even find it strange, and even though it was probably left running all night and created more probably a hundred "encounters", no one even suspected a tiny little about what was going on, no one!
    • Re: (Score:3, Funny)

      by noidentity ( 188756 )

      Even though bizarre phrases happened (like "I want to insert my 8 inch vagina into your deep wet penis") most people amazingly didn't even find it strange, and even though it was probably left running all night and created more probably a hundred "encounters", no one even suspected a tiny little about what was going on, no one!

      So you're the one who made me gay!!!!!!!

  • I believe the first artificial intelligence will awaken in botnet.

1 Word = 1 Millipicture

Working...