Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google Security IT

How Google Authenticator Made One Company's Network Breach Much, Much Worse (arstechnica.com) 79

A security company is calling out a feature in Google's authenticator app that it says made a recent internal network breach much worse. ArsTechnica: Retool, which helps customers secure their software development platforms, made the criticism on Wednesday in a post disclosing a compromise of its customer support system. The breach gave the attackers responsible access to the accounts of 27 customers, all in the cryptocurrency industry. The attack started when a Retool employee clicked a link in a text message purporting to come from a member of the company's IT team. It warned that the employee would be unable to participate in the company's open enrollment for health care coverage until an account issue was fixed. The text arrived while Retool was in the process of moving its login platform to security company Okta.

Most of the targeted Retool employees took no action, but one logged in to the linked site and, based on the wording of the poorly written disclosure, presumably provided both a password and a temporary one-time password, or TOTP, from Google authenticator. Shortly afterward, the employee received a phone call from someone who claimed to be an IT team member and had familiarity with the "floor plan of the office, coworkers, and internal processes of our company." During the call, the employee provided an "additional multi-factor code." It was at this point, the disclosure contended, that a sync feature Google added to its authenticator in April magnified the severity of the breach because it allowed the attackers to compromise not just the employee's account but a host of other company accounts as well.

This discussion has been archived. No new comments can be posted.

How Google Authenticator Made One Company's Network Breach Much, Much Worse

Comments Filter:
  • Lol, TLDR: software security company didn't use strong MFA.
  • by Joe_Dragon ( 2206452 ) on Friday September 15, 2023 @01:50PM (#63851680)

    deep insider info does make hacking easier + outsourcing can make it harder to find out if some really works for them or not.

    • by Bert64 ( 520050 )

      Not only that, but the naivete from a lot of companies security teams who assume that an attacker won't have this kind of information.
      There's a huge trend towards black box security tests because "an attacker couldn't possibly know X", rather than a more sensible assume breach scenario.

      • by tragedy ( 27079 ) on Friday September 15, 2023 @03:51PM (#63852040)

        Part of the problem is that a lot of companies don't take security seriously until it bites them. At a job I had once, someone from IT who I had not met before requested my password in an e-mail. I declined. So they came around to my desk and demanded it, but I told it I wouldn't give it to them without written instructions from management. So, they went to my manager and I got my instructions, but I also had to deal with an irritated IT person and got reprimanded by my manager. This despite the fact that they were the ones who were violating security best practices and the very clearly written instructions in the published IT manual. Technically, I shouldn't have handed out my password without written instructions from an actual company officer, but what are you supposed to do when not violating company policy is being treated as a form of stubbornness and insubordination? Bear in mind that, while this was not a strict IT job, it was IT adjacent. Everyone involved should have had some technical chops. The manager wasn't really a tech-type per se, but she had been managing technical people for quite a long time. Imagine how much worse it is in departments who aren't particularly technical.

        • by Joe_Dragon ( 2206452 ) on Friday September 15, 2023 @04:27PM (#63852148)

          and opening the door for someone with an Ladders in there hands that can't get to there card is done with out question at places

          • and opening the door for someone with an Ladders in there hands that can't get to there card is done with out question at places

            You having a stroke? Or perhaps a random AI conversation?

          • Yeah because pulling that kind of shit is a phenomenally good way to bypass access controls.

            Unless you personally know that the person is still employed at the company, you are giving access to someone that you don't know should have access. It's standard pentesting stuff. Dress up in a way which makes you an invisible part of the infrastructure, or in something that makes people liable to help you. Used to be a clipboard, cup of tea and harried expression would get you in the door, then a high vis vest was

        • I'd read that reprimand in your file as praise and make a note to not trust your manager. Your manager is an idiot. Even if you had been wrong with respect to handing over your password face to face, the fact that your erred on the side of caution, of security, makes it a forgivable and acceptable mistake. Again, assuming a mistake just for the sake of argument, not claiming you actually made one.
        • by lsllll ( 830002 )
          Fuck that. I would have told them to go in as an admin and change the password to whatever they want, log in as me and do whatever they want, and then let me know when they're done so that I can do a password reset. What if your password was "Imasterbateal0t@sexysecretary"?
        • by quonset ( 4839537 ) on Friday September 15, 2023 @06:48PM (#63852484)

          someone from IT who I had not met before requested my password in an e-mail.

          Why would they be requesting your password in the first place, let alone through email? That gives them direct access to your account. As you said, that is in direct violation of standard IT security policies. I would love to hear the justification why someone from IT would randomly need a user's password.

          • Depends on what systems they have in place, but one good reason would be that some productivity app that doesn't have decent policy support needs to have it's user settings reconfigured by IT because the clueless end-user (or it's developer through an update) changed a setting they shouldn't have and it's faster to have IT do it than to explain to someone the process who won't care to know how to fix it themselves.

            Or as any IT worker would call it, just another average Patch Tuesday.
    • by Darinbob ( 1142669 ) on Friday September 15, 2023 @02:24PM (#63851776)

      The story is that the problem with the Google solution is that one employee was able to provide a One Time Password token to the attackers which then led to the compromise of other accounts. This was not deep insider info.

      Ie, the story was not about hacking through social engineering, or through the lack of MFA, but that the solution they used from Google had flaws that made this worse. Ie, Google Authenticator had a synchronization feature that synced MFA codes to the cloud, a feature that others had warned was insecure.

      The whole point of MFA and OTPs is to prevent compromise of one individual from leading to wide spread compromise within the organization.

      • by lsllll ( 830002 )

        the story was not about hacking through social engineering

        What do you mean it wasn't through social engineering? From TFS:

        Shortly afterward, the employee received a phone call from someone who claimed to be an IT team member and had familiarity with the "floor plan of the office, coworkers, and internal processes of our company." During the call, the employee provided an "additional multi-factor code."

        If that's not social engineering, I don't know what is. Well, I do. But so is that.

      • Google Authenticator had a synchronization feature that synced MFA codes to the cloud, a feature that others had warned was insecure.

        That's silly. Oh, I suppose it makes social engineering a bit more effective, because if you can get the target to give you the password and OTP for their Google account, you can get the OTPs for all of their other accounts. But if you've got their email, you're a short step away from access to all of their other accounts anyway, since everything uses email for password reset.

        On the other hand, the cloud sync feature provides the user with a simple, automated backup and restore, so whenever they set up a

        • since everything uses email for password reset.

          Not when they're secured with a tertiary Authenticator app. That's the whole point of MFA. If they're secured with an authenticator, it doesn't matter if you've got the email address and password, because you can't get the OTP. Those OTPs are supposed to be locked away on the user's device.

          Unless, of course, Google makes an incredibly stupid move and syncs those OTPs to the cloud in a way that makes a single OTP match for dozens of accounts when they should be exclusive to a single account.

  • The attackers caught a live one, a sucker willing to provide not just one, but two OTP codes to a malicious party.

    Sure, they can be all pissed that the user's google account had stored credentials, including OTP shared secrets, but ultimately the user allowed the attacker to authenticate as them, and there's lots of general precedent in the industry to allow an authorized user to "extend" a second factor to another device if they can prove who they are.

    Even if you don't allow a shared secret to roam, most f

  • by gweihir ( 88907 ) on Friday September 15, 2023 @02:07PM (#63851722)

    I remember that many people here (me included) though that quietly updating the Google authenticator app to sync to the cloud was a very, very bad idea. Did not take long to backfire badly.

    To to re-iterate, authenticator apps should only keep secrets locally unless you really understand what you are doing and this feature (if present) should always be default-off. Obviously, you can still get an exceptionally stupid employee to send you a picture of the clone-codes (for moving the authentication codes to a different device), but that is highly suspicious and requires some work. With the cloud-sync Google uses, stealing the secrets is apparently very easy. I cannot really tell how easy, since I refused that update. With Authy, for example, if you chose online backup there is at least a passphrase protection and you are warned to not ever share that passphrase.

    • by mysidia ( 191772 )

      To to re-iterate, authenticator apps should only keep secrets locally

      If you need that, then use a security key. Phones are not security keys. No matter how locally you store them: malware on the phone can exploit them, and maybe even leak them.

      It's a bad idea to keep them locally on a phone, because phones have a relatively short lifetime and need to be replaced, Or you need a backup phone -- the purpose of the Sync feature. it's not really acceptable to have to manually go and reset 50 credentials

      • by Ksevio ( 865461 )

        Maybe true, but no one wants to carry around a dozen authentication dongles

        • by mysidia ( 191772 )

          no one wants to carry around a dozen authentication dongles

          I said A security key - specifically 1 personal FIDO2 key to carry around for all authentication purposes, and perhaps 2 Backup keys to lock up where you won't lose them; Your work and Each service ought to let you register as many keys as you need, but Not 12.

          • Technically there is a limit and places like AWS do not let you do a backup key. You have to create a new IAM and assign a key to that. Also, you could just use your key with bitwarden and then use the built in authenticator in that.

            There are limitations with the YubiKey in terms of supported accounts. It can store up to 25 FIDO2 credentials for password-free logins, two OTP credentials, 32 OATH credentials for one-time passwords (when paired with the Yubico Authenticator), and an unlimited number of U2F credentials.

            For me, I put my important stuff on the key, the less important stuff in google authenticator. Combined with Microsoft stuff in it's authenticator, duo, ping, and i don't remember all of them but the point is there is A LOT of authenticator apps I need to use.

            • Not true in this case as they used Okta per the article summary. That means they can centrally enforce FIDO2 logins only, and the AWS login flow changes since it's using SSO. There are no longer IAM keys that are persistent for user accounts, has to be regenerated per session.
            • This is patently false: you can have more than one key on a IAM user. Our root account has 5 yubikeys on it.
              From the IAM page: "Use MFA to increase the security of your AWS environment. Signing in with MFA requires an authentication code from an MFA device. Each user can have a maximum of 8 MFA devices assigned. "

        • Maybe true, but no one wants to carry around a dozen authentication dongles

          FIDO security keys let you enroll many authentication credentials onto one device. I think mine has a dozen. So you only need one security key per computer, if you want to just leave it plugged into the port, or maybe one on a keychain and a backup in a drawer, each with all of your FIDO-capable accounts enrolled.

        • by AmiMoJo ( 196126 )

          You can transfer your codes from Google Authenticator by QR code. That's how I always move them from an old phone to a new one.

      • by gweihir ( 88907 )

        Bullshit. This is a risk-management question. I have a tablet with no network connections for some high-security TOTP codes. (Just put it into airplane mode, and, by law, the RF part must be off.) I have the regular ones on my phone. I do not install random crap on that phone. Even if somebody does, that phone still needs to be linked to systems and accounts. Sure, if somebody is abysmally stupid and downloads malware on their phone and then logs into things from that phone using the authenticator app on th

        • by drnb ( 2434720 )

          I have a tablet with no network connections for some high-security TOTP codes. (Just put it into airplane mode, and, by law, the RF part must be off.)

          I'd disable bluetooth too.

        • I do the exact same thing with an old iPod Touch. The reason I have it in airplane mode with BT disabled is both security, and that I've had PW managers sync corrupt data before causing all 2FA keys to be rendered useless. With the offline device, it wasn't too hard to recover, but without it, it would have been impossible in some cases, like some NAS machines.

        • by Khyber ( 864651 )

          "Just put it into airplane mode, and, by law, the RF part must be off"

          BWAHAHAHAHA nope modern phones you can enable airplane mode then turn on the wifi to get access to the in-plane wifi network.

        • by Bongo ( 13261 )

          Exactly. And there's nothing on that feature to warn you of what it actually means, even if you wanted to make a decision about balancing convenience. It just offers it as a "backup".

        • by mysidia ( 191772 )

          Bullshit. This is a risk-management question.
          Nonsense.

          I have a tablet with no network connections for some high-security TOTP codes.

          No. We were having a serious discussion here. Why does some fruit loop always gotta barge in with frivolous claims based on their one-of-a-kind system. Let me guess... you opened up the tablet and removed the antenna leads. If not, then it's a lot less secure than you think it is, and Airplane mode is no assurance that the OS is untampered with and couldn't be hacke

          • by gweihir ( 88907 )

            Bullshit. This is a risk-management question.
            Nonsense.

            Ah, no? All IT security is applied risk management. If you do not understand that, you have no place in this discussion. The rest of your statement is just as bereft of insight, so I will not even bother to answer it.

      • If you need that, then use a security key. Phones are not security keys. No matter how locally you store them: malware on the phone can exploit them, and maybe even leak them.

        Chasing perfect security is a fools errand. Phones are an order of magnitude better than single factor authentication, and an order of magnitude more convenient (reads more likely to be used rather than deactivated) than carrying around one or more special purpose devices.

        The news story here isn't that an employee's phone was hacked, it's that an employee handed over their credentials. You can throw as many authenticators at them as you want, you won't solve the issue that way.

        • by mysidia ( 191772 )

          Phones are an order of magnitude better than single factor authentication

          False, they're not an order of magnitude better. The reason that is false, is because App authenticators are still Single-factor. Just like
          "security questions" are still Single-factor.

          What makes you think there is a second factor, when you launch an e-mail app on your phone, Type your password into the App, then approve the login through an authenticator on the phone?

          A malware bug that gains root on your phone can exfiltrate the Sha

      • by xwin ( 848234 )
        Phone can be as good as physical security key if handled correctly. If you don't install every app you come across on your phone and don't browse some random websites, it is just as good as physical security key. If you have some super sensitive accounts, just get a cheap phone and put it in airplane mode.
      • I'm pretty sure all phones have a hardened security chip in them. If used properly, there is no way to get the key off that chip. The problem is that most TFA apps don't use the provided hardware properly. Pretty much by definition, if you can "save" the key to the cloud, the app is not using the hardware correctly. You are correct that if you want to change phones often and have a lot of credentials, then a security key is the way to go. But that is not what was being discussed. Phones are much more
        • It does it like this if you use the phone as a security key, in which case it's basically just using the secure enclave or equivalent as a FIDO2. TOTP doesn't do this, however, in any implementation I'm aware of...the key is usually generated server side and you scan a QR code to access the shared secret. So there's no private key (of an asymmetric cryptosystem) to store as it's a symmetric shared secret
    • by Junta ( 36770 )

      Given they pretty much were able to get the employee to hand over as much authentication material as they liked, there's not a whole lot of blame left for the sync feature on OTP...

      I'm reasonably confident that even if they couldn't get the OTP secret, they could have accessed an "enroll a new code" page and gotten that exact same employee to just feed them however much authentication material needed to enroll a new authenticator. It doesn't matter if it was synced shared secret, new shared secret, or even

      • by gweihir ( 88907 )

        The goal of security engineering is not to make it impossible to do attacks. Here it is to make insecure behavior cumbersome and high-effort compared to secure behavior. That way fewer of these attacks succeed and the attacker has to invest more and has a higher risk to get caught. That "sync" feature makes it very easy to do this attack and hence it is a brazen violation of sound security engineering principles.

        So, yes, there is a lot of blame on Google here. Sure, there is also a lot of blame on that utte

        • by Bongo ( 13261 )

          I agree, esp. your point that it works to make empty promises.

          (And I wonder if, in a very general way, there's a theme around how the internet may usher in an age of transparency and integrity, if ever humanity is to survive! but I digress.)

          It's also a link to the notion that corporations are machines to make money, and as machines, workers are just cogs, and nobody is responsible for anything.

          So I don't know, security can only go so far, because it takes human effort, and that effort will most of the time

    • by xwin ( 848234 )
      If you give someone your TOTP twice, and user name and password, they can login into Okta and add another second factor. No physical security key will save you. This is in general true for any accounts which allow multiple second factors.
      The lesson here for the rest of us, never give anyone your second factor code no matter what they tell you.
      • I think the point is that the way the "sync" is implemented in Google authenticator intentionally obscures what is happening in order to make it seem more secure and to make it "scary" to disable it. The "dark pattern" mentioned in the article means a user design issue where the company (Google) uses warning messages and dialog structures to mislead the user into doing what the company prefers rather than assisting the user in making an informed decision. It seems like there is a training issue at Retool

    • by ctilsie242 ( 4841247 ) on Friday September 15, 2023 @05:31PM (#63852304)

      Authenticator apps need to be built around the concept that the shared secret needs to be as protected, if not more than passwords. This means:

      The shared secrets need to be encrypted. Every field.
      If stored/synced, it needs a sync key that is separate from the login info, and is manually copied to endpoints. 1Password's secret key, and KeePass's keyfile are good examples of this. This ensures that the secrets cannot be easily decoded from a backend cloud server.
      It needs an ability to be exported, so one can back up the keys unencrypted, so one can go to another PW manager. Backups are important because if a sync error causes corruption, it will be a show stopper, and many 2FA keys can't easily be regenerated if lost.
      Ideally, the user should have an option for a PIN and/or fingerprint and/or face identification, depending on what the device offers, as well as the pass phrase.
      Another nice option would be either piggybacking off an existing cloud provider, or having its own cloud provider, and with a secondary key, this will ensure that the backend can't be brute-forced.
      Finally, the app needs both hammering protection, and a duress code, so it would erase itself and require re-syncing with the secondary key on an endpoint, just in case the phone was stolen while unlocked and an attacker is trying to guess their way in.

    • by Bongo ( 13261 )

      Well, I'm no security expert, but I kinda get the non sequitur of "something unique you have (only you)" with "...aaaand which the cloud has."

  • Headline should read: How One Extremely Stupid Employee Made Company's Data Breach Much Much Worse

  • What?

    Bahahahahhahahah

  • by sheph ( 955019 ) on Friday September 15, 2023 @02:44PM (#63851830)
    No matter how many factors you have until you get rid of the human factor you will never be secure.
  • by xwin ( 848234 ) on Friday September 15, 2023 @03:05PM (#63851902)
    They can litigate all thy want but this is unlikely to be successful. The employee is an idiot, providing second factor token to some guy from "IT". The whole point for the second factor that you never give it to anyone. The "IT" guy can use his own second factor if he has one, same as he can use his own password and user name.
    Google is not at fault here, nor is Google authenticator. I don't use google authenticator myself, but I use similar software. If I give away my TOTP code to people calling me, I should not be surprised if my account is compromised.
    The company should introduce more training, like once a week maybe, so the stupid people are reminded how to use the authentication tools. The normal people will suffer, but that is the price we all pay for stupidity of others.
    • Agreed. This is very much like a "Man uses drill to drill through board freehand, accidentally drills through self, sues drill maker for having drills that can hurt people when not used carefully". Unfortunately there are a lot of lawsuits like this...

      Feature worked as intended - someone with appropriate account credentials was able to access multiple accounts that were associated with the convenience feature.

      Are there risks to doing this? Yes, if someone other than the intended person gets access. Plan

      • No, not agreed. If this is the feature working as intended then it is a BAD FEATURE.

        That's the point. Secure systems been to be built on the idea that (a) most people don't understand security at all, (b) malicious people out there trying to break your security exist and (c) everyone fucks up sooner or later.

        Anything that violates one of those three assumptions is a bad feature. People know and they warned about this.

        Google IIUC changed it from "you have to physically give your phone away permanently to a s

    • by boulat ( 216724 )

      I've looked through roster of people working at Retool.. this is expected.

      I'm surprised it took this long for them to get hacked.

    • But it sounds like "User A"'s OTP allowed the attacker to compromise other accounts in the organisation?

      a sync feature Google added to its authenticator in April magnified the severity of the breach because it allowed the attackers to compromise not just the employee's account but a host of other company accounts as well

      How is that a useful feature?

  • My brother-in-law had the opposite problem, and really, really wished that cloud sync had been enabled.

    He's a freelance sysadmin, so has dozens, maybe more than a hundred, accounts on a whole bunch of different systems. He uses randomly-generated passwords, stored in a cloud-synced password manager, and Google Authenticator for TOTP-based MFA on nearly all of them, but had cloud sync disabled.

    A few weeks ago, he switched phones, and didn't sync his Google Authenticator config. Oops. He also didn't have

    • I nearly fell off this cliff with a PW manager that synced... but it synced corrupt data, so my smartphone and tablet were unusable. All my MFA keys were useless, and I would have had to do that recovery process, one by one, perhaps in some cases it wouldn't have been possible because some accounts didn't have the option to save recovery codes.

      What saved me? An offline iPod Touch which stays in airplane mode. After I powered that on and redid 2FA with everything, I now use a PW manager that I can dump th

      • Agreed, backups are a good idea even if you have a cloud sync. I've been using Google's password manager and its cloud sync for years, and it's been flawless, so I doubt I'll have any trouble with the Authenticator sync, but I still keep a separate downloaded copy of my passwords, and I'll continue keeping a printed copy of my Authenticator QR code. Redundancy is good.
      • Bitwarden lets you dump your saved data into a JSON-formatted file. I do that every few months - dump the data into a file stored on an encrypted disk image I've saved locally.

    • He understands the importance of backups! But, he didn't have one.

      If he didn't have a backup, it's arguable whether or not he actually understands the importance of backups.

      • He understands the importance of backups! But, he didn't have one.

        If he didn't have a backup, it's arguable whether or not he actually understands the importance of backups.

        Nonsense. He just hadn't considered the need to back up this particular item. I'm sure even an infallible expert like you has one or two things you haven't backed up.

        • I apologize for the tone of the above. I should have made my point differently. The point was that even the most knowledgeable and careful people make mistakes.
        • That's the thing with security, everyone fucks up. A lot of the comments on this article are people pointing&laughing or being angry at people fucking up, but it happens an it happens to everyone sooner or later. Incidentally this is why so many here think C is still the best thing since sliced bread because they cannot imagine fucking up, but the rest of us know what will happen sooner or later.

          That chap had of course fucked up, but because he had good security that meant a tedious recovery process, wh

  • Phones are computers, computers get hacked so authentication based upon a phone is fundamentally flawed. Yes it's convenient however but so is leaving your key under the mat. If you're using a phone as an authentication credential for something of value its dumb. Yes computers can have secure elements however ones like phones are always connected to the Internet so it's just a dumb way of doing things.
    While the article recommend FIDO tokens, a smartcard based system also provides strong protection for a sig

  • by NotEmmanuelGoldstein ( 6423622 ) on Friday September 15, 2023 @06:56PM (#63852502)

    ... poorly written disclosure.

    Google authenticator worked as intended, don't blame it. So "give me your password" phishing has escalated to "give me your OTP code": There will always be some idiot who obeys an invisible and unverified authority figure (See "Compliance", 2007).

    The problem isn't Google Authenticator or even Google making a copy of that database, it's Google encouraging multiple devices to log into the one account: If the wrong device gains access, they have the keys to the kingdom: Authority over all other devices. Convenience (and profiling), contrary to Google's/Microsoft's claims, don't increase security.

    This stupidity continues most recently with new proof-of-identity laws: Criminals are paradoxically targeting anti-crime databases so Amazon and PayPal are demanding they hold more personal details to prove you are not a criminal. No, just no.

    • Google authenticator worked as intended [ ... ]

      "NOTABUG: Working as designed."

      Yeah, we know, Sparky... The design is fucking idiotic!

      It seems clear that one of the OTP codes got them into the rube's account -- the second OTP code allowed them to copy out his Google Authenticator database. If that copy hadn't existed -- and indeed did not exist until Google decided to make copies for itself -- then they would have had to keep pumping him for OTP codes, and the damage would likely have been more limit

      • ... a STUPID FUCKING IDEA!

        Anyone with access to that data file, or access to your phone, also gets access to your OTP secrets: That's the point of failure, which happens because most authenticator apps aren't password protected.

        ... Google bears partial responsibility ...

        Since the criminal actually phoned the dope who assisted with further breaches, no level of password/OTP security would have prevented this cyber-crack.

        ... a sync feature Google added ...

        It's sounds like Google put all OTP secrets into the one database, because remember, it's multiple people using the one account, thus making all services sha

        • Anyone with access to that data file, or access to your phone, also gets access to your OTP secrets: That's the point of failure, which happens because most authenticator apps aren't password protected.

          Most phones are encrypted and protected with a password. In order to get all of the OTP secrets, I have to find who you are, physically go to where you are, steal your phone, crack it and then get your OTP codes.

          Or

          Use the back door that allows me to spear phish the secret from someone already proven vulnerabl

    • by Bongo ( 13261 )

      It's like people can't think of one step ahead. Companies send warning messages to indicate there's been suspicious activity on your account, so scammers start sending warning messages to say there's been suspicious activity on your account and you must act urgently.

      Is it really so hard to foresee, is it really so hard to use a bit of empathy and imagination to think a step ahead?

      I'm not quite sure what the key principle is here, but surely there's something to be said for segregation and isolation. Interes

  • If those customers were in the cryptocurrency industry, they well deserved to be hacked. No pity here. Now if it happened to a useful industry, it could be taken seriously, but them? Go hackers!
  • Hello everyone! I had seen so many recommendations on (TECHSPYHACKERPRO @ GM AIL C OM) so I contacted him to help me Clone my wife cell phone and WhatsApp and other applications on her cell. Just like Magic, I got the files to get it done and I have access to my wife phone. He was really efficient and I have access to everything including phone calls, logs, SMS, surrounding and location.

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...