CyanogenMod Android ROMs Accidentally Logged Screen Unlock Patterns 69
tlhIngan writes "Heads up CyanogenMod users — you will want to update to the latest nightly build as it turns out that your unlock patterns were accidentally logged. The fix has been committed and is in the latest build. While not easy to access (it requires access to a backup image or the device), it was a potential security hole. It was added back in August when Cyanogen added the ability to customize the screen lock size.`"
Multi-layered security (Score:1)
Re: (Score:2, Funny)
Re: (Score:1)
Your location has been observed and logged. We have dispatched the Mole People. Your co-operations is appreciated.
ftfy
Re: (Score:2)
Re: (Score:2)
Don't eat fries before you unlock your phone :P
Seriously though, I appreciate the amount of paranoia the makers of Cyanogen exhibit as far as potential security holes go. Even if patterns are not super secure, it's nice that take additional security holes seriously enough to fix it quickly and and make a public announcement.
Re: (Score:1)
Simple unlock patterns are inherently flawed, anyway. Your password is finger-painted on the screen. Even direction is easy enough to determine.
Particularly if you sweat as much as Jimmy Savile in a primary school playground.
Re:Accidentally? (Score:5, Insightful)
FUD:
* it's an open-source project
* the fix has been commited
* it requires access to the device
Re: (Score:2, Insightful)
Oh, it's open source so it's all good?
Open source is so fast to get a pass on being Evil(tm) around here. More people who own an Android phone have the skills to rebuild an engine than to properly interpret the source code of their phone. Open source only matters if you have the skills to understand the code. The vast majority of people running CyanogenMod don't have this skill set.
Re:Accidentally? (Score:5, Insightful)
Ahh, you miss the point. The vast majority do not need to understand the code.
Open source's strength is not that everyone has to read/understand the code -- it is that everyone can. It takes only one person to find an issue, then others can see for themselves and confirm/fix. If the vendor not fixing it fast enough, a fork or patch can be done without vendor's approval. On the other hand when Apple logged your location, it was only found by accident because they left data laying around. Then you had to wait for Apple to fix it, which, for all we know, they did by not leaving the data easily findable.
Of course that is not perfect and plenty of bugs and issues do not get found quickly in Open Source - but if it is popular enough, it is much harder to be evil on purpose and hide it.
Oh, it's open source so it's all good?
Open source is so fast to get a pass on being Evil(tm) around here. More people who own an Android phone have the skills to rebuild an engine than to properly interpret the source code of their phone. Open source only matters if you have the skills to understand the code. The vast majority of people running CyanogenMod don't have this skill set.
Re: (Score:1)
Android flavor messes up security and Apple gets dissed. Standard day on /. :)
Move along people, nothing to see here
Re:Accidentally? (Score:5, Informative)
Re: (Score:1)
Re: (Score:2)
The fix was in the nightly, not the bug. The bug has been there for months.
For whatever it's worth this is sloppy coding. As of ADT 20 there is an automatically generated java file called BuildConfig with a single constant DEBUGGING.
So the way this line of code should have been written is something like this:
if (BuildConfig.DEBUGGING) Log.v (TAG, "some logging info");
That said, this isn't exactly leaking bank details, it's a swipe gesture. It's good they caught it, but it's not a huge security risk unle
Re: (Score:1)
Re: (Score:3)
Re:Accidentally? (Score:5, Informative)
The guy is part of the Cyanogenmod team, he used his username so he could grep the debug output he created with that log line while a testing a feature he was working on.
To sum it up:
Not a big deal, just left over debug code.
Not really a vulnerability either, because in most cases where you can read the local log file you already unlocked the phone in the first place.
--
Me
Re: (Score:3)
Or you have a program running on it that is looking for that information and sending it to you via the cellular data channel.
Imagine what the criminals of the world will do with a database of android unlock codes and gestures!
Re: (Score:2)
What am I missing? What good is the gesture and unlock code without the phone?
Re: (Score:3)
Wait, was that sarcasm?
I have a condition where I cannot determine sarcasm before 7am.
Re: (Score:1)
by PopeRatzo (965947) on Wednesday October 24, @07:52AM
I have a condition where I cannot determine sarcasm before 7am.
Whew, dodged that one!
Re: (Score:2)
Never heard of "time zones"? I posted it at 06:52 CST.
Re: (Score:2)
What am I missing? What good is the gesture and unlock code without the phone?
Just imagine what the criminals will do!!! IMAGINE!
Re: (Score:2, Funny)
Run for Office?
Re: (Score:1)
Re: (Score:2)
Or law enforcement - imagine what they can do with the data - they sieze your phone, plug it in to see if it'll spew data out the USB port while locked. If you have USB debugging on, they could look at the logcat and see the unlock code and use it to legitimately snoop around (it "wasn't locked - it just had a very fancy "slide to unlock" function).
Given how cellphone's legal status as a container is in doub
Re: (Score:3, Insightful)
If an official ROM did this it would be taken as an evil invasion of privacy by Samsung, HTC or Google, but when the Cyanogen team does it it's immediately accepted as an accident.
Interesting.
No, things like this have happened with the larger developers and it has always been explained as a bug and accepted as incompetence. The times you see outrage is when the larger developers logs data and send it to them as part of the intended function. Cyanogen has not done anything like that yet and indie teams generally don't have an interest to do so.
Re:Accidentally? (Score:5, Interesting)
Not interesting in the slightest. The difference between evil invasion of privacy and an accident is purely intent.
If a company had done it you can't prove it one way or another so it's safe to assume the worst.
If on the other hand it's done to code that is openly published at a time where a feature is modified which during developing would have clearly called for logging the actions to file for debugging purposes it shows quite a different level of intent.
You can still assume the worst, but if you do in this case we'll just assume your tinfoil hat would need to be retuned.
Re: (Score:1)
I disagree with that. No matter the intentions harm is still possible. So are you saying that if it were a company somehow they are only capable of malicious capitalistic greed, and do not possess the ability to make a mistake? That seems a bit over the top (speaking of tinfoil hats...). In this case it requires physical access to the device, and is therefore less of an issue than if it could be accessed remotely, or worse uploaded and stored on some centralized server. Rest assured that open or closed
Re: (Score:1)
I disagree with that. No matter the intentions harm is still possible. So are you saying that if it were a company somehow they are only capable of malicious capitalistic greed, and do not possess the ability to make a mistake? That seems a bit over the top (speaking of tinfoil hats...). In this case it requires physical access to the device, and is therefore less of an issue than if it could be accessed remotely, or worse uploaded and stored on some centralized server. Rest assured that open or closed source is not the issue here.
It's a matter of means, motive, and opportunity.
On one side, you have a company. It's sole purpose for existence is the creation of profit for its shareholders. Because their products are closed, they can introduce a security flaw under cover of closed source ("opportunity"). Because they make the product, they the only ones who can introduce the security flaw ("means"). Security flaws are potentially lucrative and the only reason a company exists is to make money ("motive").
On the other side, you have an o
Re: (Score:2)
No what I am saying is that without context we can assume the worst. Companies can and often do make mistakes, and if those mistakes are found through a process of auditing rather that security researchers finding locally stored sensitive information as is usually the case, then they would be forgiven.
The issue here is that the open source provides context into what happened. Could it have been nefarious? Possibly, but given the incident, the full code review provided and the timetable it is quite unlikely.
Open source // code review? (Score:5, Insightful)
That's one of the issues with many committers, you can't review all the code before it ships off in a build. I seem to remember a bug in openssl where some kid commented an entropy line "because it showed warnings at compile-time" and managed to commit it without raising suspicions.
Bottom line, where are the code reviewers in this process? QA?
Re: (Score:2)
Continuous integration should be able to prevent such problems.
At it's worst it'll do no worse than the best of all code reviewers combined.
Re: (Score:3, Insightful)
Re: (Score:1)
Re: (Score:1)
Are you speaking about CM specifically or open source in general? With respect to the CM project, particularly on XDA, you will find a large number of people who ship binaries only instead of embracing the open source style of making branches in git and using gerritt. You just have to stick to the better known builders and subscribe to their git repo.
Re: (Score:1)
To be fair, the bug was caused by the Debian OpenSSL package maintainers, not by the OpenSSL developers themselves. Here are some information [theinquirer.net] for the bug in question.
While this bug in Cyanogenmod is different and the developers themselves are responsible for it, it was not shipped in any official build. If it did, it would have been a totally different matter.
Re: (Score:1)
Requires backup file or device (Score:1)
So, nothing to see here, move along.
Storm in tea-cup (Score:1)
What protection can you really expect from the screen lock? Someone who is determined enough can usually use the android debugging bridge to do whatever the hell they want with it anyway (either in recovery or when booted up). As the saying goes: if you have physical access to a device... all bets are off anyway.
The screen lock is simply to protect against most "attackers".
Re: (Score:2)
Excuse me, but... so what? (Score:1, Insightful)
You can bypass the lockscreen on any phone that has CM installed. Just hook it up to a PC with a USB cable, up pops the "Turn on USB storage" screen, hit Home, bam, you're in.
I don't use any lockscreen gesture or password, because I find them a PITA, and I want my gf to be able to use it without hassles. On the other hand, I try to treat my phone as I treat my wallet. I look around me when I pull it out of my pocket. I wait until the subway doors are closed. Etc.
Re: (Score:2, Informative)
You have to unlock it to access the dialog to enable USB storage.
Maybe you are thinking of USB debugging?
Re: (Score:1)
I have a phone here running CyanogenMod
Hooked it up to a PC with a USB cable
Phone's screen turns on, locked
Now what?
When you say "any phone" but you actually mean "My phone, on which I have disabled the lockscreen" then you look like a retard.
Re: (Score:2)
Re: (Score:2)
I had the same oppinion, but I've recently added a lock gesture to stop my pocket from using the phone.
The Big Difference (Score:1)
The Comments of the Ars article are worth reading. (Score:5, Insightful)
Basically, the story is that:
It is debugging code left in a development build, that happens to be used by many persons as nightlies.
It does not write to a file. It is debug information written to a ring buffer in RAM. You would need to have an app installed with permission on the logs, or connect a cable in debug mode and trace the log to even get these messages.
It was found in a code review, and removed.
So much a non-issue that it is a wonder that Ars even reported it. Seems Ars misread a mailing list heads-up. We are waiting for Ars to publish the correction to their article.
Run to the hills! (Score:2)
CM10 Nightly... (Score:1)
The thread following TFA mentions that this is for CM10 nightlies, so if you're tracking the development branch, you just need to upgrade to the latest nightly to ensure you have the fix.
What is this wizardry? (Score:1)
This would be more interesting... (Score:1)