New Firefox Vulnerability Revealed 250
Not long after Firefox 3.5.1 was released to address a security issue, a new exploit has been found and a proof of concept has been posted. "The vulnerability is a remote stack-based buffer-overflow, triggered by sending an overly long string of Unicode data to the document.write method. If exploited, the resulting overflow could lead to code execution, or if the exploit attempts fail, a denial-of-service scenario." It's recommended that Firefox users disable Javascript until the issue is patched, though add-ons like NoScript should do the trick as well (unless a site on your whitelist becomes compromised).
Update: 07/20 00:09 GMT by KD : An anonymous reader informs us that the Mozilla security blog is indicating that this vulnerability is not exploitable; denial of service is as bad as it gets.
Update: 07/20 00:09 GMT by KD : An anonymous reader informs us that the Mozilla security blog is indicating that this vulnerability is not exploitable; denial of service is as bad as it gets.
Re:Defective by design (Score:5, Interesting)
Re:Unbounded (Score:4, Interesting)
Is That What's Crashing Xorg? (Score:0, Interesting)
I wonder if this bug what is causing Xorg to crash, as described in this blog post? [wordpress.com]
I thought they tested 3.5 prior to release.
automate protection (Score:4, Interesting)
These recurring requests to turn off something are getting annoying. Why not automate the process? Set up a page somewhere like
www.mozilla.com/firefox/3.5.1/current-safety.txt
which would list something like
javascript: unsafe
java: safe
flash: safe
Then by default your browser would fetch that file and automatically implement Mozilla's recommendation of the day.
Re:Many eyes makes for secure code (Score:3, Interesting)
Let's just hope that all those eyes are friendly. How many black hats are scouring the source code to generate exploits to sell underground? As quickly as Firefox releases patches, when these bugs aren't reported it's no better than a proprietary browser.
Except that other people are a lot more likely to find the same bug, and report it regardless of the black hats.
The code may not be that relevant (Score:3, Interesting)
After all, FF is open during development, not just after release. 3.5 has been a long time in coming, the code has been out there for lots to see and lots have looked, yet this was missed.
The thing is, open or closed, any major project has a lot of people looking at the code, and at least some of those people, perhaps most, are highly skilled. What this means is that it isn't likely there's an extremely obvious bug in the code. It isn't the sort of thing that someone would look at the source and go "Oh look they forgot to set getHacked = 0," or something like that. If it were obvious, the developers probably would have caught it. Instead the bugs are due to subtle interactions in teh code, that aren't easy to see.
So, more often than not, the way these things get found isn't someone pouring over the code, it is someone trying out attacks on the finished product. They try sending it bad data of various kinds to see how it reacts, or perhaps they see it react in a certain way to good data that gives them an idea how they might craft bad data to exploit it. Whatever the case, they are working on the finished product, and not particularly concerned with the source.
This is why you find bugs even in projects that many people are on, because developing something and looking at the code is real different from trying to exploit the finished product.
Re:That's notthe first time (Score:4, Interesting)
Fix once, fix forever
The bug is in the Just-in-Time compiler inside of SpiderMonkey (TraceMonkey). This is brand new code as of 3.5.x. Of course there will be a ton of bugs found in it (just like the ton of bugs that have cropped up in SquirrelFish and have been subsequently patched).
I have to wonder why it's taken so long for anybody's security team to look at this code though. You'd think they'd look at this code before release and not after.
Re:Turn off javascript... (Score:3, Interesting)
But I also think it's silly to assume and design for Javascript
According to 95% [w3schools.com] of users have JS on. There's no reason to essentially design two separate sites to support the other 5%. And it could be argued that that 5% could either easily turn it back on if they choose (in which case, they're the lazy one), or is using something really really old and has no need to, or doesn't want to.
I'm not a web developer, but it seems obvious to me that while it's possible and often sensible to include the other 5% (which may include spiders, which you typically want), ignoring them because you don't have time for two designs is not at all silly. They may not even be the type of people you want on your site anyway.
Re:Not just Firefox? (Score:3, Interesting)
It crashes FF 3.5.1 and Safari 4.0.2 for me, but not Chrome 2.0.172.37 or IE 8.
Re:Turn off javascript... (Score:5, Interesting)
Wouldn't avoiding javascript make webpages smaller & therefore load faster?
Nope. To the contrary, a well-designed AJAX page that dynamically reloads sections instead of the entire page can potentially be much faster. Take the example of registering for a site account. Old way:
New way:
Alternatively, look at Slashdot itself. Yeah, it has its issues, but I have to say that I love the dynamic content loading. That's so much better (and easier on bandwidth!) than having to load a whole page just to expose a collapsed comment.
Re:That's notthe first time (Score:2, Interesting)
I'm not saying valgrind, etc. are bad, only that sometimes they can be misleading.
Re:Turn off javascript... (Score:3, Interesting)
>>>But the 95% percent of people with functioning browsers might appreciate those features
Nearly all those persons aren't even going to notice the difference between a Javascript dropdown menu and a CSS dropdown menu, so why bother with the larger JS version? I say follow the KISS principle - use CSS.
>>>why do the people stuck in 1996
That's not really the issue. Even today in 2009 there are people using slow dialup, satellite, or 500k DSL connections. You design your site so it loads quickly over these connections, instead of alienating your customers with 2-minute bloated pageloads.
Re:Turn off javascript... (Score:3, Interesting)
I don't think you wasted your time as I quite agree that each party (server & content reader) has a right to only provide/accept according to their wishes. That is the defining characteristic of the Internet, not just the web. A great example is NNTP which is currently under fire as well since the puditocracy and politickians just don't get it.
More to the point, as you've noticed, there is a definite lack of capability in the realm of critical thinking in the US, and it seems to be spreading. It wasn't even a requirement in our state's education system here unless you went to college and even then, judging from the papers turned in, the students still didn't get it. Not good. The ability to think critically is fundamental to being more than just another industrial society wage-slave. Furthermore, the Constitution was predicated on the notion that the voters would have that capability as well. I can hear a collective "whoops!" from the founding fathers, although I wouldn't be surprised that the political class likes the current status-quo. I don't see the situation changes short of revolution and that's about as likely as an asteroid stirke, perhaps less.
Re:Turn off javascript... (Score:3, Interesting)
True. The whole emotional outrage that anyone would block ads is easy to summarize. Webmaster goes out of his way to knowingly place content onto a public network where it is freely accessible by anyone. Said Webmaster does not use a paywall, nor does he deny content to users who don't load the ads. When said content is freely accessed, Webmaster then says, in effect, "now you owe me something, so view my ads!" and feels cheated if they aren't viewed. He wants compensation for a thing at the same time that he is giving it away freely. He also wants me to honor an agreement in which I did not participate. This is the Webmaster's fault.
And that's alright; while I think it's silly, I also believe that website owners should be free to do this if they want to. I just refuse to be shamed or otherwise pressured into going along with someone else's faulty expectation. The need to try that on me is the red-flag indicator of the entitlement mentality I mentioned. It's the reason why I responded as I did, as most people who do this don't seem to realize that it's manipulative.
This is especially true when said pressure comes from people who have invested in similarly faulty expectations of their own. Most people don't seem to use ad blockers and they are not standard features of most browsers. In other words, most users have chosen, not actively but by default, to give up their potential control, allowing the remote site full control of page layout. For that reason, many ad-supported public sites have been successful. They should be thankful that mitigating factors can help flawed premises to produce desirable conclusions instead of concerning themselves with how I configure my browser. Besides, they can put that effort towards reconfiguring their servers.
I'm glad whenever I see that someone understands the severity and long-term outcome of this problem. That understanding is one of the single most effective things you can personally do about it. I imagine that if you didn't see the problem, the tone of my previous post wouldn't make sense and either that post or this one would seem like too much of a rant (eh, too late).
I think "wage-slave" is a somewhat mild term. I'd go so far as to say "automaton forever deprived of the ability to live his own life." I've heard more cynical folks say that you can't miss something if you have never known what it was like, yet I've never met a person who could be described that way who was also happy. In a sense, the problem is hidden in plain sight. It's so widespread and so common that it is often accepted as normal.
There's a bit more to it than that. If "political class" includes "19th century industrial tycoons" and their descendants, and there's no reason why it shoul