Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Mozilla Firefox Security IT

Mozilla Says a New Firefox Security Bug is Under Active Attack (techcrunch.com) 68

Mozilla has warned Firefox users to update their browser to the latest version after security researchers found a vulnerability that hackers were actively exploiting in "targeted attacks" against users. From a report: The vulnerability, found by Chinese security company Qihoo 360, was found in Firefox's just-in-time compiler. The compiler is tasked with speeding up performance of JavaScript to make websites load faster. But researchers found that the bug could allow malicious JavaScript to run outside of the browser on the host computer. In practical terms, that means an attacker can quietly break into a victim's computer by tricking the victim into accessing a website running malicious JavaScript code. But Qihoo did not say precisely how the bug was exploited, who the attackers were, or who was targeted.
This discussion has been archived. No new comments can be posted.

Mozilla Says a New Firefox Security Bug is Under Active Attack

Comments Filter:
  • by bobstreo ( 1320787 ) on Friday January 10, 2020 @11:10AM (#59606660)

    #CVE-2019-17026: IonMonkey type confusion with StoreElementHole and FallibleStoreElement

    Reporter
            Qihoo 360 ATA
    Impact
            critical

    Description

    Incorrect alias information in IonMonkey JIT compiler for setting array elements could lead to a type confusion. We are aware of targeted attacks in the wild abusing this flaw.
    References

            Bug 1607443

    • by bjwest ( 14070 )

      I see none of that info in my 'About Mozilla Firefox' pulldown. In fact, I see no pulldown labeled 'About Mozilla Firefox'. There's a 'About Firefox' in the 'Help' menu, witch opens an 'About Mozilla Firefox' dialog with a 'What's new' link that takes you to a page with another link called 'Security fix' that will take you to a page that shows what you posted.

      Not everyone on Slashdot knows the ins and outs of how to find specific information related to bug and security fixes. You really should be more sp

  • One word (Score:5, Insightful)

    by TheDarkMaster ( 1292526 ) on Friday January 10, 2020 @11:11AM (#59606664)
    NoScript.
    • Javascript brings an invaluable addition to web pages, improving a lot the user experience. Why is NoScript the systematic answer to a Javascript related bug? So, after the Meltdown and Spectre CPU attacks, the solution is NoCPU?
      • Easy answer: Javascript brings the ability of untrusted assholes to run code on your machine. I know folks these days have a problem with "your". This means something that belongs to an individual (yes, that word is also under threat) and isn't a place for random assholes to run whatever they feel like. Thus, when some Chinese or North Korean asshat finds a way to run code outside of that sandbox you get fucked like a housecat. The features and functions JS brings to the web are significant, yes. That being
  • People are forced to use old Firefox versions for various reasons and will be vulnerable to this because Mozilla abandoned them.

    Version 45 because they have old computers that don't support SSE2, Version 52 because they still need Windows XP (legacy app support), Version 56 because they ignored the pleas not to drop XUL support. And soon People won't upgrade to version 74 because it won't allow TLS 1.1 and 1.0 connections, which are NEEDED in old corporate environments, forcing them to use 68 ESR foreve
    • by anarcobra ( 1551067 ) on Friday January 10, 2020 @11:44AM (#59606734)
      If people NEED all of those things maybe they NEED to pay mozilla to keep supporting those version of firefox.
      Especially people who NEED it because of corporate environments.
      What exactly is stopping all those people who NEED it so badly from downloading the sources, backporting the patch and compiling it themselves?
      Seems like the NEED is not so great if they aren't even willing to go that far.
      • Re: (Score:2, Insightful)

        by Train0987 ( 1059246 )

        Easier to ditch Firefox altogether and never use another Mozilla product ever again. That's why their installed base decreases every day.

        • People are forced to use old Firefox versions for various reasons [...] Version 45 because they have old computers that don't support SSE2, Version 52 because they still need Windows XP (legacy app support)

          Easier to ditch Firefox altogether

          In favor of what? The current version of the other major multi-platform web browser [wikipedia.org] doesn't run on pre-SSE2 CPUs or Windows XP either.

          • Why should google care about supporting an OS almost two decades old? Will today's Firefox run on a linux distro of the same vintage? Of course not. Same goes for an equally old instruction set.

            • Why should google care about supporting an OS almost two decades old?

              Because they still work?

              Granted, because of the jungle that is the internet you should no longer connect one of these directly to the network, but it is not because they are over ten years old that it suddenly stopped working.

          • by gtall ( 79522 )

            You are supposed to ditch Firefox because this guy on Slashdot mentions it is bad and doesn't float his boat. You are supposed to use some whizzy net twiddler which has superior characteristics that only he and a few others care about.

          • You can't compile WebKit for pre-SSE2 machines anymore?
            • by tepples ( 727027 )

              Which web browser for a non-Mac desktop or laptop computer still uses WebKit?

              • by laffer1 ( 701823 )

                Epiphany and midori. Only browsers that work on my OS right now. LLVM devs won't take upstream to get to rust for recent firefox builds. Google won't take upstream patches for BSDs.

        • I like Firefox, and I'll keep using it because I don't care to be part of the Google or Microsoft bot-nets. Real easy for someone like you to say something like that now isn't it?
      • If people NEED all of those things maybe they NEED to pay mozilla to keep supporting those version of firefox.

        The web, in general, does not support old versions. I still keep coming across web sites that will work exclusively in the absolute latest version of Chrome. It's far more sensible to just make the new versions not suck, but Mozilla has been repeatedly screwing that up for more than a decade. More funding won't change bad policies and stupid agendas.

        What exactly is stopping all those people who NEED it so badly from downloading the sources, backporting the patch and compiling it themselves?

        That's practically every Firefox port ever. Nobody likes to fork, but they have to because Mozilla has never been sensible with regards to maintaining old f

    • by Anonymous Coward

      SSE2 dates to the beginning of the century. If you're still running a machine that old, the problem is the user.

      • by xonen ( 774419 )

        SSE2 dates to the beginning of the century. If you're still running a machine that old, the problem is the user.

        What? Why?

        There's nothing wrong with hardware from last century, nor it's users. If anything, the user took good care of the hardware so it's still in working order.

        This idea of 'old is obsolete' is so nuts. It may apply to your iphone, but it does not apply to any random piece of equipment. And you want to save the planet and the environment, but you think it's normal to trash working hardware after a year or two?

        Now you probably gonna mumble something like 'collectors' or 'your hifi system not counts i's

        • by ArchieBunker ( 132337 ) on Friday January 10, 2020 @01:00PM (#59607058)

          Your circa 2000 Pentium 4 box is painfully slow. I tested Windows 7 on a P4 HT box and it worked but felt laggy. I can only imagine what javascript laden websites would be like. Look I love old hardware and even help out at an actual computer museum but trying to do modern tasks on it is pointless and a huge waste of time. This is coming from a person with multiple SGI and IBM boxes.

        • I wholeheartedly agree. In my opinion, the only valid reason to decommission hardware is that it is broken and unfixable. Older hardware gets repurposed to less demanding tasks. I have computers with single core Celeron (Eee pc 904, with SSD drive swapped for the original spinning rust after it failed), I have Core2Duo with SSD and Linux, still able to even play games such as Borderlands 2, my media pc is a dualcore Intel that was thrown in the thrash at a friends work place, and so on. Producing even a s
      • by xack ( 5304745 )
        People still drive old VW Beatles and Morris Minors, you shouldn't need a 64 bit multi core processor just to access websites.
        • by Osgeld ( 1900440 )

          you don't need 64 bit multi core processor just to access websites you need an appropriate browser for your legacy machine, forcing firefox or whoever is akin to bitching that VW doesnt make parts for your 1959 van, there's a specialty market for that JUST as there is for software on legacy computers

        • People still drive old VW Beatles

          Difference is that roads didn't change in a way that makes VW Beatles harder/slower to drive. On the other hand, websites have been using more and more JS/CSS... expecting the user to have a decent computer to view the pages decently.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      If you're using Windows XP and TLS 1.0 to access external websites, Mozilla isn't the cause of your security vulnerabilities.

    • I have a UPS which supports an old SSL protocol. My only choice with modern Firefox is to connect over HTTP. Mind you, this is a LAN device that's firewalled from the world.

      If I have an unknown APT on my network, I'm better off running SSLv3 to manage it than cleartext HTTP. This is obvious to anybody who understands the security process and security economics.

      Mozilla understands security protocols well but not the broader process.

      My effective choices are to spend $2200 replacing a decent UPS or to keep

      • Why not just connect a network cross-over cable to it at times you actually want to manage it?

        Or maybe a better option would be to just segment your network and put all admin consoles (insecure or otherwise) on that network. Don't give the admin network access to the Internet. Now all of that traffic can be insecure as you want without having to worry about it.

      • I'd use a VM for outdated UPS, KVM-over-IP, etc. Which would be on a separate management LAN.

        Here is an idea, though. I wonder what you think. For under $50 (as low as $20), you could have a single that plugs into that old equipment and translates between insecure SSL/TLS and modern encryption.

        The $13.90 Orange Pi R1 has two Ethernet ports and runs Mix, so it could proxy. I think the NanoPi R1 is $29. I wonder if there would be any interest in a pre-configured dongle for that purpose. It could use pow

        • Without typos, hopefully:

          Here is an idea, though. I wonder what you think. For under $50 (as low as $20), you could have a dongle that plugs into that old equipment and translates between insecure SSL/TLS and modern encryption.

          The $13.90 Orange Pi R1 has two Ethernet ports and runs Linux, so it could proxy. I think the NanoPi R1 is $29. I wonder if there would be any interest in a pre-configured dongle for that purpose. It could use power over Ethernet.

          But again, I'd just use a VM with a browser configured

    • by btroy ( 4122663 )
      Yikes!

      that old?

      You need to use Lynx instead, it'll give you plenty of speed on equipment that ancient.
    • by tepples ( 727027 )

      Version 45 because they have old computers that don't support SSE2

      SSE2 exists in every Pentium 4 CPU since 2000 and every Opteron and Athlon 64 CPU since 2003. If a PC was manufactured prior to the introduction of the Pentium 4, a $200 PC could probably pay for itself in power use alone.

      Version 52 because they still need Windows XP (legacy app support)

      Why not run the legacy apps inside Windows XP inside a virtual machine inside a modern operating system, and then run the web browser directly within the modern system?

    • People are forced to use old Firefox versions for various reasons and will be vulnerable to this because Mozilla abandoned them. Version 45 because they have old computers that don't support SSE2, Version 52 because they still need Windows XP (legacy app support), Version 56 because they ignored the pleas not to drop XUL support. And soon People won't upgrade to version 74 because it won't allow TLS 1.1 and 1.0 connections, which are NEEDED in old corporate environments, forcing them to use 68 ESR forever. Mozilla keeps trying to upgrade for "security" reasons, but they abandon the users that can't. The only alternative is to use browsers like Waterfox and Pale Moon/New Moon since Mozilla would rather be a Chrome drone than making a real web browser.

      SSE2? That's ancient. What are you running that predates the P4?

      There is no good reason to run a computer that old. Just upgrade to the cheapest possible thing available today for better performance, lower power usage, better everything.

      Windows XP? Are you serious? There is no good reason to use that at all. Ever. Stop.

    • People are forced to use old Firefox versions for various reasons and will be vulnerable to this because Mozilla abandoned them. Version 45 because they have old computers that don't support SSE2, Version 52 because they still need Windows XP (legacy app support), Version 56 because they ignored the pleas not to drop XUL support. And soon People won't upgrade to version 74 because it won't allow TLS 1.1 and 1.0 connections, which are NEEDED in old corporate environments, forcing them to use 68 ESR forever..

      If people "need" to use Windows XP, or old, insecure versions of TLS, etc., I'd say they likely have much larger security problems than the fact that they can't upgrade Firefox. If they "need" to still run those things, and they care at all about security, they should have other mitigations in place to isolate those boxes and the version of Firefox running on them shouldn't matter at all...

  • Many traditionalist users of Slashdot, SoylentNews, and other tech forums choose not to run JavaScript or WebAssembly at all. Escape defects like this are one of the two reasons that such users cite most often, the other being abusive adtech. Traditionalist users want HTML documents to be static, apart from server-side action on form submission and subsequent full page reload, and networked applications to be native. But that raises another question:

    How are native applications any safer in practice than web applications?

    A user who shuns JavaScript-driven web applications would need to download and install a native application for the same functionality. But unlike a web browser, which has a sandbox with occasional defects that the browser publisher quickly patches, most desktop operating systems don't even try to put a native application in a sandbox. This means that once installed, a native application has full read and write access to every file in each user's home directory. Even if an application is distributed in source code form under a free software license, and therefore theoretically auditable, the number of users who go to the effort of actually auditing every line of code in a program and in the libraries it uses is a rounding error, as the Heartbleed incident showed.

    • by gweihir ( 88907 )

      How are native applications any safer in practice than web applications?

      Simple: It is you that controls what code is installed. Apart from that, web-apps are more secure, because they are running in sandboxes. But the argument that you decide on installed code is pretty strong. That is if you know what you are doing.

      • by flippy ( 62353 )

        How are native applications any safer in practice than web applications?

        Simple: It is you that controls what code is installed. Apart from that, web-apps are more secure, because they are running in sandboxes. But the argument that you decide on installed code is pretty strong. That is if you know what you are doing.

        It's a matter of risk vs flexibility, really.

        Native applications can be inherently more secure because they are a limited attack surface - if you've got 10 native apps installed, you can only be attacked through flaws in those apps.

        Web apps, on the other hand, run in a web browser, which means they're running in an environment that is designed to connect to any server. IF you only ever went to 10 trusted sites that replicated the functionality of the 10 native apps mentioned above, then yes, THOSE web apps

        • What might be very useful is a browser that can be configured in such a way that locally-executed code (JS/WebAssembly/etc) will only run on specified sites and not anywhere else.

          Firefox with the JavaScript Control extension [mozilla.org] installed is that browser. So is Firefox with the GNU LibreJS extension [mozilla.org], though it automatically whitelists any script that it can determine is distributed under a free software license.

          • by flippy ( 62353 )

            What might be very useful is a browser that can be configured in such a way that locally-executed code (JS/WebAssembly/etc) will only run on specified sites and not anywhere else.

            Firefox with the JavaScript Control extension [mozilla.org] installed is that browser. So is Firefox with the GNU LibreJS extension [mozilla.org], though it automatically whitelists any script that it can determine is distributed under a free software license.

            Are these better than NoScript? I've been playing around with that one a bit and it seems pretty flexible and easy to use.

            • JavaScript Control is a simple-to-understand extension that just whitelists all script or no script per domain. It's grandma-easy for the specified use case ("locally-executed code (JS/WebAssembly/etc) will only run on specified sites and not anywhere else"). You're right that NoScript is more flexible. But more flexible software has proven to take more time to learn, and in my experience, many older users lack patience to learn anything outside the task directly at hand. Some even forget about tabbed brows

      • the number of users who go to the effort of actually auditing every line of code in a program and in the libraries it uses is a rounding error, as the Heartbleed incident showed.

        But the argument that you decide on installed code is pretty strong. That is if you know what you are doing.

        I doubt anybody understands every line of code in the operating system or applications that they use on a modern PC. Thus people don't know what they are doing.

        • by gweihir ( 88907 )

          Competent risk management does not require to understand things on that level. In fact, trying to do it on that level is a complete fail.
           

          • by tepples ( 727027 )

            Let's say a user visits a website about a particular Internet service that asks the user to download, install, and run a native application in order to use that service. What steps make up "competent risk management" with respect to evaluating whether or not to download, install, and run said application?

            • by gweihir ( 88907 )

              That depends. Your scenario is missing most of the needed information to make a decision. Hence the first step is to get more information.

              • by tepples ( 727027 )

                Hence the first step is to get more information.

                Which information about an Internet service and the native client application therefor is most relevant to get in order to make a decision about the information security risks of running the client for that service on a user's device?

    • by raymorris ( 2726007 ) on Friday January 10, 2020 @12:53PM (#59607030) Journal

      You may or may not remember all of the worms and other nasties caused by Outlook macros, and Office macros in general. The problem that caused that, and causes the problem on the web today, is the lack of distinction between a document (data) and a program. Web pages and emails should be things you read - they shouldn't be an to DO anything to your computer.

      You do NOT need to download a separate program to read a Slashdot post. This post is a document, there is no need for it to be a program.

      Thinking that way, you would block JavaScript for general web surfing. You'd perhaps whitelist Google Docs if that's a program you use (on the web), or use a different browser for web *applications* as opposed to reading web *documents*.

      In that same vein, I used to have a different browser I that had Flash. My main browser didn't have Flash, because Flash was mostly annoying ads, and a security issue. Maybe three times per year I would have a Flash site I wanted to use, so I'd use my Flash-enabled browser for that site.

      • Of course for lots of us who tried blocking Javascript in general we found that it broke so many sites that we gave up doing that.

        • Then you (the lot of us you refer to) are the problem. Instead you should have continued to block JavaScript and sent a nasty-gram to the owner of the website explaining why you were unable to partake of their website anymore, and thereafter do not partake of their website. If everyone did this, then malicious websites using JavaScript would soon be bankrupt and no longer exist.

          Problem solved.

          • and thereafter do not partake of their website.

            This can prove impractical when a government website requires JavaScript and provides no paper alternative with the same functionality. This has been the case, for example, for the Regulations.gov portal through which United States residents can submit comments on proposed regulations. In 2016, the U.S. Copyright Office refused to accept comments submitted through U.S. Mail, instead requiring users to use a web application written in JavaScript, and proprietary JavaScript at that [fsf.org]. It's also the case for sev

        • I just don't visit the broken sites any more. I'm the commodity, but I have agency.

      • You do NOT need to download a separate program to read a Slashdot post. This post is a document, there is no need for it to be a program.

        Slashdot is a program, and your comments are the documents that it is used to view.

        The typical view of a comment section does not contain all comments. When you click a post whose score is below the breakthrough threshold, Slashdot needs to retrieve the post from the server in order to insert it into the HTML view. This is done by a script. Would it be better to cause a full page reload just to add one comment to the view?

        • Yeah Slashdot is a Perl program, running on the server.
          I'm about to use comment.pl.

          > insert it into the HTML view.
          > This is done by a script. Would it be better to cause a full page reload just to add one comment to the view?

          I use CSS for that, because it's a PRESENTATION issue, how the comment is displayed.

          • Yeah Slashdot is a Perl program, running on the server.
            I'm about to use comment.pl.

            Slashdot is two programs. One is the server side, to which you refer. The other is the client side, which handles requests by the user to retrieve additional comments and add them to the DOM.

            insert it into the HTML view. This is done by a script. Would it be better to cause a full page reload just to add one comment to the view?

            I use CSS for that, because it's a PRESENTATION issue, how the comment is displayed.

            You can use CSS to reveal a comment that is already part of the DOM but just hidden. Unless I'm severely mistaken, CSS can't add additional comments to the DOM that were not originally part of the DOM. In order to save network bandwidth, client CPU parsing time, and client RAM, Slashdot's normal comment view does not in

            • > insert it into the HTML view. This is done by a script. Would it be better to cause a full page reload just to add one comment to the view?

              Actually it does load a completely new page. Try it. At least in the view I'm seeing - fuck beta.

              In 1998, CmdrTaco decided to have it "cause a full page reload" *even though those comments are in fact part of the conversation*. CSS wasn't supported by browsers yet, the term "Ajax" hadn't been invented, and at the time he decided to HIDE comments which are part of

              • by tepples ( 727027 )

                Actually it does load a completely new page. Try it. At least in the view I'm seeing - fuck beta.

                Buck feta here too. During the beta fiasco, I used what the beta interface referred to as "classic," which was previously called D2 (with AJAX).

                In 1998, CmdrTaco decided to have it "cause a full page reload" *even though those comments are in fact part of the conversation*.

                And over the past two decades, the preferences of the majority of end users have shifted. Nowadays, the majority prefer the AJAX paradigm to the full page reload paradigm that you prefer.

                Right, something that is already part of the conversation, but hidden with the message "1 hidden comment". Logically it's there, but hidden, hence the wording "1 hidden comment".

                This "1 hidden comment" includes comments that were sent to your device but not rendered, either because of your threshold or because you collapsed it. It doesn't include comments t

  • The Chinese, aways them!!!11!!!11!
  • Comment removed based on user account deletion
  • This is reported by the same company who is behind the Samsung phone security app that people were recently claiming was sending data to China.

Don't panic.

Working...