Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Internet Windows Technology

Windows 8 Changes Host File Blocking 1030

An anonymous reader writes "Windows 8 has been confirmed to not only ignore, but also modify the hosts file. As soon as a website that should be blocked is accessed, the corresponding entry in the hosts file is removed, even if the hosts file is read-only. The hosts file is a popular, cross-platform way of blocking access to certain domains, such as ad-serving websites."
This discussion has been archived. No new comments can be posted.

Windows 8 Changes Host File Blocking

Comments Filter:
  • Another reason... (Score:5, Insightful)

    by Spritzer ( 950539 ) * on Sunday August 19, 2012 @03:21PM (#41048071) Journal
    So, after reading the article this can be summarized as "Microsoft gives you one more reason to disable Windows Defender and use a third party AV app."
    • by binarylarry ( 1338699 ) on Sunday August 19, 2012 @03:23PM (#41048081)

      Microsoft gives you one more reason to switch to Mac OSX or Ubuntu.

      • by Anonymous Coward on Sunday August 19, 2012 @03:29PM (#41048133)

        I completely agree. This is the nail in the Windows coffin for me.

        • by ackthpt ( 218170 ) on Sunday August 19, 2012 @03:39PM (#41048225) Homepage Journal

          I completely agree. This is the nail in the Windows coffin for me.

          If you are an enterprise IT manager this is your dream come true. You're not seeing this from the angle Microsoft is, they count on enterprise income more than they do home users.

          • Re:Another reason... (Score:5, Interesting)

            by Bill, Shooter of Bul ( 629286 ) on Sunday August 19, 2012 @03:46PM (#41048265) Journal

            Why is that a dream come true for an enterprise IT manager? You *want* employees to be on facebook? Or are you saying that crazy behavior on the windows platform ensures your job security?

            • Re:Another reason... (Score:5, Informative)

              by Anonymous Coward on Sunday August 19, 2012 @03:50PM (#41048293)

              Enterprise customers will block it at using DNS or using Group Policy, not the hosts file.

              • Re: (Score:3, Insightful)

                by Anonymous Coward

                Yes but my point is, I will now have to use a firewall to keep Adobe CS_ from phoning home.

                • by zoloto ( 586738 )
                  Frankly, I wish there were an iptables windows clone.
                  • Re:Another reason... (Score:5, Informative)

                    by X0563511 ( 793323 ) on Sunday August 19, 2012 @08:38PM (#41050049) Homepage Journal

                    Have you seen the firewall that comes with the Windows 7 generation? It's no iptables, but it can do the job now.

                    • by SeaFox ( 739806 ) on Sunday August 19, 2012 @08:55PM (#41050137)

                      I think what he wants is a firewall system that explicitly cannot be controlled by the operating system without his approval. So if he blocks something he can be assured it will stay blocked regardless of what kind of backroom deals Microsoft makes.

                      The most annoying thing about these latest versions of Windows is that there appears to be this new class of user with control that supersedes than the owner of the hardware.

                    • Comment removed (Score:5, Interesting)

                      by account_deleted ( 4530225 ) on Sunday August 19, 2012 @11:44PM (#41051107)
                      Comment removed based on user account deletion
                    • by GeniusDex ( 803759 ) on Monday August 20, 2012 @01:46AM (#41051643) Homepage

                      It is inherentily impossible to build something into an OS which cannot be controller by that OS itself. If you want these really secure firewalls, they should be on a separate appliance and all your traffic should be routed through them.

                    • by AmiMoJo ( 196126 ) on Monday August 20, 2012 @02:36AM (#41051837) Homepage Journal

                      You seem to be a bit confused about how Windows works.

                      If it is your PC and you are the administrator then yes, you have full control over it. You can set any firewall rules you want and they won't be overwritten by "backroom deals" or anything like that. Hosts was always an unsupported system file hack, and there is a pretty powerful firewall in Windows 7.

                      On the other hand if it isn't your computer then the (network) administrator can overrule you with Group Policy Settings. This is exactly the same as on a Linux box if you don't have a root access. Your administrator can decide if you have access to the firewall, or even right down to what types of firewall rule you can make. There really is a huge amount of fine grained control available. Enterprise admins love it.

                    • Re:Another reason... (Score:5, Informative)

                      by oreaq ( 817314 ) on Monday August 20, 2012 @04:45AM (#41052373)

                      Hosts was always an unsupported system file hack

                      Where do you get this idea from? Hosts files are a common part [wikipedia.org] of the IP stack of various operating systems. Microsoft has been using hosts files at least since Windows 95. They are fully supported and documented [microsoft.com].

                    • by AaronLS ( 1804210 ) on Monday August 20, 2012 @11:55AM (#41056307)

                      There were no backroom deals here. Certain domains are commonly targetted by malware. If malware, or perhaps another user/IT with malicious intent, modifies your hostfile to redirect facebook.com to a phishing site, it will still appear to be at a legitimate domain of facebook.com but actually serving the phishing site. It won't have SSL but your average user won't notice. So you see, it is in the interests of preventing the hosts file from being a tool for malware or malicious users. It is not in the interest of some backroom deal MS made with facebook.

                  • Re:Another reason... (Score:5, Informative)

                    by hobarrera ( 2008506 ) on Sunday August 19, 2012 @09:54PM (#41050479) Homepage

                    iptables? Really? Have you even tried OpenBSD's pf? That's a powerfull yet easy-to-use firewall!

                • Comment removed (Score:5, Informative)

                  by account_deleted ( 4530225 ) on Sunday August 19, 2012 @11:16PM (#41050925)
                  Comment removed based on user account deletion
                  • Re:Another reason... (Score:5, Informative)

                    by Anonymous Coward on Monday August 20, 2012 @03:44AM (#41052073)

                    Hell if you are worried about power you can buy one of those little plug computers or my personal favorite the little cheap E350 AMD kits. Those things are cheap, make great mini-servers or office boxes, only draw about 18w under load and less than 6w on average, great little units

                    Seconded, however you'd best steer clear of the Asus and Asrock boards if you plan on doing anything with the PCI slots on those boards. They all use the ASMedia 1083 pci bridge, which happens to be broken beyond belief. See here [kernel.org] and here [marc.info]. TL;DR: the controller has a hardware bug where it fails to deassert its interrupt status, causing IRQ storms which effectively makes connected devices useless.

            • by LordLimecat ( 1103839 ) on Sunday August 19, 2012 @05:07PM (#41048843)

              An IT manager using Hosts is an IT manager that needs to be replaced.

              First, if you are doing your web filtering on the workstation, you are doing it badly, badly wrong. Second, HOSTS is not somethin that is easily maintained or modified. Third, there are about a zillion better ways to accomplish blocking than using a HOSTS file.

              Its basically a kludge from bygone days before DNS, and for 99% of use cases where you might think "I can use a HOSTS file for that", there are far better methods-- or else the thing you are trying to do is retarded.

              • by cayenne8 ( 626475 ) on Sunday August 19, 2012 @05:25PM (#41048939) Homepage Journal

                Its basically a kludge from bygone days before DNS, and for 99% of use cases where you might think "I can use a HOSTS file for that", there are far better methods-- or else the thing you are trying to do is retarded.

                Even allowing for your premise....

                Why on earth would MS destroy a simple, well known behavior that users might indeed have reason to want to use? Why 'fix' something that isn't broken? Why break something that wasn't hurting anything else on the OS?

                No harm in leaving a well known tool and behavior be.....but plenty of reason not to fuck with it, no?

                • I agree, I just dont think theres anything remotely noteworthy here. If it werent for awful 90s era programs that cant handle DNS, Id say kill the entire thing off and end the stupid "Hosts is a good idea" myth altogether.

                • Re:Another reason... (Score:5, Interesting)

                  by Martin Blank ( 154261 ) on Sunday August 19, 2012 @05:40PM (#41049037) Homepage Journal

                  Considering that the number of systems hit by malware making use of HOST file modifications is far larger than the list of systems using it to block access to sites, the balance of evidence is in favor of what Microsoft is doing. I know some people who have extensive files, but that group is very small. LordLimecat was right: it's a feature from a bygone era that is used more often for harm than for good. Even adding a switch to the functionality (which might well be there in the form of a registry entry) doesn't help because that switch will get flipped by malware.

                  Sometimes features once useful outlive that usefulness.

                  • by Anonymous Coward on Sunday August 19, 2012 @05:48PM (#41049075)

                    This is silly reasoning. "Since I don't have a good reason to use it, nobody else should either."

                    I use it to test services that are replacing old services with the same name. It works well as a temporary/quick way of testing. Yes, I could do it in DNS but it would take much longer to vet the change to our DNS servers than my local hosts file. Thankfully, I don't have to worry about this since I don't use Windows.

                    • Re: (Score:3, Interesting)

                      by Anonymous Coward

                      Yup, that's what I use it for too. Changing DNS changes it for everybody, which is what I don't want.

                    • by Anonymous Coward on Sunday August 19, 2012 @10:32PM (#41050683)

                      I use it to stop Mom from reading my blog.

                      As far as she is aware my "awful site" as been offline since May.

                    • by TheRealGrogan ( 1660825 ) on Sunday August 19, 2012 @10:45PM (#41050749)

                      These people defending MIcrosoft's behaviour are just tools... I wouldn't pay much attention to them. Microsoft can't "kill the hosts file off" because the behaviour is part of the IP specification (defined in the RFC's)

                      We expect implementations of the TCP/IP protocol in clients to behave in established ways and Microsoft has no right to change that.

                      I make use of the hosts file for various purposes, including getting my forum users set up with hosts file entries to the new server, beforehand, whenever our DNS entries are changing so they can still reach the forum while changes are propagating. THIS is a prime example of why the hosts file still exists and the behaviour should not be fucked with by those assclowns at Microsoft.

                      Hosts was never meant to be used for blocking sites, but it works well enough as a consequence and the behaviour should be left alone. Whatever the user puts in there, should work as intended. I don't fucking CARE that it's used for malware. Fight malware in other ways.

                    • Re:Another reason... (Score:5, Informative)

                      by TCM ( 130219 ) on Monday August 20, 2012 @04:46AM (#41052379)

                      I make use of the hosts file for various purposes, including getting my forum users set up with hosts file entries to the new server, beforehand, whenever our DNS entries are changing so they can still reach the forum while changes are propagating. THIS is a prime example of why the hosts file still exists and the behaviour should not be fucked with by those assclowns at Microsoft.

                      No, it's a prime example of a bad IT person. If you had any clue about what you're doing, you'd lower the TTL prior to making the change, then make the change, then change the TTL back to normal.

                      Expecting random clients to modify their config to compensate for your incompetence is just dumb.

                  • Re: (Score:3, Interesting)

                    by Boaz17 ( 1318183 )

                    Crap!

                    The hole to plug (17 years over do) Is the fact that malware is able to modify the hosts file or flip a registry switch. Not some M$ convoluted notion of spaghetti security. I bet that by itself has holes in it.

                    Guys be careful an M$ troll making a days pay ...

                    Free Life
                    Heart

                  • by Anonymous Coward on Sunday August 19, 2012 @06:50PM (#41049463)

                    If that was the legitimate reason, then the proper course of action would have been to remove the hosts file feature totally (not this half-assed bullshit).

                    • by asdf7890 ( 1518587 ) on Monday August 20, 2012 @02:51AM (#41051883)

                      then the proper course of action would have been to remove the hosts file feature totally

                      IIRC you still need posix compliance (or the ability to claim it such that your claims can not be rubbished too easily) for your OS to be used in many US agencies, and the hosts file is one of the many minor points mentioned in that specification. Presumably that spec says something about having the feature, but does not say anything about effectively disabling it in this way.

                • Re:Another reason... (Score:5, Interesting)

                  by ceoyoyo ( 59147 ) on Sunday August 19, 2012 @05:54PM (#41049109)

                  MS sells ads. The biggest use of the HOSTS file is blocking ads. Google wishes they could do this.

                • Malware. (Score:5, Insightful)

                  by Deathlizard ( 115856 ) on Sunday August 19, 2012 @08:19PM (#41049943) Homepage Journal

                  the Hosts file is targeted my malware to redirect to malicious sites and to keep under the radar to infect systems after they have been clean. (or even to a locally hosted proxy to infect sites like Facebook) Personally, I've seen facebook and myspace targeted in it. Never seen doubleclick but my guess is doubleclick is a target so that they can redirect to their own profit generating ads, or more malware to attempt to extort money out of people.

                  My guess is that the sites defender removes from hosts are sites that have been targeted by malware in the past. Frankly, I'd like to see the list of domains it looks for, but I'm sure that I woudn't want any of them redirected to some scumware site trying to pawn off fake antivirus.

                • Re:Another reason... (Score:5, Interesting)

                  by Joe U ( 443617 ) on Sunday August 19, 2012 @09:47PM (#41050433) Homepage Journal

                  Why 'fix' something that isn't broken?

                  Because it is broken.

                  Malware can easily change the hosts file and screw you up, it's really a hole in name resolution security.

              • Re:Another reason... (Score:5, Informative)

                by garett_spencley ( 193892 ) on Sunday August 19, 2012 @06:10PM (#41049211) Journal

                I agree that for blocking or for network-wide control using HOSTS is a horrible idea.

                I also realize that the issue apparently here is blocking only.

                But with that said, what about independent developers running their own web application on their machine ? If you're a web developer and you do your coding locally, it makes sense to use your host file to send a domain like dev.example.com to 127.0.0.1.

                Again, I know it looks like Windows 8 won't interfere with that. But it's still an example of a legitimate reason someone might rely on the hosts file, and why it could be a major PITA to have it messed with by the OS. Or is there a better way that I'm missing ? ( (and running your own DNS server, even locally, and especially on a Windows machine, seems way overkill and no where near "better" IMO).

                The problem with HOSTS files were they needed to be synchronized, distributed and maintained. Yes, it's a hold over to pre-DNS. But for a single machine who needs to set up certain private domains locally it seems the best option.

              • Re:Another reason... (Score:4, Informative)

                by rrohbeck ( 944847 ) on Sunday August 19, 2012 @06:35PM (#41049387)

                Its basically a kludge from bygone days before DNS, and for 99% of use cases where you might think "I can use a HOSTS file for that", there are far better methods-- or else the thing you are trying to do is retarded.

                Ah, so I should rather set up a DNS server for my 5 machines, rather than have one hosts file that never changes and that I append once after installation?

                • by AK Marc ( 707885 ) on Sunday August 19, 2012 @06:48PM (#41049451)
                  Is "laptop3.fakedomain.local" reachable from a root DNS server? No? Then this won't affect you. But if you block adsense.com or whatever, even on a hosts file, you will be affected. The best fix is for someone to start up an ad-blocking DNS server that will block the ones people want blocked, and if you want to use it, you point your computers to it. The problem is, it'll be ad supported from the DNS errors, causing the heads of all the users to explode.
              • Re:Another reason... (Score:5, Interesting)

                by AK Marc ( 707885 ) on Sunday August 19, 2012 @06:43PM (#41049427)
                I've seen it done by managing the hosts file with a login script. The issue was that two companies merged with separate intranets that had intranet names that overlapped public names. The DNS merge was months away, so hosts allowed employees in both companies to get to both intranets until DNS was set up appropriately. I can't argue it was best. I can only argue that because of business reasons, it was just about the only possible solution (natting could have worked, but it was uglier).
              • So, how is one supposed to test moving a host around without fucking about with the DNS server now, too?

                Used to be I could just stick overrides in HOSTS for the reported nameservers or whatnot and browse/use the host normally, to confirm it works before throwing the switch at the registrar.

                What, are we supposed to ask IT to temporarily modify zones that aren't even in their zone of authority now? Or are we just supposed to throw the switch and see what happens?

              • by Lime Green Bowler ( 937876 ) on Sunday August 19, 2012 @09:20PM (#41050279)
                We use hosts files with shop floor manufacturing software that requires it. It does not function without host entries. You are not the judge of how a hosts file is to be used, and any mindset like yours should not be in IT. You have short sight and low experience in the real world it seems. Any any ass who threatens to "replace" somebody for using a feature that is far from outmoded, or thinks someones methods are "retarded" without benefit of understanding or even offering an alternative is a STFU-and-leave opportunity.
            • I'm not sure how smaller companies do it, but I don't know of any decent sized enterprises that rely on a hosts file to restrict access to certain sites.

              That said, this is some really stupid shit from the MS gene pool. Hosts should always take priority and simple visiting a site should never modify hosts as a result.

              That said, I wonder if the old trick of setting 'System' to read only works?

            • by Dunbal ( 464142 ) * on Sunday August 19, 2012 @07:36PM (#41049727)
              The smart IT manager realizes that even if employees spend 20 mins or so a day, they are far more productive than the ones fully restricted, locked down and persecuted. Studies have been done. Smart managers read them. Bad managers crack the whip according to arbitrary "productivity" goals that really mean nothing. Then they wonder why employees are always leaving the company and positions are so hard to fill.
          • Re:Another reason... (Score:5, Informative)

            by MicroSlut ( 2478760 ) on Sunday August 19, 2012 @03:51PM (#41048301)
            What Enterprise IT Manager is using the Hosts file to block web sites? Enterprises use firewalls. I've been blocking doubleclick at the firewall/proxy level for as long as I can remember.
          • by Nerdfest ( 867930 ) on Sunday August 19, 2012 @04:12PM (#41048449)

            If they're interested in 'enterprise' (I really hate that word these days), they may want to have a look at what's been happening. Good or bad security-wise, people have been pushing for using their own devices, devices they *like* to use. I think the only thing really stopping it from taking off for tablets and phones is the failure of Rim, Apple, etc, to open their protocols so a business does not need to pick a single type of device. If they ever figure that out, Microsoft is hosed.

          • Hamhandedness. (Score:5, Insightful)

            by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday August 19, 2012 @04:18PM (#41048487)

            If you are an enterprise IT manager this is your dream come true.

            Hardly. At the enterprise level there are multiple different ways of handling situations such as this. Which one(s) you choose depends upon how you've organized Active Directory and your network.

            But a different point is that this is an OLD way of phishing. The phisher is publishing the IP addresses that need to be blocked. So, again, at the enterprise level this kind of phishing would not be an issue.

            If a phisher really needed to redirect traffic like that he'd have an easier time just getting the information in the local machine's DNS cache. That way it would never show up in the hosts file which means that it would be that much harder to spot. Then just keep updating the DNS cache.

            So this is the wrong solution to the wrong problem and it is implemented in the wrong way. And it will probably cause more issues in the future as 3rd party developers have to work around not having the hosts file as a reliable option any more.

            Nice way to remove a useful tool that's been around for decades.

          • If you are an enterprise IT manager this is your dream come true.

            Dream? No, nightmare. A machine the can't be configured as desitred and rewrites itself at will has no place in any corporate shop. You don't want the user rewriting the hosts file? That's not unreasonable and you can implement that right now, via policy so it's uniformly implemented. A client unavoidably rewriting itself against management wishes and that behavior can't be changed? Completely unacceptable. With this "feature", Windows

        • by Dunbal ( 464142 ) *
          Windows 8 = Windows Hate.
    • Re:Another reason... (Score:4, Interesting)

      by burne ( 686114 ) on Sunday August 19, 2012 @04:26PM (#41048563)

      Could you be so kind to post the other reasons?

      I have been using UNIX/linux/BSD and odd stuff like BeOS, System 7/8/9, OS X, Solaris/CDE, IRIX etc for 15 years.

      Never found a solid reason to use windows, and now you tell me there's more than one reason _not_ to run windows?

      That is one alternative reality I must grab..

  • So... (Score:5, Insightful)

    by Anonymous Coward on Sunday August 19, 2012 @03:24PM (#41048089)

    Just add the hosts file to the Defender's white list. If you know how to edit the hosts file, you should know how to add it to the white list.

    Otherwise, who says the edits to that file were not malicious.

  • by metrix007 ( 200091 ) on Sunday August 19, 2012 @03:24PM (#41048097)

    APK's sole existence seems to be reliant on advocating the hosts file as a means of host filtering, despite more modern, flexible, easier, convenient and powerful alternatives existing.

    How will APK stay relevant with the demise of the hosts file in Windows 8? Stay tuned....

  • Calm down (Score:5, Informative)

    by Anonymous Coward on Sunday August 19, 2012 @03:26PM (#41048107)

    Before everyone gets all excited... the article has already been updated with the fact that this is a feature of “windows defender” (and imo a reasonable one) and can be disabled.

    The hosts file is popular for blocking sites, but also popular for redirecting to phishing sites as well. This seems like a very ineffective way of solving that problem, but at least it doesn’t look like there is some evil malicious intent..

    In other news, running certain anti-virus products will prevent you from writing to the boot sector while they are running

    • Re:Calm down (Score:5, Insightful)

      by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday August 19, 2012 @03:52PM (#41048309)

      This seems like a very ineffective way of solving that problem, but at least it doesnâ(TM)t look like there is some evil malicious intent..

      Considering that one of the sites they are unblocking is ad.doubleclick.net (which is often blocked because the user wants it blocked) then Microsoft is taking away an option from the user.

      What will be interesting will be when someone compiles a list of the sites that will be unblocked ... and finds how many BANKS will still be subject to phishing like this ... but ad.doubleclick.net will be protected.

      This is a stupid move by Microsoft done in a stupid fashion.

    • Re:Calm down (Score:5, Insightful)

      by mrnobo1024 ( 464702 ) on Sunday August 19, 2012 @03:54PM (#41048319)

      The hosts file can only be modified by administrators. Any additional protection is useless because if malware has gotten itself running as administrator, it can just kill or modify windows defender anyway.

    • by frovingslosh ( 582462 ) on Sunday August 19, 2012 @04:38PM (#41048641)

      From the article, Two of the sites that you can’t block using the hosts file are facebook.com and ad.doubleclick.net

      I started using the hosts file over a decade ago, when I traced crashes that I was having to doubleclick.net. Ad supported software that I was using was receiving files from them, but it was doing a lot more than just displaying the ads (which I would not have objected to). Many users were experiencing this, but the author would not fix it so I and others started blocking the site (which resolved the problem, although the author lost some small amount of revenue).

      More recently I have also started blocking facebook. I never use it, have no account there, but I've noticed an awful lot of network traffic going to and from my site with facebook.com. I'm not even a member, so I don't feel the need for them to track most of the sites that I visit. The hosts file has so far worked very well for this.

      And argument that this feature is in any way for the benefit of the clueless user is bogus. The common way to block a site via the hosts file is to equate it to the IP address 127.0.0.1, which is the local machine. If Microsoft were doing this for the benefit of their users then they would simply look at the hosts file and, if they found redirects for sites that they were concerned about that were not pointed to the local machine, they might well conclude that it was potentially an attempt to hijack a domain name and then, after warning the user (and even asking him) correct the problem. This would even show the user that Microsoft was doing something good for the user for a change. But if the address is redirected to the local machine, the only reasonable conclusion that I can see is that the user wanted it that way (as it provides no attack vector). It took me about 30 seconds to realize that changing 127.0.0.1 redirects was user unfriendly and could easily be avoided if Microsoft were really concerned about their users who paid for the software. They just have to look at the IP address that the hosts file contains and if it is 127.0.0.1 then allow it to stay! Clearly Microsoft realized this too. The only reasonable conclusion is that they are doing this because they have a motive that is against customers interests.

      • Assuming that all redirects to localhost are user-specified is all well and good, until you figure out that some malware makes the hosts file looks like this:

        update.symantec.com 127.0.0.1
        update.trendmicro.com 127.0.0.1
        update.mcafee.com 127.0.0.1
        update.microsoft.com 127.0.0.1

        Not that I consider this a good move by Microsoft, by any means, but implying that the situation is as simple as you're making it out to be is dangerous.

  • by Anonymous Coward on Sunday August 19, 2012 @03:28PM (#41048117)

    As comments in the article point out, this behavior can be turned off by going to the Windows Defender settings... But by and large this make sense for 95% of Windows users as they will have NO clue about the hosts file, and even less of a clue if it has been modified for a phising attack. Nice to see microsoft take another step forward in protecting the blindingly ignorant and inept.

    • by lowlymarine ( 1172723 ) on Sunday August 19, 2012 @03:32PM (#41048155)
      Exactly, this is a perfectly reasonable anti-phishing measure that can be easily disabled, as is clearly explained in the linked article. But hey, we can't have any such pesky facts sneak into a /. summary, it might stymie some good old-fashioned MS bashing.
    • As comments in the article point out, this behavior can be turned off by going to the Windows Defender settings... Nice to see microsoft take another step forward in protecting the blindingly ignorant and inept.

      No, a step forward would be requiring administrator rights to write to the file, and then ensuring admin access is granted only when actually needed. Please, understand this: If you've got software modifying your hosts file, then Windows Defender hasn't done its job and you've got much bigger problems already.

    • On what planet does it make sense to change entries in a file on the system and not even warn the user that you are doing so? And since they are reportedly making the changes selectively, then if there were really an attacker his attack could have made other changes, but the user was never warned that the host file had anything "suspicious" in it and so would not be aware to even look at it and see if there was anything that the Great and Powerful Microsoft had missed. This isn't for the user, it is purely
  • by Anonymous Coward on Sunday August 19, 2012 @03:33PM (#41048161)

    Prepare them for the shitstorm.

  • by bobbutts ( 927504 ) <bobbutts@gmail.com> on Sunday August 19, 2012 @03:33PM (#41048169)
    This seems like one of those situations where someone didn't think of the potential side effects. The goal was to fix some attack on specific sites, but the solution failed to consider that the mere presence of entries like Facebook is not enough to determine of the entry is in fact malicious and/or unintended. Security and expected behavior is compromised in too high a number of situations to use this software imo.
  • by Blue Stone ( 582566 ) on Sunday August 19, 2012 @03:39PM (#41048219) Homepage Journal

    Yeah, this is basically a cack-handed way of fixing malicious hosts redirects.

    It'll prevent malicious programmes from sending you to fake Facebook, but at the expense of entirely overriding any preferences YOU as tthe computer owner might wish to make via the Hosts file.

    It's a staggering level of incompetence that this is their solution. It needs to be changed and they need to find either another way of solving it or allow some form of granulation and user input.

    • by firewrought ( 36952 ) on Sunday August 19, 2012 @07:44PM (#41049757)

      Yeah, this is basically a cack-handed way of fixing malicious hosts redirects.

      Every OS does this: starts out with a simple (possibly easy-to-understand) model and evolves to something with more and more layers of cruft. It's called technical debt, and the long-term consequences are that these systems become harder to learn and understand.

      Linux is better than Windows in this regard, but open source is by no means immune to crud formation. The maintenance tools for Debian packaging and the GNU Build System [wikipedia.org] come to mind.

      Which brings me to my rant: in order to remain viable as a hobbyist OS, Linux should strive to simplify and remove "stupid complexity" that needlessly hinders technical understanding of its internals. I'm not speaking of user-friendliness per se (because that's a term that we use in reference to end users), I'm talking about removing complexity that isn't inherently necessary for the purpose of the system.

  • by nurb432 ( 527695 ) on Sunday August 19, 2012 @03:39PM (#41048223) Homepage Journal

    Hope you enjoy your new 'media consumption appliance'. Its becoming less and less of a 'general purpose computer' every day.

  • by kimvette ( 919543 ) on Sunday August 19, 2012 @03:56PM (#41048345) Homepage Journal

    This is another good reason to stick with Windows 7, giving Windows 8 a miss.

    One common use of the hosts file is to test staging servers, particularly web servers before pushing them live, and without the complexity and time it takes to set up an additional DNS server.

  • by LocalH ( 28506 ) on Sunday August 19, 2012 @04:31PM (#41048597) Homepage

    The option on one end is to allow the user to have full, unfettered access to everything on their system, from the highest levels down to the lowest. This was done back in the DOS and Win9x days, and although it does have a few benefits in certain niches, it's also very bad for security.

    The option on the other end is to disallow access to modifying the underlying system and related settings, and only allow such actions from full administrator accounts, and maybe not even then (depending on the mindset of the development team). This pisses off a lot of the hardcore techies who like to modify everything they can, but to be fair it does help protect the average user.

    Now, I'm not defending Microsoft on how they've implemented this silently and without notification to the user, but on the face of it I think it's a good idea for the average user, at least with regards to the Facebook part of it (not so much on the Doubleclick part). Think about it - the average non-techie person wants Facebook to work. They will want to get their notifications on the Start screen (and elsewhere).

    I agree with other posters - they should have openly done this and notified the user before "fixing" it - something like "Your hosts file has been modified to prevent access to <site on this list>. Is this desirable to you?" with three options - "Yes", "No", "More information". That way, the techies can click "Yes" and go about their business, average users can click "More information" and maybe actually learn a little bit in the process, then come back and click "Yes" or "No" as per their wishes.

    As with many things, the idea is sound, but the implementation is not. To those saying "well, malware wouldn't redirect to localhost, it'd redirect to a false Facebook", there's nothing stopping a piece of malware from being written that is similar to the existing rogue security software, but that also uses hosts to block access to various social media sites, in an attempt to give the uneducated user further reason to believe they're truly infected as bad as the rogue software tells them they are, and also as a weak attempt to prevent the user from going online and telling people about it even after the rogue software has been removed. They'll do anything to get a few more successful purchases of their crap software. I'm quite surprised they haven't really done this already, to be honest.

  • by __aaqvdr516 ( 975138 ) on Sunday August 19, 2012 @05:12PM (#41048871)

    The answer is simple enough:
    If you're already smart enough to edit the hosts file, you should be smart enough to add hosts to Windows Defender exclusion list.

    Is this a change from the way that things were done in the past? Of course it is. This is how systems become more secure for the average user. Average Joe isn't messing with hosts.

    Chicken Little, the sky is not falling.

  • by aepervius ( 535155 ) on Sunday August 19, 2012 @09:15PM (#41050251)
    If one redirect a site to 127.0.0.1 from the aforementionned double click, chance that it is a malware is nil. Before removing the entry windows defender should check the IP and leave it for those site at 127.0.0.1. OTOH if it is an anti virus site it should remove it if it is precisely 127.0.0.1. If they went the extra way to check for some specific web site, then they should have done the extra way and check for the IP. Or make a pop up windows warning of the behavior and how to stop it.

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...