Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apache Software

Latest Netcraft survey shows Apache increase 103

The latest Netscraft Survey is out. Apache enjoyed an over 1 percent increase, with Microsoft and Netscape showing some decreases. According to the survey, Apache has a 54.81 percent "market share." Also reported is the fact that Webjump actuals uses a hybrid setup with NT serving static content and the dynamic content with a Solaris/Apache/Perl system. Tucked away in the report is a small factoid that PHP is on over 1.1 million domains.
This discussion has been archived. No new comments can be posted.

Latest Netcraft survey shows Apache increase

Comments Filter:
  • I think that Webjump's choice to use a hybrid system had absolutely nothing to do with their 'company politics'. Webjump analyzed their options and decided that the hybrid solution is their best answer.

    Now, does this mean Apache is not the optimal choice for everyone who uses it? Of course not, but it does show that Apache's serving of static pages needs to be improved.

    Just finish doing that and it's world domination time.
  • Percentages are good, but what I liked was taking the change in the number of hosts and looking at the sheer numbers.

    Do the math. Approximately 4 Apache hosts went up for each IIS/PWS host that went up in the past month.

    Yes, lots of those hosts are virtual hosts on the same machines, but even that says something about Apache's penetration at web hosting providers.
  • by jilles ( 20976 ) on Wednesday December 01, 1999 @06:22AM (#1490261) Homepage
    From the netcraft survey

    The top 10 .com domains with the most distinct certificates found by the September SSL survey, run 213 Netscape servers [out of a total of 341 sites], against 42 Microsoft [of which 28 are in the microsoft.com domain], and 24 Apache. Equally notable is their choice of operating systems, where both NT and Linux, strongly represented in the SSL Survey as a whole have a relatively small share. Just 57 sites run NT, with the most common Unix systems according to signatures detected in the tcp/ip characteristics, being Solaris and AIX. Only one site was detected as running Linux.

    This shows that on the really big servers netscape servers still rule. When you scrap the ms domains from the survey apache has a larger marketshare (scrapping ms is a good idea since price/performance/support probably did not play a role in choosing a webserver there).

    The survey also tells us that both IIS and Apache saw a rise in marketshare for smaller SSL based stuff.

    The last line is sort of interesting too since it shows that linux does not play a big role as a webserver platform for very large sites.
  • I didn't see any stats on some kind of caching web proxy, but I'm interested in setting up some sort of caching proxy frontend to several diverse web servers. For the curious, here's what I want to set up.
    • The ability to map files from other servers into its url space.
    • I want to be able to control caching directives passed to the client. So like I could make files within /forum/ passed with no-cache to the client.
    • I would like the proxy to have basic methods of determining if a fresh copy of something needs to be fetched. So "If /forum/ not-cached by no-account in last 5 minutes, get new /forum/; But if client-is-user and not sent in last 3 minutes get a new forum for user"
    • I want to be able to manipulate cookies so that some files and cgis mapped within the server space can't get ones not intended for them (user server).
    • I want to be able to protect mapped URLs by password (at the load-balancer) and by ip.
    • How about running a https server and mapping the requests to unencrypted http servers
    • Here's what I want to use it to handle
      1. A user box with some SUID user cgis and mostly static html (apache)
      2. A Slashdot-like forum (apache w/ php/mysql)
      3. A file archive server (thttpd)
      4. A group of small webservers for displaying stats on from developer machines (thttpd)
      5. A discussion forum on a mod_perl
      6. A web-based email server (custom running on OpenBSD)
    Anyone have a solution for this?
  • by maskatron ( 7560 ) on Wednesday December 01, 1999 @06:27AM (#1490263) Homepage
    maybe the government should break up apache into....err nevermind.
  • "More sites on the Internet use IIS and Windows NT than other other OS/Server combinations, even Solaris*".

    Well, definitely misleading, probably deliberately, and therefore a lie, but not necessarily literally false. What Apache/OS combo scores higher? Apache/Linux? Apache/Solaris? Apache AIX? Apache/HpUX? Apache/FreeBSD? Apache/OpenBSD? ... and so on and so on. Like the Mindcraft tests, M$ has carefully chosen a special case where they look better and ignored the general case where they look like *sh!t*. This seems to be the M$ standard way of dealing with OSS, the comunity need to learn to expose the misdirection rather than just claim falsehood (which seems to be frequently absent).
  • On October 19th, I put into production a slightly hacked version of Apache 1.36. Previously those 50,000 domains were hosted on Netscape-Commerce. Perhaps this upgrade has something to do with the 1% jump in usage. FYI, I work for the #4 web host.
  • It does appear that it will add sites via that link. But I think it's unlikely that they have had people submit requests for more than 8 million different sites. Look at the changes this last month. All the top servers have had a numerical growth. Apache had approximately 490,000 more servers in November than in October. This equated to slightly more than 1 percent. It would take some real dedication to actively skew the results.
  • > does this represent a real increase?

    Others have commented on the margin of error part. As to whether it's a real increase, I'd say what matters is the trends over time. For instance, notice

    Apache had its best month since the spring, gaining just over one percentage point of survey share. Microsoft, after its large gain last month, and Netscape dropped back by a roughly equivalent measure.

    One of the resident astroturfers in comp.os.linux.advocacy bragged last month about how Microsoft was killing Apache with such a large gain, while Apache showed a small loss. It was nonsense, of course, but it would be equally nonsensical to say that Apache is now on a roll due to this month's report. Clearly, Apache is "winning", but that observation is based more on its current share and the absence of any moving trends against it in the chart.

    --
    It's October 6th. Where's W2K? Over the horizon again, eh?
  • Note that the "Internet OS Counter" at Leb.net [leb.net] lists Linux as running on 31.3% of surveyed servers. NT and Win9x had 24.3%.


    Since the vast majority of Linux servers ran Apache at the time you are talking something like 28% Apache + Linux. The majority of Netscape Servers were actually deployed on NT so NT + IIS probably amounted to 15%.


    It's only not a lie if they count different kernel, Distribution and Apache versions separately. What about NT Service packs ?


    They only count around 1/4 the net which would give them less than 1% margin of error.

  • You aren't looking at the whole picture. Go to the netcraft site and look at their data and see how all the numbers relate for the month of November.
  • It's an old version, so what's the point? Do you want a code fork, where any improvements are obviated or made incompatible with the current functioning version? At what point do the /. developers decide 'enough is enough' and let people at the code? Is there a foreseeable actual date for this?

    I think, given /.'s nature as an advocate of OSS, that it be held to the same standards that it (and its 'community') holds to other projects/corps/developers.. Glass house, stones, etc..

    And if noone shows interest in this, it'll never be seen... Hell, look at the interest that has been shown and the results so far.. Will the recent source (or a public CVS server) just magically appear?
    Your Working Boy,
  • Whenever there is no competition, there is no reason not to sit back and relax. And that's when quality starts to drop.

    I don't think this is necessarily true, especially with open source products. While I can't speak for the Apache group, as an example, in the KDE slashdot interview [slashdot.org], one of the developers said that the competition between GNOME and KDE had little if any effect on them, saying [in question #7]:

    I think the whole "competition makes for better products" thing is bunk. KDE developers work to make KDE the best that they can -- and they would be doing so even if GNOME didn't exist!

    That said, I would think that the Apache group probably feels the same way. Why would they care about competition? They aren't really making money from developing it, so chances are they care about making Apache the best possible product they can, and if they gobble up 100% of market share along the way, well that's even better.

  • i heard they are also using 'boa' to serve images.

    sigh...

    its nice to see microsoft embracing open source software like that.
  • I'd like to point out WHY Apache is benchmarked slower than IIS for static pages. Microsoft only has to worry about IIS working on one platform, so they have not only tuned their web server for NT, they have actually put hooks in the NT kernel to make IIS run faster. One of the Apache Group's primary goals is to provide support for as many platforms as is reasonable. Performance is secondary to this goal. There is still room for improvement in Apache for static content, though. The Apache tuning docs specify a number of compile time configurations that can be made to make Apache much faster for static pages. Of course, if you want truly amazing static web site performance, load mod_static_mmap into Apache. You'll see it serve static pages at 10 - 20 times faster than NT/IIS. It only works on POSIX systems supporting mmap, and it eats RAM for very large sites, but if you are looking for performance and you have a dedicated static server, this is definitely the way to go.
  • I'd have to agree with the original guy here. In some fields where precedent is being set, like PERL and X Window, open-source shows great innovation. But in area's where it's not, features seems to be looked at an added only when there's a glaring weakness. Like Linux and SMP, etc...

    I don't think that any project will prosper without a form of competition. It doesn't have to be financial, but just another group of people doing something that produces the same result but by a different means. That way each group could look at each others work and pick out the best.

    Maybe someone or ones should begin a new HTTP server project with a completely new source tree. Take nothing from Apache, but just build the "best" server they can.
  • Wrong. Go here [netcraft.com]. See in the type near the bottom. It says "The host you examine will be included in future surveys".

    If you click the "add your site" link, it just brings you to the generic query page, which seems to me to mean that the way a site gets checked or scheduled to be checked is by actually querrying it. That, to me, seems like it'd be incredibly easy to tilt the results one way or the other.

    Not that I'm trying to defend anyone. But to say NetCraft is unbiased, to me, seems false. The sites that get queried are the ones that users ask to have queried.
  • If I remember correctly, sites that they haven't already spidered out are entered by using their What's the site running interface. So the user has to actually enter in their site to have it added to the query engine. My guess is that it's pretty low. The sites that are currently listed are on there because they have actual content and are linked from other servers.
  • I'll email you with... oh. Never mind.
    --
  • I notice that the Microsoft entry includes Personal Web Server as well as IIS. Now, all Microsoft has to do is configure Windows and NT workstation so that PWS is on by default, just like Apple does with the Mac web sharing, and every Windows user with a Cable Modem or DLS line will count as a Microsoft server. Then watch them brag.

    Sure, it would slow things down and open up some security holes, but that would be a small price to pay for the greater glory of Redmond.

    Note for the humor impaired: just kidding, folks.
  • Take a look at queso and the online OS database. Unless you actually monitor your firewall fairly closely you'll never know that you've been probed, either.

    http://www.apostols.org/projectz/queso/ [apostols.org]

    -Peter
  • go ahead and mod me down for off topic, but what's the point of the apache section [slashdot.org] if apache-relevant stories aren't going to be posted there?

  • What most people don't take into account in counting Windows NT servers versus unix servers, especially on the web, is that it only takes one unix machine, from Sun/Solaris to PC/Linux, to do the task that would require 2 or 3 NT machines.

    So when you're counting these machines remember that while Linux hauls workloads like a train, and only one such engine is needed for the task, using Windows NT/IIS for the same task is like taking the same load and putting it on an entire fleet of Volkswagen Beetles.

    So naturally the Beetles are going to outnumber the locomotives!
  • by King Babar ( 19862 ) on Wednesday December 01, 1999 @10:54AM (#1490293) Homepage
    I just have to wonder, with an "increase of 1%" what the margin of error is - does this represent a real increase?

    Others have mentioned possible problems in interpreting such data which include (but are not limited to) the following:

    1. This is a population, not a sample
    2. Even if it were a sample, it wouldn't be a randomly selected sample.
    3. Even if it were a randomly selected sample, it would be a random sample of domain names, not host machines.
    4. And many, many, more. :-)

    OK, having said that, it might be useful to pretend that none of these were concerns, and that we really did want to know whether a 1% increase in the number of domains served by Apache meant anything. Here's the short answer:

    I can't tell you that.

    This is especially true if the domains surveyed in some sense are the population. In that case, whether or not you care that Apache added 500,000 domain names to the population while IIS added 125,000 is basically up to you. There are many explanations for why this could have happened, and not all of them are very interesting. (Again, others have pointed out why.)

    Personally, I would have been more interested in certain kinds of longitudinal breakdowns rather than the overall numbers. Some of those questions would include:

    1. How many sites went from IIS to Apache vs. vice versa?
    2. How many "new to the survey" sites went with one solution or another?
    3. How many "new to the survey" sites are actually new to the web?
    4. How many sites running Apache (or IIS) closed down during the last month?

    Call me a geek, but these are questions I think could be more interesting to ask. And, yes, some of the answers to these are given or hinted at on the netcraft website.

    But there is one more question, which is the one the original poster asked:

    But what if this really were a sampling question; is a 1% difference likely to be reliable?

    If all N of the netcraft domains were independently and randomly sampled from the total population of domains, then a 95% confidence interval for a given market share, M, where M is between 0 and 1 is:

    [M - 1.96*(M*(1-M)/N)^.5, M + 1.96*(M*(1-M)/N)^.5]

    For Apache's market share in November, we would get the interval [.5479, .5483]. For the October share, the interval is something like [.5365, .5369]. Those are pretty tight intervals, but the sample size is over 8 million...

    And this is the real point: when you have random samples this huge, error bars are pretty danged small. So it's too bad these really aren't random samples...

  • Combine this survey with the following from www.attrition.org:

    Index Statistics for 99.11:
    (Analysis of /home/web/mirror/attrition/1999-11.html)

    Reported Hacks: 639
    # of AIX : 1
    # of BSDi : 12
    # of IRIX : 10
    # of Linux : 105
    # of FreeBSD : 11
    # of OpenBSD : 1
    # of OSF1 : 1
    # of DigUnix : 3
    # of SCO : 2
    # of Solaris : 56
    # of Win-NT : 425


    What does this tell you?


  • What's up with these reports I've heard of some MS pages running on Apache?

    The force is truly all-powerful. *g*

    -Mikey
  • After all this showed how good a Web server NT is for static pages ;-)
  • It's marvellous to see an Open Source solution win such a clear victory over proprietary rivals, but I hope someday to see Apache start to lose market dominance again, in favour of some of its open source rivals (like Zope). The way Apache does things isn't always the best way to manage Web content provision, and a monoculture of Web servers would certainly be a Bad Thing.
    --
  • There'd be no point to that. If any cracker cared, (s)he could get the same result by telnetting to port 80 and using queso.

    Then again, if they find out you're running OpenBSD, they might just give it up a priori ;-)
  • Is how does the "Other" category break down? After all, it has a bigger share of the server market than Netscape.

    Don't you love the big, Big, BIG gap between Apache and everybody else?

  • by rw2 ( 17419 ) on Wednesday December 01, 1999 @05:39AM (#1490305) Homepage
    An even more interesting page is here [netcraft.com].

    It shows the usage by platform. There are a couple significant Apache derivatives that aren't grouped into the more conservative number that is used for the graph.

    In fact, the bulk of the tailing off shown in the graph for Apache was actually slack picked up by Apache derivatives!

  • Whether it uses IIS or Apache or both, one thing is certain: it's not responding to my kind requests. They should really move to a all-unix setup!!

    --

  • it's no wonder that apache is in a dominant position as a web server: it works very very well. the fact that it's open source is only the half of it. the strengths could go on for days, but include the really sharp modules available (including SSL, PHP, Perl...), the speed it has (though it could be tweaked some more for some sites), and it's security.

    one of the things i always wonder about are the security fixes they send out. they note they fix a lot of security holes, and i'm sure they do, but i don't look at code diffs so i don't know where they take place. and i have not seen an Apache exploit in decent release for a long time. i think that says a lot. IIS, NS, yeah, you see those every now and then (ok, a lot of IIS ones).

    it's good to see a product like Apache really continuing on so many levels the Inet's traditions.

    jose
    are rick rubin and alan cox related? find out at http://biocserver.cwru.edu/~j ose/humor/rubin-cox .html [cwru.edu].

  • As I understand it, a site gets indexed by Netcraft when somebody goes to Netcraft to look up a site ... So my question is, who uses Netcraft? If mostly linux users (most of /.?) visit mindcraft to look up a site, they are probably going to check out some of their usual hangouts (slashdot, freshmeat, ), most of which are run on Apache ... Wouldn't this bias the results in favor of Apache? I'm guessing that you could really affect the results for next month if you were to find a large number of -run sites and then started looking them up on Netcraft ...

    Not sure if this makes sense or if I even understand how Netcraft works ... Just curious how reliable these results are ...

    --elint
    "...So if you're cute, or even beautiful, remember: There's more of us ugly motherfuckers than you are." --Frank Zappa.

  • not in place of it. Zope is a sort of middleware.

    --

  • by Booker ( 6173 ) on Wednesday December 01, 1999 @05:54AM (#1490313) Homepage
    I just have to wonder, with an "increase of 1%" what the margin of error is - does this represent a real increase?

    Also, how many of the sites in question are the Apache "Congratulations!" page when Apache is installed and enabled by default on various Linux distributions?

    Not to be a wet blanket - just wondering. :)
    ----
  • putting "something" in /etc/hosts would probably just be a waste of you time. Anyone can find this info from HTTP headers.

    $ telnet www.slashdot.org 80
    Trying 209.207.224.41...
    Connected to slashdot.org.
    Escape character is '^]'.
    HEAD / HTTP/1.0

    HTTP/1.1 200 OK
    Date: Wed, 01 Dec 1999 15:42:46 GMT
    Server: Apache/1.3.6 (Unix) mod_perl/1.21
    Connection: close
    Content-Type: text/html
  • The competition makes for better products argument is a result of free enterprise. When a company's goal is to generate money by selling as much of a product as possible, competition with other companies trying to do the same makes all the difference.

    This is probably one of the biggest selling points of capitalism. Greed on the part of the companies benefits the consumer because the companies compete against one another for the customer's dollars, thereby doing what is necessary to get the customer's dollars. note: this is where the bad side of capitalism enters the equation. Companies try to force proprietary standards on consumers to lock in market share.

    With open source projects the motivation is different than the big corporation. While market share may play a role in their motivation, it's hard for me to imagine it playing as big of a role as market share for the big corporation.

    If I were going to start an open source project, my motivation would be:
    1. to create a cool product
    2. to challenge myself
    3. to learn to program better (just have a couple computer courses on C, yay! I can make a linked list!!!!)
    4. hopefully produce something usefull and beneficial to the community

  • ...and just like Redhat (i haven't tried others) does by default.

    I don't think that these result mean very much myself, because they're bound to be counting the sites hosted by hosting companies, which of course are going to be using Apache. Not because of any other factor but it and Linux are free (as in beer :). And a lot of people signing up with them don't know or care to ask what server/OS they're using. Not that it matter. They just want a webpage and a domain name.

    A more meaningful (in my eyes) count would be what companies run that manage their own website (only one site per distict company - the one that gets the most average hits per day). Those are results I'd like to see.
  • Of course there's a margin of error. They're extrapolating to all web servers on the web, but they can only count the ones they know about. They're bound to miss some.

    Also, remember that they count domain names, not machines. So a single-machine ISP hosting 20 sites gets counting 20 times, while multiple-machine sites like Yahoo generally only get counted once.

    It's impossible to determine a margin of error, unless you actually decide what exactly "on the web" means, which is a tricky question. Is a ppp-connected box "on the web"?

    And my DSL connection puts my Linux box on the Net 24/7. But it doesn't get counted by netcraft.

    Netcraft bills this as a "survey of Internet connected computers". But in practice that's a very slippery concept.

  • You've got it all wrong. Netcraft finds new hosts by walking the DNS (amongst other methods). So just putting a website out there with a proper domain name makes it available.
  • Has anyone seen the latest add from Microsft. It clearly says "More sites on the Internet use IIS and Windows NT than other other OS/Server combinations, even Solaris*".

    * September 1999 Netcraft survey

    This is clearly crap. Can they do this with a good conscience? IIS is *way* behind Apache in number of websites, so how can they publish such crap with a good conscience. Either way, the PHB's will read that crap and buy into it.
  • Well, sure, there's no margin of error in the servers they count, but they're extrapolating the results to the entire web. They're essentially polling a sample of servers out there, and that can't be 100% accurate... right?
    ----
  • actually, of the commercial servers, Zeus is by far the best.

    http://www.zeus.co.uk [zeus.co.uk]

  • by Rasmus ( 740 )
    That is hard to say. There are certainly a lot of servers out there with PHP enabled that host domains that may not use PHP at all. Hence the IP count of 357,481 unique ips found to have PHP enabled. Then you also have quite a few large ISPs that use PHP in CGI mode under suExec with a bunch of people using PHP that way and none of those guys are counted in these stats, so that might even things out a bit. The real number is probably somewhere in between the 357,481 and 1,114,021, but exactly where is impossible to say.

    -Rasmus
  • Apache is a decent web server, though not as good as many commercial offerings. On the other hand, it's free and can saturate most network connections on reasonable hardware.

    I suppose this validates the open-source model. Or perhaps it makes a case for the hundred-monkeys-on-a-hundred-typewriters model.

    _.......................__
    ||.....__...._._||_..||-\\..._...._._||_
    ||......_\\.(/_'..||....||-//.//.\\.(/_'..||
    ||__((_||_,_/).||_..||....\\_//.,_/).\\_
    HAHA! LAST POST! Anything following is redundant.
  • Comment removed based on user account deletion
  • Some Apache related items are "important" enough that they deserve notice on the main /. page, instead of being placed in the Apache section. It's a judgement call, but I figured the increase, especially after last month's "decrease" warranted the article being front and center :)
  • by Gurlia ( 110988 ) on Wednesday December 01, 1999 @05:55AM (#1490335)
    Don't you love the big, Big, BIG gap between Apache and everybody else?

    Not at all. In fact, I hope there will be a competitor of Apache which occupies roughly the same percentage of the market. Competition is always healthy. Whenever there is no competition, there is no reason not to sit back and relax. And that's when quality starts to drop. I'll probably get flamed for this, but MS products didn't suck that much when MS was still a small company. It's only when they became a dominant force that their products began to really deteriorate.

    IMNSHO, shouting hooray to Apache because it's the dominant factor in the webserver market is no different from MS declaring how good the world would be if everybody switched to Windows. I'm not saying Apache sucks (I use it for a website project in fact), but that if there is no competition, eventually it will suck. (Note: this is not intended to be flamebait)

  • by Anonymous Coward
    1.) Zope is based on Apache

    2.) Apache 2.0 (aka new-httpd) is a major step WRT Apache's technique (i.e. it's multithreaded, runs on OS/390).

    See the list archive: http://www.geocrawler.com/lists/3/Web/ 417/0/ [geocrawler.com]

  • Does anyone have statistics over hits/platform?
    Since one must suspect that a number of apache sites are rather small that would be a more accurate measurement.
    (Not that I dont think Apache is ahead there too)
  • Zope can run as a web server on its own, or handle requests through Apache. I think the latter is recommended though.

    At least, this was so last time I looked. I know they've put quite a bit of work into making the Apache path faster since then.

    But you're right, what I wrote was misleading.
    --
  • ummm... is this [slashdot.org] what you are looking for?

    From said page:
    Anyway, you're welcome to use the code, but it is provided with no warranty and no support. You're on your own here, and I can't help you if you have problems (and you will have problems! This is fairly alpha code) you are on your own. The only restriction is that you must put a Slashdot Logo and a link back to Slashdot on any site that uses our code. Beyond that, have a ball.
  • Your probably refering to Hotmail bash-2.01$ telnet www.hotmail.com 80 Trying... Connected to www.hotmail.com. Escape character is '^]'. GET / HTTP/1.0 HTTP/1.1 200 OK Date: Fri, 08 Jan 1999 01:07:20 GMT Server: Apache/1.2.1 (Example taken from http://photo.net/wtr/dead-trees/) /Tobias
  • NT/IIS seems to be running a lot of commerce sites these days.

    It happens that if you wanted to run a commerce site and wanted to use SSL and have a "plug" saying that you were a good guy, you need a certificate from one of the authorities like VeriSign or Thawte. But, until the last year or so, they wouldn't issue a certificate to any site running a "free" web server. You had to have a commercial server. And of the commercial servers, IIS is by far the best and most well known to web site developers.

    -sw
  • I've poked into webserver "pros and cons" and about the only thing that can be said in defense of IIS is that it uses threads. Otherwise, IIS just makes no sense whatsoever (security nightmares abound). It sounds like Apache 2.* will allow one to use threading if desired (just a compile time choice). Once that is there, I can't see how IIS could be technically justified for any situation. I reckon Apache 2 will really raise the bar for web servers all around...looking forward to it!
  • but that if there is no competition, eventually it will suck. (Note: this is not intended to be flamebait)

    this is primarily only true for closed source products where it's not profitable to continue to dump resources into it that product anymore. people are always going to tinker with software like this, if nothing else to make their lives easier. the other danger is that everyone starts "thinking inside the box" and no new features are added, which likewise isn't true because we all have the source and can tinker if we think of some cool new feature.

    i can understand your concern, though. the problem is this: who gets forced to not use linux?

  • This is good news. I was getting worried when last month IIS made that big jump against apache. We can't have MickySoft taking over the whole internet now can we?

  • Sorry about the re-post forgot to format. This should be bit clearer
    Your probably refering to Hotmail
    Try

    bash-2.01$ telnet www.hotmail.com 80
    Trying...
    Connected to www.hotmail.com.
    Escape character is '^]'.
    GET / HTTP/1.0

    HTTP/1.1 200 OK


    Date: Fri, 08 Jan 1999 01:07:20 GMT
    Server: Apache/1.2.1
    (Example taken from http://photo.net/wtr/dead-trees/)

    /Tobias
  • Three years ago I read an ad in one of the most popular newspapers, that said:
    "IIS is the most popular webserver all over the world *"
    and the fine print at the bottom said "*)among all commercial webservers on microsoft platforms"

    The lesson is: don't trust in ads - check yourself or ask a real expert!

    :-)
    ms

  • I have seen a couple posts complaining about this undercounting. I would bet that its done for legacy reasons. Probably, back when they started this, there wasn't any or much variety between the various netscapes, and the various ms servers, so they decided to lump it all into one number. Apache and its derivatives seem to have become popular somewhat later, however, so they may have decided to represent them in a somewhat better fashion. However, to change the method of representing either at this point would skew the rate of change in the datapoint that you made the change at. They probably have to stick with what they have been doing so that the graph stays correct relative to the past.
  • I've often thought about starting some kind of web database where this false/misleading claims could be stored along with pointers to the documents showing the fact to be misleading.

    Things like the BYTEmarks and this could be explained. Somethig like the skeptics dictionary. [skepdic.com] Of course I just don't have the time or the webspace to track down everything.

    OTOH, probably could call it either www.phb.com or www.fudbusters.com [who y'gonna call?]...
  • After all this showed how good a Web server NT is for static pages ;-)

    NT/IIS is great for dynamic content too.

    After all, have you ever seen how fast it generates those "server too busy" messages on the fly? :-)
  • Notice that they tie the web-server to the OS. So, if they call different Linux kernels different OSes (something the FTC would be hard pressed to deny, since the Linux community always talks about how fast Linux changes from version to version...), and each different version of UNIX, the numbers may actually work out.

    I agree it is crap, but legally, I think it may be an easy statement to defend.
  • Hotmail has always run on Apache on top of Solaris, since the very beginning.

    Microsoft doesn't make anything with the cojones to handle it :)
  • It sounds like Apache 2.* will allow one to use threading if desired (just a compile time choice).

    Actually, IIRC Apache 2 will be a hybrid threading/forking server. The whole argument against threads was stability, namely if one of the threads crashes/blocks all of them do. Apache 2 will still fork some child processes which will then each be multi-threaded; thus you have the speed of threads but if one of them bites it you can just fork a new child.

    Chris
  • Actually, I think there is a Server directive you can put in the conf file (don't recall what it is though) to change the message displayed here to something like "sorry, no OS/server info available"

    Shouldn't be too hard to find, though (unless I'm wrong and it doesn't exist)

    As for the OS decection, I don't know but I would expect something like nmap, where it fingerprints the nuances of the TCP stack. No way to stop that from working...

  • by Anonymous Coward
    No. The way to get your server high up in the stats is to (a) build a free service which offers free domains (or subdomains), (b) wait some days until you have >100,000 users with an insane number of domains, (c) submit the list of domains to Netcraft.

    hypermart.net is such a service, hosting more than 300,000 domains on Apache. It will make for a good increase in the number of Apache sites in the next month.

  • There is no margin of error (other than rouding errors). Because they're not extrapolating anything, they're simply couting what every host (that they index) is running then tabulate the results. Apache showed a 1% increase in market share of the hosts that they index.
  • Netcraft uses more of a web robot system to periodically browse every site they know about. It's not an on demand probing of the site in question.

    -sw
  • The note about Webjump using IIS for static pages and Apache for "processed" pages is much in line with the Mindcraft test stating that the IIS is good at pushing out bits, but that Apache is good at CGI/PHP compared to ASP technology.

    It might be too, that the static pages are not that static and the people changing them, "just wants to press at button", where the "processed" pages often involves a lot of database/monitoring/organising actions that are far better handled in a UN?X environment.

    Maybe this is not war but common sense.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...