Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Internet

Only 4.13% of the Web Is Standards-Compliant 406

Death Metal writes "Browser maker Opera has published the early results of an ongoing study that aims to provide insight into the structure of Internet content. To conduct this research project, Opera created the Metadata Analysis and Mining Application (MAMA), a tool that crawls the web and indexes the markup and scripting data from approximately 3.5 million pages."
This discussion has been archived. No new comments can be posted.

Only 4.13% of the Web Is Standards-Compliant

Comments Filter:
  • by AltGrendel ( 175092 ) <(su.0tixe) (ta) (todhsals-ga)> on Thursday October 16, 2008 @10:04AM (#25399217) Homepage
    ...on which standard the designer chose.
    • by morgan_greywolf ( 835522 ) on Thursday October 16, 2008 @10:07AM (#25399267) Homepage Journal

      'Looks good in Internet Explorer and doesn't seem to crash Firefox or Opera' is not a standard.

      • Re: (Score:2, Funny)

        by theaveng ( 1243528 )

        Does using make my code non-standard?

        • by remmelt ( 837671 ) on Thursday October 16, 2008 @10:31AM (#25399643) Homepage

          It sure makes your Slashdot comment non-standard!

          • Re: (Score:3, Funny)

            by andy19 ( 1250844 )
            I disagree- I'd say that's a pretty standard Slashdot comment.
      • by gnick ( 1211984 ) on Thursday October 16, 2008 @10:48AM (#25399923) Homepage

        'Looks good in Internet Explorer and doesn't seem to crash Firefox or Opera' may not be a standard, but it satisfies the bulk of most web-sites' customers. I'm a FF user and include myself in that group. I realize that sites are tuned for IE because it's the leader and accept that my browser choice and add-ons sometimes make things look a little funny - As long as they work I don't care. I would guess that most visitors feel more or less the same (slashdot standards nazis excepted).

        Besides, if most of a web site's traffic is coming from a browser that doesn't support any standard but their own anyway, what motivation do they have to conform?

        • by Bogtha ( 906264 ) on Thursday October 16, 2008 @11:08AM (#25400237)

          'Looks good in Internet Explorer and doesn't seem to crash Firefox or Opera' may not be a standard, but it satisfies the bulk of most web-sites' customers. I'm a FF user and include myself in that group.

          The problem with that attitude is that not so long ago, Firefox wouldn't be in the list, and for many developers (including some I worked with this week) Opera is still not on that list. It's like Internet Explorer only websites, except only slightly laxer. So you use Firefox. Lucky you! How about all the people who use something less popular, e.g. Konqueror? How about all the people who must use something that will never be popular, such as people with disabilities? Shall we just say "tough, get off the web"?

          As long as they work I don't care.

          "Working" is not a property of a website. "Working" is a property of a combination of a website and a browser. You can't say that a website "works", only that it works in particular browsers.

          • Re: (Score:3, Insightful)

            by gnick ( 1211984 )

            How about all the people who use something less popular, e.g. Konqueror?

            Web pages will be tailored to suit the bulk of their traffic. Konqueror will learn to display them properly, regardless of adhesion to standards, or fall by the way-side as users get frustrated. It's not fair, it's not right, and it's not changing. Sorry.

            How about all the people who must use something that will never be popular, such as people with disabilities? Shall we just say "tough, get off the web"?

            Some pages are practical to conform to people with disabilities. Some aren't. When practical, web-admins should make their pages accessible to the handicapped. Adhering to standards may make it easier to tailor specialized browsers for use, but the fa

      • by jellomizer ( 103300 ) on Thursday October 16, 2008 @10:54AM (#25400049)

        Internet Explorer is really the big trouble maker here. Any Professional knows that their site needs to render flawlessly in IE first, Good enough in Firefox, and perhaps workable on others. Following the "standards" bairly leads to this operation as IE so poorly handles the standards that you really need to break them. I am still trying to find the HTML tag that gives IE users an electric shock.

        • by Bogtha ( 906264 ) on Thursday October 16, 2008 @11:10AM (#25400277)

          I am still trying to find the HTML tag that gives IE users an electric shock.

          Don't be silly, everybody knows you use CSS for that. (Cruelly Sadistic Styleshocks).

      • Re: (Score:3, Insightful)

        Yes it is. It is the standard that everyone shoots for. The defacto standard if you will. It is not a rigorously defined standard published by an internationally recognized standards body. I'm afraid there is not a single standard definition of the word standard [google.com] in the English language.

        Isn't English fun, my compeer?
    • by g0dsp33d ( 849253 ) on Thursday October 16, 2008 @10:22AM (#25399513)
      But if we completely reverse the standards we should be at 95.87% compliance!
    • All the major browsers are vying for top dog in creating a smiley face where there was once colorful blurs...And now is colorful flashy rainbows!
  • Depends on how strict they're being.
    For example, I never close paragraph and line break tags, but otherwise my html is compliant.

    • Re:How compliant? (Score:5, Informative)

      by DrSkwid ( 118965 ) on Thursday October 16, 2008 @10:08AM (#25399301) Journal

      It is very simple http://validator.w3.org/ [w3.org]

      • Are there degrees of strictness?
        If you claim your code is HTML 4.01 or XHTML 1.0 or whatever, then it either is or it isn't.

        • Are there degrees of strictness?

          Yes. HTML 4.01 and XHTML 1.0 each have two DTDs: a "transitional" DTD that allows presentational elements and a "strict" one that disallows them. The trouble is that a couple structural elements and attributes got removed by mistake in the strict DTDs along with the presentational ones, most notably the value attribute of the li element. For this and other reasons, most valid HTML that I've found has used a transitional DTD.

          • by Bogtha ( 906264 ) on Thursday October 16, 2008 @10:42AM (#25399815)

            Yes. HTML 4.01 and XHTML 1.0 each have two DTDs: a "transitional" DTD that allows presentational elements and a "strict" one that disallows them.

            No, that's something different. There aren't degrees of strictness when it comes to validity. If a document claims to be a Strict document, and makes a single mistake, then it is invalid. If a document claims to be a Transitional document, and makes a single mistake, then it is invalid. In both cases, it's an absolute rule with no laxity.

      • Re: (Score:3, Insightful)

        by Jim Hall ( 2985 )

        It is very simple http://validator.w3.org/ [w3.org]

        Actually, not quite that simple. My top-level page currently gives 66 validation errors. Guess how many are from content I've had to include from a third party, where I have no control over their standards compliance? 66 of them - 65 are from WebRing, 1 is from a news feed.

        For those of you who are bad at math, that's all of the errors on that page. Note that all my other pages validate just fine, since I don't include third-party content there.

        Being part of the WebRing is still important for my site, so I h

    • Re:How compliant? (Score:5, Insightful)

      by morgan_greywolf ( 835522 ) on Thursday October 16, 2008 @10:12AM (#25399371) Homepage Journal

      Isn't that a bit like saying, "my C code fails to compile whenever I pass it the flag for strict ANSI checking, but other than that my code is ANSI C compliant"?

      • by gnick ( 1211984 )

        I think you nailed him - That's a perfect analogy. But, for web sites just like for application users, the target for compliance is not typically the end-user. When I download a new application, I don't care whether it was coded with 'ANSI C compliant' code. I just want it to work properly. When I load a web-page, I don't care if it meets some HTML standard. I just want it to display and function properly in my browser.

    • Re: (Score:3, Interesting)

      by DZign ( 200479 )

      Also depends on how old the websites they searched are..
      only recently added websites or also websites and old pages that exist longer than the standard they validated against exists ?

      • by DrSkwid ( 118965 )

        How many websites around now are pre November 1995 when the HTML2.0 standard [ietf.org] was released.

        "HTML has been in use by the World Wide Web (WWW) global information initiative since 1990. This specification roughly corresponds to the capabilities of HTML in common use prior to June 1994. HTML is an application of ISO Standard 8879:1986 Information Processing Text and Office Systems; Standard Generalized Markup Language (SGML)."

      • Re:How compliant? (Score:5, Informative)

        by Bogtha ( 906264 ) on Thursday October 16, 2008 @10:26AM (#25399565)

        only recently added websites or also websites and old pages that exist longer than the standard they validated against exists ?

        MAMA didn't validate against a single document type. They validated against the document type that each individual document claimed to be. So all the ancient HTML 2.0 pages out there will correctly be identified as valid in they are, in fact, valid HTML 2.0.

    • by alexhs ( 877055 )

      I never close paragraph and line break tags, but otherwise my html is compliant.

      In that case I think you're compliant when using the transitional doctype [w3.org]

    • Re: (Score:3, Informative)

      by Bogtha ( 906264 )

      Depends on how strict they're being.

      There aren't degrees of validity. A document is either valid or it isn't. You can't be "more strict" when validating something, if a tool offers you an option like that, then it is doing something other than validating, it's probably linting as well. There's at least one widely-used "validator" that doesn't actually validate at all.

      For example, I never close paragraph and line break tags, but otherwise my html is compliant.

      Yes you do. If you didn't close them

    • Is there a particular reason you don't?

      Do you understand why XHTML exists? How much more work it is to parse straight HTML, and less work it is for a browser to simply fire up an XML parser instead?

      <br /> isn't that difficult.

    • why don't you close paragraph breaks? both HTML and XHTML require paragraphs to have end tags.

      line breaks don't need to be closed with a separate tag. in XHTML you simply write <br /> just like you would close other empty tags.

      it's not hard to follow conventions that are universal across all browsers. there's no reason to break open standards other than a.) ignorance (which clearly is not the case here since you know you're breaking standards) or b.) your site needs to render on a browser that does no

      • by Bogtha ( 906264 )

        why don't you close paragraph breaks? both HTML and XHTML require paragraphs to have end tags.

        No, closing tags for <p> elements are optional in HTML 4 [w3.org], HTML 3.2 [w3.org] and HTML 2 [w3.org]. I don't think any version of HTML has required them.

  • More like (Score:5, Funny)

    by ODiV ( 51631 ) on Thursday October 16, 2008 @10:07AM (#25399263)

    OMG 4.13% of the Web is Standards-compliant!?

  • W3C (Score:5, Informative)

    by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Thursday October 16, 2008 @10:08AM (#25399305) Journal

    W3C's validation tools

    Normally I'd go on my own rant but I'm feeling lazy today and recently I read a good article at A List Apart that sums it up [alistapart.com]. As for the W3C, I like this list they compile:

    W3C's Pros & Cons

    Pros:

    • Global
    • Academic and scientific body
    • Multiple interests represented, but mostly from paid member companies
    • Attempting to be more open via certain teams such as the HTML5 and CSS Working Groups
    • Attempting to appeal more to work-a-day world via redesigns, blogs, and more human-friendly language throughout the site

    Cons:

    • Creates "open standards" by ideal, not necessarily fact
    • Incredibly slow moving in a highly evolutionary environment
    • Poor economic model that relies on membership monies
    • Discourages independents and open process
    • Passive: only creates specs and recommends, does not do real outreach
    • "Ivory tower" perception

    You should read that article, it's pretty spot on for this subject.

    • Re:W3C (Score:5, Insightful)

      by Bogtha ( 906264 ) on Thursday October 16, 2008 @10:37AM (#25399747)
      • Incredibly slow moving in a highly evolutionary environment

      That's hilarious. We still can't use CSS tables or generated content on the web - features that were published by the W3C in the CSS 2 specification over a decade ago because Internet Explorer doesn't support them yet. We need to use JavaScript frameworks or otherwise normalise event handling because Internet Explorer doesn't support DOM 2 Events - a specification published by the W3C eight years ago (event Internet Explorer 8 won't support this). And SVG anyone? XHTML? MathML?

      Get back to me when browsers make it out of the 90s before telling me the W3C is "incredibly slow moving".

      • In fact, this is pretty much entirely due to IE.

        Do you suppose, if Google started blocking IE from their homepage by user-agent, that the situation would improve?

        • I think it would be a great boon for LiveSearch.

          Let's face it, the average user isn't going to think to blame their browser, they're going to blame Google.

  • I wonder if (Score:3, Interesting)

    by Jane_Dozey ( 759010 ) on Thursday October 16, 2008 @10:09AM (#25399321)

    I wonder if they're throwing away every page that doesn't fully comply or if they're actually including the pages that almost comply but have a typo or missing doctype or missing closing tag. I'm guessing the former by the numbers which seems a little unfair to me.

  • by cosmocain ( 1060326 ) on Thursday October 16, 2008 @10:10AM (#25399339)
    ...the rest just renders perfectly in IE.

    (i would prefer if there wasn't any truth in it.)
  • For example, xhtml-strict does not include support for "target" attributes in links. What kind of idiotic decision was that?

    So, people then choose xhtml-transitional, which is much more relaxed, etc.

    Another thing is the inclusion of embedded xml inside html, which due to lack of support in the standards, completely break "standards-compliance", whatever that means.

    Now, if you're talking about DOM, then that's another story.

    • by sakdoctor ( 1087155 ) on Thursday October 16, 2008 @10:15AM (#25399421) Homepage

      Well not in the least bit idiotic actually.
      It's up to me as a user to choose where a url opens, especially since we are all using the tabbed paradigm now.

      • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday October 16, 2008 @10:38AM (#25399759) Homepage Journal

        It's up to me as a user to choose where a url opens, especially since we are all using the tabbed paradigm now.

        User agents currently do not allow the user to submit a form into a new window or tab. This is the nearly nine-year-old bug 17754 on bugzilla.mozilla.org with 99 votes.

        • by Bogtha ( 906264 )

          User agents currently do not allow the user to submit a form into a new window or tab.

          Webkit based browsers do.

      • by l0ungeb0y ( 442022 ) on Thursday October 16, 2008 @10:43AM (#25399829) Homepage Journal

        XHTML-STRICT is not for everyone, it's intended for those (like me) who are more development oriented and wish to completely separate structure from presentation. A "target" attribute is clearly a presentation attribute since it defines how the linked reference is presented to the user and as the parent noted, it should be up to the user to make that choice.

        When wanting to control presentation in XHTML STRICT, you should use DOM or CSS, that way, they structure (XHTML) is removed from the presentation (JS/CSS). I typically link all scripts and stylesheets. That way the XHTML is made portable in terms of data with the JS/CSS being limited to only effecting a web client. In the OPs case, a simple ID attribute for that particular anchor would work just fine, you could bind an event listener for a click event to that element and then execute your javascript popup code when that event is triggered, canceling the event so that the browser does execute the link on it's own. That way, your default browser clients could execute the JS instructions, while a 3rd party app (an AIR desktop or mobile device) could put their own custom behavior in if desired.

        While that sort of practice may seem extreme to a designer, as a developer I can swear to it's scalability and transportability for supporting 3rd party access such as when developing a web UI that needs to support many types of clients via one codebase.

        If none of those features make sense nor strike you as worthwhile, I suggest you stick to XHTML TRANSITIONAL, which is probably better suited to your needs.

    • by mikael_j ( 106439 ) on Thursday October 16, 2008 @10:19AM (#25399471)

      For example, xhtml-strict does not include support for "target" attributes in links. What kind of idiotic decision was that?

      A very good decision, there are two main uses for the "target" attribute:

      • Frame-based sites - Old-school, annoying way of designing sites that I and many others feel should not be used for new sites.
      • To automatically open links in a new window - Annoying behaviour by web developers who think no one could possible want to, god forbid, leave their site in favor of another site.

      /Mikael

      • I wouldn't want to browse API documentation in a non-frame based environment.

        • I wouldn't mind it -- have you seen some of the better API doc sites?

          What's more, "no frames" doesn't have to imply "no frame-like behavior" -- CSS can give you a little box whose contents scroll independently of the parent, and Javascript can give you links that don't refresh the entire page.

          • by Sancho ( 17056 ) *

            Sites requiring the use of Javascript are pretty evil. And what CSS style are you referring to--I don't think I've run across that one. Usually, when I see behavior like that, it's an iframe.

        • why? so you can save .025 seconds by not reloading an index or menu?

          there's no good reason to use frames or even iframes in a modern site. it's bad for search engine indexing, and it's bad for usability. that is why major API documentation sites like those for the YUI Library [yahoo.com], MySQL [mysql.com], PHP [php.net], and even MSDN [php.net] do not use frames in their layout.

          even if there were a need to keep persistent layout elements, you can use AJAX to simulate all of the desirable behaviors of frames/iframes without the drawbacks.

      • There are good and bad times to open a new window. I use it on my sites for when ever I'm showing a Map to something. "This week's game will be played Here (Map)". That lets me link to google maps. If it's something on my 'link' page then I let it open in the current window. It's the difference between an aside and a new paragraph.

        • Re: (Score:3, Informative)

          Not really a lot of point to it, though -- savvy users will simply middle-click on the link if they want it in a new tab/window. If they don't, that generally implies they want it right where it is, and your attempt to open a new tab/window is going to be annoying.

          But hey, at least using a target for that is better than linking to a javascript: URL. A lot of sites are even worse -- they add an onClick event, and they set the link href to #, or to javascript:void(), meaning that middle-clicking on it inevita

          • by Sancho ( 17056 ) *

            The web is not, and should not be designed for savvy users.

            Savvy users will get around the stupid measures put into place for everyone else.

      • Actually there is another very good use for automatically opening a window (not mentioning target as it's just a vehicle to get you there)

        When a link is possibly important to a user but would in fact break the flow of their current activity, a link should be set to open in a new window - preferably one which does not go full screen to hide the window they are really using.

        This is a usability issue. You should not make the user think about having to open a link in a new tab or window if they click a link to

        • Ah yes, for those special purposes it can be useful, unfortunately it has been proven that by giving developers the possibility of opening links in a new window/tab is something that will be abused, especially by people who seem to think that their site is oh-so-important and that the user couldn't possibly want to leave it.

          /Mikael

        • Re: (Score:3, Insightful)

          by Bogtha ( 906264 )

          When a link is possibly important to a user but would in fact break the flow of their current activity, a link should be set to open in a new window - preferably one which does not go full screen to hide the window they are really using.

          If you use the target attribute, you have no control over the size of the window and it is very likely that it will obscure the current window. You need JavaScript to get the effect you desire, and if you are using JavaScript, why bother with a new window when you can d

    • Re: (Score:3, Interesting)

      by hansamurai ( 907719 )

      The lack of a target attribute really bothered me when I first ran into it. Their argument was something like how websites shouldn't be controlling the browser, as in creating tabs/windows, etc. Of course you can hack it in with Javascript which is something I refused to do, what's the point of striving to be standards compliant when you break it a minute later with Javascript? Anyways, I thought about it and kind of agreed with the notion, so now I just externally link a lot less.

      • I am so full of hate at sites that continually want to open new windows. When I can, I use a browser that lets me turn that off. So what's the point of you breaking the standard in ugly javascript, only to have me turn it off?

        You don't need multiple windows. If you think you do, you're wrong. If you're not wrong *I'll* open a new window.

        • by Sancho ( 17056 ) *

          *fills out a long web form*
          *gets to bottom of page, sees privacy policy*
          *thinks "Hmm, there's a lot about privacy and facebook on the news, maybe I should check this out*
          *clicks policy. policy opens in current window.*
          *reads policy*
          *clicks back*
          *screams as entire form has been erased*

      • Of course you can hack it in with Javascript which is something I refused to do, what's the point of striving to be standards compliant when you break it a minute later with Javascript?

        Well, Javascript is a standard.

        I prefer progressive enhancement -- make the link a plain old link, and useful on its own, then override onClick to do whatever you want. Browsers that support middle-click-open-in-new-tab don't seem to count that as an onClick event.

        • Javascript may be a standard but I don't consider it an excuse to break usability. I use Noscript and I don't even bother to explore sites that are totally borked without their Javascript crutch. I spend a lot of time making sure everything works whether Javascript is on or not.

  • Surprised? (Score:5, Insightful)

    by secondhand_Buddah ( 906643 ) <secondhand.buddah@gma i l . c om> on Thursday October 16, 2008 @10:11AM (#25399355) Homepage Journal
    Why is this a surprise? We are limited by non-standards compliant browsers.
    Unfathomable amounts of development time has been wasted over the years trying to set sites running and usable in multiple browsers.
    To complicate the issue, over the last few years there has been an explosion in the number of browsers on the market. It is really no fun navigating this modern tower of Babel.
    If I had one wish that would be granted, it would be that all browsers would be compliant to a standard. Literally millions of man years in development time could have been saved if this issue was somehow nipped in the bud earlier on.
    • To complicate the issue, over the last few years there has been an explosion in the number of browsers on the market. It is really no fun navigating this modern tower of Babel.

      To simplify the issue, maybe every few months, I have to fix an issue where our site works on Firefox, but not Safari.

      Every few days, I have to fix an issue where our site works everywhere else except IE.

      If we didn't have to deal with IE, the problem would be a complete non-issue. Any page I build that I'm not being paid to make usable in IE, I don't.

  • by bboxman ( 1342573 ) on Thursday October 16, 2008 @10:21AM (#25399489)

    This is sad. The situation is even worse in some non-English web domains.

    Why can't the web stick to something simple? 95% of the sites I use, would be fine with just plain simple HTML 2.0. Instead, we've got javascript, CSS, XHTML, and other buzzwords. Which in the end, take control of how a web page looks from the user's hand.

    I like to read text, on a monitor, green on black (or white on black). I would like to format a web page the way I want to see it.

    The vast majority of the web is simple formatted text. There is no reason for this to constantly evolve onwards and onwards.

  • 1. the web is still evolving, the standards keep changing. no pressing need to lock things in
    2. it is superior design to have a browser that gracefully degrades rather than being and brittle and refusing to render everytime someone forgets to close a <p> element. not simply because of nonstandard pages, but for a whole host of other reasons, including handling partial transmissions
    3. the strength of the web is open participation, low barrier to entry. hobbyists should publish, and this is a good sign. hobbyists should not expected to be anal retentive standards zealots

    complete standards compliance should always be low on the web because this is a sign of a HEALTHY internet, because it means nonprofessionals are contributing content. this is always a good thing, this what made the internet a powerful nw form of media in the first place. if ever there were some sort of gatekeeper organization or rigorous technical specification that enforced standards compliance, you would raise the barrier to entry onto the web by regular joes. you would reduce the variety of the web, make it more monoclonal, and hurt a vibrant ocmmunity

    low standards compliance is not only a complete nonissue and not a problem, its a good sign. the lower standards compliance is, the better for us all

  • by thermian ( 1267986 ) on Thursday October 16, 2008 @10:35AM (#25399707)

    Does it mean that 94% of websites did not find the standard useful?

    Or perhaps that the standard is poorly presented, causing fewer people to be aware of it?

    My personal leaning is that the standards body lost control of their 'standards' a long time ago, but they haven't realised yet. The only real thing most web devs care about is 'does my site/application run as required in the browsers I need it to?' If the answer is 'yes, if you don't follow the standard', then the standard is ignored.

  • Slashdot (Score:2, Interesting)

    http://validator.w3.org/check?uri=http%3A%2F%2Ftech.slashdot.org%2Farticle.pl%3Fsid%3D08%2F10%2F16%2F1325215&charset=(detect+automatically)&doctype=Inline&group=0 [w3.org]

    If Slashdot shows up with 28 errors, would you really expect anything at all out of the non-technical media?

  • by NoNeeeed ( 157503 ) <slash&paulleader,co,uk> on Thursday October 16, 2008 @10:39AM (#25399777)

    You only need to make one mistake in your markup to be non-compliant. I would be interested to see what the degree of failure is for the other 95.87% of sites. My website, Wii Fit Forum [wiifitforum.org] currently fails on six counts, all just simple errors in the code which I plan to fix. But currently, the site displays just fine, so I have more important things to worry about. I think this is the same for many publishers.

    Unfortunately for the novice, the ignorant, the lazy or the just plain error-prone (the last two are me), the W3C and the browser industry do not make it that easy to be compliant.

    HTML standards are the current prime example of the old joke "the great thing about standards is that there are so many of them". The W3C really needs to stop pissing around with all this semantic web crap, and concentrate on making what is already there work better.

    We need a single standard which embodies all the best elements of the existing ones in a coherent form, and then the browers manufacturers need to get their arses in gear and implement it properly. The novice developer is currently confronted with a mish-mash of alternative doc-types, each of which has different pros and cons, and which may or may not work properly depending on your browser. It needs to be done soon, not over a ten year timescale.

    When you can stop worrying about whether your site will work in various browsers, then people will spend more time on compliance. Until then, people will worry about the important things, such as their readers being able to see their site properly.

    I know I should treat standards with more importance, but while the current mess persists it is hard to care.

  • by PortHaven ( 242123 ) on Thursday October 16, 2008 @10:40AM (#25399791) Homepage

    When they don't work with the tools (various browsers).

    Better to build a website that works, than one that meets standards but display poorly in the browsers of your users.

    Ask yourself this simple question. If it does not look good in the browser, is your client going to accept "Well it's coded to standards!". Heck no...

  • by Ed Avis ( 5917 ) <ed@membled.com> on Thursday October 16, 2008 @11:25AM (#25400525) Homepage

    Nowadays making sure your site is valid HTML is easy. Just install the excellent HTML validator plugin [skynet.be] for Firefox. It gives you a tick or cross icon on each page; double-click the cross to view the page source with a list of errors. It does the validation locally on your machine, not sending the content off to some server, so it's fast.

    If you're writing dynamically generated pages it is a great way to find bugs in your code, and it's unobtrusive enough to leave it turned on all the time.

"It's a dog-eat-dog world out there, and I'm wearing Milkbone underware." -- Norm, from _Cheers_

Working...