Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Government The Internet IT Technology Politics

The ASP.NET Code Behind Whitehouse.gov 143

An anonymous reader writes "The author looks at the markup for the new whitehouse.gov site, launched today. It uses ASP.NET and various JavaScript libraries. It suffers from various inefficiencies, most easily remedied. Check the images and techniques used to build the site front-end."
This discussion has been archived. No new comments can be posted.

The ASP.NET Code Behind Whitehouse.gov

Comments Filter:
  • Maybe we can (Score:3, Insightful)

    by AKAImBatman ( 238306 ) * <akaimbatman AT gmail DOT com> on Tuesday January 20, 2009 @09:23PM (#26540059) Homepage Journal

    The whitespace.gov site has lots of whitespace characters.

    Hmm... Freudian slip? FWIW, most of the whitespace is probably generated by the ASP technology they used. Spots where code goes can often generate unexpected whitespace. It's just a side effect of using scriptlet-type technologies.

    Don't get me started on the "correct" way to write ASP. The tag-substitution gobbldy-gook encourages all kinds of bad page-development practices. The request-time attributes + templating taglibs seen in JSP provide a much cleaner separation between logic and HTML.

    The whitehouse.gov site uses more GIFs than PNGs.

    Ouch. Welcome back to 1996. Web developers looking for the smallest file size really need to learn that there is such a thing as palletized PNGs. They're even smaller than GIFs, even.

    The whitehouse.gov site uses heavy JPG compression.

    This is one place where my reaction is "BFD". As long as it looks acceptable under most circumstances, the people with the carefully color calibrated monitors (*cough*yeslikeme*cough*) can suffer a bit. Heck, half the JPGs on the internet look like garbage, so I'm not too worried about artifacting in the gradients.

    The whitehouse.gov site uses IIS 6.0. The whitehouse.gov site uses ASP.NET 2.0.

    Can we choose the right technologies for a website? No we can't! Thankfully, the President isn't hired to choose the best technology to run his website. ;-)

    • by Anonymous Coward

      The fact that he uses them makes them the best.

      As of today, IIS6 is the best web server and ASP.NET 2.0 is the best web application framework.

    • But I agree with the heavy jpeg compression. It is one of my pet peeves, actually. Especially when people use JPEG's for images that really should be GIF's.

      And yes, PNG's are smaller but I've found that IE6 can do weird things to them. Even IE7 can sometimes do funky things with the colors in PNG images. And don't forget you can't easily do transparent PNG's until IE6 is finally flushed out of our system (if you are reading this and are using IE6, please upgrade for the love of god).

      Can we choose the ri

      • by ergo98 ( 9391 )

        Like most PHP sites, the aspx extension is a dead giveaway :-) ASP.NET MVC is url based, just like a good mod_perl app or a rails app.

        Many, many ASP.NET sites use URL rewriting. http://www.yafla.com/dforbes/The_Best_And_Worst_of_2008/ [yafla.com] goes to an aspx page. http://www.yafla.com/dforbes/Could_Microsoft_be_the_Patron_Saint_of_Firefox/ [yafla.com] goes to the same aspx page, albeit with different parameters.

        ASP.NET MVC brings a nice model, but it certainly wasn't first to rest-ful URL rewriting.

      • And yes, PNG's are smaller but I've found that IE6 can do weird things to them.

        256 color PNGs work correctly in IE6 [johndyer.name]. No need to do anything special.

        And don't forget you can't easily do transparent PNG's until IE6 is finally flushed out of our system

        Put pngfix.js [ntlworld.com] in an IE conditional tag if it bothers you. I refuse to stop the progress train because of Microsoft's ill-gotten monopoly.

        If they were using ASP.NET MVC you wouldn't have even known the thing was running ASP.NET.

        I confess, I haven't used ASP.NET 3.

        • Re: (Score:3, Insightful)

          by Mad Merlin ( 837387 )

          The PNG hack for IE6 has some rather fatal drawbacks, particularly if you want to use it to replace background images instead of just regular <img>s. The most obvious issue is the z-index one, in that the PNG hacked PNG is rendered on a layer on top of the canvas, for which no events (including clicks) will pass through, unless you throw in a lot more hacks and you're really lucky. The other more insidious problem is that any element with a PNG hacked PNG must have layout, meaning it's almost guarante

          • The PNG hack for IE6 has some rather fatal drawbacks, particularly if you want to use it to replace background images instead of just regular s.

            Fair enough. On the sites I used pngfix, I haven't run into these issues. Of course, these days I've started "firing" customers who use IE at all. Hobby sites are tons of fun that way. Wish I could do that at my day job. ;-)

            PNG8 is an interesting aside, but only marginally better than just serving GIFs to IE6.

            FWIW, everything in the article is about marginal improve

        • by KlomDark ( 6370 )

          Hold off on the ASP.NET MVC stuff for now. It's really a pathetic implementation at this point.

          Stuck with only basic HTML controls, nearly none of the more advanced ASP.NET controls work, as most continue to rely heavily on ViewState which isn't supported with the MVC framework.

          So until they convert most of the advanced controls, it's just going to create more work to use it.

          Worse than Ruby on Rails for now, but give them time...

        • Comment removed based on user account deletion
        • I confess, I haven't used ASP.NET 3.5/MVC yet. It would have been handy back when I inherited an ASP.NET project. Unfortunately, after my poor experiences with ASP.NET's scalability (or lack thereof), I'm not really inclined to develop another site in ASP.

          You mean since you lacked the knowlege to properly scale and Asp.Net application.

      • I would love to have the users I manage dump IE6. Ask Microsoft to support IE7 or IE8 for Windows 2000. We have some rather expensive hardware at work that requires Windows 2000 due to driver compatibility (XP will NOT work) and a website for a company we work with requires requires Active X... not a common combination, I know, but try getting users to open IE for one site and Firefox for another! It is harder than you might think.
    • Re: (Score:3, Insightful)

      by zoips ( 576749 )

      The request-time attributes + templating taglibs seen in JSP provide a much cleaner separation between logic and HTML.

      Wait, are you serious? JSP is no better than classic ASP; it's arguably quite a bit worse than classic ASP since it isn't language agnostic. JSP is a defunct and outstandingly annoying technology to work with that encourages all sorts of bad habits.

      You might consider checking out Tapestry 5 [apache.org] for something a little more this century.

      • Re: (Score:3, Insightful)

        by AKAImBatman ( 238306 ) *

        JSP is no better than classic ASP; it's arguably quite a bit worse than classic ASP since it isn't language agnostic.

        Language agnostic is pointless when you're not writing code. Correctly written JSP 2.0 files should have no scriptlets in the pages. All the code should be in behind-the-scenes APIs or in parent servlets that use the page for rendering. The JSP solution is far more flexible than the ASP page-backing code files solution, and generally encourages better written code. Looking at ASP.NET pages th

        • by zoips ( 576749 )

          Whatever helps you make it through the day

          In comparison to Tapestry or Wicket, JSP is defunct, you really can't deny that.

          • Oh for the love of God. You have no idea what you're talking about.

            http://java.sun.com/products/jsp/ [sun.com]

            Use JSTL. Don't use JSF. There, good to go.

            I was there when the first version of Tapestry was introduced the internet. It was an interesting idea, but it took quite a few revisions to make it into something useful. Today it's an alternative to JSPs for rendering, but it by no means retires the use of JSPs. Same with Velocity and Wicket. And if you try to introduce me to Spring as if it's the latest and greate

            • Comment removed based on user account deletion
              • Can you recommend any good beginner references to get started with J2EE stuff?

                FWIW, you should probably start with Glassfish (aka Sun Java System Application Server) and Netbeans. That combination is extremely smooth for J2EE development and will have you up and running in no time flat. (Plus it's all open source. ;-)) Just visit Netbeans.org [netbeans.org] and grab the full version or the version with Glassfish bundled.

                Once you get it setup, you can configure features like the JDBC connections through the admin console o

        • Re: (Score:3, Interesting)

          On a broader note, .NET's language agnosticism is a farce. You can have any color you want (slate, charcoal, basalt, jet, etc.) as long as it's black. There are no real differences between the languages. They have all been modified to fit the C# mold

          This isn't entirely true. For example, you can have normal ISO C++ compile into pure MSIL - VC++ does that (if you use /clr:pure). When you look at the features the runtime provides, it's pretty obvious: it has raw data and function pointers, arbitrary-layout st

        • I tend to agree and disagree, the fact is, whatever clever thing you think you can do with JSP can also be done with ASP.net or even ASP....if you know your programming of that language, webservices, API and even .exe calls can all be done by all these technologies.

          If you want to trash talk one language, or another, be sure that you have a firm point.

          That being said, I tend to like one over another purely because of the ease of use by the designer/utilities/tools associated with it. Komodo was cool for awhi

    • I'd expect that the imagine compression issues are short term to handle the large amount of traffic they're expecting now. Because they're damn ugly.
    • The tag-substitution gobbldy-gook encourages all kinds of bad page-development practices.

      That "gobbldy-gook" generates valid XHTML, if you tell it to.

      templating taglibs seen in JSP

      Bwahaha... sorry. There are better templating engines than ASP.NET 2.0, but JSP is not one of them - not in a million years.

      • Re: (Score:3, Insightful)

        by AKAImBatman ( 238306 ) *

        That "gobbldy-gook" generates valid XHTML, if you tell it to.

        So does every other page templating technology known to man. I can do it in ColdFusion if you ask me to. That doesn't make ColdFusion a particularly good language for scalable site development. Nor does it make ASP.NET anything special. And if I never see more code like this, it will be too soon:

        Featured1.Attributes.Add(...);
        Featured2.Attributes.Add(...);
        Featured3.Attributes.Add(...);
        Featured4.Attributes.Add(...);
        Featured5.Attributes.Add(...);
        Fea

    • I don't use PNGs because IE6 can't handle partial transparency of pixels and some morons still use it unfortunately. Other than that, it's a better format. Oh but I did use 3 layers in one website and 2 were semi-transparent PNGs and even my 8600GTS OC can't render it while scrolling without severe skipping in IE7 or Firefox 3. I think PNG still needs some work (or the browsers)
    • FWIW, most of the whitespace is probably generated by the ASP technology they used. Spots where code goes can often generate unexpected whitespace. It's just a side effect of using scriptlet-type technologies.

      Don't get me started on the "correct" way to write ASP. The tag-substitution gobbldy-gook encourages all kinds of bad page-development practices. The request-time attributes + templating taglibs seen in JSP provide a much cleaner separation between logic and HTML

      We're talking about ASP.NET here, not pl

      • We're talking about ASP.NET here, not plain ASP

        I can't think of any nice way to put this, so I'll just go with you fail it [slashdot.org].

        Don't you get tired of typing .NET after everything? That's as obnoxious of a marketing scheme as putting "Sun Java" before every name.

        ASP.NET is not a "scriptlet-type technology", not any more so than JSF is - in fact, much like JSF, it's component-oriented.

        Scriptlet technology is a core part of ASP.NET whether you like to admit it or not. Ideally it should not be used, but it is ther

    • Re:Maybe we can (Score:5, Informative)

      by Xest ( 935314 ) on Wednesday January 21, 2009 @05:26AM (#26543557)

      Whilst I'm a fan of PHP myself, I have to say the new ASP.NET MVC framework is rather good.

      It really does beat hands down anything in the PHP world in terms of how quickly you can get something up whilst maintaining quality. I'd argue partly this is because of the Visual Studio integration and the power of Visual Studio to start with.

      I wouldn't ever build a live app. in web forms, but I could be pretty tempted with the new MVC framework I have to admit. The various PHP frameworks out there that perform a similar task such as CakePHP could learn a lot from it in terms of how quickly you can build with it without the usual sacrifice of quality of software you get with Microsoft's tools (again, web forms for example).

      Of course, the thing is as well, the Microsoft platform is slowly getting better, Windows Server 2008 and IIS 7 really aren't that bad, performance isn't too much different to Apache/PHP now and security since .NET is much improved. Of course, Apache was always pretty good so to say IIS is improving doesn't mean much in that context but certainly combined with .NET MVC I think the whole LAMP platform needs to watch out. The various PHP frameworks like Cake could do things such as dropping the stupid cake bake crap that fails half the time on Windows and is pretty much undocumented (It has one page in the docs that don't explain much beyond what it's for). In contrast Symfony is absolutely fantastic on documentation and so is Zend, but it's still much more hassle, and there's still many more security pitfalls you have to keep an eye on vs. ASP.NET MVC.

      I'd probably like Java, but I've never used it professionally, only academically so can't compare with that. At the end of the day though, my point is that with ASP.NET MVC I can build a high quality site much more quickly and with much greater confidence than I can with any PHP framework right now.

  • wait, a government project that suffers from easily remedied inefficiences??!?!
    no way.
  • This page is a way (Score:5, Insightful)

    by Nimey ( 114278 ) on Tuesday January 20, 2009 @09:44PM (#26540267) Homepage Journal

    for the author to show his superiority to the Internet. None of what he cites really matters.

    • Re: (Score:3, Interesting)

      by ergo98 ( 9391 )

      for the author to show his superiority to the Internet. None of what he cites really matters.

      True enough. Indeed, the page in question actually validates as XHTML Transitional [w3.org] which is something that is remarkably rare [yafla.com] and shows a concern for craftsmanship.

      • Actually, merely using ASP.NET components for all markup will give you valid XHTML Transitional for ASP.NET 2.0+ with default settings. So no surprise there - it just means the guy who made the website knows how to use ASP.NET properly (well, maybe that's a surprise in and of itself, of course).

      • Indeed, the page in question actually validates as XHTML Transitional [w3.org] which is something that is remarkably rare [yafla.com] and shows a concern for craftsmanship.

        I noticed that, too. The CSS however does not validate. Still I take your point. APS.NET is not the tool I would use, but they have done well with the tool of their choice.

    • Exactly. When I saw this headline, I expected a solid critique of the use of ASP.NET (there are certainly reasons to criticize this), but instead, it looks like a 12 year old wrote it. The website uses JPG compression, has some extra whitespace, and uses -- gasp -- gzip? That is about as important as how often I clip my toenails.
      • by Yath ( 6378 )

        Strange - it almost seems like you interpreted everything on that page as a criticism.

  • Helpful advice?? (Score:5, Insightful)

    by biocute ( 936687 ) on Tuesday January 20, 2009 @09:46PM (#26540293)

    Many developers use the JQuery from Google's servers for improved performance and lower latency.

    Is this guy serious? Advising Whitehouse.gov to use a remote server to serve javascripts?

    • The theory behind using the Google-served copies of JavaScript libraries isn't to reduce load on the whitehouse.gov servers, but to improve caching.

      Since HTTP caching is URI-oriented, a browser can't tell whether whitehouse.gov/xyz.js is the exact same thing as mypetcat.com/xyz.js, but it can if both sites reference a copy stored at google.com/xyz.js instead.

      • The theory behind using the Google-served copies of JavaScript libraries isn't to reduce load on the whitehouse.gov servers, but to improve caching.

        This is true, but it doesn't change the fact that it still introduces security/privacy concerns. The fact remains that if the remote JavaScript changes, the behaviour of all websites referencing it will change. The method by which it changes doesn't have to involve hacking Google (DNS hijacking (site-local or otherwise), HTTP cache poisoning, etc.), and changes don't have to be outwardly malicious to have unintended side-effects (AVG's LinkScanner, anyone?).

        On the face of it, though, referencing JQuery fi

      • by eison ( 56778 )

        No, the idea is to get the script to the client faster, because most web browsers will only initiate two connections per site you're talking to. Serving different things off multiple different sites lets you download more than two things at once. See for example http://support.microsoft.com/kb/183110 [microsoft.com]

    • by socsoc ( 1116769 )
      This bears repeating. I use it for a few sites, but .gov should have their own system if they need to be so concerned about saving bandwidth by users caching files.
  • Personally, I think the site does a great job in balancing usability, layout, and (most importantly) good content. Given the extreme time pressures they were probably under, I think they did a great job.

  • hey, hopefully a new website like this can have votes for it's citizens about diff issues, stats on who voted for what, and senetor voting to ensure they follow what they say after elected. some free webspace/an email couldn't hurt either. whats webspace to an org like the govt? 2gb for all would get people more involved! as for the idea about open sourced govt, can such a concept be applied for our most imprtant (corrupt) system?
    • We already had access to vote information. Such as Mr Obama likes to vote 'present' quite often.
    • by socsoc ( 1116769 )
      You are truly an idiot, as evidenced by your grammar. Congressional voting stats are easily obtained and why the hell should the gov provide free webspace? They are busy (trying to half-assedly) provide essential services.
  • by Cyko_01 ( 1092499 ) on Tuesday January 20, 2009 @09:58PM (#26540423) Homepage
    ...flash based!
    • Re: (Score:2, Insightful)

      by Dotren ( 1449427 )
      Looks to me like a lot of those CSS results are due to trying to make it cross-browser compatible. Looks like they went pretty far back too... some of those tags have been depreciated since Firefox 0.9.

      I never really did the color comparison/validations on my pages although I can see how handy that information could be and I bet its pretty easily remedied.

      What I am surprised about is that you mentioned its valid HTML. The article mentions the site uses .Net 2.0... its been my experience that most, if no

  • You mean like his entire campaign, and probably the "first 100 days?"

    I believe government should be more like technology: it does not have to look good to work and, dammit, we would really prefer it work right out of the box!

  • New robots.txt file (Score:5, Interesting)

    by Cyclopedian ( 163375 ) on Tuesday January 20, 2009 @10:13PM (#26540555) Journal
    The switchover of the whitehouse.gov site also meant that the robots.txt file has changed. From around 2400 lines to just 2 lines: http://www.kottke.org/09/01/the-countrys-new-robotstxt-file [kottke.org]
  • We're not designing for modem bauds anymore. Platforms having a high level of abstraction (Java, Microsoft, etc) inject meta data into the pages. Really, I consider this a non-story. Sure it's fun to see how the website could be more efficient in fine detail; however it doesn't need to be. If you really want to do that, go back to CGI gateway.
  • by rwa2 ( 4391 ) * on Tuesday January 20, 2009 @10:58PM (#26540993) Homepage Journal

    http://www.linuxjournal.com/content/open-source-force-behind-obama-campaign [linuxjournal.com]

    My take is that the whitehouse.gov servers are run by the government and have to conform to DoD security guidelines, which have only relatively recently included Linux configurations for certain commercial distributions such as Redhat. So they probably don't have the freedom to redo the servers with whatever they could cobble together with talented volunteers for the campaign.

    Anyway, we'll eventually see whether all this talk of change only runs skin deep.

  • The author states that CSS sprites are "somewhat involved to implement". Unless you're really new to CSS, it's very easy to do.

    What I find sad is the poor choice of image formats, i.e. all the graphics of the blue border should have been in 24-bit PNG, not JPEG.

    I do applaud them for having clean URLs. Most of them anyway. [w3.org]

  • by Junior J. Junior III ( 192702 ) on Tuesday January 20, 2009 @11:04PM (#26541043) Homepage

    ...and, admittedly, a pretty sucky one. I figured out html on my own and it's not the main thing I do at work, so I never learned how to do things "the right way", and neither did any of my co-workers, for the most part.

    I do want to do things the right way, so I read articles like the one linked to in this story with interest. However, I get NOTHING out of them when they're written like this:

    The whitehouse.gov site uses ASP.NET 2.0. The HTTP header that identifies the software says "X-Aspnet-Version: 2.0.50727". There is a way for this header to be removed, which saves about 30 bytes of bandwidth on every response. [Search for 'X-Aspnet-Version']

    It's annoying to read someone going off about inefficient practices without telling you how to do it better.

    "There is a way?" Nice. Thanks for sharing.

    (Yes, the "[Search for 'X-Aspnet-Version']" is dead text, not a link to anything...)

    It's like this with virtually every other tech problem I've ever tried to research... Zillions of pages of people complaining about a problem, suggesting fixes, or claiming that something they did fixed the problem, and very little in the way of actual, detailed information about the fix, how and why it worked, and what exactly the problem was that it solved.

    • Re: (Score:3, Informative)

      by AKAImBatman ( 238306 ) *

      First hit on Google. [devnewz.com] Remember to search only the "X-Aspnet-Version" (remove the quotes) part of that text. BTW, as efficiencies go, this is a pretty minor one. It matters for a site like whitehouse.gov because they're likely to get a few million visits per day. (At least in the short term.) Very few websites have that problem, so I wouldn't worry too much about it. :-)

    • (Yes, the "[Search for 'X-Aspnet-Version']" is dead text, not a link to anything...)

      Of course it's not a link. It's an instruction that you ought to follow if you're interested in finding it out for yourself.

      • Of course, I can search for answers to questions using a search engine. It's not particularly effective though. It might be in this case, from what the other responder to my original post said, but that's not the point.

        Probably better than 90% of the time when I try to search for "how do i..." or "I get [this] undesirable behavior..." I end up getting a lot of useless hits in the search results. Discussion boards where no one knows anything, everyone guesses, someone mentions that they fixed the problem

  • by Anonymous Coward on Tuesday January 20, 2009 @11:23PM (#26541213)

    You will encounter this sometimes in your life, and you better get used to it. Sometimes, believe it or not, things are done simply because they need to be done.

    They don't spend a lot of time laboring over every little detail, they have a list of tasks and a deadline and they do their best to meet the deadline.

    They anticipate that nerds who nitpick Battlestar Galactica episode continuity errors will likely come in and stroke their butter soaked neck beards and chortle about how this or that could be done better to achieve 5% faster page loads, or allow for translation into Swahili.

    But, they get paid either way and in the grand scheme of things trying to impress anyone on Slashdot is probably pretty low on anyone's to-do list.

    As someone who's argued with people about vi vs. emacs in the past, I can honestly say you guys have reached a new low both in wasting time, having no worthwhile point, and being worthless slashdot editors. The trifecta.

    • by hab136 ( 30884 )

      stroke their butter soaked neck beards

      I hate you for providing that mental image.

  • I went to it today. All the links worked. It looked nice and professional. It loaded fast even though I'm sure it was getting hammered.

    No complaints from a user.

    At the end of the day. Who gives a shit if it's 30bytes less efficient. A hummingbird landing on a telephone line probably disrupts my DSL connection worse.

  • by KlomDark ( 6370 ) on Tuesday January 20, 2009 @11:44PM (#26541395) Homepage Journal

    I just love when people who know nothing about ASP.NET attempt to critique things:

    The whitehouse.gov site has long ASP.NET IDs. There are many elements on the page that have very long IDs, which are mainly a waste of bandwidth usually. They could be easily removed on the server side.

    <a id="ctl09_rptNavigation_ctl00_rptNavigationItems_ctl01_hlSubNav"...

    Sorry bud, they can't really be removed on the server side - these are controls embedded in controls embedded in controls. Maybe a slight shaving of rptNavigationItems down to rptNavItms or something, but the long name is to be able to reference the embedded controls.

    Please try again...

    • by JamesNK ( 967097 )
      Sure you can. Create your own hyperlink control inheriting from the one in ASP.NET and exclude writing the id.
    • Maybe he's from the future and thinks they're using ASP.NET 4 [asp.net]?
  • They could fix these perf problems instantly using the RPO (www.getrpo.com). If anyone knows the developers, let them know
    • by TJCunn ( 1457865 )
      Its a great looking site, and since we're setting a new path for the country i agree lets fix it now and move forward.
  • by mr sharpoblunto ( 1079851 ) on Wednesday January 21, 2009 @12:58AM (#26542055)
    Most of the optimization suggestions in TFA are going to offer no real performance benefit. With gzip on, whitespace, long ID's & viewstate make pretty much no impact on the final page weight, but doing these "optimizations" is going to make your page a hell of a lot harder to maintain. Don't believe me, go to webpagetest.org [webpagetest.org] and have a look, HTML accounts for only around 5% of the final page size. The best thing these guys could do to optimize the site would be to
    • combine the css and javascript files.
    • minify the javascript (as it is its taking up around 20% of the page weight)
    • perform more aggressive css spriting of the gif and jpeg images to slash the request count further.
    • remove ETag headers and add far future expiry headers to the images to speed up repeat page views and cut down on 304 responses from the server.

    Who cares about a 30 byte http header when your page is over 800k and ~45 requests, there's plenty of low hanging fruit to pick first. Interesting thing is in the post above a tool called the rpo is mentioned, it seems to do most of the important optimizations automatically.

  • by iabervon ( 1971 ) on Wednesday January 21, 2009 @01:15AM (#26542201) Homepage Journal

    $ host www.whitehouse.gov
    www.whitehouse.gov is an alias for www.whitehouse.gov.edgekey.net.
    www.whitehouse.gov.edgekey.net is an alias for e2561.b.akamaiedge.net.

    Reducing their bandwidth and server load is just not a big deal. (See Akamai [wikipedia.org] and note that the whole site takes the path that the "image" request takes in that diagram.)

  • This is a classic one:

    ASP.Net has built in defences against XSS which are enabled by default. Most sites will however only catch this on the last line of defence; the server-side request causing HTTP error unless manually handled or overridden. It looks like this site also doesn't do any input validation for tags either causing said errors.

    Really, you need to switch off the XSS checking and instead HttpEncode all the inputs manually so at least you don't break the site for potentially dangerous requests.

    Sti

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...