Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming The Internet IT Technology

Best Programming Practices For Web Developers 210

An anonymous reader writes "Web pages have become a major functional component of the daily lives of millions of people. Web developers are in a position to make that part of everyone's lives better. Why not try using traditional computer programming and best practices of software engineering?"
This discussion has been archived. No new comments can be posted.

Best Programming Practices For Web Developers

Comments Filter:
  • by Isofarro ( 193427 ) on Monday September 10, 2007 @05:53AM (#20536327) Homepage

    Then there's Digg; Digg's pages are such a load on the visitor's CPU that I have to click "script not responding, continue?" three times on a page with 800 or so comments with Firefox and a dual-core 2 GHz CPU just to get the page to completely render.

    Sounds like Digg is attaching events to every show/hide link instead of using event delegation and using only one event listener. Browsers can't really handle hundreds of attached event listeners, it is a known performance issue.

    Now using event delegation [scripttags.com] instead of attaching hundreds of events should definitely be in a set of web development best practices.

  • by tajmahall ( 997415 ) on Monday September 10, 2007 @06:03AM (#20536375)
    Actually when /. articles have 700 comments or so I need to answer the "script not responding" error message several times.
  • by nospam007 ( 722110 ) on Monday September 10, 2007 @06:47AM (#20536531)
    (Cargo cults .....The aborigines then proceeded to worship the big birds that dropped those, and pray that they come drop some more of that stuff. And when they didn't come back, they sculpted airplanes out of wood, ....
    ---
    Actually they did the _airport_ (not the planes) out of wood and straw, complete with tower and offices.
    They wanted to attract the planes.

    from Wikipedia: ...
    Famous examples of cargo cult activity include the setting up of mock airstrips, airports, offices and the fetishization and attempted construction of western goods, such as radios made of coconuts and straw. Believers may stage "drills" and "marches" with twigs for rifles and military-style insignia and "USA" painted on their bodies to make them look like soldiers, treating the activities of western military personnel as rituals to be performed for the purpose of attracting cargo. The cult members built these items and 'facilities' in the belief that the structures would attract cargo. This perception has reportedly been reinforced by the occasional success of an 'airport' to attract military transport aircraft full of cargo...
  • by supersnail ( 106701 ) on Monday September 10, 2007 @07:50AM (#20536785)
    Traditional projects using so called "best practices" fail with atonishing regularity.

    Most project failures are covered up by tha management but in environments such as UK government projects where public scrutiny makes it imposible to spin failure into success the failure rate is about 60%.

    In my experience the private sector is just as bad they are just better at redfining success to be whateever crud was delivered, or, quietly cancelling projects and pretending they never happened.

    I would also posit that "traditional" best practices are a big contributer to these failures.
    Among the many problems:-
                1. Analysts paying lip service to user requirements but actually ramming thier own pet vision down the customers throats.
                2. Equating expensive with good. Which leads to chosing ludicrously expensive and inappropriate software where something cheap and free would have worked just as well.
                3. Dumping existing working software because it is not "re-usable" for a "re-usable" design which doesnt work.
                4. Spending eons of time perfecting design documents and diagrams as if they were the end product not just a tool for getting there.
                5. Treating all developers as equals. They arent. If you dont recognise and cherish your start programmers you will lose them.
                6. Failing to reconise the simple fact that if you dont deliver something by the second year the prohject will get canned.

     
  • by ortholattice ( 175065 ) on Monday September 10, 2007 @07:59AM (#20536831)
    Or look at Google Apps, GMail, YouTube. Not even possible without client side scripting. A well programmed client side script (like anything Google's coded) runs great on even a 500MHz Pentium 3.

    The one app where Web 2.0 shines is Google Maps. But Google has been phasing into other things where it is just not appropriate, making them bloated, slow, awkward to use, and even buggy where they weren't before.

    Example: The new Google Groups is an abortion, replacing the older interface (which was already slower and less user-friendly than the simple, plain DejaNews it replaced, with bugs like overlaying ad text on top of posts) with something that is truly horrendous. Nothing is formatted right unless you're in full-screen mode, making copy and paste from a local editor inconvenient. False line breaks are inserted at column 69 (whatever happened to the old column 72 or even 80 worst case?), so I have to switch my text editor to column 69 wrapping just for Google Groups if I want my post to look halfway decent. Google has ignored the huge number of user complaints about the new format (including my own insignificant one FWIW).

    I've switched back to a Usenet reader for regular newsgroups and am much happier. Unfortunately Google-only groups are becoming popular, forcing me to use Google Groups sometimes. This morning I must have spent 10 minutes futzing with a post to one such group due to my post being rejected because it somehow forgot I was logged in, then rejected because of a timeout, then... oops I forgot to wrap my text editor at column 69, so the post looked like crap with successive long and short lines and code formatting that was barely comprehensible.

    Example 2: I've played with Google Analytics, but god is it slow. Finding out a referring page is often impossible because it strips off everything after "?" in the referring URL. Maybe managers with time to kill like its pretty interface, but just give me the quick and dirty output of analog (web server logfile analyser) and I'm much happier.

  • by apt142 ( 574425 ) on Monday September 10, 2007 @08:48AM (#20537153) Homepage Journal
    Agreed, I find it is much, much easier to do small incremental updates and gains than to do the big website of d00m. Chances are, if you work with a customer and get the that version 1.0 up and running well, they'll look at you to do more for them. It is so much easier then to kick into incremental updates. Do small but functionally significant improvements.

    Going from version 1.0 to 1.1 is much easier than from 1.0 to 2.0. And there are added benefits.

    • Since the changes are so small it's easy to explain what is covered in the scope of the change and what you can promise in another change. This keeps yourself from getting derailed by "new shiny's".
    • Not doing a lot of change all at once lets the users get trained to the new functions without getting overwhelmed.
    • As a developer you'll be forced to think more modularly. Which in my experience has been a very good thing.
    • The changes will be able to be done very quickly. So, the managers see progress, the company sees improvements.
    • Having the small projects lets you reorder them by priority and gets you a flexible long term plan.
  • Re:XSLT! (Score:1, Informative)

    by Anonymous Coward on Monday September 10, 2007 @08:59AM (#20537253)

    every browser since 1995 is capable of doing XML+XSLT client-side

    Really? Browsers have supported XSLT since four years before it was standardised (and three years before XML was standardised)?

  • by herve_masson ( 104332 ) on Monday September 10, 2007 @09:04AM (#20537283)
    On the server side the task, dev tools, and target platform are quite classical, and hence the best practices are similar to classic development (and it's mostly classic development after all, parallelizable task).

    Web development adds specific complexity to "classical" dev platforms. You have to deal with 4 dialects at minimum: html, css, javascript, and the one that really run on the server. Worse than that: you are usually given development tools that makes possible (and encourage) to mix these languages endlessly in a single piece of code.

    Web dev best practice #1 to me is: do _NOT_ mix this stuff; keep them well apart as much as you can, otherwise you endup with spaghetty-write-only-code than even you won't be able to read in a few month from its writing (I'm assuming the programmer care about that). Mixing them will also guarantee that you won't be able to reuse your code efficiently.
  • by Nygard ( 3896 ) on Monday September 10, 2007 @10:58AM (#20538845) Homepage
    In the corporate world, there hasn't been a pure "web site" project since about 1998. I said in my book, "Despite lip service, companies didn't really get off the starting line for enterprise integration until they needed to create dynamic websites. Those projects were the impetus that finally forced many companies to integrate systems that have never played well together."

    The place where you need to look for actual software engineering is in the whitespace on the block diagrams. Those innocent looking little arrows that connect the boxes together are the source of most failures and inefficiencies. I've seen one little "interface" arrow implemented with 20 moving parts on half a dozen servers. (Send a message to a queue, the broker picks up the message and appends a line to a file. Once an hour, another broker picks up the file and FTPs it to the "integration server", which then slices the file into lines and uses SOAP to send messages out to a third party.) Talk about failure modes!

    Civil engineers consider the loading factors, material strength, and design life of their structures. Together, these constitute the "design envelope" of the system.

    Software engineers need to think the same way. How long will this system last? One year? Five years? Two months? The level of demand and demand growth over that time span matters a great deal. An architecture that works well for 10,000 page views a day will bomb badly when it's asked to serve 10,000,000. That sounds like a "duh", but it's ignored surprisingly often.

    I could go on and on about engineering systems for real-world survival... but I won't do it here. (Since I already have [michaelnygard.com].)
  • by SlowMovingTarget ( 550823 ) on Monday September 10, 2007 @12:45PM (#20540655) Homepage

    Indeed:

    Backslash can be used to allow continuing the program line past a carriage-return, but you almost never have to use it. Python is smart enough to do the right thing when it sees an open bracket, a comma separated list, and a carriage-return.

    From Moving To Python From Other Languages [python.org].

  • by Electrum ( 94638 ) <david@acz.org> on Monday September 10, 2007 @02:22PM (#20542243) Homepage
    Python needs an equivalent to CPAN. I often use Perl instead of Python because of CPAN. PyPI just doesn't cut it. If I'm writing a quick script needing some library, it's quick to find it on CPAN and install with one command. With Python, you have to search PyPI, hope what you want exists on there, hope it works with the version of Python you are using, etc.

    Perl packages have a standardized method for doing unit tests, and consequently, many CPAN modules have them. Python does not, so most packages do not have unit tests, or if they do, it isn't as easy to run them as make test.

    Python really needs to copy CPAN.

There are two ways to write error-free programs; only the third one works.

Working...