Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Databases The Internet Google Microsoft Mozilla Oracle Technology News Your Rights Online

Key Web App Standard Approaches Consensus 143

suraj.sun tips a report up at CNet which begins: "Browser makers, grappling with outmoded technology and a vision to rebuild the Web as a foundation for applications, have begun converging on a seemingly basic but very important element of cloud computing. That ability is called local storage, and the new mechanism is called Indexed DB. Indexed DB, proposed by Oracle and initially called WebSimpleDB, is largely just a prototype at this stage, not something Web programmers can use yet. But already it's won endorsements from Microsoft, Mozilla, and Google, and together, Internet Explorer, Firefox, and Chrome account for more than 90 percent of the usage on the Net today. 'Indexed DB is interesting to both Firefox and Microsoft, so if we get to the point where we prototype it and want to ship it, it will have very wide availability,' said Chris Blizzard, director of evangelism for Mozilla. ... Microsoft publicly endorsed Indexed DB on its IE blog: 'Together with Mozilla, we're excited about a new design for local storage called Indexed DB. We think this is a great solution for the Web,' said program manager Adrian Bateman."
This discussion has been archived. No new comments can be posted.

Key Web App Standard Approaches Consensus

Comments Filter:
  • by girlintraining ( 1395911 ) on Saturday March 13, 2010 @02:51PM (#31465496)

    it looks like the Golden Age of the web will continue for some time.

    Dude, the web didn't even exist until about 18 years ago. We're still evaluating the impact that the internet is having on culture -- what with some countries defining it as an inalienable human right and others eager to all but destroy or censor the crap out of it, the "golden age" is not what I'd call this time period. I'd call it the friggin' dark ages -- a mish-mash of global entities all competing at cross-purposes, a thriving black market, and every week more of our technology becomes connected to it, and people being burned at the stake for "file sharing", and the world wide web is being crapflooded with advertisements and commercial interests that continually infest the garden of knowledge that is the web like weeds.

  • by John Hasler ( 414242 ) on Saturday March 13, 2010 @03:04PM (#31465598) Homepage

    > Do they claim so?

    The browsers they list as having 90% of the Net have 90% of the Web. As there is more to the Net than the Web they are necessarily wrong.

    > Browser usage is definitely what most people do on the Internet...

    You forget spammers and botnet operators, both large and growing markets.

  • by raddan ( 519638 ) * on Saturday March 13, 2010 @03:06PM (#31465610)
    It depends on what you mean by 'do'. There may be more people 'doing HTTP' on the net, i.e., more people actively involved in that application than any other, but at least half of all traffic on the net is currently BitTorrent, so by that measure, you could say that "BitTorrent is the net". I think that kind of thinking is wrong, though, no matter the dominant application. It's abundantly clear that the net is not one application, but many, many applications, and that a real strength of the current Internet is precisely that this diversity is allowed.

    (Whether we need so many application protocols for all these applications is a different conversation entirely, though)
  • by raddan ( 519638 ) * on Saturday March 13, 2010 @03:26PM (#31465764)
    I think your comment is spot-on, and I think the reason is this: programmers hate network programming. They hate concurrency. CODER WANT SIMPLE.

    When you look at much of the development of platforms, a great deal of effort has been expended to make sure that the programming model is simple. E.g., from the perspective of a typical process running in a typical modern OS, the world still looks like a simple OS: your own flat address space and simple system calls to use to write to disk, etc. Generally, you don't have to deal with interrupts, shared memory, etc. But networking is where all of this breaks down. The location of your storage is important, because while hard disks are slow, network storage is really slow. Some parts of your application run here, and some run there, and here and there may even be wildly different platforms (e.g., 'there' could be a functional language running on a cluster, while 'here' could be a mobile web browser on a cellphone), so race conditions and slow network links and processors are a real problem.

    This constant shifting around is an attempt to find the right complexity balance. I don't know if there is a 'right' balance for all scenarios, but it doesn't look like that's going to stop people from trying to find it. Just look at all the iterations of RPC out there. They all suck, too (you just can't pretend the network doesn't exist!), but that does not stop them from being useful. Just look at NFS.
  • by PotatoFiend ( 1330299 ) on Saturday March 13, 2010 @04:24PM (#31466312)
    Whoa there. Bolting a spoiler and ground effects onto a Prius doesn't make it a Formula One car. JavaScript is fundamentally a procedural (and therefore non-declarative) language. It has first-class functions and closures in addition to some superficial support for programming in a functional style, but the function is not the main focus of the language design and using it as a serious functional language is akin to ricing.
  • by j1m+5n0w ( 749199 ) on Saturday March 13, 2010 @05:00PM (#31466618) Homepage Journal
    I like to think of the current state of the Internet as the Wild West phase.
  • by Nadaka ( 224565 ) on Saturday March 13, 2010 @05:02PM (#31466630)

    Not entirely true. Technically xslt is a programming language and is supported by many browsers. I know of at least one person writing an XML/XSLT CMS.

  • by oztiks ( 921504 ) on Saturday March 13, 2010 @10:34PM (#31469106)

    Exploits is the one of the many issues. How about change control, patching and schema changes, this has got catastrophe written all over it unless the API accounts for a lot more than whats written any serious database application reliant on it would require a strong set of change log rules, shifting data when needed, schema compliance checks before allowing access.

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...