Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet Technology

Web 2.0 Mashups Almost Ready For Enterprise 69

Dion Hinchcliffe, in a blog post over at ZDNet, talks about the increasing business value of 'Mashup' projects. Some of these, he believes, may soon or already be ready for use in an enterprise environment. He demonstrates one of these upcoming projects, showing off IBM's QEDWiki in a Flash demonstration. The software allows users to create their own mashups from canned widgets, turning data into simple applications with fairly straightforward functionality. From the article: "The motivations for mashups are quite different inside of organizations, where application backlogs and demand for more software that will improve collaboration and productivity are often rampant. If this state of affairs is true, far from having too much software, most enterprises don't have enough to satisfy demand, despite the prevalence of mountains of existing enterprise systems, many of which are underutilized. The arguments for letting users self-service themselves with end-user application tools and getting IT out of the critical path for the backlog of simpler applications are extensive." How important do you think 'self-made' software will be in the future?
This discussion has been archived. No new comments can be posted.

Web 2.0 Mashups Almost Ready For Enterprise

Comments Filter:
  • by yagu ( 721525 ) * <yayaguNO@SPAMgmail.com> on Sunday January 21, 2007 @02:49PM (#17704142) Journal

    I'm not going to ax for extra credit or anything but, I wrote mashups in the 80's. FTA:

    In decades past, the new ideas in computing originated in the enterprise world and trickled down to the consumer world later on (things like databases, computer networks, file servers, and so on). However in the Web 2.0 era, for reasons too complex to go into here, new ideas and approaches are germinating more on the consumer Web than from the enterprise space.

    I would claim this specific notion (mashups) not only originated from the enterprise and trickled into internet consciousness, enterprise "mashups" existed many years ago. I know, I wrote them. It was (or at least we called it) surround technology.

    We took vital pieces of different applications and wrote wrappers which allowed users with very simple interfaces to access more data more accurately more quickly. One example was a service order writing routine for small business that routinely took over 30 minutes... using our "mashup", we accessed the necessary enterprise applications and melded into a single app presentation and shortened the 30 minute process to less than 5.

    I could go on, there were at least three other major applications we wrote (small team of 2, sometimes 3), that were "mashups". The advent of browser technology simply gave us another presentation tool, the notion and mechanics of mashing was still there.

    I've played with Google "mashups", and Amazon "mashups", they're really nothing new.

    There was a (don't know if they're still there) a Strategic Computing Consortium based in Boston, Ma, and they were huge advocates of surround technology and not only taught techniques and reasons for approaching solutions this way (I won't go into it -- it was a six-week class). And they provided and sold tools and consulting for putting these new applications together... the CEO (I believe) was John Donovan, author of a few college texts on OSes, and another major contributor was Stewart Madnick, one of the original authors of CMS (IBM's Conversational Monitoring System).

    I'm won't claim they were the "founders" of mashups, but what they espoused and taught was mashup technology, and they were teaching it in 1986 (that's when I attended the consortium). The more things change, the more they stay the same.

    (Also, as an aside, the article implies this new magic allows for "easy" creation of new applications. This is hardly so. All the care and due diligence of putting an application are still required. The effort can still be significant... There is certainly time saved if a team leverages existing critical applications but to toss this out as magical and easy for any end user community to leverage is probably glib and misleading.)

    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday January 21, 2007 @02:58PM (#17704226)
      (Also, as an aside, the article implies this new magic allows for "easy" creation of new applications. This is hardly so. All the care and due diligence of putting an application are still required. The effort can still be significant... There is certainly time saved if a team leverages existing critical applications but to toss this out as magical and easy for any end user community to leverage is probably glib and misleading.)

      Yep, we see that every few years. Strangely enough, it coincides with the latest new "paradigm".

      I blame Star Trek. People want technology to be magically easy to configure and re-purpose. But it isn't. Computers don't "think" like people do and it takes a lot of work for a person to think the way a computer does.

      Being pretty much accurate for most of the data most of the time is what you get when the untrained person attempts it.
      • by Anonymous Coward
        "I blame Star Trek."

        I don't.

        "People want technology to be magically easy to configure and re-purpose. But it isn't."

        Let's ignore the faction that benefits from the status-quo.

        "Computers don't "think" like people do and it takes a lot of work for a person to think the way a computer does."

        It's easier to change computers than it is to change people.

        "Being pretty much accurate for most of the data most of the time is what you get when the untrained person attempts it."

        They're usually better domain experts than
        • Car analogy time! (Score:4, Insightful)

          by khasim ( 1285 ) <brandioch.conner@gmail.com> on Sunday January 21, 2007 @04:35PM (#17704922)
          Let's ignore the faction that benefits from the status-quo.

          No, let's look at cars. The heavy equipment that usually takes a new driver a few months to "master".

          And yet tens of thousands of people are KILLED while operating these every year. And I'm not even talking about crippling injuries, non-crippling injuries or property damage.

          The fact is that even when their LIFE IS AT RISK people fail to handle the technology they have correctly. Even after being trained on it.

          So why would they spend more time and effort learning how to program effectively?
          • Let me get this striaght: Apples are not oranges, therefore people who cannot throw apples should not even think about growing oranges.
        • >> "People want technology to be magically easy to configure and re-purpose. But it isn't."
          > Let's ignore the faction that benefits from the status-quo.

          While you can make good technology that works well, ultimately it does rely on a user who knows what they're doing. There are plenty of untrained users who can't figure out anything beyond the wall plug. You can make up new meanings for words like "faction" all you want, but it won't change the fact that I know these people and I answer their illo
        • "Being pretty much accurate for most of the data most of the time is what you get when the untrained person attempts it."

          They're usually better domain experts than the turf-protecting programmers.

          You don't actually work in an a large company do you? They may be experts at their 'domain', but they are almost always incapable of translating that knowledge into database tables, data structures or algorithms that make for good software. The art of enterprise programming is to accurately gauge what funct

    • I'm won't claim they were the "founders" of mashups, but what they espoused and taught was mashup technology, and they were teaching it in 1986 (that's when I attended the consortium). The more things change, the more they stay the same.

      The techniques may be the same or similar. The difference is the vast amount of information and software available on the internet today, along with the power and variety of tools available on any random pc to work with it. Just about any question an individual has can be

    • by Anonymous Coward
      Deprived of the command line, lacking the ability to simply pipe information between processes, the web generation has rediscovered the simple fact that many tasks are best solved by letting the end user combine small, best-of-breed tools.
    • 1987: Wrote software and macros that allowed the users to work with a single interface to pull real estate data from Dataquick, parse it out, blow it into a database application and spreadsheet, and generate reports and alerts. And everyone in the office was connected using the $25 Network software for sharing drives and printers at a time when Novell Netware was ungodly expensive. Ah, yeah, those were the days....

  • Importance? (Score:4, Insightful)

    by Colin Smith ( 2679 ) on Sunday January 21, 2007 @02:52PM (#17704180)

    How important do you think 'self-made' software will be in the future?
    Mmmm. Inversely proportional to the importance of security.

     
  • hmm... (Score:5, Insightful)

    by Forbman ( 794277 ) on Sunday January 21, 2007 @02:55PM (#17704194)
    Several attempts have been made at this in the past. In many companies, there is one or two Excel uber-users (or even, gasp, actual developers), who are able to understand parts of the enterprise's accounting, ERP, CRM, etc. databases and make tools...er, workbooks, that facilitate some of the necessary analysis or other operational needs of the department they work in, even if it means "enter data from here in the app form X to this cell here", i.e., manual screen scraping.

    The company I am contracting at is trying to do something like this with an enterprise rules engine by TIBCO. Others provide various kinds of APIs that hide the gory details of the database or application interface, whether it is SAS, SAP ABAPs, etc.

    It might work in a general sense, but it will still involve developers at some point to bridge the gap between functional experts (i.e., accountants) and the application, in order to fit the application to the business, and not the other way around.
    • I must concur with this point, and also rejoice with the "mashup" trend. Such Excel über-users were responsible for just about half the customer specific applications I did in the 90's ...

      Mashups such as described is simply another way for intelligent superusers to get out of their depth and, subsequently, call in the cavalry (that would be you and me, hehehe) when the real world hits their application.

      Mind you, this is not a bad thing, just another variation of an age-old trend ... ask any carpenter o
      • by Myself ( 57572 )
        And what it means for regular users when there's no budget or appetite for cavalry-calling is that they suffer at the hands of recklessly clueless Excel fiends who think they're God's gift to project management.

        Those same manager folks would, even if there was plenty of budget for software development, staunchly insist that their million-column spreadsheets are working fine, while forcing underlings to spend half (literally, half, I counted) their time maintaining and updating disparate copies of the same d
    • Anybody using Excel for application development should be shot. I've seen it used for stock-tracking, customer records, support incidents and God only knows what else.

      Databases exist so you don't have to write macros to move cells around in Excel. Learn to use them.
      • I once saw an Excel spreadsheet some guy had created to track assets in a newly built hospital. A very large hospital with hundreds of different assets in each room; door handles, taps, tables, light bulbs, curtains etc etc etc etc.

        As well as having separate sheets for each of the hundreds of rooms containing the hundreds of individual assets in each he also had sheets for every type of asset with the purchase order details, further sheets for when the assets were delivered and yet more to track when they w
  • Is a mashed-up trailer for Star Trek Enterprise, a la "Ten Things I Hate About Commandments" or "Must Love Jaws".. That might make that crappy show entertaining.

    http://www.youtube.com/watch?v=t4UIJTt-vdU [youtube.com]
  • let it die (Score:2, Insightful)

    by Blakflag ( 95052 )
    Please, let this horrid buzzword die. Right now. All we have to do is convince the Slashdot editors to stop injecting it into articles. Last time I turned around, "mashup" meant some kind of Frankenstein DJ set. Now it means the same thing as connecting software packages together by end users? WTF.
  • But where in the world is this Web 2.0 taking place in the rest of the world? Perhaps in small shops, but certainly not throughout any enterprise or even close to being ingrained as a solution in any major IT firm that I know of. In fact, the IT industry has gone the opposite road from "intuitive and creative" and has wrapped itself around the "software axle".... making policy based on software instead of choosing software that is intuitive to policy. Irregardless of why this has happened, it has happened,
  • by Anonymous Coward
    I remember the whole justification for the expense of web services and the xml-ification of everything was the promise of doing just this. No more Com, OLE, COM+ etc where everyone must have MS Office installed to get things done. Software is available in any browser, any device any where at any time.

    You make a public stateless web service. RSS feeds of content. Internet enabled APIs. Mashups are the logical result of being able to pull in data from anywhere, control it and use XSLT etc to change the l
  • "How important do you think 'self-made' software will be in the future?"

    About as important as it's been up until now. The vast majority of people who have to use computers are totally incapable of using them beyond launching applications with them. In an era where people have to be trained on specific keypresses and mouse clicks for specific applications, there is exactly zero chance of these people developing any kind of software, using any kind of environment, to solve any kind of business problem.

    There
    • by Cederic ( 9623 )

      Yes it will. It means that the ones who prioritise financial rewards above job satisfaction will be able to make a lot of money supporting businesses when their mashed up web apps that they've managed to incorporate into their business critical process go horribly wrong.

      It also means that low-paid people in IT departments will start bringing large knives to work. It's already bad enough being asked to fix issues in the random excel or access 'applications' business people have built up.

  • What the fuck is up with using "mashup" in this way? Why not call it "user-configured software" or something?

    A mashup is a music term, meaning a song made up from the parts of other songs.

    • What the fuck is up with using "mashup" in this way? Why not call it "user-configured software" or something?

      A mashup is a music term, meaning a song made up from the parts of other songs.

      Sort of like how spam was Hormel's canned spiced ham product before this whole junk email marketing thing?
       
  • Now enterprise software developers too can take a hodgepodge of technologies and combine them into a poorly designed application of Frankensteinian proportions.

    Wait, how is this different than before?
    • before we seen anything web 2.0 widely yet,
      What? You haven't seen anything web 2.0 yet? Try visiting flickr, wikipedia, google calendar, last.fm, even slashdot itself. All of these are, in some way or another, web 2.0. But you don't have to take my word for it. Ask Tim O'Reilly [oreillynet.com], the coiner (coinant?) of the bleeding buzzword.
      • yea, and as i said in other topics before, all of these are high-volume, big boy sites. No sites under those need to use web 2.0 elements to reduce the server side load by dumping it to client side. hence noone is asking them too from developers. it pumps up devel costs very high
        • IIRC, none of the sites I mentioned except for google calendar originated with a big company. All the rest started small, and built a success on the prettiness and smoothness of use that AJAX and friends give, and/or on that other main piece of web 2.0, user contribution. AJAX is not all there is to web 2.0, as you would know if you had read the page I found for you, and all but one of the sites I mentioned were web 2.0 sites long before they were high-volume, big boy sites. Do you think that they would
          • Do you think that they would have made it big if they hadn't used web 2.0 concepts?

            Well, tbh i think that post 1998 success on the internet is similar to success (!) in lottery. its little different from fate.

            well, ajax is in fact a goodly bit about reducing server side loads. You, instead of allowing repeated requests from a web interface, just bundle the whole data and send it to the client, but the data wont get displayed until the visitor does something to display it. hence, the neatness of the in
  • The idea is pretty cool, but my problem with all the IBM tools it that it looks very bad :)

    You can instantly see that the icons (cool green colors) are made a by a graphics designer, but the rest of the website looks like it could be made by any of the millions myspace users. Horrific!

    It uses 6 or 7 shades of blue that don't match...

    Err.. yes, I'll stop nagging like a woman now.
  • by pavera ( 320634 ) on Sunday January 21, 2007 @04:30PM (#17704884) Homepage Journal
    "Mashup" is most possibly the worst word that has ever come out of the technology sector as a buzzword.

    First it sounds like "an amalgamation of multiple different components into one" but when I look at all of the sites/services that are referred to as "mashups" none of them fit this description. QEDWiki is a wiki, it doesn't appear to be "a wiki with a calendar attached" and it certainly doesn't appear to be built from 10 different components or easily integrated.

    the article mentions zillow, which is an online real estate directory.... It has no "mashy-ness" about it at all.

    Anyway, its a stupid word that doesn't mean anything
  • Nausea (Score:3, Insightful)

    by pwroberts ( 600985 ) <slashdot AT pwroberts DOT com> on Sunday January 21, 2007 @04:31PM (#17704890) Homepage Journal
    "Web 2.0 Mashups Almost Ready For Enterprise"

    What a disgusting, vapid headline :-(

    That is all.
  • by gentlemen_loser ( 817960 ) on Sunday January 21, 2007 @04:49PM (#17705056) Homepage
    Has anyone actually watched the flash demo? Sadly, I have wasted a good ten minutes of my life that I will now never get back watching it. In doing so - I took notes on two terms that I found interesting:

    Situational Application: Come on people, WHAT fucking application on the planet is NOT situational? I've NEVER used an application that was NOT situational - be it a game (entertainment), word processor (solving a business need), or anything else for that matter.

    My other favorite:

    Data driven application: As opposed to what?!? A bullshit driven application? Ah yes, that is officially MY new buzzword: Bullshit driven application. You heard it here first folks....
    • A data driven application is surely one in which the content of data streams drives activity. An example is a real-time monitoring application in which the current state of incoming data and historical data gets fed into a rule engine, and events get triggered based on this combination of the system state and the history. A more detailed example might be an engine management system which looks at current operating conditions, load demand, and the history of vibration patterns, exhaust temperatures, fuel flo
      • A change in the data (vibration trending upwards, exhaust or combustion temperature changes trending either way) could result in a variety of events ranging from an alarm to a modification to the operating envelope. Note that there is no specific identifiable external event that results in a trigger.

        A change in the data is not an event how? I can understand your logic in drawing a distinction between user-driven and automated. However, read TFA before attacking people's comments. The person giving the de
  • by Master of Transhuman ( 597628 ) on Sunday January 21, 2007 @06:14PM (#17705780) Homepage
    "The arguments for letting users self-service themselves with end-user application tools and getting IT out of the critical path for the backlog of simpler applications are extensive.""

    And the arguments AGAINST it are very serious and extensive as well.

    Look at all the crap Excel spreadsheet "systems" and badly-designed Access database "applications" that exist in every company.

    This stuff is under no one's control except one or two employees. It is sometimes used for mission-critical decisions. And the reliability and accuracy of the application is not controlled by anybody, let alone the issue of whether proper backups, data vetting and security are being done in such "end user developed" applications.

    This has proven to be bad news in the past for many companies, and will be proven so again, I suspect.

    Applications that aren't that important for a business, such as applications that merely improve the productivity of an employee's personal use of their computer, aren't that bad. But applications that are important for the CORRECT performance of the employee's JOB should be developed by people that have some clue about the issues that surround application development (assuming such people exist in your IT department - which isn't always the case, unfortunately.)

  • by dave562 ( 969951 ) on Sunday January 21, 2007 @08:49PM (#17706778) Journal
    ..A world where every VP becomes an IT expert. I have worked in corporate IT for a little over a decade and I've seen the same scenario repeated again and again. Some department head somewhere will get a bug up his ass about the "system not doing what he needs it to do." and then he'll go develop some amateur application in Access or the like that "does what he needs it to." Life will chunk along great for a little while, then all of a sudden his application will blow up and he won't know why. It will fall on the shoulders of IT to fix his cluster fuck for him.

    There is a reason that companies have an IT department. There is a reason that they hire computer experts. The simple fact of the matter is that every Tom Dick and Harry doesn't have the necessary skill set to develop and MAINTAIN their own applications. Companies need to ensure that they have data integrity and ensure that everyone is working with the same dataset. When you start giving users control over something as mission critical as data applications you are looking for a headache. At the end of the day, you are going to have a bunch of pissed off users and a bunch of pissed off IT guys. The users are going to be pissed because their applications break. The IT guys are going to be pissed because they are expected to support applications that they didn't even develop in the first place.

    If you need to give users access to data, give them a copy of Crystal Reports and send them off to class to learn how to use it. I haven't come across a single situation where a non-technical person needed data out of any system that couldn't be presented to them with Crystal Reports.

    • by Zugger ( 1054322 )
      Are all the people who post to Slashdot IT insiders? Or perhaps they just work at some wonderful company where the IT dept is competent, adequately staffed, and responsive to users' needs? That's great, but outside of that merry utopia there is a wide swath of the corporate world where the IT dept is a hide-bound, under-staffed, and out-sourced resource with little regard for end-user productivity or implementing clever ideas. I am an engineer in a large, unnamed, aerospace company (a member of the DJIA)
      • I agree, I was once in the curious postion of working for a major IT outsourcing company in the outsourcing division which meant I was working as part of the IT department for a couple of hundred businesses.

        The internal IT department for our company was a different entity altogether and spent its entire time and effort in endless in fighting amongst its various departments and was absolutely totally useless. For example the company policy was for every team to have a website and share their knowledge to rep
  • How is this any different than OpenDoc [wikipedia.org], OLE [wikipedia.org] or some of the NextStep [wikipedia.org] OO "kits"?
  • I'm about to be bored watching this all happen again. Sorry if this sounds like flamebait, but it really isn't intended to be. Here's how I read this -- we've got very few new ideas. We've got RSS, aggregators, social networks and content and we've got endless permuations as always. In an effort to generate excitement (and VC) we're inventing new buzzwords and "paradigm shifts" as fast as we can say "Flooz". There's nothing "new" about aggregating various technologies or sources into a single offering.
  • I don't see many commenting on whether or they think their users will actually create anything with anything at hand. Where I work, we have management analysts that need to creatively use spreadsheets, web apps, or anything, in order to do their job well. Well, they don't. If it's not served on a silver platter, they are not going to even try to use it. To be fair, the amount of work that they are expected to deliver is now much, much higher than, say, only 3 years ago. I generally think that it's only the
  • How important do you think 'self-made' software will be in the future?

    Not very.
  • > How important do you think 'self-made' software will be in the future?

    Only as important as the poor IT d00ds who have to support the whole mess once the fly-by-night "finance manager" has left the company ...

    (for "finance manager" substitute any role in the company ... then ask why you have IT "churn" because your staff do nothing but "firefight")

One person's error is another person's data.

Working...