Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Patents Your Rights Online

Google Patents Staple of '70s Mainframe Computing 333

theodp writes "'The lack of interest, the disdain for history is what makes computing not-quite-a-field,' Alan Kay once lamented. And so it should come as no surprise that the USPTO granted Google a patent Tuesday for the Automatic Deletion of Temporary Files, perhaps unaware that the search giant's claimed invention is essentially a somewhat kludgy variation on file expiration processing, a staple of circa-1970 IBM mainframe computing and subsequent disk management software. From Google's 2013 patent: 'A path name for a file system directory can be "C:temp\12-1-1999\" to indicate that files contained within the file system directory will expire on Dec. 1, 1999.' From Judith Rattenbury's 1971 Introduction to the IBM 360 computer and OS/JCL: 'EXPDT=70365 With this expiration date specified, the data set will not be scratched or overwritten without special operator action until the 365th day of 1970.' Hey, things are new if you've never seen them before!"
This discussion has been archived. No new comments can be posted.

Google Patents Staple of '70s Mainframe Computing

Comments Filter:
  • by jhb146 ( 459905 ) on Tuesday February 19, 2013 @09:58PM (#42952099)

    This is supposed to be new...

    • Re: (Score:2, Funny)

      by Anonymous Coward

      Let alone patentable? I remember when the idiots at the USPTO gave a patent on the x-oring of a byte of screen memory to flash a cursor.

      Too bad we couldn't generate electricity from stupidity. We seem to have plenty to go around these days.

      • I'm Sorry, but... (Score:5, Insightful)

        by sycodon ( 149926 ) on Wednesday February 20, 2013 @12:05AM (#42952815)

        But the USPTO is populated by idiots.

        They are deserving of the disdain and ridicule reserved for the Postal Office, Congress, etc.

        Which is a shame because I've always figured they had some pretty smart people there. The examiner should have taken a shit on the application and mailed it back with a note saying,"this is what your application is worth".

        They are either complete morons or...are getting payoffs. And Google will just use it as club some day on a small outfit that doesn't have half a million dollars to fight a lawsuit.

        • Re: (Score:3, Insightful)

          by jatoo ( 2393402 )

          They are either complete morons or...are getting payoffs.

          Or they are precisely following moronic policy

          • by raymorris ( 2726007 ) on Wednesday February 20, 2013 @08:28AM (#42954805) Journal
            The "summary" wholly misrepresents what the patent is about. It's not about having an expiration date in the filename at all. When someone advocating a position lies to me, as this submitter did, I figure the reason they are lying about the issue is because they realize that the truth doesn't support their position.

            Rather than choosing an expiration date ahead of time, the patented method deletes a file (or not) based on multiplying the time to live by the inverse of the user's quota usage, plus the latest of several modification times. The patent covers only using that specific algorithm, and only when the TTL is represented within the filename.

            Is that algorithm obvious? Several Slashdot commentors who say the are programmers read the explanation of the algorithm and still didn't understand it at all. One might say that if it's explained to you and you don't "get it", it's probably not obvious.
            • ... Is that algorithm obvious? Several Slashdot commentors who say the are programmers read the explanation of the algorithm and still didn't understand it at all. One might say that if it's explained to you and you don't "get it", it's probably not obvious.

              There's a fine line between clever and stupid. If an average programmer reads the explanation, and "Doesn't get it", it could be either. Most patents are very poor explanations for what they are about.

              • by tambo ( 310170 )

                > There's a fine line between clever and stupid. If an average programmer reads the explanation, and "Doesn't get it", it could be either. Most patents are very poor explanations for what they are about.

                But the "average programmers" here aren't motivated to try to understand it. They are motivated to find that the patent is worthless, because that's what the submitter wrote about it, and that's what they are predisposed to believe. So they are prone to glance at the application and say, "well, the cla

            • by tambo ( 310170 )

              > When someone advocating a position lies to me, as this submitter did, I figure the reason they are lying about the issue is because they realize that the truth doesn't support their position.

              I don't think it's flat-out lying. I think it's an example of the echo chamber effect.

              The community believes that patents suck, that patent examiners are inept, and that patentees are using clever tricks to patent things that aren't new. So upon encountering any new patent, the submitters here don't do the har

        • by dbIII ( 701233 ) on Wednesday February 20, 2013 @01:18AM (#42953137)
          It's just another version of a dog licence intended as a petty revenue stream. When a patent is granted that's proof of nothing other than the government is aware of it and has it on file - validity these days is apparently supposed to be sorted out in court and is none of the patent office's business.
        • by jockm ( 233372 ) on Wednesday February 20, 2013 @01:19AM (#42953143) Homepage

          Or they are normal people, without much domain knowledge, forced to handle too many cases in too little time, and fit within the rules of a broken system.

          I personally find that to be the more plausible situation.

        • by reve_etrange ( 2377702 ) on Wednesday February 20, 2013 @02:41AM (#42953495)
          The USPTO is supposed to support itself with fees [uspto.gov]. The largest fee is for reexamination, creating a financial incentive to grant bad patents (which are likely to be reexamined). -da
        • Re:I'm Sorry, but... (Score:5, Interesting)

          by AmiMoJo ( 196126 ) * on Wednesday February 20, 2013 @04:22AM (#42953911) Homepage Journal

          Google only uses patents defensively, at least up until now. In a way it is better that such a ridiculous patent went to a non-troll company that won't use it to suppress the competition, if the USPTO is going to grant such nonsense.

        • by wren337 ( 182018 ) on Wednesday February 20, 2013 @08:20AM (#42954763) Homepage

          This is a case of the USPTO saying "We don't understand this fully, we'll let the courts figure it out".

          And the courts say "We don't understand this fully, we'll defer to the experts at the USPTO".

    • Re:Really! (Score:5, Insightful)

      by rtfa-troll ( 1340807 ) on Wednesday February 20, 2013 @01:16AM (#42953129)

      Yes; maybe; and the whole summary is stupid. From claim one of the patent; the very first paragraph:

      having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers

      So; where the mainframes of the 70s had single consolidated disks stores this is talking about doing this on a distributed filesystem. The area of application is indeed new completely opposite to the claim of the summary.

      Patents are not supposed to control what you do; instead they control how you do it. Since the way that Google is claiming to do this is by going around comparing the timestamps on a bunch of different distributed chunks of a file, this is something that no mainframe of the 70s is likely to have had to to so it may even be a new way to automatically delete temporary files. I wish people would begin to understand this and commenters would point it out every time. I wonder if this isn't a bunch of patent lawyers trying to make us look silly.

      Having said that; If you had a distributed file which kept a timestamp on each of several separate chunks, how would you go about deciding when to automatically delete it? My guess is that the solution you would come up with quickly is basically the one in the patent. You certainly wouldn't have great difficulty in deciding how you do it; suddenly think "maybe there's a patent that might tell me how to do this"; go to the patent office and read the patent then come back inspired and manage to solve your problem only because Google was so nice as to publish their solution. Patents are supposed to record valuable secrets that companies might otherwise keep to themselves in a way that helps humanity. This one is failing at that.

      What this comes down to is that the whole idea of patents on things as abstract as software is stupid and is an illegal interference in free speech a right everyone should have under the universal declaration of human rights. The patent officers of the USPTO and the congressmen who put them there should be arrested.

      • Re:Really! (Score:4, Funny)

        by sjames ( 1099 ) on Wednesday February 20, 2013 @04:51AM (#42954011) Homepage Journal

        The 'plurality' of chunks is irrelevant to the matter of expiring old data. What matters is that you have some sort of metadata telling the system when a blob of data that might resemble a file is to be deleted.

        The USPTO, meanwhile, keeps making me feel like blowing a plurality of chunks into a water filled receptacle where upon manipulation of a lever or chain fresh water will enter the receptacle transferring momentum to the relevant chunks causing them to exit through a drain.

      • Re:Really! (Score:4, Insightful)

        by Alioth ( 221270 ) <no@spam> on Wednesday February 20, 2013 @06:40AM (#42954421) Journal

        It's not prior art, it's obviousness. In terms of file storage, I consider myself "ordinarily skilled in the art", yet 5 years ago I put in such a system to expire files at work on a distributed filesystem. The problem is that the USPTO is allowing obvious stuff to be patented. They even admit as such - unfortunately I can't find the article - but I remember reading the USPTO saying "only 5% of patents they grant are what they call pioneer patents" (in other words, something really new and worthy of patent protection). The reform needs to be that only these "pioneer patent" applications actually get granted and the rest thrown out.

  • by Anonymous Coward on Tuesday February 19, 2013 @10:01PM (#42952115)

    Google Labs is supposedly working on a next-gen programming development environment that allows source code statements to be physically manipulated like a deck of cards.

  • big deal (Score:5, Insightful)

    by dickens ( 31040 ) on Tuesday February 19, 2013 @10:02PM (#42952123) Homepage

    It's not like Microsoft was ever going to be interested in that anyway. They must get cents back from the disk manufacturers for perpetuating their ever-growing temp folders.

  • really? (Score:3, Insightful)

    by lkernan ( 561783 ) on Tuesday February 19, 2013 @10:03PM (#42952137)
    I don't know whats worse, Google applied for this, or the USPTO approved it.
    • by edibobb ( 113989 )
      It's much worse that the USPTO approves trash like this. As long as they do, companies, trolls, and other profit seekers will take financial advantage of the stupidity. Meanwhile, the more patents the USPTO approves, the larger their budget becomes.
    • I don't know whats worse, Google applied for this, or the USPTO approved it.

      I am no particular fan of Google, but if the USPTO is approving this sort of thing, Google (and every other business) has to worry that some troll will land a patent on some basic part of their everyday operations, and if you can afford it, one defense is to attempt to patent everything that you do and use. They may have been as surprised as we are that it was approved.

      For the USPTO this is a wonderful business model: do a crappy job and increase the demand for your services. Another recent example of this

  • by dajjhman ( 2537730 ) on Tuesday February 19, 2013 @10:05PM (#42952149)
    If you actually read the patent, it is specifically for a similar method, but designed for Distributed File Systems. This is different from just a single file being names a certain way. It is an algorithm based on the location of other related files, each different file's modified and Time to Live (TTL) dates, and the factors determined by the, keywords here, plurality of servers. If they tried to patent a regular temporary file that would be different, but this is a distributed system specifically for a file that is distributed in different parts on different systems. If you still think this has been done before, I would love to see the source for that information and gladly would recant myself given that.
    • by samkass ( 174571 ) on Tuesday February 19, 2013 @10:19PM (#42952279) Homepage Journal

      Here is the crux of the first claim: "1. A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each chunk has a modification time indicating when the chunk was last modified, and wherein at least two of the modification times are different; identifying a user profile associated with the file; determining a memory space storage quota usage for the user profile; deriving a file time to live for the file from the path name; determining a weighted file time to live for the file by reducing the file time to live by an offset, where the offset is determined by multiplying the file time to live by a percentage of memory space storage quota used by the user profile; selecting a latest modification time from the modification times of the plurality of chunks; determining that an elapsed time based on the latest modification time is equal to or exceeds the weighted file time to live; and deleting all of the chunks of the file responsive to the determining."

      Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

      • by stevesliva ( 648202 ) on Tuesday February 19, 2013 @10:30PM (#42952365) Journal

        Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

        Unlikely. Nonetheless, anti-patent sentiment is a good thing. Far too many people assume there's some sort of fairness or justice to the whole mess, and there isn't.

      • by Improv ( 2467 ) <pgunn01@gmail.com> on Tuesday February 19, 2013 @10:44PM (#42952483) Homepage Journal

        It's still a dumb patent; a trivial weighting addition doesn't change this. I mean, seriously, that's less complicated than your average photoshop filter, and it's an obvious "innovation" that any engineer would think up if they were to be asked to implement file expiration on Google's platforms.

      • by sycodon ( 149926 ) on Wednesday February 20, 2013 @12:13AM (#42952853)

        Software should not be patentable. Period.

      • Can we please have an end to the stupid articles where someone intentionally mis-interprets the abstract or even just the title of a patent and pretends it's some simple thing that's been done for decades to try to drum up anti-patent sentiment? There seems to be one a week or so.

        Not until we have really stupid patents. I'm not a DFS guy, bu I am a computer guy. In every patent article there's one of you pointing out some supposed novelty. In my field, I've been through one or two and posted blow-by-blow re

    • by c0lo ( 1497653 ) on Tuesday February 19, 2013 @10:29PM (#42952359)

      If you actually read the patent, it is specifically for a similar method, but designed for Distributed File Systems.

      Ahhhh... that's good.

      You see, I was scared shitless that we are still quibbling over patents granted with the only claimed difference over some old methods (patented or not) being "on a computer".
      I see now how wrong I was: we stepped in the glorious era of the "in the cloud" claims.

    • Look, I know you're trying to be insightful and all, but this is Slashdot. We don't let silly facts get in the way of our unbridled hatred of all things Gub'mint!

      It's a patent on storing files. Sure, it has some improvements that nobody's really used before that solve particular problems in a particular field, but I totally saw something similar sketched on the back of an envelope at my cousin's house in 1957, so this is blatantly obvious. I don't need to read the silly lawyer-speak claims to see that this

      • by 1u3hr ( 530656 )
        While the mechanism described isn't very original, Google has lawyers to cover their asses and patent it so that a few years down the trail when they've implemented it all through their cloud they can't get sued by Microsoft or some troll that went ahead and patented an equivalent method. Actually, if the PTO threw it out as unpatentable, Google would probably be just as happy, so they could use it without looking over their shoulders. .
    • Utility, novelty, and non-obviousness. This patent clearly fails at least one of these conditions.

    • Re: (Score:3, Interesting)

      So if I add "over a network" to a claim that makes it patentable?

    • by sycodon ( 149926 )

      From trash is still trash.

  • are doomed to think they have (re)invented it.

    This is so true I have quite a few patents and I see it every day while doing art searches the number of patents claiming things that anyone with even a half way decent understanding or education in the field would recognize as already having been done "way back in the good old days".

  • Wow, Has anyone patented the concept of the old mainframe Generation Data Set recently? I used them extensively back in the mainframe days and could have used a similar concept in more recent systems, but never found a real substitute in either Unix/Linux or Windows. A simple explanation for those who have not heard of them is that they are sort of a push down stack of files managed by the OS with a fixed stack length. You could reference them by a long serial number that was of the format GnnnnVnnnn or
  • by ZorinLynx ( 31751 ) on Tuesday February 19, 2013 @10:07PM (#42952181) Homepage

    The same thing happened in the 80s and early 90s when microcomputers started gaining features like virtual memory, protected modes, out of order execution, etc... People thought these were all brand new things, when in fact mainframe processors had done all that 20 years prior in the 1960s.

    I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes. :)

    Read about the IBM 360/91 if you want details on what I mean. It was amazing when you consider the year it came out.

    • I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes. :)

      You talking about the same old grey beards that gasped when the kids opened the cover on a server and added their own memory, network adaptors, backplanes, disk drives, etc without having to call IBM out to do it?

      • by sbjornda ( 199447 ) <sbjornda@h[ ]ail.com ['otm' in gap]> on Tuesday February 19, 2013 @11:23PM (#42952641)

        I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes. :)

        You talking about the same old grey beards that gasped when the kids opened the cover on a server and added their own memory, network adaptors, backplanes, disk drives, etc without having to call IBM out to do it?

        Yeah, and then rolled their eyes again because the kids didn't know about change control, didn't notify the users about the outage, didn't verify that their backups were good (if they even had backups), and lost 6 months worth of corporate data as a result.

        --
        .nosig

        • by 93 Escort Wagon ( 326346 ) on Tuesday February 19, 2013 @11:59PM (#42952789)

          As an aside... I remember a few years ago - when we were still running tape backups - I went to one of our then-sysadmins and asked him to recover an important directory one of our faculty had managed to delete. I was told he couldn't do it because it would require they stop the backup system for several hours, which would throw their backup tape rotation scheme out of sync.

          So we were continuously generating backups we could never actually use.

    • I bet when all the kids were super-excited about programming on the i386 with its "OMG VIRTUAL MEMORY!!!" the older guys who had worked on mainframes just rolled their eyes. :)

      Well, it was super-exciting to have it on the desktop for a reasonable price, yeah. I can't speak for everyone of that generation, but I appreciated it while still understanding perfectly well that it wasn't a new invention.

  • Oh bullocks (Score:5, Insightful)

    by the eric conspiracy ( 20178 ) on Tuesday February 19, 2013 @10:13PM (#42952231)

    The summary is wrong. Folks, please stop reading the abstract, and read claim 1 instead.

    This is what is patented:

    1. A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each chunk has a modification time indicating when the chunk was last modified, and wherein at least two of the modification times are different; identifying a user profile associated with the file; determining a memory space storage quota usage for the user profile; deriving a file time to live for the file from the path name; determining a weighted file time to live for the file by reducing the file time to live by an offset, where the offset is determined by multiplying the file time to live by a percentage of memory space storage quota used by the user profile; selecting a latest modification time from the modification times of the plurality of chunks; determining that an elapsed time based on the latest modification time is equal to or exceeds the weighted file time to live; and deleting all of the chunks of the file responsive to the determining.

    • If the patent process were anything like the peer review process, a bunch of distributed filesystem engineers would have been asked how to implement file expiration, and their answer, within five minutes, would sound something very close to this.

      But, more importantly, Google seems to have actually implemented this (not bad, considering the state of things). But who honestly believes they would not have done so without the hope of patent protection?

  • If you screwed up like this in company, you would be fired. Yet some dumbass government worker in the USPTO grants this and several million dollars later it gets sorted out by the courts. One of the reason patent litigation is out of control is because these dumbass don't do their jobs.

    Worse, the USPTO is about to switch from first to invent to first to file. You don't need to invent any more. Just find out what your competitors are doing, patent it, and sue them out of business: http://www.jdsupra.com/le [jdsupra.com]
    • and several million dollars later it gets sorted out by the courts

      No, not really. Several million dollars later, someone capriciously wins, and then there are equally capricious rounds of appeals.

    • If you screwed up like this in company, you would be fired boilerplate anti-government yammering blah blah blah

      When you get out of high school and have your first job that doesn't involve cleaning a grease trap, you'll learn how things actually work in the corporate world.

  • by cowtamer ( 311087 ) on Tuesday February 19, 2013 @10:14PM (#42952241) Journal

    I think it's time for a crowdsourced patent challenge web site run by the USPTO where there would be a period of public comment for each patent about to be awarded in order to help underpaid (and I imagine under-resourced) examiners find Prior Art.

    A lot fewer patents might be awarded, but ones that are would be genuinely new -- this might also save the world billions of dollars.

    • by stevesliva ( 648202 ) on Tuesday February 19, 2013 @10:35PM (#42952401) Journal

      I think it's time for a crowdsourced patent challenge web site run by the USPTO where there would be a period of public comment for each patent about to be awarded in order to help underpaid (and I imagine under-resourced) examiners find Prior Art.

      A lot fewer patents might be awarded, but ones that are would be genuinely new -- this might also save the world billions of dollars.

      http://peertopatent.org/ [peertopatent.org]

  • by Anonymous Coward

    And the inventive step is....
    And the non obvious part is....

    The problem here isn't the USPTO, it's the Patent Appeals Court that modified the Supreme Court decision (that an invention needed to be more than the sum of its parts), and decided that as soon as you'd been told about an invention, your judgement would be tainted by 'hindsight bias' and thus unable to determine prior art. So unless it's written down in that form, the patent should be awarded.

    Can I ask the idiots in the Patent Appeals Court, is TH

    • by Grond ( 15515 )

      What are you talking about? There is no "Patent Appeals Court" in the United States. There is the Court of Appeals for the Federal Circuit. And the standard for non-obviousness was most recently articulated by the Supreme Court in KSR v. Teleflex [wikipedia.org] , a 2007 case in which the Court held that the precise prior art combination did not need to be explicitly "written down in that form":

      As our precedents make clear, however, the analysis need not seek out precise teachings directed to the specific subject matte

  • ...for extracting random phrases out of the middle of a patent document that match prior art and posting them to a web site in order to increase hit rates. Please delete this article or you will be hearing from my lawyers!
  • I make no claims to the validity of the the data, but the example given and the patent are different. The IBM 360 example is about *preserving files* by affording them additional protection, as opposed to the Google patent which is about *deleting* temporary files through adding a "time to live" value actiuallin in the directory/filename with various ways of cleaning out these files, as well as an *additional* indicator that it is a tempory file.

    ...but considering the patent is 26 pages long, if could be f

  • GROW UP (Score:2, Interesting)

    by Anonymous Coward

    geez, when is slashdot ever gonna stop running these stupid articles that only show how little the posters know about patent law
    or, at least, READ THE FILE WRAPPER 111
    MAYBE THE IBM PATENT IS AN X OR Y DOC IN THE SEARCH REPORT 111
    OR THE VERY LEAST, READ THE CLAIMS !!!

    claim 1:
    A computer-implemented method comprising: selecting a file having a path name in a distributed file system, wherein the file is divided into a plurality of chunks that are distributed among a plurality of servers, wherein each ch

  • The NCAR Mass Store (tape archive) had an expiration period attribute (units of days) on the bitfiles. The default, if not specified was 30 days, which effectively made it a temporary file. Expiration periods of 31 days or more were considered more permanent, and the owners would receive email two weeks and one week before the projected expiration date arrived. Expiration processing was run each Sunday, and the bitfiles were moved into the trash, from which they could be recovered for another 30 days bef
  • "We don't even pretend to care."
  • A one liner find command that has been written thousands of times, if not millions of times.

    Not novel, not original, prior art, obvious.

  • by bl968 ( 190792 ) on Tuesday February 19, 2013 @11:34PM (#42952691) Journal

    find /tmp/* -mtime +14 -exec rm {} \;

    • by jcdr ( 178250 )

      Mod parent up.

      No need to add a new date information in the file name. The inode already have the creation, last modification and last reading date. Fare enough to determine witch files are too old.

  • Hey, things are new if you've never seen them before!

    Just wait till you all grow up and discover cougars.

  • I used to service Icon II's which used a primitive form of QNX. When the hard drive filled up, it would start deleting old files. In a school, the oldest unmodified file was usually the master password file. Since these systems didn't have a built in root login, this means they were self bricking.
  • A path name for a file system directory can be "C:temp\12-1-1999."

    Why is google still using a C drive? And Temp [wikipedia.org] is a bad place to store anything of value.

  • I have just patented a system of composing Slashdot posts where the attention span expires after......
  • by bkmoore ( 1910118 ) on Wednesday February 20, 2013 @01:19AM (#42953141)
    Someone once said, most patent applications are a result of a lack of good literature/patent research.
  • ...is done by folks who have the technical knowledge (and people skills) of Tijuana pole dancers.

    Tech companies know this, and have basically been daring each other to attempt patenting more and more outrageous things.

    Sadly, with great success.

    It's amazing what walking into a bar with lots of dollar bills will get you...

news: gotcha

Working...