Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
The Courts Privacy

OpenAI Slams Court Order To Save All ChatGPT Logs, Including Deleted Chats (arstechnica.com) 99

An anonymous reader quotes a report from Ars Technica: OpenAI is now fighting a court order (PDF) to preserve all ChatGPT user logs—including deleted chats and sensitive chats logged through its API business offering -- after news organizations suing over copyright claims accused the AI company of destroying evidence. "Before OpenAI had an opportunity to respond to those unfounded accusations, the court ordered OpenAI to 'preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying)," OpenAI explained in a court filing (PDF) demanding oral arguments in a bid to block the controversial order.

In the filing, OpenAI alleged that the court rushed the order based only on a hunch raised by The New York Times and other news plaintiffs. And now, without "any just cause," OpenAI argued, the order "continues to prevent OpenAI from respecting its users' privacy decisions." That risk extended to users of ChatGPT Free, Plus, and Pro, as well as users of OpenAI's application programming interface (API), OpenAI said. The court order came after news organizations expressed concern that people using ChatGPT to skirt paywalls "might be more likely to 'delete all [their] searches' to cover their tracks," OpenAI explained. Evidence to support that claim, news plaintiffs argued, was missing from the record because so far, OpenAI had only shared samples of chat logs that users had agreed that the company could retain. Sharing the news plaintiffs' concerns, the judge, Ona Wang, ultimately agreed that OpenAI likely would never stop deleting that alleged evidence absent a court order, granting news plaintiffs' request to preserve all chats.

OpenAI argued the May 13 order was premature and should be vacated, until, "at a minimum," news organizations can establish a substantial need for OpenAI to preserve all chat logs. They warned that the privacy of hundreds of millions of ChatGPT users globally is at risk every day that the "sweeping, unprecedented" order continues to be enforced. "As a result, OpenAI is forced to jettison its commitment to allow users to control when and how their ChatGPT conversation data is used, and whether it is retained," OpenAI argued. Meanwhile, there is no evidence beyond speculation yet supporting claims that "OpenAI had intentionally deleted data," OpenAI alleged. And supposedly there is not "a single piece of evidence supporting" claims that copyright-infringing ChatGPT users are more likely to delete their chats. "OpenAI did not 'destroy' any data, and certainly did not delete any data in response to litigation events," OpenAI argued. "The Order appears to have incorrectly assumed the contrary."
One tech worker on LinkedIn suggested the order created "a serious breach of contract for every company that uses OpenAI," while privacy advocates on X warned, "every single AI service 'powered by' OpenAI should be concerned."

Also on LinkedIn, a consultant rushed to warn clients to be "extra careful" sharing sensitive data "with ChatGPT or through OpenAI's API for now," warning, "your outputs could eventually be read by others, even if you opted out of training data sharing or used 'temporary chat'!"

OpenAI Slams Court Order To Save All ChatGPT Logs, Including Deleted Chats

Comments Filter:
  • by gweihir ( 88907 ) on Wednesday June 04, 2025 @04:42PM (#65428012)

    AI not so special after all. Still needs to follow the law. Such a surprise.

    • The AI hasn't been ordered by a court. As usual you let your biases cause you to post yet more drivel.
    • Re:Oops (Score:5, Insightful)

      by Zocalo ( 252965 ) on Wednesday June 04, 2025 @05:06PM (#65428058) Homepage
      This bit amazes me:

      a consultant rushed to warn clients to be "extra careful" sharing sensitive data "with ChatGPT or through OpenAI's API for now," warning, "your outputs could eventually be read by others, even if you opted out of training data sharing or used 'temporary chat'!"

      I mean, seriously? This is one of a whole bunch of companies that have been blatently hoovering up any data they can get their hands on without any regard to copyright, constraints placed via things like robots.txt, or thought to the hosting costs that can be incurred by continual spidering of vast amounts of website data, and you *honestly* thought you could trust them with the data you *chose* to provide them with or that it might not backfire like this?

      Zuckerberg was right all along; "Dumb Fucks" indeed.

      • Pandora's box locked with airline luggage locket. Bobby pin reveals all contents. Contents revealed to be stolen from sources.

        That's news?

        I wonder what that consultant charges.

    • Is this different from ordering vpns to log all activity? Would you say the same thing if Apple was ordered to disable any cryptography in the way of newspapers reading everyone's private files for a copyright case? I don't care much for the current ai hype or the ethics of companies peddling it either, but I think a lot of people are too eager to abandon principles on copyright and privacy laws/4th amendment to jail them. Spoiler: Once power to invade privacy is expanded, it won't be confined to only the c
      • by mysidia ( 191772 )

        Is this different from ordering vpns to log all activity?

        Is a bit more intrusive than that. It's like ordering AT&T to preserve a recording of all voice calls. Or odering SnapChat to keep a copy of everything that was sent, even after it's supposed to be destroyed.

        It seems to me like a blanket warrant, and therefore, violating End users' 4th amendment rights and privacy rights which is Unconstitutional. Warrants are supposed to have to be specific and indicate preserving a specific thing or the fi

        • by rta ( 559125 )

          Is a bit more intrusive than that. It's like ordering AT&T to preserve a recording of all voice calls.

          Heh, this is the analogy that came to mind for me too.

          I'm boggled that a US court would order something like this in this day and age.

        • It seems to me like a blanket warrant, and therefore, violating End users' 4th amendment rights and privacy rights which is Unconstitutional.

          It's been established precedent for quite a while now that server logs are the property of the server owner, not the user.

          Warrants are supposed to have to be specific and indicate preserving a specific thing or the files of a specific person or place, for example.

          This isn't a search warrant, it's a preservation order. It's quite common for a court to order a company to preserve all documents (which includes things like emails and server logs) related to ongoing litigation.

  • So how exactly does a judge get to force a business to breach all of their contracts? I'm expecting this will be overturned on appeal pretty quickly.
    • What kind of business plan cannot be done legally? A criminal enterprise.
    • I doubt they will be in breach of any contract. The contracts would likely state they must abide by all laws in whatever jurisdiction they've specified.

      • Yes. However legal compliance only means they must break the contract. It does not mean they arent on the hook for serious damages when they do break the contract.

        The scope of contract breakage required here makes that a somewhat onerous decision to comply with however, and courts actually do tend to halt orders that put onerous requirements on discovery

        • They put things like this in the contracts too

          EACH PARTY’S TOTAL LIABILITY UNDER THE AGREEMENT WILL NOT EXCEED THE TOTAL AMOUNT CUSTOMER PAID TO OPENAI DURING THE TWELVE MONTHS IMMEDIATELY PRIOR TO THE EVENT GIVING RISE TO LIABILITY

          And just incase a bunch of customers all have the same issues

          15.1. Mandatory Arbitration. Customer and OpenAI agree to resolve any Disputes, regardless of when they arose, even if it was before this Agreement existed, through final and binding arbitration.

          15.6. No Class Actions. Disputes must be brought on an individual basis only and may not be brought as a plaintiff or class member in any purported class, consolidated, or representative proceeding.

          5.7. Batch Arbitration....

          Probably most relevant to the OP

          16.9. Force Majeure. Except for payment obligations, neither Customer nor OpenAI will have any liability for failures or delays resulting from conditions beyond Customer’s or OpenAI’s reasonable control, including but not limited to governmental action or acts of terrorism, earthquake or other acts of God, labor conditions, or power failures.

          I like how they put strikes and power cuts in there too.

        • by DarkOx ( 621550 )

          This could actually prove to be an interesting event. We have been here before of course but generally speaking legal holds and discovery are more narrow in scope than this or they have been.

          If courts start to get in the habit of throwing words like 'all' around often in these types of orders and the appellate court holdings turn out to be that such behavior is acceptable, I expect we are going to see legislative action to stay the power of these courts pretty quickly.

          Its not just the trillion dollar GenAI

    • by dfghjk ( 711126 )

      By issuing lawful orders? What authorizes a business to enter such contracts when a court order could prevent them from honoring those contracts?

      • By issuing lawful orders? What authorizes a business to enter such contracts when a court order could prevent them from honoring those contracts?

        I guess the assumption that users themselves would also be entitled to some sort of due process.

      • This is slashdot, so everybody makes bold proclamations about legal implications even though they don't understand anything about contract law.

        If a court orders some action, it would be "unconscionable" for a contract to attempt to forbid it. Therefore, the clause isn't enforceable in regards to whatever action the court ordered.

        It's really, really simple... Courts are the government. Contracts are private agreements. People who don't know which supersedes the other should STFU. But they won't.

        And of course

    • You can't contract the break the law. You can't contract to supply your customers with stolen property.

      And you can't destroy evidence. Any evidence you destroyed is automatically assumed to incriminate you or you wouldn't have destroyed it.

      Where is Pam from Groklaw. If she is still alive she might have to come out of retirement for this.

      Can the LLM people convince the court that their model was trained only on public domain works and non-copyrightable facts?

      • You can't contract to supply your customers with stolen property.

        Nobody has been convicted of anything and there does not seem to be an injunction against continuing to supply their customers.

        Any evidence you destroyed is automatically assumed to incriminate you or you wouldn't have destroyed it.

        Vast amounts of data are destroyed every day as a matter of course. I don't think you can read so much into something that is already routine. Having duct tape and zip ties might make you a serial killer, but probably not.

        • Vast amounts of data are destroyed every day as a matter of course. I don't think you can read so much into something that is already routine.

          Well there Dufus Dan, it turns out that it only kicks in once you know, or should have known, that it was evidence. It has nothing to do with it being "data." When you get a preservation order, then you know unequivocally that it must be preserved, and if you then don't preserve it, it's presumed that the content was whatever would be most favorable to your legal opponent. Otherwise you wouldn't have destroyed it.

          Having duct tape and zip ties might make you a serial killer, but probably not.

          You're even stupider than you look, and that's saying something.

          • Well there Dufus Dan, it turns out that it only kicks in once you know, or should have known, that it was evidence. It has nothing to do with it being "data."

            This is a case where any of that data could be evidence. Since you are so brilliant how exactly do you think they would go about screening for and only keeping that which is?

    • by mysidia ( 191772 )

      Judges have absolutely immunity for all decisions and instructions made while acting as a judge. Therefore, a Judge can never be sued nor prosecuted for what they decide in their court room.

      As for any actions that have to be done to comply with an order - the judge can grant relief from any claims of breach of contract by offering a more-specific order. Anything the company is ordered to do by a judge cannot be claimed against them as a breach of contract.

      • Judges have absolutely immunity for all decisions and instructions made while acting as a judge.

        Yeah I know it was more of a rhetorical question. That said I think a lot of judges think they have a lot more power than they actually have, in an "I order Pi to be 3" sort of way. Kind of like politicians in that regard actually.

        Anything the company is ordered to do by a judge cannot be claimed against them as a breach of contract.

        That is only fair. It also make sense for the company to vocally present themselves as blameless in the matter.

        • That said I think a lot of judges think they have a lot more power than they actually have, in an "I order Pi to be 3" sort of way.

          I think this concern doesn't really apply here. If your lawyers didn't consider that at some point you might have to retain records that you routinely destroy because they might end up as part of a discovery process, then your lawyers have failed you. I suspect that OpenAI already retains many of these logs anyway (for research and training purposes) and their primary concern is that complying with this order will force them to provide incriminating evidence.

          Your concern more applies to the "According to th

          • If your lawyers didn't consider that at some point you might have to retain records that you routinely destroy because they might end up as part of a discovery process, then your lawyers have failed you.

            I don't think it would be unreasonable to expect some better specificity from the judge. Retaining every bit (literally) of input and output for an operation that size has to be technically daunting and seems a bit like authorizing a fishing expedition on a grand scale. The judge also seems remarkably casual about invading the privacy of potentially millions of people. With judges like that who needs politicians?

            • by mysidia ( 191772 )

              Retaining every bit (literally) of input and output for an operation that size has to be technically daunting

              I believe OpenAI already retains all these logs for their own purposes (It's part of the product), and the court order just requires them to preserve what they have retained for an extended period of time - as in, Indefinitely.

              It is not "technically daunting" to retain data on existing systems that they develop already retain and make sure Any normal data destruction or adulteration process Is al

            • If your lawyers didn't consider that at some point you might have to retain records that you routinely destroy because they might end up as part of a discovery process, then your lawyers have failed you.

              I don't think it would be unreasonable to expect some better specificity from the judge.

              When it comes to writing a contract and privacy policy, it would be unreasonable. The lawyers will, if they aren't incompetent, always add a clause that says the privacy policy does not apply to anything subject to court order.

              The judge also seems remarkably casual about invading the privacy of potentially millions of people.

              Clearly you've never worked or been involved in litigation. The logs would almost certainly be subject to a protective order, so the only people that will ever see them are the lawyers and the judge. The only exception might be a small number of excerpts that are presented as evidence

    • Never mind contracts. Type an address or phone number into a chat and, congrats, it is now PII, protected under the CCPA... not a contract, a law. And Californians have the right under that law to prevent it being shared in the first place and to have it deleted, should we so desire. A concerted effort to send PII, spam out opt-out and deletion orders, and then spamming out the ensuing enforcement claims might be a good way to stop up the works and screw with this overreaching judge and his delusions of

      • Retaining data to comply with a court order is not "sharing" of data.

        And if there are concerns about PII in the data, that merely creates a duty to protect the data from unauthorized disclosure. Disclosure to comply with a court order is not ever unauthorized.

        Don't be a moron. Make more intelligent arguments.

    • So how exactly does a judge get to force a business to breach all of their contracts?

      That's easy, the judge just tells them to breach all their contracts. After all, who decides if a contract is valid? A judge. How do you enforce a contract? A judge.

      Laws and court orders will always trump a contract.

    • by suutar ( 1860506 )

      If your contract doesn't have a clause mentioning that a court order can override any provision in it, your lawyer should be fired.

  • If Google does it for the government to be able to monitor people, why wouldn't the government make AI services keep this data for the government to monitor as well? You think the NSA is not going to monitor this stuff?
  • by Murdoch5 ( 1563847 ) on Wednesday June 04, 2025 @05:16PM (#65428092) Homepage
    I was literally on a phone call 30 minutes ago (it's 5:15pm EST), where we were talking about LLMs, and I mentioned you should run them locally, for privacy reasons. Everyone basically laughed and said, "Nah, just use ChatGPT.", and I warned them that if you don't control the environment, you can't guarantee privacy. Now look at what story showed up! You might need a high-end workstation, but being that protects your company, product and IP, just spend the 4k per computer.
    • You secretly think you alone control the Internet and it bends to your whim, admit it.
      • Hahaha nope, but I do control the privacy policy of my company, and I think people at the company need to pay attention.
        • by dfghjk ( 711126 )

          Working for a loser as an IT guy. Sorry, LEAD IT guy. SuperKendall could give you pointers on how to use emacs.

          • What would it matter if I run Emacs, unless I'm missing some point? Running a moderate model, 30 B parameters, you can easily max out an entry-level computer, and without some decent graphic horsepower, and system memory, you'll quickly be stuck in mud. If you want to be productive, with moderately sized models, you need the horsepower, it's a resource limitation concern.
    • Many years ago, when Motorola was in buyout talks with Google, they used Google docs extensively. One can only wonder if Google got a better deal because they were able to read Motorola's internal discussions. I don't know if they used Google docs for the discussions, but I do know there were quite a few people at the company who expressed no concern for the possibility that Google docs could leak proprietary information.

    • I was literally on a phone call 30 minutes ago (it's 5:15pm EST), where we were talking about LLMs, and I mentioned you should run them locally, for privacy reasons. Everyone basically laughed and said, "Nah, just use ChatGPT.", and I warned them that if you don't control the environment, you can't guarantee privacy. Now look at what story showed up! You might need a high-end workstation, but being that protects your company, product and IP, just spend the 4k per computer.

      Not sure why anyone thought privacy was guaranteed using a web service running in America. You’d have to have a highly classified reason and authorization to assume so. Otherwise you live in the land of litigation. Anything can be subpoenaed with discovery. Even if you were a business nuking audit trails and user logs on the daily, it’s probably not legal for you to do so. Because litigation.

      • I don't know why, I'm in Canada, but why do you trust any company to hold your data, unless that company goes out of their way, to prevent the data access?
    • was literally on a phone call 30 minutes ago (it's 5:15pm EST), where we were talking about...

      ... Now look at what story showed up!

      You should be used to that by now if you're putting your data into these kinds of systems.

    • I am absolutely amazed by the modern attitude of "Naw, we'll outsource storage of everything we do to a third party that we probably can't trust" that applies to everything. In the late 1990s the ISPs and email community took away our ability to self host email, in the 2000s they made it harder and harder to host blogs, and suddenly by the early 2010s EVERYTHING, even your Office documents, now need to "go to the cloud" because it's "simpler".

      If people had pushed back at the time, all of this could have bee

      • Recently, my company was going to buy a large Monday license block. No one read the privacy policy, which was so comically abusive, that even Episten would have pushed back. Here's the policy: https://monday.com/l/privacy/p... [monday.com]

        It says they will hunt down any information they can find about you, regardless if you consent. They will sell your data, they will abuse privacy protections that exist at the federal level. They will use invasive trackers to bug your browser, and so on. It's actually one of the
  • by wakeboarder ( 2695839 ) on Wednesday June 04, 2025 @05:19PM (#65428098)

    Enron didn't like keeping files either. Paper trails are bad for companies that could get sued. The funny thing is these AI companies keep most of the prompts anyway to improve training. So it really isn't about costs of storage, it's about legal liability.

    It's a lot easier to say I don't have that file if someone wants to take you to court.

    • This isn't a paper trail. It's the logged chat history of its users. All of its users.
    • by dfghjk ( 711126 )

      "So it really isn't about costs of storage..."

      Who said it was about the cost of storage? And prompts and user logs are not the same thing. But making unsubstantiated claims does make you sound smart....

    • Enron also didn't have data subject to privacy laws such as CCPA and GDPR. Want to even speculate as to how many users OpenAI has? How many of those users have at some point entered PII into a GPT session, even if they didn't realize it at the time? How many users have entered PII into apps that use OpenAI services of models? There is a whole world... no, an entire GALAXY... of difference between Enron's accounting ledgers and literally everything that anyone has typed into a ChatGPT (or services that u

    • by allo ( 1728082 )

      The storage cost? Many people upload a profile picture that is larger than an archive of their chatlogs.

    • The funny thing is these AI companies keep most of the prompts anyway to improve training.

      No, the use of prompts for training data is subject to the contract. Business clients can definitely opt out of their logs being used for training.

  • This is amazing. They are claiming that keeping the logs is a privacy risk to customers?

    Is their security *that* bad???

    If so, the order should include taking all of their systems and backups *completely* offline!

    • This is amazing. They are claiming that keeping the logs is a privacy risk to customers?

      Is their security *that* bad???

      It *is* if they are keeping the chat logs.

      If the fucking NSA leaks like a faulty tap every time some private pyle accidently develops a conscience, what makes you think OAI could keep your secrets?

      • Keeping logs is normal.
        Further, their chat history is part of their product.

        What's at question here, is the expiration of those logs and subsequent deletion, which the court has ruled must stop, so that the plaintiff can scour more customer information in discovery.
        • by mysidia ( 191772 )

          Keeping logs is normal.
          Further, their chat history is part of their product.

          The logs that are part of the product ought to have been encrypted with a zero knowledge solution; for example: The customer's login password (while the provider knows only its hash), or a chosen key known to the customer or stored on the client but not the server is necessary to decrypt the chat history.

          • The logs that are part of the product ought to have been encrypted with a zero knowledge solution; for example: The customer's login password (while the provider knows only its hash), or a chosen key known to the customer or stored on the client but not the server is necessary to decrypt the chat history.

            The court would then simply order them to collect the customer passwords as they were used to decrypt the logs.
            Remember- OpenAI's system must be able to read this history- because the GPUs doing the token embedding are theirs, not the customer's.

            Further, the logs would then be useless to OpenAI for its own purposes. These logs are theirs, not the customer's. That does not mean they don't have a responsibility to the customer's privacy outside of their agreed to terms in the EULA.

            • by mysidia ( 191772 )

              The court would then simply order them to collect the customer passwords as they were used to decrypt the logs.

              Anyway, interesting theory. But it seems to be way outside normal court civil procedure or parties' power to request such a thing for the discovery process. That's about as extreme as suggesting a Judge would order installing keyloggers -- and it just doesn't happen, as there is no law allowing this, and is so outrageous it would likely be thrown out immediately. You can't be subpoena'd for dat

              • No, it just means you can't blah-blah your way out of obeying court orders, and if you bend over backwards to attempt to avoid compliance, you're just demanding the court go further and make you bend even more.

                See also: Legal canaries, and why they don't work and if you set up one strong enough you'd have put yourself in the position of having to lie in order to comply with the order. And there's no defense, because the existence of the canary shows that you anticipated the situation.

                • by mysidia ( 191772 )

                  No, it just means you can't blah-blah your way out of obeying court orders

                  No, but you can make records unavailable to be produced during discovery by not having access to them in the first place. The federal rules and procedures for discovery in civil cases are set by law, and courts don't just ignore procedures and the law to issue random writs.

                  See also: Legal canaries, and why they don't work

                  You are confusing the civil system with the criminal system. Canaries were a response to special powers granted

    • As someone who deals in PCI compliance on the regular, yes, the goal is to retain as little as possible.
      If someone forces me to retain credit cards I'm not using, it increases my liability, and yes, presents a privacy risk.

      Another example, is for CALEA purposes, I have to retain certain mapping logs that I'd prefer not to keep. Those are absolutely a privacy risk for people.
      If someone who is not you sues me, would you want me openly divulging your pornhub use frequency to the court?

      Your disconnect her
      • by mysidia ( 191772 )

        PCI compliance on the regular, yes, the goal is to retain as little as possible. If someone forces me to retain credit cards I'm not using, it increases my liability

        PCI requirements would be different since they involve systems that read and process authentication creds which are for someone else (Your systems are not the relying party the creds are to be presented to. And you don't own creds - they belong to an end user and a bank, And your use of these numbers will be regulated by agreements with your o

        • PCI requirements would be different since they involve systems that read and process authentication creds which are for someone else (Your systems are not the relying party the creds are to be presented to. And you don't own creds - they belong to an end user and a bank, And your use of these numbers will be regulated by agreements with your own merchant providers.), thus the storage is only necessary for a short period. That is a fundamental security tenet systems in the middle of an authentication exchange have to make sure authentication materials are destroyed as soon as possible. That's so well established that retaining unneeded auth data amounts to gross negligence.

          This is completely incorrect. PCI governs security practices of storing actual customer billing information, including credit card numbers.
          Storage is necessary for as long as it's used.

          You are talking out of your ass. Be gone.

          • by mysidia ( 191772 )

            This is completely incorrect. PCI governs security practices of storing actual customer billing information

            No. You are completely incorrect and full of bullshit. PCI specifically governs transmission and storage systems for payment card processing data, period; any other systems and data are out of scope of PCI. That is all.

      • As someone who deals in PCI compliance on the regular

        Over the years that you've been blathering nonsense on this site you've been an expert on everything. I assume what you mean is that one of the offices you clean does PCI compliance.

        Are you really suggesting that you comingle customer credit card numbers in logs containing the users other interactions with the service? And at the same time, that you're responsible for PCI compliance? Are you a fucking idiot? (Don't answer that)

        • Over the years that you've been blathering nonsense on this site you've been an expert on everything. I assume what you mean is that one of the offices you clean does PCI compliance.

          I have ~15,000 customers monthly-recurring customers (not me personally, but in my capacity as Chief Engineer of an organization)
          . You will find that most Chief Engineers are experts in quite a few things. That's why they're the Chief.

          So no, the PCI compliance comes in from having to take recurring payments from many thousands of people.

          Are you really suggesting that you comingle customer credit card numbers in logs containing the users other interactions with the service? And at the same time, that you're responsible for PCI compliance? Are you a fucking idiot? (Don't answer that)

          Of course not. The principle of least retention was where it applied. You're clearly not intelligent enough to take the blocks I've given you and build it into a coherent

  • by Rosco P. Coltrane ( 209368 ) on Wednesday June 04, 2025 @05:27PM (#65428122)

    I swear to God, if I ever meet someone in the flesh who tells me something or someone has been slammed, blasted, destroyed, torched, trashed or grilled, I'm gonna punch them in the face.

    It's impossible to read any headline without those toddler English-level verbs used and abuse all the fucking time these days. It's really annoying!

  • by Mirnotoriety ( 10462951 ) on Wednesday June 04, 2025 @05:42PM (#65428154)
    I thought OpenAI already preserved chat logs. Else how did I get an email from head office suggesting I was in violation of terms.
    • Indeed. And if you believe that they aren't using your prompts to tune/train on EVEN IF YOU PAY AND SELECT PRIVATE you are a fool. I trust them even less than I trust Google to not scan my email to offer ads. This is why I canceled my Google Apps service amd moved to Protonmail

    • by mysidia ( 191772 )

      That only shows they have logs and they are not immediately destroyed.

      Your result does not show they preserve logs (Keep them for an extended period of time, or indefinitely, even after you deleted the past sessions in your account's web dashboard)

    • Depends on your contract with ChatGPT. Also your post is nonsequiteur. Did you get in the email a copy of your chat log? Was the system scan for your breach run retrospectively or was it at the time of using the system? You can absolutely send people violations without preserving chat logs.

  • of course, this is the court order that they actually enforce. not the ones that matter to real life
  • Including every bit of private data they can get their hands on, and then act all huffy and righteous when theyre ordered to turn some of it over. I sure as hell hope the courts dont buy into that bs.
  • is that even technically doable in the long run? Short answer is yes... but....

    I'd have to imagine that the space, bandwidth and processing considerations for doing this would be costly... especially if the chat logs contain files. Even if only text is copied... with backups and encryption... then good luck to the lawyers that ask to review those chat logs... how would you process that mount of data? (skipping over the legal and contract stuff... just looking at this from a technical viewpoint. )

    • I'd like to donate some space to this effort. OpenAI can go ahead and delete the terabytes of data they have scraped from my websites over and over and over again as if their stupid bots are caught in a loop.

  • by ndykman ( 659315 ) on Wednesday June 04, 2025 @08:44PM (#65428434)

    If you are engaged in a suit and you aren't very careful about what you retain and you are found to have deleted anything that the court ordered to be retained. And that is usually a very, very broad set of material.

    This does not mean that everything is made public or is entered into evidence. Also, if things are entered into evidence, they may redact personal information to protect third parties.

    And there is little that OpenAI can do to stop this. It's a well established practice. At some point, if you are a big enough company, you have to use solutions to manage all this.
     

    • Even if the court hasn't ordered it, once you're sued you have an obligation to retain anything which would be relevant to the lawsuit. It's called a litigation hold, and I work in a field where they're common. That obligation supersedes company data retention policies and requests from other parties (including the plaintiff, if the plaintiff eg. uses Outlook's "recall message" feature to try and remove a message they sent us we're required to ignore it and retain that message and the "recall message" reque

  • by ledow ( 319597 )

    If you send data to a remote service... regardless of the guarantees given... you have to assume that that remote service had, processed and most likely stored your data in some fashion.

    Literally things like the data protection acts and GDPR just assume this to be the case. If you gave data to a third party - that data is still your responsibility. If they have the potential to access it, you have to assume that they are/could be accessing it. If you give them permission to process it, they are required

  • A U.S. judge initially asked OpenAI to preserve only chats that users had deleted, a narrowly targeted legal hold intended to balance evidentiary needs with user privacy. But when OpenAI said it couldn't isolate those chats without risking violations of European privacy law, the court issued a sweeping, all-user retention order instead.

    At the heart of the dispute is a deeper design flaw: OpenAI built its system without distinguishing data by “service jurisdiction”, whether a conversation was han

    • by DarkOx ( 621550 )

      The entire shield agreement and current system of jurisdiction around digital services needs to be tossed. It really is unworkable.

      Addressing it at the design phase beyond, we need to be able to silo operations within national boarder and geofence users into their correct region is impossible because the rules of the road get changed often and often enough in pretty profound ways.

      The international agreements need to be simplified to 'where the server is is the jurisdiction that applies'.

      • by dbu ( 256902 )

        I get the impulse, a lot of people are fed up with legal chaos. "Server = jurisdiction" feels clean, simple, and under your control. But that model’s been dead since at least GDPR. That would require every country to agree or surrender their data laws to foreign servers. That’s never going to happen. These days, you’re accountable not just for where the server is, but where the user is and who they are.

        If you offer a service used in the EU, that service falls under EU law.
        If that service i

        • by DarkOx ( 621550 )

          See i don't think it should work that.

          If I offer a service I should be governed EXCLUSIVELY by the laws in the place i am offering that service. That would be where my server is because that is where the activity is taking place.

          If Germany has a problem with it or the EU does it should be their responsibility to restrict their citizens from using services, like mine where their laws are not followed, or outside their jurisdiction or whatever. More reasonably they should just educate their public, you bette

          • by dbu ( 256902 )

            The food analogy doesn’t quite hold. If you serve food in China, of course you follow Chinese rules.
            But if you export that food to the U.S. and it violates FDA standards, you can’t just say: “It’s America’s job to block it, I’m not responsible.”

            That’s not how cross-border regulation works. Whether it’s baby formula, toys, or online services, if you reach into another market, you’re expected to comply with that market’s laws.

            The same principle

We will have solar energy as soon as the utility companies solve one technical problem -- how to run a sunbeam through a meter.

Working...