Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Microsoft Data Storage Privacy

Microsoft's OneDrive Begins Testing Face-Recognizing AI for Photos (for Some Preview Users) (microsoft.com) 62

I uploaded a photo on my phone to Microsoft's "OneDrive" file-hosting app — and there was a surprise waiting under Privacy and Permissions. "OneDrive uses AI to recognize faces in your photos..."

And...

"You can only turn off this setting 3 times a year."

*

If I moved the slidebar for that setting to the left (for "No"), it moved back to the right, and said "Something went wrong while updating this setting." (Apparently it's not one of those three times of the year.)

The feature is already rolling out to a limited number of users in a preview, a Microsoft publicist confirmed to Slashdot. (For the record, I don't remember signing up for this face-recognizing "preview".) But there's a link at the bottom of the screen for a "Microsoft Privacy Statement" that leads to a Microsoft support page, which says instead that "This feature is coming soon and is yet to be released." And in the next sentence it's been saying "Stay tuned for more updates" for almost two years...

A Microsoft publicist agreed to answer Slashdot's questions...



Slashdot: What's the reason OneDrive tells users this setting can only be turned off 3 times a year? (And are those any three times — or does that mean three specific days, like Christmas, New Year's Day, etc.)

[Microsoft's publicist chose not to answer this question.]

Slashdot: If I move the slidebar to the left (for "No"), it moves back to the right, and says "Something went wrong while updating this setting." So is it correct to say that there's no way for users to select "No" now?

Microsoft: We haven't heard about the experience you are having with toggling, but our Microsoft contacts would like to investigate why this is happening for you. Can you share what type of device you are using, so we can put you in touch with the right team?

Slashdot: Is this feature really still "coming soon"? Can you give me more specific details on when "soon" will be?

Microsoft: This feature is currently rolling out to limited users in a preview so we can learn and improve. We have nothing more to share at this time.

Slashdot: I want to confirm something about how this feature is "yet to be released." Does this mean that currently OneDrive is not (and has never) used AI to "recognize" faces in photos?

Microsoft: Privacy is built into all Microsoft OneDrive experiences. Microsoft OneDrive services adhere to the Microsoft Privacy Statement and follow Microsoft's compliance with General Data Protection Regulation and the Microsoft EU Data Boundary.

Slashdot: Some privacy advocates prefer "opt-in" features, but it looks like here OneDrive is planning a (limited) opt-out feature. What is the reasoning for going with opt-out rather than opt-in?

Microsoft: Microsoft OneDrive inherits privacy features and settings from Microsoft 365 and SharePoint, where applicable.


Slashdot also spoke to EFF security/privacy activist Thorin Klosowski, who expressed concerns. "Any feature related to privacy really should be opt-in and companies should provide clear documentation so its users can understand the risks and benefits to make that choice for themselves."

Microsoft's "three times a year" policy also seemed limiting to Klosowski. "People should also be able to change those settings at-will whenever possible because we all encounter circumstances were we need to re-evaluate and possibly change our privacy settings."
This discussion has been archived. No new comments can be posted.

Microsoft's OneDrive Begins Testing Face-Recognizing AI for Photos (for Some Preview Users)

Comments Filter:
  • Wow, just wow (Score:5, Insightful)

    by RitchCraft ( 6454710 ) on Saturday October 11, 2025 @10:43AM (#65718448)

    Loving Microsoft yet?

    • >"Loving Microsoft yet?"

      I don't use any of their products. And there are many good reasons for that.

  • by Chuck Hamlin ( 6194058 ) on Saturday October 11, 2025 @10:51AM (#65718462)
    I would only do an rclone encrypted volume. And keep separate copies.
  • by Cajun Hell ( 725246 ) on Saturday October 11, 2025 @11:03AM (#65718470) Homepage Journal

    What's the reason OneDrive tells users this setting can only be turned off 3 times a year?

    Because that's what their customers are demanding! Don't you hate when you're doing something, and you realize you've done it more than 3 times? Just yesterday I adjusted the mirror on my wife's SUV and thought "we keep undoing each other's mirror adjustments. Can't it just stop moving so that one of us permanently loses and one permanently wins? Why is this car letting us change it back'n'forth?"

    Microsoft fights for the users!

    • by edavid ( 1045092 )
      But they will never ask permission to the people on pictures. And THAT is the great https://www.youtube.com/shorts... [youtube.com].
      • My thoughts exactly: I have nothing to do with Microsoft, its products or services. How does it know that I do not want my face to be recognised ? The only way that it can get it right is opt-in of registered users only -- that will never happen just as AI companies will not respect copyright.

    • by AmiMoJo ( 196126 ) on Saturday October 11, 2025 @12:32PM (#65718586) Homepage Journal

      There might be a more benign reason for it. In GDPR countries, if you turn it off they will probably need to delete all the biometric data. If you then turn it back on again, it will have to regenerate all the biometic data and re-scan every photo. If people toggle it too often, it's going to consume a large amount of CPU time.

      You can confirm it by using an open source facial recognition tool, like the one built into Immich. Importing photos takes much, much longer if you have face recognition turned on.

      Of course a more sensible way to do it would be to allow the user to toggle it whenever they want, with the caveat that if the turn it back on, it might take a long time to start working, or might only apply to new photos after the initial back-catalogue freebie.

      Or they could just be being dicks.

      • by RegistrationIsDumb83 ( 6517138 ) on Saturday October 11, 2025 @01:50PM (#65718724)
        Plausible, but should have limited the number of times it is turned on in that scenario.
      • I was thinking that perhaps in some jurisdiction, having the slider means the user has "consented" to their data being scanned, or may accidentally "consent" if they look at or touch it.

        • by AmiMoJo ( 196126 )

          Under GDPR it has to be explicit and clearly indicated, but yeah not everywhere is as good as that.

      • There might be a more benign reason for it. In GDPR countries, if you turn it off they will probably need to delete all the biometric data. If you then turn it back on again, it will have to regenerate all the biometic data and re-scan every photo. If people toggle it too often, it's going to consume a large amount of CPU time.

        You can confirm it by using an open source facial recognition tool, like the one built into Immich. Importing photos takes much, much longer if you have face recognition turned on.

        Of course a more sensible way to do it would be to allow the user to toggle it whenever they want, with the caveat that if the turn it back on, it might take a long time to start working, or might only apply to new photos after the initial back-catalogue freebie.

        Or they could just be being dicks.

        Of course, the most sensible way would be to say "hey, you keep dithering on this setting, it's clearly not something you confidently want, so we're going to go ahead and shut it off and if you really want it back, you're going to have to grow a year older and wiser first."

        • I also suspect there is a technical+legal reason for the bizarre limitation. A better way to handle all this: Should be OFF by default. You can turn it ON. If you toggle OFF and ON a couple more times within X time (a year, whatever), then the button should be grayed out, stuck to OFF. There can be a link to explain why (you are limited to toggle X times...) and maybe the date it can be toggled back on.

          Back when I was on MacOS last time ended in 2021. I think?) I recall iPhoto/Apple Photos coming up colle
      • In which case I think it should be off by default and you should be able to enable it only 3 times a year, to reduce CPU time being used with someone enabling disabling multiple times.

  • by Big Hairy Gorilla ( 9839972 ) on Saturday October 11, 2025 @11:19AM (#65718488)
    .. and you will pay for the privilege.

    Will wait here for garbz to explain why that's a good thing.
  • Don't let them (Score:5, Insightful)

    by LainTouko ( 926420 ) on Saturday October 11, 2025 @11:22AM (#65718492)

    I suggest encrypting anything stored in cloud systems using some key which is either based on a thoroughly memorised passphrase or stored in several different drives in multiple physical locations.

    But note that this is not a solution for people with less technical knowledge than slashdotters, and these people deserve their privacy too. It is a solution to personal problems, but not a solution to this social problem.

    • I was using a cloud-to-network drive application with encrypted data on Microsoft Live, about 18 years ago. Way back then, Microsoft recognized the lack of data structure and asked if the data was corrupted. I clicked "No" and Microsoft never bothered me again. Nowadays, using Live UI is faster than a middle-ware application, although it's slightly more work.
  • Facial Recognition (Score:5, Interesting)

    by OtisSnerd ( 600854 ) on Saturday October 11, 2025 @11:24AM (#65718494)
    This is kind of the reason I deleted all my photos I was sharing through OneDrive last year. Since they're going to do facial recognition, I uploaded ONE photo, one of an old IBM mainframe CPU card (S/370 9672 - G1, from 1995). Let them facial recognize that. I have plenty more personal retro computing hardware, should they want better images of me. Maybe some of those slop AI people would be fun to upload as well.
  • ...people choose it voluntarily and often pay for it
    If something is forced on people and is difficult or impossible to turn off, it's rarely good

  • Q&A (Score:4, Informative)

    by Calydor ( 739835 ) on Saturday October 11, 2025 @11:46AM (#65718538)

    It's not a Q&A if the guy getting the questions doesn't actually provide an answer to any of them. The answers are nothing-was-actually-said drivel.

    • The answers are nothing-was-actually-said drivel.

      Indeed, and exactly what thinking people expect from a PR flack for a corporation which produced and distributed literally the most insidious spyware of all time.

      • Asking questions is a skill. That's why some journalists are paid the big bucks (*)

        NOT asking questions is also a skill. That's why some OTHER journalists are paid the big bucks

    • The Q&A responses from the “Publicist” seem suspiciously like ai chatbot responses.
  • Amazon Photos has been doing this for many years.

    What's the big deal?

  • by jopet ( 538074 )

    just don't use Microsoft Crap

  • This clearly comes down from the top, "force AI into everything and show me uptake, or else ...", so the grunts just shovel it in there and force it.

    Meanwhile the market is clapping because the morons at the top lucked into the money printer which is the cloud transition. Also 365/Azure are handled with some competence ... while Windows and consumer applications are done by the teams with the least political power, which get forced to do these AI experiments and use crap like WinUI ("dogfooding" is clearly

    • Or could be their current AI is built on illegal material and they are quietly trying to build a legal LLM? How does AI have pictures of public people without somehow somewhere violating copyright?
  • Besides Orwell foreshadowing our growing dystopia so as to become eponymous, "Orwellian". In our glaring stupidity, we'd already relegated rights of the actual subject for promotion of the rights of the likeness/image taker. Your image in photos is not your own, as example. It belongs to the photographer, exclusively. Orwell understood Big Brother isn't some esoteric government abstraction. Big Brother is us, by willful ignorance and negligence. None could be worse.
  • by Growlley ( 6732614 ) on Saturday October 11, 2025 @01:29PM (#65718696)
    and it will be turned back on every forced update which will happen 4 times a year !
  • by PPH ( 736903 ) on Saturday October 11, 2025 @01:54PM (#65718728)

    ... for the Microsoft publicist: Are you a 'bot?

    • by caseih ( 160668 )

      Sure sounded like one. The questions posed could have been worded more clearly with less room for wiggling. But the bot did an excellent job of avoiding all the questions.

  • by Rick Zeman ( 15628 ) on Saturday October 11, 2025 @01:58PM (#65718736)

    Anyone remember tub girl?

  • I'm a windows user and always have turned off OneDrive.
    There are other very secure platforms for this that don't have the bias that a Microsoft operation will have.

  • by gweihir ( 88907 ) on Saturday October 11, 2025 @03:15PM (#65718834)

    Unless you want all your data analyzed, used, sold and lost.

  • by ebunga ( 95613 ) on Saturday October 11, 2025 @04:48PM (#65718944)

    You're going to do what Microsoft tells you todo. They take what they want, and you're just going to wake up sore and confused the next morning.

  • Wait, what? (Score:4, Interesting)

    by Charlotte ( 16886 ) on Saturday October 11, 2025 @05:14PM (#65718978)

    Slashdot can do images? When was that feature added?

  • by david.emery ( 127135 ) on Saturday October 11, 2025 @05:56PM (#65719034)

    What's the step beyond Enshittification? Microsoftification? Oracalization? Tech Broiication? Because we've clearly moving past Enshittification to something even worse.

  • They are not even hiding that they scan the content you put in there. Why are you surprised they now scan for even more purposes?

  • if the bot had just answered "Go fuck yourself" to every question.
  • If I moved the slidebar for that setting to the left (for "No"), it moved back to the right, and said

    "That's once."

  • But they do the recognition behind the scenes for everyone regardless of that setting. Prove me wrong.
  • This is the way of the internet now. Thanks Uk.gov and imgur

  • Kudos to EditorDavid and Slashdot for performing some actual, original journalism and showing some technical capabilities in Slashcode that are infrequently visible to the users.

    Implementation is a bit spotty, though, with the screenshot hosted on Imgur -- meaning UK 'dotters can't see it.

  • Of course there's no easy way to use a screenshot directory that isn't in OneDrive. That's great.

  • I hope slashdot does more stuff like this.

  • As I commented on the "Amazon's Giant Ads Have Ruined the Echo Show" story posted here a few days ago, if you lie down with dogs you get up with fleas. And Microsoft is very much a dirty dog with many big, hungry fleas.

    Ditch that crap and try Proton Drive, or some other privacy-respecting service. Proton Drive isn't yet the most convenient service, but at least it doesn't bend you over and start thrusting at every possible opportunity.

Today's scientific question is: What in the world is electricity? And where does it go after it leaves the toaster? -- Dave Barry, "What is Electricity?"

Working...