Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Google Security Technology

Sensitive Data Stored On Box.com Accounts Accessible Via Search Queries (threatpost.com) 29

msm1267 writes: Last week Box.com moved quickly and quietly to block search engines from indexing links to confidential data owned by its users. That is after security researcher Markus Neis surfaced private data belonging to a number of Fortune 500 companies via Google, Bing and other search engines. Box.com said it's a classic case of users accidentally oversharing. Neis isn't convinced and says Box.com's so-called Collaboration links shouldn't have been indexed in the first place. Box.com has since blocked access to what security researchers say was a treasure trove of confidential data and fodder for phishing scams.
This discussion has been archived. No new comments can be posted.

Sensitive Data Stored On Box.com Accounts Accessible Via Search Queries

Comments Filter:
  • by Anonymous Coward on Tuesday January 03, 2017 @08:08PM (#53601411)

    Don't let someone else have custody of your data.

    People are so stupid.

    • by Anonymous Coward

      But they said it was in a box, not a cloud. I'm confused.

    • Don't let a bank have custody of your money. Keep it safe under your mattress. For all but the security professionals, cloud storage is the most secure option available. Some super rich people keep bars of gold in their homes but for everybody else a bank account is the right place to store money.
    • by antdude ( 79039 )

      My former security workplace used box.com. :(

    • Brazil there was some legislation (abrogated by current administration, who's product of a coup...) trying to avoid the use of "clouds" hosted on other countries (where Brazilian legal jurisdiction can't reach)...
  • I'd like to look through the data. For science, yeah, that's the ticket...

  • by Anonymous Coward

    I'm with box.com on this. If users overshare their links, why should it be box.com or the search engines' responsibility to know that the information is confidential and prevent indexing? Why do the idiots who overshare get preferential treatment over people who willingly want their information to be public and deliberately post links in places where search engines will index them so the content can be found?

    • by stephanruby ( 542433 ) on Tuesday January 03, 2017 @09:52PM (#53601859)

      Box should just have used a robots.txt and disallowed /* everything by default. It's not that hard [robotsgenerator.com].

      It's a given that users, whether they know it or not, are going to leak private urls to search engines. The Alexa toolbar, the Google toolbar, the Microsoft browser, etc., they all leak that kind of information. This is not a new problem. This is why the robots.txt file is there (not to inform hackers of the exact links they must not index, but to inform search engines that if they find themselves on a particular domain, or in a particular directory, that they should not index any file/folder below that level).

      • by Anonymous Coward

        While robots.txt allows companies to address the problem of having URLs like these indexed in search engines, it does not prevent rogue indexing services from doing so. Take Baidu, for example. I've seen how they disregard the robots.txt, which means URLs like these might very well leak and get indexed through an incredible amount of different channels; browser addons, email scanning, URL shorteners etc.

        Sharing using "secret links" is highly insecure and should not be possible if the information behind it i

      • Digest access authentication would have stopped it cold, a lot of spiders use the robots.txt as a hint to where the good stuff might be.

  • Box.com said it's a classic case of users accidentally oversharing.

    Next time you feel trepidation about oversharing, remember, someone once said in a meeting, "Let's make a film with a tornado full of sharks."

  • by Anonymous Coward

    TFA was lacking on technical details and I don't see any information about what Box has done to prevent these URLs from being indexed in the future. If they just tweaked their robots.txt or something, that isn't going to cut it. People who use Chrome will still be leaking these URLs directly to Google, which in turn will still index them. See here [google.com] (search for "Chrome sends" for just a taste of what the Chrome spyware transmits back to its mother ship).

  • by SethJohnson ( 112166 ) on Tuesday January 03, 2017 @08:53PM (#53601583) Homepage Journal
    Using the Atlassian chat client, HipChat, if a user transmits a file to another user, the file is stored on Amazon S3, just like it sounds as Box is doing, and is accessible by an obfuscated URL. The files are then available via any unauthenticated GET requests that can stumble upon the URL string via brute force.

    A clever attacker doesn't even need to use her own resources in the brute force attack. A website can be constructed with millions of links pointing at candidate URLs and eventually Google and other indexers will spider them and the ones that don't turn up 404 errors will be added to the web index.
  • And what do they do besides charging money for something you can do for free with a terminal, poorly?

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...