Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Social Networks Privacy Slashdot.org

Artists Are Deleting Instagram For New App Cara In Protest of Meta AI Scraping (fastcompany.com) 21

Some artists are jumping ship for the anti-AI portfolio app Cara after Meta began using Instagram content to train its AI models. Fast Company explains: The portfolio app bills itself as a platform that protects artists' images from being used to train AI, and only allowing AI content to be posted if it's clearly labeled. Based on the number of new users the Cara app has garnered over the past few days, there seems to be a need. Between May 31 and June 2, Cara's user base tripled from less than 100,000 to more than 300,000 profiles, skyrocketing to the top of the app store. [...] Cara is a social networking app for creatives, in which users can post images of their artwork, memes, or just their own text-based musings. It shares similarities with major social platforms like X (formerly Twitter) and Instagram on a few fronts. Users can access Cara through a mobile app or on a browser. Both options are free to use. The UI itself is like an arts-centric combination of X and Instagram. In fact, some UI elements seem like they were pulled directly from other social media sites. (It's not the most innovative approach, but it is strategic: as a new app, any barriers to potential adoption need to be low).

Cara doesn't train any AI models on its content, nor does it allow third parties to do so. According to Cara's FAQ page, the app aims to protect its users from AI scraping by automatically implementing "NoAI" tags on all of its posts. The website says these tags "are intended to tell AI scrapers not to scrape from Cara." Ultimately, they appear to be html metadata tags that politely ask bad actors not to get up to any funny business, and it's pretty unlikely that they hold any actual legal weight. Cara admits as much, too, warning its users that the tags aren't a "fully comprehensive solution and won't completely prevent dedicated scrapers." With that in mind, Cara assesses the "NoAI" tagging system as a "a necessary first step in building a space that is actually welcoming to artists -- one that respects them as creators and doesn't opt their work into unethical AI scraping without their consent."

In December, Cara launched another tool called Cara Glaze to defend its artists' work against scrapers. (Users can only use it a select number of times.) Glaze, developed by the SAND Lab at University of Chicago, makes it much more difficult for AI models to accurately understand and mimic an artist's personal style. The tool works by learning how AI bots perceive artwork, and then making a set of minimal changes that are invisible to the human eye but confusing to the AI model. The AI bot then has trouble "translating" the art style and generates warped recreations. In the future, Cara also plans to implement Nightshade, another University of Chicago software that helps protect artwork against AI scapers. Nightshade "poisons" AI training data by adding invisible pixels to artwork that can cause AI software to completely misunderstand the image. Beyond establishing shields against data mining, Cara also uses a third party service to detect and moderate any AI artwork that's posted to the site. Non-human artwork is forbidden, unless it's been properly labeled by the poster.

This discussion has been archived. No new comments can be posted.

Artists Are Deleting Instagram For New App Cara In Protest of Meta AI Scraping

Comments Filter:
  • by pixelpusher220 ( 529617 ) on Thursday June 06, 2024 @07:28PM (#64529133)
    Running from Meta is understandable. Running to ANOTHER privately held silo pretending it won't end up just as bad is just...ugh
    • It doesn't really matter...You don't have anything to worry about unless you're British, because AI only generates likenesses of British people. Just look at their teeth, it's a dead giveaway.

    • by Anonymous Coward
      But this one is different. They are asking POLITELY that you don't use their images for training AI.

      Because that is the only thing that was missing.
      • Re: (Score:3, Informative)

        by Rei ( 128717 )

        Most trainers are happy to honour opt-out requests. It's demands for opt-in that are problematic. It's akin to robots.txt. Nobody says you have to honour it, but most scrapers are perfectly happy to.

        That's not the silly part. The silly part is that they're still pushing this glaze / nightshade snake oil. Hey, how long have these tools been out? Does it look like they've ruined AI art yet? ;) New models just keep getting better and better, because all those tools do is ruin the quality of your work.

    • by ShanghaiBill ( 739463 ) on Thursday June 06, 2024 @07:45PM (#64529169)

      Exactly.

      Artists need to find new ways to prevent people from looking at their creations.

      Perhaps locking them in a vault or burning them.

      • Exactly.

        Artists need to find new ways to prevent people from looking at their creations.

        Perhaps locking them in a vault or burning them.

        EXACTLY.

        Instead of expressing themselves in a physical viewable medium, perhaps they could simply store them protected inside their own brain. Why THAT would solve everything!

        Artist: "I have this wonderful portrait that I'm going to sell you, but it exists only inside my brain so you'll have to take my word on that while you still pay me for it. It's really great and worth it, though -- much better than owning a NFT."

        Elon: "Don't worry -- that's not [scitechdaily.com] a problem. [nextnature.net]"

    • You cannot 'ask' AI how it generated that image or wrote that article. This is the very definition of AI over Expert Systems that can tell you what rules were invoked. So when an artist says you stole my idea/style/whatever - the company can say 'go prove it' who even know they may get costs, because someone like NYT went out their way to 'salt' the learning side. Moreso if the training database is updated daily. Just a pity in most countries 'NOAI' flags are practically unenforceable. What CAN be done is t
    • If you check out Jingna Zhang's history and her pull, you'll see that it's in good hands.

      All that's left is adoption and a model for profitability.

      • It's still a walled silo that makes you the product, enshitification will come, it's just when.

        Open protocols so that if I don't like their service I can leave and keep using the protocol...like email or the Fediverse on ActivityPub protocol.

        I can follow people on Threads via Mastodon [joinmastodon.org] or PeerTube [joinpeertube.org] or PixelFed [pixelfed.org] or dozens of other services.. I'm not on Threads, don't have an account but via the protocol I can interact with them. Just like gmail can email yahoo or mypersonalemail.net
  • by PubJeezy ( 10299395 ) on Thursday June 06, 2024 @08:50PM (#64529259)
    This isn't a plausible article. The implication is that they think all artists are morons. Or that they think social media accounts with the word "artists" in their bio are actually real people. Free social media has a game theory problem.

    If users aren't paying for the platform, then they're the product. It's not a new or clever thing to say but here they are, pretending we're too dumb to remember.

    Every marketing company builds a swarm of accounts on every new platform. They do it at a rate that exceeds organic users greatly. This means that all new platforms will be filled with spam BEFORE the user base can actually develop.

    I'm not saying Cara as a platform doesn't have anything to offer anyone, I'm saying it's 100% impossible that there's any chance it could ever have anything to offer the userbase. This article is just pay2win garbage. Yuck.
    • by znrt ( 2424692 )

      This isn't a plausible article.

      most of the few people i follow in instagram are artists. a few of them have already announced they move to cara, so for now it seems pretty plausible.

      just for curiosity i read a bit around about clara. it seems they ... ahem, use ai to detect ai generated art, and as you would expect false positives are already a big issue. i don't think this will really work, not just because of the mechanics but because it's the wrong approach all around, it's stupid to try and resist or lock up ai because it's already h

      • Sometimes scrambled eggs make the best still life paintings.

        Ensure your financial seatbelt is fastened. Confirm your Easter budget aligns with your financial runway. Verify that existing Easter treasures are stowed away. No unnecessary purchases allowed! Examine homemade Easter baskets and dyed eggs. DIY readiness: green light. Gather loved ones for egg hunts and potluck feasts. Confirm RSVPs. Stock up on discounted Easter items for future flights. Tune in to the the frequency. Ready your heart for gridiron

  • Opportunities! (Score:4, Interesting)

    by Kernel Kurtz ( 182424 ) on Thursday June 06, 2024 @08:51PM (#64529265)

    The tool works by learning how AI bots perceive artwork

    They could use AI for this!

  • it's ironic that artists who presumably eat better when more people see their work are taking their ball and going home. presumably only other similarly principled artists are going to be their audience now. I understand where their frustration comes from, and I understand how they are afraid of what is happening. It's just not a good idea for them, and I think they're fooling themselves to think this will help.
  • by Rosco P. Coltrane ( 209368 ) on Friday June 07, 2024 @03:09AM (#64529645)

    Cara will strike a deal with OpenAI to monetize "their" content.

    If you think they'll resist the lure of money - or the need, depending on how deep in debit they are - you're a very naive "content creator".

    Also, nothing prevents anybody from scraping their site for free. That's what all the other big data sumbitches have been doing for years without asking anybody's permission, and that's how AI happened in the first place.

  • by bleedingobvious ( 6265230 ) on Friday June 07, 2024 @03:53AM (#64529695)

    You have pretty much failed to a degree that would make amoeba laugh at you.

    Maybe the threat of taking your "art" elsewhere isn't as dramatic as you imagine?

    • You have pretty much failed to a degree that would make amoeba laugh at you.

      Maybe the threat of taking your "art" elsewhere isn't as dramatic as you imagine?

      If your final product is an image, and it's on the Intertoobz, it is free to the world.

      If you are an artist, and don't want it scraped, make something that is internet proof - real tangible objects. Hard to do internet AI scraping on a sculpture or wood product or even images that incorporate materials into them.

  • If you want to discourage scraping, clearly labeling all AI artwork is the opposite of what you should do. If an LLM ingests data generated by another LLM, it quickly becomes unstable and starts hallucinating. That's why pre-AI data is so precious. It's also why so many AI-"enabled" devices have come to market recently. It's actually a way to gather large amounts of human-generated data without having to worry about whether any of it is contaminated by AI data. If you art site clearly labels all AI artwork,

    • by Rei ( 128717 )

      If an LLM ingests data generated by another LLM, it quickly becomes unstable and starts hallucinating.

      Tell me you've never trained an AI model without telling me you've never trained an AI model.

      There are literally entire "Midjourney Style" models for Stable Diffusion tuned purely with Midjourney images.

      Whether something was made by an AI or not isn't what matters. Whether it's considered aesthetically desirable by humans is what matters. Garbage human content and garbage AI content both suck. Good huma

  • Social media sites are all scams in the first place, unless anyone really believes that narcissism driven profiteering is a net positive to society.

"The pathology is to want control, not that you ever get it, because of course you never do." -- Gregory Bateson

Working...