Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses EU Technology

Google, Meta, and Others Will Have To Explain Their Algorithms Under New EU Legislation (theverge.com) 50

An anonymous reader quotes a report from The Verge: The EU has agreed on another ambitious piece of legislation to police the online world. Early Saturday morning, after hours of negotiations, the bloc agreed on the broad terms of the Digital Services Act, or DSA, which will force tech companies to take greater responsibility for content that appears on their platforms. New obligations include removing illegal content and goods more quickly, explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation. Companies face fines of up to 6 percent of their annual turnover for noncompliance.

"The DSA will upgrade the ground-rules for all online services in the EU," said European Commission President Ursula von der Leyen in a statement. "It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms." [...] Although the legislation only applies to EU citizens, the effect of these laws will certainly be felt in other parts of the world, too. Global tech companies may decide it is more cost-effective to implement a single strategy to police content and take the EU's comparatively stringent regulations as their benchmark. Lawmakers in the US keen to rein in Big Tech with their own regulations have already begun looking to the EU's rules for inspiration.

The final text of the DSA has yet to be released, but the European Parliament and European Commission have detailed a number of obligations it will contain [...]. Although the broad terms of the DSA have now been agreed upon by the member states of the EU, the legal language still needs to be finalized and the act officially voted into law. This last step is seen as a formality at this point, though. The rules will apply to all companies 15 months after the act is voted into law, or from January 1st, 2024, whichever is later.
"Large online platforms like Facebook will have to make the working of their recommender algorithms (used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users," notes The Verge. "Users should also be offered a recommender system 'not based on profiling.' In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently)."

The tech giants will also be prohibited from using "dark patterns" -- confusing or deceptive UIs designed to steer users into making certain choices. A detailed list of obligations contained in the DSA can be found in the article.
This discussion has been archived. No new comments can be posted.

Google, Meta, and Others Will Have To Explain Their Algorithms Under New EU Legislation

Comments Filter:
  • by Sebby ( 238625 ) on Monday April 25, 2022 @06:22PM (#62479176)

    "Metastasize"

    Other acceptable name: "Privacy Rapist"

  • Are they going to dump the entire state of their neural networks on a million-page printout, hand it to the EU and be like "Here, now you go figure it out."? Conventional algorithms, maybe; but any deep learning algorithms and their inherent state are most of the time unexplainable in a finite amount of time.

    • Re: (Score:3, Insightful)

      by splutty ( 43475 )

      If you can't explain what your algorithm does, you probably should not be using it.

      • by djinn6 ( 1868030 )

        It's easy to explain what it does. It identifies the thing that you will most likely to click on and puts it first, followed by the thing you're second most likely to click on and so on.

        How does it do that? Well that's worthy of several PhD theses.

      • "It's magic!" (from 70's...)
      • large components of these algorithms were not human created. And even if they were 100% human created, no one person can understand all aspects due to the complexity level.

      • by AmiMoJo ( 196126 )

        There is actually a lot more here than just having to explain how algorithms work. Loads of good stuff, including...

        - No targeted ads based on gender, race, sexual orientation etc. No targeting at all for kids.

        - Dark patterns are banned, no more tricking the user into subscribing etc.

        - Take-downs and content removal must be explained to the user, and they must have an opportunity to appeal. This is actually incredibly powerful because the explanation and appeal process can form the basis of legal action, so

    • You might not be able to explain why it made a particular decision, but you can explain what decisions it is trying to make and why. For instance, if you have a neutral network that is trying to identify images of horses, you could tell people how you initially teach it what is and isn't a horse, and how it continues to improve its horse identification once fielded. You couldn't probably tell people why it saw this particularly tall dog as a horse this one time.
    • Are they going to dump the entire state of their neural networks on a million-page printout?

      Do you realise how ridiculous is your argument? The algorithm stands for how neural nets are trained (what's the parameters, what is the train/decay method/algorithm, how many layers, how many iterations or time spent on training, and so on...): you know anything about how neural networks works?

      • I do know how neural networks work, yes. The comment above was sarcastic because:
        a) I can easily imagine how the people who made this law may not be satisfied with a "we don't know exactly how it works and how it discriminates or not against groups X, Y, Z"
        b) There were cases in history when big companies tried to comply with disclosure laws by printing thousands of pages in order to paralyse the system.

      • by ranton ( 36917 )

        You are describing how a neural net is trained, not how it is used to rank / sort / suggest content. Crafting this legislation will be a challenge, but it will be a complete fail if it allows the description of the algorithms to be limited to describing how the algorithms were built.

      • The algorithm stands for how neural nets are trained (what's the parameters, what is the train/decay method/algorithm, how many layers, how many iterations or time spent on training, and so on...)

        That's a metaalgorithm, though. Nobody is interested in that -- legally the important thing is how decisions regarding customers are made, not how a whole family of algorithms can be automatically trained on different data.

    • Another subthread addresses this technically. My question is whether a legal requirement to make an algorithm transparent to users (phrasing of the linked Verge article) would allow a company to use a learning algorithm like that.

      In short, can a requirement to have a transparent algorithm be met while still being unable to explain what the algorithm was doing and why when a particular challenge is brought?

      Not the training, the end results of Bob getting a series of "You should work here!" ads for pre
    • by fermion ( 181285 )
      It is very likely that they do know how the algorithm works. These things tend to be developed incrementally and arenâ(TM)t well documented. It is like MS Office formats of the 1990s. It was often hard to convert between formats accurately. MS used official document ion, which was often bug ridden. On the other hand, OpenOffice.org which reversed engineered the formats always created very accurate imports of all formats.
    • You're very optimistic. I would hate to have to explain something as simple as Bubble Sort to a politician, or worse a bureaucrat with an agenda and graft in their eyes.

      • Maybe it will be an eye-opening moment for them. Making them realise that they tresspassed on the area they have no clue or the ability to ever understand. They'll be sending their most hated peers to these meetings.

    • Well even if they can explain it i seriously doubt the other end can even understand it. Politicians are not really known for their wast technical expertice... (eye roll implied)
  • Large online platforms like Facebook will have to make the working of their recommender algorithms (used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users, ...

    Will other media platforms, like newspapers and television news (and "News") channels, etc, have to explain their reasoning too? Seems like they should in the spirit of this legislation. I'd be interested in the transparent rational used for airing a segment / story.

    • Will other media platforms, like newspapers and television news (and "News") channels, etc, have to explain their reasoning too?

      It already occurs in a big part of the world (majority or entire EU btw), friend: it's called "broadcast regulations" or something like it...

    • Will other media platforms, like newspapers and television news (and "News") channels, etc, have to explain their reasoning too?

      The algorithm used by newspapers etc. is already clear. They have an editorial line, voice their opinion in their columns, or you read or watch for a minute and you make your idea.

      Social platform claim to be just a neutral platform for people to communicate, but in the backstage inject suggestions that insidiously manipulate you. They may not voluntarily promote left or right propaganda, but it affects you nevertheless, and you don't know why, how, or to what extent what you read is different than your neig

  • Itâ(TM)s great nobody had done anything about this for decades. Suddenly Facebook is the problem.
  • Does this overturn all the trade secret protections that companies have?

    Perhaps it's like a auto-approved patent. You tell them how it works and your competitors can't use it.

    I'm sure the intellectual property lawyers are perusing the 2023 yacht brochures with glee.

  • A big part of this will be to get the businesses on the record about what the algorithm are and aren't responsible for. We have already had a lot of illegal discrimination incidents waved off as being the computers fault so this move should limit that.
    • Yes, I see that as a good thing.

      But I suspect those guys are sitting on such a large swath of data, metadata, and algorithm-processed metametadata, and algorithms processing the results of processes of other algorithms processing things that were processed before by algorithms (etc.), that those companies have actually no clue what they are doing, how they are getting the insights they offer, whether their models are still simply derived from reality, or whether they are actually shaping reality itself, lik

  • I bet an event such as an election in which someone uses adversarial attacks on the algorithms to affect the outcome of the election, and another law will be introduced how the algorithms must be kept secret as not to aid malicious actors in spreading disinformation and/or manipulating public opinion.

    Now imagine exposing the algorithm being required by law in one country and banned in another.
  • .. a 44 billion dollar white elephant ?

  • The core objective should be that huge or specialist businesses in certain countries be able to win market share against Amazon and the like. But that is never going to happen. Without quantitative numbers (like DB2 Explain), no advertiser will be able to predict how much money they need so spend, to match or beat, say Amazon. It is not so much the Algorithm, but how much you pay for climbing into the first place. And if they did, then the cheaper postage and return deals make market share gains - just as
  • Explaining how software works to someone who doesn't know how to program and has no particular interest in actually knowing how for themselves can often be like teaching fish to ride bicycles.

  • by Virtucon ( 127420 ) on Tuesday April 26, 2022 @09:30AM (#62480826)

    Europe already has laws blocking speech deemed "harmful", the DSA attempts to make anything illegal offline, illegal online. What this boils down to is an open assault on your ability to speak out. It will mean more censorship, not less. At a time when free speech is more necessary than ever, we find the democracies of the world taking the Chinese model of censoring what they don't like. That's very strange.

    I hope this garbage legislation gets flushed quickly because if Europe forces tech to censor everything it's just a matter of time before it comes to the US.

    • by whitroth ( 9367 )

      Let's see - so you think if something is illegal in the Real World, it should be allowed online?

      So, another wrong-wing pedophile, are you, accusing anyone to the left of Attila the Hun to be of your hidden predilection?

  • I want fecesbook to dump the entire algorithm of their "community standards", since the only thing users get to know is something vaguer than any corporate "mission statement" (which itself is euphemism, mostly, for ROI! ROI! ROI!)

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...