Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Japan Technology

Japanese Court Ruling Poised To Make Big Tech Open Up on Algorithms (ft.com) 16

Japanese legal experts have said an antitrust case related to a local restaurant website could change how large internet platforms such as Google, Facebook and Amazon operate in the country, forcing them to reveal the inner workings of their secret algorithms. From a report: Last month, a Tokyo court ruled in favour of Hanryumura, a Korean-style BBQ restaurant chain operator in an antitrust case brought against Kakaku.com, operator of Tabelog, Japan's largest restaurant review platform. Hanryumura successfully argued that Kakaku.com had altered the way user scores were tallied in ways that hurt sales at its restaurant outlets. While Kakaku.com has been ordered to pay Hanryumura $284,000 in damages for "abuse of superior bargaining position," the internet company has appealed against the decision.

Japanese legal experts said the outcome may have far-reaching implications, as the court requested Kakaku.com to disclose part of its algorithms. While the restaurant group is constrained from publicly revealing what information was shown to it, the court's request set a rare precedent. Big Tech groups have long argued that their algorithms should be considered trade secrets in all circumstances. Courts and regulators across the world have begun to challenge that position, with many businesses having complained about the negative impact caused by even small changes to search and recommendations services.

This discussion has been archived. No new comments can be posted.

Japanese Court Ruling Poised To Make Big Tech Open Up on Algorithms

Comments Filter:
  • by Anonymous Coward on Monday July 04, 2022 @11:56PM (#62673962)

    Pay us and we will improve your rating.
    Tabelog is applying all the same dirty tactics as Yelp.

    • by jrumney ( 197329 ) on Tuesday July 05, 2022 @02:04AM (#62674050)

      That isn't really what the case is about though - if it were that simple the chain would just pay them.

      Tabelog changed their algorithm some time back in response to complaints from independent restaurant owners that their algorithm was favoring chain restaurants. The chain owner says that 55% of their restaurants' ratings dropped, none went up, as a result of this change, and argue that it disadvantages them compared to independent restaurants. As a remedy they want to see how the algorithm works so they can invest effort that independent restaurants cannot afford into gaming the ratings system, like they used to.

      • What they want and what they are getting in the order are pretty different.

        "The Algorithm" sounds nice and might satisfy a judge and PR, but do they really want a lawyer-eyes-only view of a few million lines of source code, machine learning formula, and the like, all under seal as confidential proprietary information from discovery? Plus a few thousand database lines where the restaurant shows up as machine learning weights and data statistics?

        What they want is the conclusion of the complex machine learni

        • by Klaxton ( 609696 )

          What makes you think it takes " few million lines of source code" to tally up a review score? Or any machine learning at all?

    • What I don't understand is all the hidden reviews on Yelp. What makes them move a review to the hidden area?
  • by Barny ( 103770 ) on Tuesday July 05, 2022 @12:58AM (#62673994) Journal

    When the algorithm is just "we insert all the inputs into this AI we trained and it spits out the result", that law isn't going to be all that useful. Oh, I can see the non-technical types then pointing at the AI and saying "Well, show us what's in that!" But that will be even worse. They will just get reams and reams of neuron weights and nothing of actual substance.

    All this will do is push more companies to rely on an AI system, since it will be the ultimate defense to this kind of law.

    • Re:Yeah, but— (Score:5, Interesting)

      by iserlohn ( 49556 ) on Tuesday July 05, 2022 @01:21AM (#62674012) Homepage

      >Oh, I can see the non-technical types then pointing at the AI and saying "Well, show us what's in that!" But that will be even worse. They will just get reams and reams of neuron weights and nothing of actual substance.

      The way this will develop would be that claimants and their lawyers would then ask for the training data set, together with everything else that makes up the model. Explainability (and the lack thereof) isn't a way out of legal requirements, if anything, it makes the compliance more costly.

      • by AmiMoJo ( 196126 )

        I don't think they use AI anyway. It sounds like they made some change that rebalanced the way chain restaurants are rated compared to independents, which implies an algorithm that takes an IsChain flag as an input.

        It's probably something like chains being down-ranked more for negative reviews, because chains sell the same food everywhere and part of the attraction is you know what you are going to get. With independents there is a chance that an individual simply won't like that restaurant's food, not beca

    • So the thing to ask is then 'How do you train your AI'.

    • All this will do is push more companies to rely on an AI system, since it will be the ultimate defense to this kind of law.

      This is true to an extent. But now restaurants will now be able to ask "Why did my rating improve so much now that I've started paying your company $5,000 a month?" Before they used to say "I'm sorry, but this is a trade secret."

      Now, they won't be able to make such an argument (at least, that's what I'm hoping for).

    • by Anonymous Coward

      Your defence might work in a retarded jurisdiction, but in most, handing over a load of "gobbledegook" is going to get you a Contempt of Court ruling. You won't ever have to hand over source code, or neuron weights - you'll have to hand over a description - something a human can read. The fact you can't hand over a precise one also won't help you - again, it's either a proper description or Contempt of Court. In fairness, if it's going to take you thousands of hours to prepare the description, the judge may

  • They will just game it. Basically any metric that someone comes up with will only be useful until everyone knows it and then it will be gamed. The worst is the managers who have no idea how your job works. They use things like how many hours you worked or how many lines of code. The all time best though is how many bugs you solve in your own code.
  • ... that nobody actually knows how the algorithms work. They've all been tweaked and added to and changed and rewritten and unwritten and integrated back into themselves so many times over so many years that nobody has a clue how they function.

"I got everybody to pay up front...then I blew up their planet." "Now why didn't I think of that?" -- Post Bros. Comics

Working...