Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Microsoft Technology

Microsoft Improves Its AI Translations With Z-Code (techcrunch.com) 14

Microsoft has announced an update to its translation services that, thanks to new machine learning techniques, promises significantly improved translations between a large number of language pairs. TechCrunch reports: Based on its Project Z-Code, which uses a "spare Mixture of Experts" approach, these new models now often score between 3% and 15% better than the company's previous models during blind evaluations. Z-Code is part of Microsoft's wider XYZ-Code initiative that looks at combining models for text, vision and audio across multiple languages to create more powerful and helpful AI systems.

"Mixture of Experts" isn't a completely new technique, but it's especially useful in the context of translation. At its core, the system basically breaks down tasks into multiple subtasks and then delegates them to smaller, more specialized models called "experts." The model then decides which task to delegate to which expert, based on its own predictions. Greatly simplified, you can think of it as a model that includes multiple more specialized models.

This discussion has been archived. No new comments can be posted.

Microsoft Improves Its AI Translations With Z-Code

Comments Filter:
  • by Anonymous Coward
    I'd wager Lord Dimwit Flathead the Excessive makes better AI than microsoft does.
  • Can someone translate this to English?
    • Instead of running everything through a single neural network for translation, they are training intermediate neural nets to recognize specific language and send it to uniquely trained translation neural nets.

      If you try to train a neural network to translate.
      "Thou art lovely as a flower." and "Damn, you a dime!" It's going to struggle because the grammar and vocabulary are so far from interchangeable as to be nearly different languages. Embracing that difference means they can train two separate neural nets

    • They're abandoning C# and rewriting everything in Inform.
    • I think the russians have hacked microsoft, as it's called 'z' code.

  • Are they working on their translation services when people leave messages on your phone? And yes, I'm using translation because based on the engrish words and phrases I'm getting in the emails, they have work to do getting a voicemail correct.

  • It's bad becuz it's Micro$oft!!! LOL
  • After several hours of real-world use, everything will be getting funneled to the Nazi conspiracy-mongering AI with the foul mouth.

  • Until people give up on trying to make a single translation out of individual sentencesm AI translation is going to continue to be junk. You need to be able to deal with ambiguity in the source language.

  • Mixture of experts (MoE) refers to a machine learning technique where multiple expert networks (learners) are used to divide a problem space into homogeneous regions. It differs from ensemble techniques in that typically only a few, or 1, expert model will be run, rather than combining results from all models.

    An example from computer vision is combining one neural network model for human detection with another for pose estimation.

    I think this may be closer to how our brains work.

  • "At its core, the system basically breaks down tasks into multiple subtasks and then delegates them to smaller, more specialized models called "experts." The model then decides which task to delegate to which expert, based on its own predictions. "

    It's called a bureaucracy.

  • by Traf-O-Data-Hater ( 858971 ) on Wednesday March 23, 2022 @04:47PM (#62384069)
    Cool! When can we expect to see some new Infocom Adventures then?
  • Because it is the sign painted at the Russian tanks in Ukraine now?

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...