Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Google

AI Researcher Who Helped Write Landmark Paper Is Leaving Google (bloomberg.com) 6

An artificial intelligence researcher who co-authored one of Google's most influential papers in the field is leaving the company to launch a startup. From a report: Llion Jones, who helped write the pioneering AI paper "Attention Is All You Need," confirmed to Bloomberg that he will depart Google Japan later this month. He said he plans to start a company after taking time off. "It was not an easy decision leaving Google, it's been a fantastic decade with them but it's time to try something different," Jones wrote in a message to Bloomberg. "It also feels like good timing to build something new given the momentum and progress in AI."
This discussion has been archived. No new comments can be posted.

AI Researcher Who Helped Write Landmark Paper Is Leaving Google

Comments Filter:
  • I am glad to see a researcher who helped push the technology forward will be able to cash in on his hard work that everyone else is already doing.

  • by danielcolchete ( 1088383 ) on Tuesday July 11, 2023 @05:05PM (#63678133)

    The "T" in "GPT" means "Transformer", which is the thing that that paper created.

    Before that paper it wasn't possible to parallelize large model training across a large cluster, which make large models impractical.

    Most of the generative AI and large language models making strides out there today came from that parallelization that Transformers enabled. It really changed the world in a significant way.

    Funny thing is that they were only trying to improve language translation models. Well, at least according to the paper.

    • by narcc ( 412956 )

      Before that paper it wasn't possible to parallelize large model training across a large cluster

      That's not accurate. Training CNNs, for example, can be parallelized a number of different ways which predate the Attention paper.

      The Attention paper was important, but not for that reason.

    • Ehh....not quite. There has always been ways to parallelise that (Its kind of inherent to the field). What the Attention paper innovated was the idea of Attention heads,which kind of do what it says on the package. Attention was indeed a pivotal paper, really the one that kickstarted the new era of AI, perhaps as important even as Hintons work, but I dont think the parallelization was the key innovation there.

  • ...To better opportunities for the bloke. If he could solve the problem of AI paying attention to anything at all, who knows what he'll do next?

The last thing one knows in constructing a work is what to put first. -- Blaise Pascal

Working...