HomeAI News
No matter how good it is, it is also a "worker": Google crossed out the author's mailbox of the Transformer architecture paper
102

No matter how good it is, it is also a "worker": Google crossed out the author's mailbox of the Transformer architecture paper

Hayo News
Hayo News
August 2nd, 2023
View OriginalTranslated by Google

We all know that most of the underlying frameworks of the popular AI models are inseparable from the proposal of the Transformer model architecture, and the paper " Attention Is All You Need " proposed by the Google research team in 2017 can be said to be two years old. The "origin of all things" in the AI ​​​​world.

However, recently, this article has undergone some "theoretical" changes on arxiv. Since the eight authors of the article called "Transformer Eight Sons" have all left Google , the email addresses of the eight authors at the beginning of this article have been designated. drop delete.

And there is an extra red note above the title: In the case of providing appropriate attribution, Google hereby grants permission to translate and reproduce (or reproduce) the tables and figures in this article only for news or scholarly works.

Comments

no dataCoffee time! Feel free to comment