Jump to content

Attention Is All You Need

From Wikipedia, the free encyclopedia
(Redirected from Attention is all you need)

An illustration of main components of the transformer model from the paper

"Attention Is All You Need"[1] is a 2017 landmark[2][3] research paper in machine learning authored by eight scientists working at Google. The paper introduced a new deep learning architecture known as the transformer, based on the attention mechanism proposed in 2014 by Bahdanau et al. It is considered a foundational paper in modern artificial intelligence, as the transformer approach has become the main architecture of large language models like those based on GPT.[4][5] At the time, the focus of the research was on improving Seq2seq techniques for machine translation, but the authors go further in the paper, foreseeing the technique's potential for other tasks like question answering and multimodal Generative AI.[1]

The paper's title is a reference to the song "All You Need Is Love" by the Beatles.[6]

An early design document was titled "Transformers: Iterative Self-Attention and Processing for Various Tasks", and included an illustration of six characters from the Transformers animated show. The team was named Team Transformer.[6]

As of 2024, the paper has been cited more than 100,000 times.[7]

For their 100M-parameter Transformer model, they suggested learning rate should be linearly scaled up from 0 to maximal value for the first part of the training (i.e. 2% of the total number of training steps), and to use dropout, to stabilize training.

Authors

[edit]

The authors of the paper are: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan Gomez, Lukasz Kaiser, and Illia Polosukhin. All eight authors were "equal contributors" to the paper; the listed order was randomized. The Wired article highlights the group's diversity:[6]

Six of the eight authors were born outside the United States; the other two are children of two green-card-carrying Germans who were temporarily in California and a first-generation American whose family had fled persecution, respectively.

By 2023, all eight authors had left Google and founded their own AI start-ups (except Łukasz Kaiser, who joined OpenAI).[6][7]

References

[edit]
  1. ^ a b Vaswani, Ashish; Shazeer, Noam; Parmar, Niki; Uszkoreit, Jakob; Jones, Llion; Gomez, Aidan N; Kaiser, Łukasz; Polosukhin, Illia (2017). "Attention is All you Need" (PDF). Advances in Neural Information Processing Systems. 30. Curran Associates, Inc.
  2. ^ Love, Julia (10 July 2023). "AI Researcher Who Helped Write Landmark Paper Is Leaving Google". Bloomberg News. Retrieved 1 April 2024.
  3. ^ Goldman, Sharon (20 March 2024). "'Attention is All You Need' creators look beyond Transformers for AI at Nvidia GTC: 'The world needs something better'". VentureBeat. Retrieved 1 April 2024.
  4. ^ Toews, Rob (3 September 2023). "Transformers Revolutionized AI. What Will Replace Them?". Forbes. Archived from the original on 26 September 2023. Retrieved 3 December 2023.
  5. ^ Murgia, Madhumita (23 July 2023). "Transformers: the Google scientists who pioneered an AI revolution". Financial Times. Archived from the original on 28 December 2023. Retrieved 22 March 2024.
  6. ^ a b c d Levy, Steven. "8 Google Employees Invented Modern AI. Here's the Inside Story". Wired. ISSN 1059-1028. Retrieved 20 March 2024.
  7. ^ a b "Meet the $4 Billion AI Superstars That Google Lost". Bloomberg. 13 July 2023 – via www.bloomberg.com.
[edit]