Jump to content

Noam Shazeer

From Wikipedia, the free encyclopedia

Noam Shazeer is an American computer scientist and entrepreneur known for his contributions to the field of artificial intelligence and deep learning, particularly in the development of transformer models and natural language processing.

Career

[edit]

In 2017, Shazeer was one of the lead authors of the seminal paper "Attention Is All You Need",[1] which introduced the transformer architecture.

In 2021, Shazeer co-founded Character.AI with Daniel De Freitas.[2]

In August 2024, it was reported that Shazeer would be returning to Google to co-lead Gemini AI project. Shazeer will serve as a technical lead on Gemini, joining the other co-leaders Jeff Dean and Oriol Vinyals, the company said in a memo to staff.[3]

References

[edit]
  1. ^ Chen, Mia Xu; Firat, Orhan; Bapna, Ankur; Johnson, Melvin; Macherey, Wolfgang; Foster, George; Jones, Llion; Schuster, Mike; Shazeer, Noam; Parmar, Niki; Vaswani, Ashish; Uszkoreit, Jakob; Kaiser, Lukasz; Chen, Zhifeng; Wu, Yonghui (2018). "The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation". Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics: 76–86. arXiv:1804.09849. doi:10.18653/v1/p18-1008.
  2. ^ "Google takes another startup out of the AI race". The Verge. 2024-08-02.{{cite web}}: CS1 maint: url-status (link)
  3. ^ "Noam Shazeer returns to Google to co-lead Gemini AI project". ctech. 2024-08-27. Archived from the original on 2024-08-29. Retrieved 2024-08-31.