RSVP on LinkedIn here.
Generative AI, specifically large language models, have taken the world by storm. But they wouldn't exist today without a key paper published by Google Brain: "Attention Is All You Need." The 10-page paper, published in 2017, introduced the Transformer - a critical innovation in neural networks.
What's in store?
- A deep dive and interactive discussion of the paper. We recommend reading (or at least skimming) it first: https://arxiv.org/pdf/1706.03762.pdf
- A look at what came before the paper (neural networks, embeddings), and what came after (BERT, GPT-2, and ChatGPT).
- All software engineers welcome - no AI/ML background necessary!
Did you know?
- The paper's authors have all gone on to found multiple billion-dollar AI companies including Cohere and Character.ai.
- Since 2017, the paper has been cited over 86,000 times.
- The Transformer architecture is the T in "GPT".
Charlie Guo is a seasoned founder and author with over 15 years of software experience and more than a decade in Silicon Valley. He is an alum of both Stanford and YCombinator, and currently publishes Artificial Ignorance, an AI-focused newsletter for engineers and founders.
- Charlie will give a brief background, and then explain the key insights and context from the paper.
- We will open the group to some Q&A and a broader discussion.
- And we'll end with some discussion on which paper to read and review next.
The next paper reading group is on October 15 about deep reinforcement learning, led by Sai Bhavanasi.