Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Music
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/38/5e/2e/385e2e1a-fd6d-cf4d-1acf-6a4f92248552/mza_1515038687252973743.jpg/600x600bb.jpg
Neural Search Talks — Zeta Alpha
Zeta Alpha
21 episodes
6 days ago
A monthly podcast where we discuss recent research and developments in the world of Neural Search, LLMs, RAG and Natural Language Processing with our co-hosts Jakub Zavrel (AI veteran and founder at Zeta Alpha) and Dinos Papakostas (AI Researcher at Zeta Alpha).
Show more...
Technology
RSS
All content for Neural Search Talks — Zeta Alpha is the property of Zeta Alpha and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
A monthly podcast where we discuss recent research and developments in the world of Neural Search, LLMs, RAG and Natural Language Processing with our co-hosts Jakub Zavrel (AI veteran and founder at Zeta Alpha) and Dinos Papakostas (AI Researcher at Zeta Alpha).
Show more...
Technology
https://d3t3ozftmdmh3i.cloudfront.net/production/podcast_uploaded_nologo400/19412145/19412145-1639386626572-67fefb11ed09c.jpg
Transformer Memory as a Differentiable Search Index: memorizing thousands of random doc ids works!?
Neural Search Talks — Zeta Alpha
1 hour 1 minute 40 seconds
3 years ago
Transformer Memory as a Differentiable Search Index: memorizing thousands of random doc ids works!?

Andrew Yates and Sergi Castella discuss the paper titled "Transformer Memory as a Differentiable Search Index" by Yi Tay et al at Google. This work proposes a new approach to document retrieval in which document ids are memorized by a transformer during training (or "indexing") and for retrieval, a query is fed to the model, which then generates autoregressively relevant doc ids for that query.

Paper: https://arxiv.org/abs/2202.06991

Timestamps:

00:00 Intro: Transformer memory as a Differentiable Search Index (DSI)

01:15 The gist of the paper, motivation

4:20 Related work: Autoregressive Entity Linking

7:38 What is an index? Conventional vs. "differentiable"

10:20 Indexing and Retrieval definitions in the context of the DSI

12:40 Learning representations for documents

17:20 How to represent document ids: atomic, string, semantically relevant

22:00 Zero-shot vs. finetuned settings

24:10 Datasets and baselines

27:08 Dinetuned results

36:40 Zero-shot results

43:50 Ablation results

47:15 Where could this model be useds?

52:00 Is memory efficiency a fundamental problem of this approach?

55:14 What about semantically relevant doc ids?

60:30 Closing remarks 


Contact: castella@zeta-alpha.com

Neural Search Talks — Zeta Alpha
A monthly podcast where we discuss recent research and developments in the world of Neural Search, LLMs, RAG and Natural Language Processing with our co-hosts Jakub Zavrel (AI veteran and founder at Zeta Alpha) and Dinos Papakostas (AI Researcher at Zeta Alpha).