Home
Categories
EXPLORE
True Crime
Comedy
Business
Society & Culture
History
Sports
Health & Fitness
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/52/ab/cb/52abcb67-3575-0960-7313-79789f23ad70/mza_547998439152404077.jpg/600x600bb.jpg
LlamaCast
Shahriar Shariati
49 episodes
4 months ago
Daily podcast about the published articles in the LLM field.
Show more...
Technology
News,
Tech News,
Science,
Mathematics
RSS
All content for LlamaCast is the property of Shahriar Shariati and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Daily podcast about the published articles in the LLM field.
Show more...
Technology
News,
Tech News,
Science,
Mathematics
https://d3wo5wojvuv7l.cloudfront.net/t_rss_itunes_square_1400/images.spreaker.com/original/879177db874692a5aa0e7ad0353a362c.jpg
Breaking the Memory Barrier
LlamaCast
15 minutes
1 year ago
Breaking the Memory Barrier
🧠 Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss

This research paper introduces Inf-CL, a novel approach for contrastive learning that dramatically reduces GPU memory usage during training, allowing for near-infinite batch sizes. The authors address the issue of quadratic memory growth in traditional methods by implementing a tile-based computation strategy that partitions the contrastive loss calculation into smaller, sequentially computed blocks. To further enhance efficiency, they propose a multi-level tiling strategy that leverages ring-based communication at the GPU level and fused kernels at the CUDA core level, minimizing I/O overhead. The experiments demonstrate that Inf-CL significantly outperforms previous methods, achieving unprecedented batch sizes while maintaining accuracy and comparable training speed. This breakthrough opens new possibilities for large-scale contrastive learning, paving the way for advancements in areas such as self-supervised learning and dense text retrieval.

📎 Link to paper

LlamaCast
Daily podcast about the published articles in the LLM field.