![Gargi Ghosh Profile](https://pbs.twimg.com/profile_images/1577110447480004609/lL6WQcrd_x96.jpg)
Gargi Ghosh
@gargighosh
Followers
682
Following
69
Statuses
57
Researcher at FAIR (Meta AI)
Bellevue, WA
Joined December 2009
We released new research - Byte Latent Transformer(BLT) BLT encodes bytes into dynamic patches using light-weight local models and processes them with a large latent transformer. Think of it as a transformer sandwich!
New from Meta FAIR — Byte Latent Transformer: Patches Scale Better Than Tokens introduces BLT, which for the first time, matches tokenization-based LLM performance at scale with significant improvements in inference efficiency & robustness. Paper ➡️
11
85
671
RT @ClementDelangue: Our science team has started working on fully reproducing and open-sourcing R1 including training data, training scrip…
0
548
0
RT @ykilcher: 🔥New Video🔥 I delve (ha!) into Byte Latent Transformer: Patches Scale Better Than Tokens where the authors do away with token…
0
110
0
RT @nrehiew_: Wrote about some of my favourite papers over the past year or so and some research directions that I am excited about in 2025…
0
80
0
RT @AIatMeta: New research from Meta FAIR — Meta Memory Layers at Scale. This work takes memory layers beyond proof-of-concept, proving the…
0
187
0
Joint wrk with @mingdachen @LukeZettlemoyer @scottyih , Alicia Sun, Yang Li, Karthik Padthe, @RulinShao
0
0
3