Christopher Potts
7 months
Despite having almost no money, academics still developed (just since 2020) diffusion models, FlashAttention, prefix tuning, DPO, essentially every neural IR model, many of the methods for long contexts, and the majority of the important benchmarks, among many other things.