![Kevin Swersky Profile](https://pbs.twimg.com/profile_images/1438131715973820419/sCdVAs6L_x96.jpg)
Kevin Swersky
@kswersk
Followers
8K
Following
2K
Statuses
390
Research Scientist at Deepmind.
Toronto, Ontario
Joined June 2015
RT @priyankjaini: Do generative video models learn physical principles from watching videos? Very excited to introduce the Physics-IQ bench…
0
42
0
@wgrathwohl @mo_norouzi @rob_fergus Sad to see you go Will, but it was a privilege to work with you. Your wild and brilliant. ideas, boundless enthusiasm, and unique style made everything so much fun!
0
0
3
RT @awiltschko: Well, we actually did it. We digitized scent. A fresh summer plum was the first fruit and scent to be fully digitized and r…
0
2K
0
RT @HanieSedghi: 🆕🔥We show that LLMs *can* plan if instructed well! 🔥Instructing the model using ICL leads to a significant boost in planni…
0
8
0
RT @priyankjaini: We have a student researcher opportunity in our team @GoogleDeepMind in Toronto 🍁 If you’re excited about research on di…
0
26
0
RT @PaulVicol: Check out @clark_kev’s and my paper on fine-tuning diffusion models on differentiable rewards! We present DRaFT, which compu…
0
15
0
I’m really excited about this project! Backpropagation and variations are extremely effective at fine-tuning diffusion models on downstream rewards.
@PaulVicol and I are excited to introduce DRaFT, a method that fine-tunes diffusion models on rewards (such as scores from human preference models) by backpropagating through the diffusion sampling! with @kswersk, @fleet_dj arXiv: (1/5)
0
2
24
@mo_norouzi @hojonathanho @wchan212 @Chitwan_Saharia Congrats to you and the team! Can’t wait to see what you make!
1
0
0
This is a really natural framework to improve Bayesian optimization when you have access to related optimization tasks Joint work with @ziwphd, @GeorgeEDahl, Chansoo Lee, @zacharynado, @jmgilmer, @latentjasper, @ZoubinGhahrama1
Hyper Bayesian optimization (HyperBO) is a highly customizable interface that pre-trains a Gaussian process model and automatically defines model parameters, making Bayesian optimization easier to use while outperforming traditional methods. Learn more →
1
5
27
RT @tingchenai: 📢Introducing Pix2Seq-D, a generalist framework casting panoptic segmentation as a discrete data generation task conditioned…
0
52
0
This was a really interesting project for me to learn about and apply neural fields, with some great collaborators!
📢📢📢 𝐂𝐔𝐅 – 𝐂𝐨𝐧𝐭𝐢𝐧𝐮𝐨𝐮𝐬 𝐔𝐩𝐬𝐚𝐦𝐩𝐥𝐢𝐧𝐠 𝐅𝐢𝐥𝐭𝐞𝐫𝐬 Neural-fields beat classical CNNs in (regressive) super-res: AFAIK, a first for @neural_fields in 2D deep learning? Mostly their wins are in sparse, higher-dim signals ~NeRF
0
0
5