Jeremias Knoblauch Profile
Jeremias Knoblauch

@LauchLab

Followers
2K
Following
1K
Statuses
396

Associate Professor & EPSRC Fellow @ UCL. Post-Bayesian seminar series sign-up @ https://t.co/a0MyAQOh16 Research mission @ https://t.co/kNIjvCrGne

London
Joined April 2019
Don't wanna be here? Send us removal request.
@LauchLab
Jeremias Knoblauch
3 years
If you are looking for a PhD position at the intersection of Bayesian statistics and machine learning, consider getting in touch with me! I'm building a research group on generalised Bayesian methods with applications in ML at UCL over the coming years.
@stats_UCL
UCL Statistical Science
3 years
🚨🚨🚨 7 PhD Studentships at UCL in Statistical Science. More information 👇:
11
89
296
@LauchLab
Jeremias Knoblauch
2 months
RT @JulyanArbel: Why posteriors MUST be modified, and a simple technic for setting them up. Dr. French's Modified Posteriors (1937) @ISBA_e…
0
1
0
@LauchLab
Jeremias Knoblauch
2 months
RT @TechAtBloomberg: Congratulations to @UCL / @stats_UCL's @matialtamiranom on being one of the 2024-2025 @Bloomberg #DataScience Ph.D. Fe…
0
2
0
@LauchLab
Jeremias Knoblauch
2 months
RT @matialtamiranom: Super happy and grateful to be awarded the Bloomberg fellowship. Huge thanks to my amazing supervisors @LauchLab and @…
0
2
0
@LauchLab
Jeremias Knoblauch
2 months
@Chau9991 congrats! Looking forward to seeing your group grow :)
0
0
1
@LauchLab
Jeremias Knoblauch
2 months
🥳🥳🥳Massive congratulations to my PhD student @matialtamiranom for winning one of the 3 Bloomberg Fellowships for his work on robust & efficient post-Bayesian methods! I cannot think of anyone more deserving!😊 Read more at
1
2
34
@LauchLab
Jeremias Knoblauch
3 months
@sirbayes thanks for the plug, Kevin! :)
1
0
2
@LauchLab
Jeremias Knoblauch
3 months
📢 Post-Bayesian online seminar series coming!📢 To stay posted, sign up at We'll discuss cutting-edge methods for posteriors that no longer rely on Bayes Theorem. (e.g., PAC-Bayes, generalised Bayes, Martingale posteriors, ...) Pls circulate widely!
2
40
188
@LauchLab
Jeremias Knoblauch
3 months
@patrickshafto If I remember correctly, the ROI is absolutely ridiculous (something like a multiplier of 1200 for each £ invested in maths research...)
0
0
1
@LauchLab
Jeremias Knoblauch
4 months
RT @ayushb05: Interested in reducing the computational cost of running simulations in simulation-based inference (SBI)? Check out our new…
0
12
0
@LauchLab
Jeremias Knoblauch
5 months
@TakuoMatsubara Would be great to have you over in London to talk about this; are you planning a trip down south any time soon? :)
1
0
1
@LauchLab
Jeremias Knoblauch
5 months
@SirioLegramanti @DanieleDurante2 @PierreAlquier que bello! Looking forward to reading the final version of this herculean project :)
1
0
2
@LauchLab
Jeremias Knoblauch
5 months
@invictusqed Yep, thanks for spotting it; it actually is also consistent between seconds & hours. When I convert into decades or millenia, it jumps back to 2% because it converts back to years. I'm sure there are more interesting ways of breaking this!
1
0
0
@LauchLab
Jeremias Knoblauch
5 months
@miniapeur @OmarRivasplata @stats_UCL Yes; e.g. recent work led by @matialtamiranom (. More broadly, you can use optimisation-centric generalisations in function space & go beyond standard nonparametric Bayes in a similar way; see
1
0
2
@LauchLab
Jeremias Knoblauch
5 months
@vernadec @ERC_Research @ml4science Hard to think of someone more deserving! :)
0
0
1
@LauchLab
Jeremias Knoblauch
6 months
@andresmasegosa thanks for the link Andres! I wasn't aware of your new paper yet; so that'll go right on top of my stack to read.
0
0
1
@LauchLab
Jeremias Knoblauch
6 months
@roydanroy I largely agree with you; we actually have the case of temperature varying with n (but in the appendix as additional results rather in the main text)---the findings aren't as straightforward to present because your contraction rates now depend on \tau
1
0
0
@LauchLab
Jeremias Knoblauch
6 months
@BlackHC … really are two complementary targets: e.g., if your model is misspecified, you might be better off using a plug-in estimator that comes from a robust loss than you would be using a tempered posterior (irrespective of which temperature you choose)
0
0
1
@LauchLab
Jeremias Knoblauch
6 months
@BlackHC Instead, you’d hope there to be some kind of ‘ideal’ temperature for prediction. This turns out not to be the case. We think this is relevant because it re-emphasises that parameter uncertainty and predictive uncertainty in Bayesian methods …
0
0
1
@LauchLab
Jeremias Knoblauch
6 months
@BlackHC I'm would not call it trivial, but I do agree that it's intuitive---and for the exact reason you named. The reason it’s not trivial is that what you probably wouldn’t expect is that the effect of raising the temperature is monotonic.
0
0
1