Muhammad Firmansyah Kasim Profile
Muhammad Firmansyah Kasim

@mfkasim

Followers
1K
Following
70
Statuses
3K

Researcher at @UniofOxford | Lecturer in math at @TrinityOxford

Oxford
Joined April 2009
Don't wanna be here? Send us removal request.
@mfkasim
Muhammad Firmansyah Kasim
4 years
Fully-differentiable Density Functional Theory written in @PyTorch. Pre-print paper here: I wonder what new applications this could bring.
1
2
22
@mfkasim
Muhammad Firmansyah Kasim
2 months
Really good work parallelising RNN!
@scott_linderman
Scott Linderman
2 months
Did you know that you can parallelize *nonlinear* RNNs over their sequence length!? Our @NeurIPSConf paper "Towards Scalable and Stable Parallelization of nonlinear RNNs," which introduces quasi-DEER and ELK to parallelize ever larger and richer dynamical systems! 🧵 [1/11]
Tweet media one
0
0
10
@mfkasim
Muhammad Firmansyah Kasim
4 months
RT @Hind_Gaza: Situation in Northern Gaza
Tweet media one
Tweet media two
0
2K
0
@mfkasim
Muhammad Firmansyah Kasim
4 months
RT @allinwithchris: WATCH: @chrislhayes honors Shaban Al Dalu, a Palestinian teen who burned to death after an Israeli strike. "That young…
0
10K
0
@mfkasim
Muhammad Firmansyah Kasim
6 months
@crawlingladybug @AsahPolaPikir Pantasan bersin2 ka …
1
0
1
@mfkasim
Muhammad Firmansyah Kasim
7 months
@_fikri_auliya Kalau diplot index vs time, bakalan jadi sawtooth signal dengan periode 2*(n-1), jadi T bisa direduksi jadi T=T%[2*(n-1)]. Habis itu tinggal dibagi aja kasusnya apakah T<=n-1 atau tidak.
0
0
2
@mfkasim
Muhammad Firmansyah Kasim
8 months
@uqyauthor Pantesan notifikasi twitter tiba2 jadi banyak. Hahaha.. Thanks, qy!
1
0
9
@mfkasim
Muhammad Firmansyah Kasim
8 months
@imrenagi … dan bisa bertahan hidup dengan “gaji” Indonesia di Swiss.
1
0
3
@mfkasim
Muhammad Firmansyah Kasim
1 year
@_fikri_auliya Kalau tensor programming di deep learning, pakai for-loop (apapun variablenya) hukumnya makruh
0
0
1
@mfkasim
Muhammad Firmansyah Kasim
1 year
@francoisfleuret You might want to check our paper on how to compute nonlinear RNN in parallel:
0
0
3
@mfkasim
Muhammad Firmansyah Kasim
1 year
@_fikri_auliya Di rumah sering, bahkan kita mau bikin kata baru biar singkat: “GPT it”
0
0
0
@mfkasim
Muhammad Firmansyah Kasim
1 year
@sonnylazuardi Kalau lagi di Oxford, ngopi2 yuk
0
0
1
@mfkasim
Muhammad Firmansyah Kasim
1 year
@ZikriIsmal @aria_ghora @__nggih @nmonarizqa @rizalzaf @imcimon Tergantung jobdesc-nya. Kita biasanya nyebutnya ML research engineer, karena kerjaannya dekat ke riset. Skillsetnya juga beda dari SE: tensor programming, baca paper+implementasi, linear algebra
0
3
12
@mfkasim
Muhammad Firmansyah Kasim
1 year
@austerralia @imrenagi Takut kena UU ITE :p
0
0
1
@mfkasim
Muhammad Firmansyah Kasim
1 year
@imrenagi Seminarnya diadakan sama anak ITB, ngundang 2 alumni dan ternyata sama2 dari Oxford juga. Kata lainnya: "anak milenial terkenal". Hahaha
1
0
0
@mfkasim
Muhammad Firmansyah Kasim
1 year
Can we parallelize RNN? We can! Now you can see how on our repo: This is the code implementation of the paper I shared last week:
0
1
8
@mfkasim
Muhammad Firmansyah Kasim
1 year
Our work's post @Reddit is being featured on @slashML twitter!
@slashML
/MachineLearning
1 year
Parallelizing RNN over its sequence length
0
1
4
@mfkasim
Muhammad Firmansyah Kasim
1 year
@aria_ghora Non-convergence karena Newton's method memang ga ada jaminan bakalan converge, jadi kalau initial guess-nya jauh, bisa diverge bahkan. Paper-nya lumayan berat di matematikanya, jadi kalau ada pertanyaan, tanya aja.
1
0
0
@mfkasim
Muhammad Firmansyah Kasim
1 year
Another drawback is scalability. Our method scales poorly: O(n^3) time complexity with the number of dimensions n. However, we can still achieve speed up using GRU with 64 dimensions (which is quite common value for experiments).
0
0
0