William Gilpin Profile Banner
William Gilpin Profile
William Gilpin

@wgilpin0

Followers
4,940
Following
2,149
Media
109
Statuses
1,434

asst prof @UTAustin physics @OdenInstitute interested in chaos, fluids, & biophysics.

Austin, TX
Joined May 2016
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
Pinned Tweet
@wgilpin0
William Gilpin
8 months
Is chaos an analogue sampling scheme? I wrote a Perspective for @NatRevPhys discussing connections between classical work on information flow in nonlinear systems, and modern generative ML models
6
54
277
@wgilpin0
William Gilpin
10 months
Can machine learning predict chaos? My new paper performs a large-scale comparison of modern forecasting methods on a giant dataset of 135 chaotic systems. (1/N) @TexasScience @OdenInstitute @UTPhysics
52
584
3K
@wgilpin0
William Gilpin
3 years
Is chaos actually hard to predict? For NeurIPS this year I made a database of 131 known strange attractors, and trained state-of-the-art forecasting models on each one, to try to figure this out (1/N): Paper: Dataset + Code:
24
406
2K
@wgilpin0
William Gilpin
2 years
Excited to develop a new grad computational physics course at @TexasScience . The entire class is open-source, incl discussions, lectures, hw, & solutions. Anyone, anywhere can access materials, and even submit pull requests/issues for bugs, typos, etc
22
241
1K
@wgilpin0
William Gilpin
5 years
My new preprint explores using neural networks to learn “strange attractors” from time series. If we measure a complex system over time, can we infer additional measurement dimensions & discover underlying structure? (1/N)
8
229
864
@wgilpin0
William Gilpin
3 years
I’ll be joining the @UTAustin physics faculty in Fall 2022, affiliated with @OdenInstitute ! We’ll be working on computational & experimental chaos, fluids, and data-driven modeling. I’ll be looking for students, postdocs, & visitors soon. Thanks to all who made this possible!
33
20
324
@wgilpin0
William Gilpin
10 months
The final projects for my graduate computational physics course were really cool. A group of amazing undergrads implemented Anderson localization under the tight binding model on arbitrary graphs (1/4)
3
34
308
@wgilpin0
William Gilpin
1 year
The second year of my fully open-source graduate computational physics course is now underway @TexasScience @OdenInstitute We invite students anywhere in the world to participate. Mods/Issues/PRs on GitHub welcome! (vid: condensate forms in turbulence)
5
57
251
@wgilpin0
William Gilpin
10 months
I found that large domain-agnostic models (Transformers, LSTM, etc) can forecast chaos really far into the future (>10 Lyapunov times). With enough training history, they outperform physics methods (next-gen reservoir computers, neural ODE, etc).
Tweet media one
6
35
210
@wgilpin0
William Gilpin
3 years
I’m looking for grad students for several openings starting Fall 2022 or later! Excited to talk to students w/ diverse backgrounds interested in any combo of complex systems, machine learning, nonlinear & fluid dynamics, especially applied to biological questions (1/4)
1
62
196
@wgilpin0
William Gilpin
3 years
My new paper shows oscillators that try to synchronize, but can’t b/c of mutual repulsion and jamming. In intermediate regimes, synchronization and neighbor interactions compete to produce avalanches @PhysRevResearch Paper: Code:
4
40
193
@wgilpin0
William Gilpin
6 years
My new paper "Cryptographic hashing using chaotic hydrodynamics" appears today in PNAS. I found that keeping a secret is as easy as stirring your coffee: @PNASNews
8
59
175
@wgilpin0
William Gilpin
2 years
I’m hiring a postdoc at UT Austin physics. We work on computational nonlinear dynamics for fluids and biological systems. We have projects with various combos of pencil-and-paper theory, coding-heavy ML for time series, and even a few tabletop experiments (1/3)
Tweet media one
3
45
169
@wgilpin0
William Gilpin
3 years
I am very grateful and excited to be on the Forbes 30 under 30 list this year! I've been very lucky to have amazing mentors like Manu supporting my career
@PrakashLab
PrakashLab
3 years
Latest Forbes 30 under 30 list is out - fantastic to see so many upcoming young scientists work highlighted. Also proud moment to see @wgilpin0 amongst them who is about to open doors for his brand new lab at UT Austin soon.
1
3
45
13
4
124
@wgilpin0
William Gilpin
6 years
It is wonderful to visit somewhere with dark skies. Here is the center of the galaxy rising over Kauai (30 second exposure)
Tweet media one
2
24
118
@wgilpin0
William Gilpin
10 months
That model scale + large amounts of data outperform domain-specific inductive biases is known as the “bitter lesson”, and it’s a theme of recent ML works
Tweet media one
Tweet media two
4
14
118
@wgilpin0
William Gilpin
2 years
I’m looking for a postdoc with broad interests in any intersection of nonlinear dynamics, statistical learning, and soft matter physics. Lots of independence to pursue anything you find exciting!
@jobRxiv
jobRxiv
2 years
Postdoc positions in soft matter, machine learning, and dynamical systems at UT Austin with @wgilpin0 The University of Texas at Austin, USA #ScienceJobs
0
2
12
1
45
119
@wgilpin0
William Gilpin
5 years
The goal of this project is to create a general-purpose tool for exploratory analysis of time series—you never know what you might find! The code is all available , and I would be excited to chat more with anyone who wants to try it out! (N/N)
5
14
109
@wgilpin0
William Gilpin
4 years
My paper on learning strange attractors from in time series has been accepted by #NeurIPS2020 . It's been really fun revisiting classic works on chaos in the context of modern ML tools. I’m very grateful to the ML/AI community for considering my work.
@wgilpin0
William Gilpin
5 years
My new preprint explores using neural networks to learn “strange attractors” from time series. If we measure a complex system over time, can we infer additional measurement dimensions & discover underlying structure? (1/N)
8
229
864
1
13
109
@wgilpin0
William Gilpin
3 years
For each system, I trained several deep learning models (Transformer, NBEATS, LSTM, neural ODE), plus classical (ARIMA, Linear, FFT) & statistical (Prophet, Exponential Smooth). Deep learning forecasts chaos super well, even w/ noise. Here's epochwise-prediction during training:
3
11
103
@wgilpin0
William Gilpin
7 years
The lake-dwelling protist Stentor uses vortices to trap edible particles. From … with @PrakashLab @Viveknprakash
3
63
103
@wgilpin0
William Gilpin
4 years
I wrote an perspective with Yitong Huang & Danny Forger about using machine learning to discover hidden dynamics in biological data! Can machine learning tell us something new about our data, or does it just re-package what we already know?
Tweet media one
0
35
98
@wgilpin0
William Gilpin
3 years
Here’s all 131 chaotic systems as an embedding, colored with unsupervised clustering. Includes familiar examples like Lorenz & Mackey-Glass, plus application-specific examples like ecosystems, biochemical networks, & fluid flows. Ranges/pts are meds/errs over many initial conds.
Tweet media one
2
8
96
@wgilpin0
William Gilpin
3 years
What does this mean for forecasting? Models that implicitly “lift” input dimensionality (time lags, nonlinear kernels, etc) do better—perhaps chaotic systems act more linear in higher dims? See recent work on Koopman and Perron-Frobenius propagators for dynamics
Tweet media one
3
10
95
@wgilpin0
William Gilpin
7 months
Our group successfully completed our first experimental project... installing a new office whiteboard!
Tweet media one
Tweet media two
4
0
87
@wgilpin0
William Gilpin
5 years
Excited to post our @NatRevPhys review on “The multiscale physics of cilia and flagella” We review amazing recent discoveries from the perspective of statistical physics and dynamical systems. It’s been a wonderful journey with @PrakashLab and Matt Bull
Tweet media one
0
28
84
@wgilpin0
William Gilpin
8 years
A swarm of sea monkeys chasing a blue laser beam (see paper by Wilhelmus and Dabiri )
3
45
79
@wgilpin0
William Gilpin
10 months
But if you know nothing a priori about your time series, and you have a *lot* of data, new ML models work really well if you tune them carefully. >10 Lyapunov time forecasts would have seemed crazy a few decades ago.
Tweet media one
3
8
78
@wgilpin0
William Gilpin
3 years
Not all systems are easy to forecast. I correlated forecast with fractal dimension, Lyapunov exponent, entropy, etc. Lyapunov correlates pretty well with predictability. Seems intuitive, but it’s crazy that a local quantity clearly affects sophisticated long-term forecast algos
Tweet media one
3
4
69
@wgilpin0
William Gilpin
7 years
Our new paper on flow visualization appears on the cover of today's @J_Exp_Biol ! Cheers @Viveknprakash @PrakashLab
Tweet media one
1
20
69
@wgilpin0
William Gilpin
7 years
My collection of seashells that exhibit cellular automaton patterns
Tweet media one
6
11
65
@wgilpin0
William Gilpin
10 months
The biggest takeaway is that the bias-variance tradeoff rules. If you're forecasting a system where you have strict constraints from domain knowledge (Hamiltonian, etc), put them in the model---you'll save yourself compute, need less data, & have less hyperparameter tuning
1
5
58
@wgilpin0
William Gilpin
10 months
However, if we restrict the training history, inductive biases of physics-based models win out. Here, we titrate the amount of training history, re-train each model, and compare forecast quality at 1 Lyapunov time
Tweet media one
1
6
56
@wgilpin0
William Gilpin
5 years
It's been a huge privilege to work with and learn from you @PrakashLab these past few years. I feel very lucky to be part of such an incredible and creative team. But while I may be gone, my hard drive collection will live on in lab forever...
@PrakashLab
PrakashLab
5 years
Congratulations Dr William Gilpin - @wgilpin0 what a spectacular PhD defense. A combination of beautiful science presented elegantly. And that patiria miniata cake is out of this world! I am so proud to have worked with you William.
Tweet media one
Tweet media two
Tweet media three
Tweet media four
2
4
78
2
4
49
@wgilpin0
William Gilpin
7 years
A water drop hits a sandy surface. From a beautiful paper by Zhao et al:
2
23
43
@wgilpin0
William Gilpin
10 months
Our course is fully open-source, and we welcome anyone using or modifying our materials. Pull requests are always welcome (4/4) Course website: Course GitHub:
0
4
40
@wgilpin0
William Gilpin
5 years
We can also try applying it to datasets where the underlying attractor is unknown, including patient electrocardiograms, neural data, and even temperature readings of “Old Faithful” geyser eruptions. There are some surprising (but interpretable) low-dimensional dynamics (3/N)
Tweet media one
1
5
40
@wgilpin0
William Gilpin
4 years
I’ll be chatting about my NeurIPS paper on using deep learning to reconstruct strange attractors from time series tonight at #NeurIPS2020 Poster Session 7 (9 PT / 12 ET). Paper here: Below I used the method to visualize eruptions of the Old Faithful geyser
2
6
40
@wgilpin0
William Gilpin
3 years
Also applications to data-driven modelling; the paper also has examples using all 131 chaotic systems to benchmark neural ODEs, as well as symbolic regression algorithms (N/N)
1
1
37
@wgilpin0
William Gilpin
7 years
A rock-paper-scissors game creates Turing patterns. From a simple model of biodiversity by Reichenbach et al:
2
15
39
@wgilpin0
William Gilpin
5 years
It's been an amazing experience working with so many fun, kind, and creative people these past five years, and learning from Manu's curiousity-driven approach to science. I'm looking forward to learning to ice climb this year.
@PrakashLab
PrakashLab
5 years
It’s always a bitter sweet moment to see a lab member leave for new adventures - this time its @wgilpin0 who is starting as a Qbio Fellow at Harvard👏. Some warm hugs were exchanged, some old jokes reborn. Wish you very best - incredibly proud of everything you have already done!
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
2
60
1
4
38
@wgilpin0
William Gilpin
5 years
My paper showing how to train convolutional neural networks to represent cellular automata appears today in @PhysRevE @APSphysics
1
10
38
@wgilpin0
William Gilpin
4 years
Thanks very much to @PrakashLab for a wonderful PhD experience, and to @ApsDbio for this amazing opportunity!
@PrakashLab
PrakashLab
4 years
Excited about William’s @wgilpin0 invited Award talk at APS March meeting tomorrow for “Outstanding Doctoral Thesis Research in Biological Physics (2020)” - I wish we could celebrate in person; alas we will just have to enjoy ciliary flows virtually.
Tweet media one
1
8
92
1
2
38
@wgilpin0
William Gilpin
2 years
We wrapped up the first semester of our open-source computational physics course @TexasScience ---the students' final projects are pretty cool: (1/N)
Tweet media one
3
5
37
@wgilpin0
William Gilpin
5 years
The root of this approach is powerful theorems by Whitney & Takens, which give conditions under which variables can be recovered from the time-history of a system using time lags. These are the basis of popular “state space reconstructions” in ecology & earth science (4/N)
Tweet media one
Tweet media two
3
2
36
@wgilpin0
William Gilpin
3 years
Other cool things you can do w/ the database: unlike traditional time series, DiffEqs are generative: they can be re-integrated at any granularity, length, or noise. I used the dataset for transfer learning by pretraining a timescale-matched featurizer for a classifier
2
1
35
@wgilpin0
William Gilpin
3 years
Here’s the full ranking across all the models. Distributions correspond to forecast errors across all 131 attractors. Left violin is coarse sampling granularity; Right is fine granularity
Tweet media one
1
2
34
@wgilpin0
William Gilpin
3 years
I’ll be at NeurIPS Poster E3 tomorrow 12/10, 8:30 - 10 am PT to discuss my paper on using hundreds of chaotic systems to benchmark forecasting & other time series problems. Turns out classical chaoticity & ML predictability have a subtle relationship
@wgilpin0
William Gilpin
3 years
For each system, I trained several deep learning models (Transformer, NBEATS, LSTM, neural ODE), plus classical (ARIMA, Linear, FFT) & statistical (Prophet, Exponential Smooth). Deep learning forecasts chaos super well, even w/ noise. Here's epochwise-prediction during training:
3
11
103
2
3
35
@wgilpin0
William Gilpin
10 months
Colors above are highlighted methods, and I’ll use same colors for ensuing figures. For all figures, error bars are across 135 low-dimensional chaotic systems, and points are single forecast methods (e.g. Transformer, neural ODE, etc)
1
2
34
@wgilpin0
William Gilpin
3 years
Multiscale models like NBEATS or TCN (causal dilated convolutions) seem to do better, perhaps because chaotic systems have continuous spectra. The results hold up across a range of noise levels and granularities (sampling rates)
Tweet media one
2
2
32
@wgilpin0
William Gilpin
10 months
Weirdly, the Lyapunov exponent doesn’t correlate empirically with how well different methods perform on different chaotic systems, especially over longer forecasting horizons. More on this in ~*~future work~*~
Tweet media one
1
4
34
@wgilpin0
William Gilpin
5 years
In the new method here, an autoencoder is trained on the measurement history, with a new latent-space loss function & regularizer based on influential classical technique by Kennel, Brown, & Abarbanel for determining embedding dimension by avoiding “false neighbors” (5/N)
Tweet media one
1
2
33
@wgilpin0
William Gilpin
5 years
We can test this hypothesis by training the network on single coordinates of several well-known strange attractors (Lorenz’s butterfly, Rössler’s Möbius strip, etc), and find that it can reconstruct the full systems pretty well (2/N)
Tweet media one
1
5
33
@wgilpin0
William Gilpin
2 years
All the videos above are homework problems: (1) Avalanche activity cascades in a sandpile automaton. (2) Vortex street formed by flow past a cylinder. (3) Turing patterns in the Gray-Scott model
1
2
31
@wgilpin0
William Gilpin
4 years
"The Strange New Science of Chaos" - A 1989 episode of PBS's Nova featuring interviews and elegant demos with Lorenz, Swinney, Shaw, Gollub, et al.
Tweet media one
2
5
28
@wgilpin0
William Gilpin
3 years
@oecodynamics @ricard_sole Here's a high-res version of this poster on GitHub The raw code used to make the figure is in the top-level “demos” notebook
3
0
28
@wgilpin0
William Gilpin
3 years
Excerpts from “Chaos in the brickyard,” a 1963 letter to Science by Bernard Forscher about science as “brickmaking” as opposed to "building"
Tweet media one
Tweet media two
Tweet media three
1
4
25
@wgilpin0
William Gilpin
7 years
Chaotic mixing in by two "blinking" vortices. From the Hassan Aref's 1984 @JFluidMech paper that first discovered chaotic advection
0
22
25
@wgilpin0
William Gilpin
6 years
The intro to the 1960s Japanese TV show “Ultraman” looks just like G.I. Taylor’s classic laminar flow reversal experiment. I wonder if they did the experiment, or if they just reversed the footage? thanks @RealCarlosSagan for sending this.
0
8
21
@wgilpin0
William Gilpin
10 months
We can also compare training costs. Bigger models take longer to train, but do better overall. Each point is a forecast model, error bars are across chaotic systems, and the “tilt” shows the correlation within each method class (no Simpson’s paradox)
Tweet media one
1
3
23
@wgilpin0
William Gilpin
10 months
I also compared different metrics of forecast quality. They appear to mostly agree. fwiw, I prefer SMAPE for assessing forecast quality
Tweet media one
2
3
22
@wgilpin0
William Gilpin
10 months
Interesting, methods that are better at pointwise forecasts also appear to better reconstruct invariants (manifold dimension, Lyapunov exponents, etc). So probabilistic forecasting and raw accuracy go hand-in-hand.
Tweet media one
1
4
21
@wgilpin0
William Gilpin
4 years
An awesome blogpost about my recent preprint, and the broader field of chaos, by @zkajdan at RStudio
@posit_pbc
Posit PBC
4 years
Deep attractors: Where deep learning meets chaos - #rstats
0
20
72
0
8
20
@wgilpin0
William Gilpin
6 years
My preprint “Cellular automata as convolutional neural networks” appears today on . Here is a video showing the different stages of training with Conway’s Game of Life
0
7
19
@wgilpin0
William Gilpin
5 years
Thirty years ago, Berkeley math Prof René De Vogelaere died unexpectedly, leaving behind an unfinished 800 page textbook on Finite Euclidean Geometry that he had spent 10 years writing. Today, we've digitized and made his book freely available on @arXiv
Tweet media one
1
9
20
@wgilpin0
William Gilpin
10 months
Another group used the adjoint method to map a full-state computational brain model onto a reduced-order Hodgkin-Huxley system (2/4)
Tweet media one
1
1
21
@wgilpin0
William Gilpin
7 years
I gave a talk at #APSMarch last week about how baby starfish make chaotic mixing patterns as they swim. We think that they create these beautiful patterns as part of their feeding process: This work is w/ @PrakashLab and @Viveknprakash
Tweet media one
1
5
21
@wgilpin0
William Gilpin
6 years
Day 1304 of life in Palo Alto: A Tesla with the license plate "AI HYPE" cuts me off in a roundabout.
1
0
17
@wgilpin0
William Gilpin
2 years
Our group's first annual pumpkin carving contest included a variety of surreal entries.
Tweet media one
0
1
18
@wgilpin0
William Gilpin
6 years
Why watch fireworks this week when you can instead watch a montage of plants exploding? c/o @smithsonian #july4th
1
4
18
@wgilpin0
William Gilpin
7 years
Our new preprint about higher-order effects in evolutionary dynamics is now online:
Tweet media one
0
6
18
@wgilpin0
William Gilpin
6 years
A really stunning new video from @MBARI_News showing thousands of brooding octopuses near Monterey Bay From:
0
4
17
@wgilpin0
William Gilpin
5 years
My new preprint discusses surprising (and solvable) avalanches that occur when oscillators synchronize. Oscillator cascades occur in everything from the human brain to the U.S. power grid, and I show that short-range repulsion is one possible cause
0
3
15
@wgilpin0
William Gilpin
8 years
this large raft of kayaks looks a lot like a bacterial swarm / random packing of hard ellipses (pic by Nancy Battaglia for @NatGeo )
Tweet media one
2
5
16
@wgilpin0
William Gilpin
7 years
One thing I miss about living in Oklahoma are the fossils: these snail shells were all found strewn among the gravel on the side of the highway. #tbt
Tweet media one
2
3
14
@wgilpin0
William Gilpin
7 years
Our new paper uses reaction-diffusion dynamics to study the spread of Neanderthals and modern humans in Europe:
Tweet media one
1
5
14
@wgilpin0
William Gilpin
10 months
A different group implemented a set of stochastic optimizers in JAX, and studied configuration of various many-body systems (3/4)
Tweet media one
1
0
14
@wgilpin0
William Gilpin
3 years
@quentinferry1 hi, it's all matplotlib via plt.savefig. I stitch frames into movies with ffmpeg
1
0
12
@wgilpin0
William Gilpin
7 years
@debivort @aexbrown this is awesome! you might be interested in our work on baby starfish (fig. 4 of ), a system with low-dim behavior where behavioral principal components almost exactly correspond to eigenfunctions of the solution to the underlying fluid dynamics PDE
0
1
12
@wgilpin0
William Gilpin
3 years
@eigensteve @bingbrunton Thank you so much—reading your work back in grad school is what got me interested in this whole field in the first place! (I’m sure you hear that a lot)
1
0
12
@wgilpin0
William Gilpin
3 years
our physics PhD program doesn’t require physics undergrad or physics GRE. We’ve had students with undergrads in CS, engineering, bioinformatics, etc and they haven’t found the 4 core courses burdensome. Quals are research-based talks, not written exams 😅 (3/4)
1
1
11
@wgilpin0
William Gilpin
4 years
An amazing piece by @zkajdan that implements in R my recent chaos embedding technique (), and uses it to predict heartbeats, neurons, and even geyser eruptions!
@posit_pbc
Posit PBC
4 years
New on the RStudio AI blog: Time series prediction with FNN-LSTM - #rstats #rkeras
Tweet media one
0
5
20
1
1
13
@wgilpin0
William Gilpin
7 years
My new paper on chaotic phase transitions in complex ecosystems appears today in #PLOSCompBio :
0
5
13
@wgilpin0
William Gilpin
7 years
A worm is slowly pressurized to 15 psi inside a #microfluidic device. From my old paper #celegans
1
4
13
@wgilpin0
William Gilpin
5 years
@FlyingOctopus0 @jm_alexia I tried this with the ECG embedding below. I kept the points/state static and rotated the embedding. Thank you for the tip!
0
0
13
@wgilpin0
William Gilpin
5 years
Stunning scientific cartoons from physicist Robert Shaw’s 1984 book “The Dripping Faucet As A Model Chaotic System.” Drawn by his brother, Chris Shaw.
Tweet media one
Tweet media two
0
2
12
@wgilpin0
William Gilpin
10 months
@shoyer @TexasScience @OdenInstitute @UTPhysics That's a really good point, I didn't include any probabilistic models (partly because I didn't think I was knowledgeable enough to benchmark them well). You pointed out last time that I should think about manifold geometry in addition to raw pointwise accuracy. (1/N)
1
2
12
@wgilpin0
William Gilpin
7 years
A green sea slug that makes literal solar panels from the chloroplasts of algae it eats. From Rumpho et al:
Tweet media one
0
7
11
@wgilpin0
William Gilpin
6 years
An ironic warning on the first page of Internet Archive’s digitization of Oseen’s Hydrodynamics book
Tweet media one
2
0
11
@wgilpin0
William Gilpin
10 months
@shoyer @TexasScience @OdenInstitute @UTPhysics But I agree that over long periods it's easy to "cheat." My pet theory is that models learn basically to shadow the dominant unstable periodic orbits, since those capture the majority of the measure. It's hard to tell a long-period orbit apart from chaos
0
0
7
@wgilpin0
William Gilpin
7 years
Saw a fascinating talk today by @KakaniKatija at @StanfordEng about surprisingly complex and beautiful mucus houses built by giant deep sea larvaceans, who can collectively filter all of Monterey Bay in just 13 days! Papers here:
1
4
9
@wgilpin0
William Gilpin
7 years
Meteor shower and a single night climber on El Capitan in Yosemite National Park, as the Pleiades cluster rises in the background. @YosemiteNPS @apod
Tweet media one
1
3
9
@wgilpin0
William Gilpin
4 years
Elegant experiments and modelling by @guille_rochelle and @PrakashLab that demonstrates how disorder can help cilia in the lungs pump fluid
@NaturePhysics
Nature Physics
4 years
Multi-scale spatial heterogeneity enhances particle clearance in airway ciliary arrays
1
7
19
0
2
9
@wgilpin0
William Gilpin
7 years
On top of earning @TeamUSA a medal, @mirai_nagasu pulled off an incredible demonstration of conservation of angular momentum
0
2
9
@wgilpin0
William Gilpin
7 years
Beautiful video about the cytoskeleton by @sshekhr !
0
4
9