davidrbakker Profile Banner
dave bakker Profile
dave bakker

@davidrbakker

Followers
899
Following
3K
Statuses
2K

PocketLab co-founder, Stanford teacher, and host of Science is Cool

Franklin, TN
Joined April 2010
Don't wanna be here? Send us removal request.
@davidrbakker
dave bakker
5 hours
Great to see this! Go @katyisd - we need more school age kids to know about great CTE careers! @DrTempleGrandin
@katyisd
Katy ISD
10 hours
Celebrating Career & Technical Education Month! DYK our Animal Science program is giving students a head start on exciting careers in the animal industry? Check out this video to learn more about the program, which is under the Agriculture, Food, and Natural Resources pathway.
0
0
0
@davidrbakker
dave bakker
1 day
@AskPlayStation Just curious...Is PSN more reliable if your PlayStation is from an authorized retailers?
0
0
0
@davidrbakker
dave bakker
2 days
YES! So glad to see PBL embraced in @TNedu schools!!
@theTSIN
Tennessee STEM Innovation Network
2 days
Exciting kickoff to our STEM + PBL Integration series with @TNedu! Huge thanks to @TNTechSTEM for hosting and to all the educators embracing PBL & CTE to transform STEM learning. This is just the beginning—stay tuned for more workshops! @CandiCollier72 #STEMEducation #PBL #CTE
Tweet media one
Tweet media two
Tweet media three
Tweet media four
0
0
1
@davidrbakker
dave bakker
3 days
RT @ThePocketLab: PocketLab announcement! Our most requested PocketLab sensor is finally here and it's all about Chemistry! PocketLab Ody…
0
1
0
@davidrbakker
dave bakker
4 days
yes owned one and yes could program in BASIC
@TsuiAllen
Allen Tsui
4 days
Yes @MuseumCommodore I did. How about you @EyesShutTeacher ?
0
0
1
@davidrbakker
dave bakker
5 days
Thoughts?
@FixingEducation
Fixing Education
5 days
A speaker was asked to speak to a high school class via Zoom. This was his experience.
0
0
0
@davidrbakker
dave bakker
5 days
Still time to sign up!!
@davidrbakker
dave bakker
5 days
Pre show run of show with Hannah from @InstantWild and Emma from @SciStarter for next week's Science is Cool webinar on Camera Traps!! 🦊🐅🦁🦧 @thepocketlab
Tweet media one
0
0
1
@davidrbakker
dave bakker
5 days
Pre show run of show with Hannah from @InstantWild and Emma from @SciStarter for next week's Science is Cool webinar on Camera Traps!! 🦊🐅🦁🦧 @thepocketlab
Tweet media one
0
0
2
@davidrbakker
dave bakker
5 days
This is a must read
@WilliamBryk
Will Bryk
5 days
Deep thoughts on Deepseek and Deep Research There was a lot of big AI news the past 2 weeks, but the actual biggest news wasn't what you'd think. The biggest news was not the trillion dollar Nvidia drop. That was a market overreaction bc of a spicy story. It was not the cheap Deepseek training run. That was impressive engineering under constraints but overblown. It was not the 500 billion dollar Stargate cluster. That was in line with predictions for big lab compute spend in the coming years. And it was not OpenAI's Deep Research. That was an impressive release but an entirely predictable combination of o3 with a traditional search engine API. So what was the biggest news? The biggest news… was that Reinforcement Learning for LLMs "just works". RL for LLMs is now easy to get working. We see RL for LLMs just working for Deepseek, given the speed they were able to replicate o1, and given the ease that other orgs had using the same RL algorithm on different training data. And we see RL for LLMs just working for OpenAI, with the speed they were able to get Deep Research working, only SOME WEEKS after o3 was trained. Something new has been discovered about reality, a statistical law of the universe. It's hard for us to grasp the power of billions of weights melding toward some reward signal. We're touching up against fundamental properties of information systems. If we ever meet superintelligent aliens out there, they'd probably tell us that they too discovered something akin to RL for LLMs long ago. All the other AI news stories the past two weeks will one day be minor details in the story. But RL for LLMs just working will power all the AI news going forward. It is this discovery that will usher in the next era of human history. So what will this next era look like if it's powered by RL + LLMs? Will it be run by startups or big tech companies? Will it be open source or closed? Will it be deeply sought or deeply researched? Yes. All of the above. I think the past two weeks suggest we're on track for a very diverse world, one where small players and big players, open and closed source, intelligent systems (deepseek) and knowledge systems (deep research), each have big roles to play. That's because the amount of value that's coming is absolutely massive (trillions of dollars) and no single player or single system will take it all. When you transition an entire economy to a new foundation built on compute, there will be opportunity everywhere for everyone. RL for LLMs just works, not just for OpenAI but for everyone. The Deepseek result complements what I've heard from people at the AI labs -- this new RL paradigm is no longer hard. It doesn't rely on some hard to replicate breakthrough like the transformer. It doesn't require some proprietary data mix like GPT-4, which took 2 years to replicate. It's an optimization function, one that requires a few thousand examples. The iteration cycles here are extremely fast. Deepseek replicated o1 in a couple months. OpenAI finetuned o3 for deep research in a couple weeks. All the big labs will have their o3-level models soon and their tool using agents soon after. And the open source versions will follow. Don't big labs have a massive compute advantage? Yes, because of the logarithmic test-time compute scaling law for RL + LLMs, you need exponentially more compute for linear gains in quality. The big companies will therefore own the frontier models. But Deepseek showed that startups and individuals will also have very good models of their own. These can be trained on proprietary data mixes to make them better than the frontier models for many tasks. There will be a powerful open source ecosystem of RL data, resources, and tools. And when the cost of serving goes down to basically the cost of the underlying GPUs, you won't need to run their o5 on their compute when you can run your personalized r5 on your own compute. Additionally we've seen that startups and individuals benefit from the race to the bottom that the big players play with their APIs, even from their frontier models. If RL + LLMs levels the playing field even more among the big labs, this probably gets more true. There is a wide distribution of tasks at all positions on the latency/intelligence/skill specialization/privacy graph, and no player will satisfy them all. You don't need a Terrence Tao o7 model to do your taxes. Trillions of dollars of new value is going to be created. There will be, and already are, a new breed of AI-first companies whose advantage come from streamlined integrations, prolific partnerships, magical product sense, shipping speed, access to unique data, connections to the physical world, viral marketing, building where big companies won't or can't. The world will overflow with models of all different types and sizes. Compute will power all pockets of the economy. This is the coolest time to be alive and to be building. Hectic and dangerous for our species for sure, but I'm optimistic. We'll get through it well if we act sensibly ( a big assumption yes). On the other side is abundance. (btw if you’re worried about lack of meaning in a world of abundance, don’t worry there will be plenty of scarcity — someone is gonna have more compute than you and you're gonna want it.) I wrote a post a couple weeks ago that predicted that at minimum by end of 2025 we’ll have phd level agents navigating the web doing complex tasks. Some called it hype. With Operator and Deep Research coming out some days later, we seem more than on track. These types of systems aren't accelerating people's work yet, but that's because they're bottlenecked on simple features that will come soon -- better integrations, longer context windows, connections to lots of data sources, and more training examples. We're only at the beginning. The past two weeks in AI were wild, and they point to many more wild weeks to come.
0
0
0
@davidrbakker
dave bakker
6 days
... and at 84
@ringostarrmusic
#RingoStarr
6 days
Wow, another Grammy. Well, done everybody. I send you peace and love. That’s right the beat goes on. thanks, peace and love Ringo.😎✌️🌟❤️🎶🥦🍒☮️☮️
Tweet media one
0
0
1
@davidrbakker
dave bakker
7 days
One of the best travelogues ever - a must see:
@belafleckbanjo
Béla Fleck
7 days
20 years ago today, me (and engineer Dave Sinko, and my little brother/director Sascha Paladino) set off on a journey to Africa that would alter forever everything we thought we knew about music, life, and ourselves. Throw Down Your Heart started as a documentary idea but quickly became a quest—one to understand the roots of the banjo and explore the deep, rich musical traditions that shaped it. I never imagined how deeply I personally would be touched by the generosity of the musicians I met, nor how profoundly their sounds would enhance the course of my own musical path. The banjo's story is truly a reflection of our shared history. From the moment I stepped onto African soil, I felt the weight of that truth, and the powerful connection that music can forge across cultures and time. I’m grateful to all the incredible musicians who welcomed me into their world, to the amazing team who helped turn this dream into reality and to every person who has supported this journey over the years. We are all connected in deeper ways than we sometimes realize, and music is one bridge that can bring us together. Here’s to 20 years of unforgettable moments and to the continued discovery of music’s infinite power.
Tweet media one
Tweet media two
Tweet media three
Tweet media four
0
0
0
@davidrbakker
dave bakker
8 days
Great talk with Chrissy and Jay Kleberg about their PBS documentary 'Chasing the Tide' - stay tuned for the podcast drop this week! @jaykleberg
Tweet media one
0
0
2
@davidrbakker
dave bakker
8 days
Having spent a good deal of time in Japan, Japanese people treat western foreigners very well. Sometimes when you show even the smallest gesture of respect, it means a great deal to them and is met with great appreciation. The dev team either never took the time learn this, or worse, knew this and ignored it.
1
0
9
@davidrbakker
dave bakker
8 days
RT @VictorVescovo: I co-authored a paper published this week concerning the high extent of pollution in the deepest point of the Mediterran…
0
7
0
@davidrbakker
dave bakker
8 days
Thanks again @milesobrien for a clear "here's what we know, here's what we don't know" analysis of the situation. With the massive changes happening in media right now, hopefully we see a reversion to this this type of reporting @CNN
@milesobrien
Miles O'Brien
9 days
What expert noticed in new videos of plane collision via @YouTube
0
0
0
@davidrbakker
dave bakker
9 days
Well, this is not good...
0
0
0
@davidrbakker
dave bakker
9 days
Go Tennessee!
@Missytesterman
Missy Testerman
9 days
TN NAEP scores are historic. For the 1st time, we are above the national average in all areas. TN students are in the top 25 for reading & math proficiency. 1st in the US in 8th gr math scores for Black students 13th in 4th gr math 21st in 8th gr reading
0
0
1
@davidrbakker
dave bakker
9 days
RT @bodnerK2: Congratulations Sally, well deserved!🎉⁦@STEMcobb⁩ ⁦@cobbscience⁩ ⁦@GSTANews#GSTA2025
Tweet media one
0
3
0