Jeremy Berman Profile
Jeremy Berman

@jerber888

Followers
2K
Following
1K
Statuses
80

research @ndeainc. co-founded https://t.co/aY50hNeJUD. yc w19.

NYC
Joined August 2017
Don't wanna be here? Send us removal request.
@jerber888
Jeremy Berman
2 months
I just got first place on the public ARC-AGI benchmark using Claude Sonnet 3.5 and Evolutionary Test-time Compute
Tweet media one
@arcprize
ARC Prize
2 months
2024 ARC-AGI-Pub SoTA! đź‘ľ 53.6% @jerber888 47.5% MARA(BARC) + MIT (@ellisk_kellis, @akyurekekin) 43.0% @RyanPGreenblatt
40
87
1K
@jerber888
Jeremy Berman
2 days
Internet connected code execution is the only tool an LLM needs.
0
0
9
@jerber888
Jeremy Berman
2 days
Returning the literal reasoning tokens at inference is important because it’s useful to manipulate the reasoning. If the reasoning tokens are part of the context, we can edit / guide the reasoning in real-time to do interesting things. Having root access to is valuable. This puts o3 at a disadvantage relative to DeepSeek R1.
2
0
10
@jerber888
Jeremy Berman
2 days
@ElliotEvNo I’m sure we do have some preset parameters. We don’t start our RL loop from random weights
1
0
0
@jerber888
Jeremy Berman
2 days
RT @mmay3r: Genius is unlikely tokens that are right.
0
7
0
@jerber888
Jeremy Berman
13 days
RT @mikeknoop: just published my full @arcprize analysis of deepseek's r1-zero and r1. link below. key points: r1-zero is more important t…
0
136
0
@jerber888
Jeremy Berman
14 days
AGI will at least be able to drive a car, no?
3
0
9
@jerber888
Jeremy Berman
22 days
AIs have bad taste now because they are confined to the tastes within their training distribution, and the average taste on the internet is bad. This is temporary. We will figure out AGI: how to generalize materially outside of the training distribution — how to create new knowledge and new taste on the fly through exploration. Then, AIs will have extraordinary taste.
0
0
6
@jerber888
Jeremy Berman
25 days
“No man ever steps in the same river twice, for it's not the same river and he's not the same man.”
0
0
0
@jerber888
Jeremy Berman
27 days
read about our goals here:
0
0
10
@jerber888
Jeremy Berman
27 days
RT @ClementBonnet16: Excited for the launch of @ndeainc 🔥 We’re making the bet that merging structured reasoning with deep learning will c…
0
10
0
@jerber888
Jeremy Berman
27 days
RT @mikeknoop: AGI is the most important technology in the history of the world. That's why I'm going all in on a new adventure with @fchol…
0
96
0
@jerber888
Jeremy Berman
1 month
RT @michaelxbloch: I’m trying something new: a weekly roundup of ideas and articles that made me stop and think. Here’s what stood out to m…
0
1
0
@jerber888
Jeremy Berman
1 month
Facts and deduction are all we need to create knowledge. Intelligence is intuiting which facts to deduce from.
2
0
9
@jerber888
Jeremy Berman
2 months
@techczech The best thing to do is to have an LLM semantically chunk the training data. But that’s super expensive
0
0
0
@jerber888
Jeremy Berman
2 months
@FernandoTheRojo Looks 🔥
0
0
1
@jerber888
Jeremy Berman
2 months
@goldstein_aa Yeah similar idea!
0
0
0