miltonllera Profile Banner
Milton Profile
Milton

@miltonllera

Followers
149
Following
6K
Statuses
189

Code monkey @RoboEvoArtLab

Copenhagen, Denmark
Joined January 2019
Don't wanna be here? Send us removal request.
@miltonllera
Milton
2 days
RT @risi1979: We're excited to announce the first Evolving Self-organisation workshop at @GeccoConf! Submission deadline: March 26, 2025…
0
60
0
@miltonllera
Milton
20 days
RT @Clement_MF_: Can we study gene regulatory networks from a behavioralist perspective, as "agents" navigating in transcriptional space?…
0
7
0
@miltonllera
Milton
21 days
RT @risi1979: Ever wish you could coordinate thousands of units in games such as StarCraft through natural language alone? 🗣️ We are exci…
0
14
0
@miltonllera
Milton
1 month
@OxfordDPAG @ruimcosta @NatureComms That's the wrong Rui Costa @somnirons ...
0
0
2
@miltonllera
Milton
1 month
RT @RichardMCNgo: Felix Hill was a kind and brilliant man (and my former teacher and colleague) who committed suicide after struggling with…
0
52
0
@miltonllera
Milton
1 month
RT @douwekiela: I’m really sad that my dear friend @FelixHill84 is no longer with us. He had many friends and colleagues all over the world…
0
92
0
@miltonllera
Milton
1 month
@tunguz Increasingly, people in academia don't know either...
0
0
0
@miltonllera
Milton
3 months
RT @jeffrey_bowers: Multiple studies show that ANNs can learn languages relying only on general learning algorithms when trained on a diet…
0
8
0
@miltonllera
Milton
3 months
@Plinz Damn that's 0 of 3 on takes in one post. Poverty of stimulus isn't disproven by point 1. AI can do 3 (to a high degree), but can't answer (yet) how language appeared in the first place. The mind that learns language could/can also invent it. Nobody ever questioned 2.
0
0
1
@miltonllera
Milton
3 months
@m_tangemann @bethgelab Great thanks! Unfortunately had to submit a camera-ready version for a workshop already, but will make sure to use this citation instead of the arXiv one for the full article.
0
0
0
@miltonllera
Milton
4 months
@bethgelab Is there a citation for this that is not the preprint?
1
0
0
@miltonllera
Milton
4 months
@FelixHill84 I think one obvious reason why Transformers are better is that, while short dependencies are more common, long ones can be important. Those "edge" cases may be the difference between a good and a bad model. Transformers have a mechanism to overcome this, RNNs don't
1
0
2
@miltonllera
Milton
4 months
RT @jeffrey_bowers: Reviewers & editors more likely to accept articles that use methods that produce (false) estimates of DNN-brain similar…
0
1
0
@miltonllera
Milton
4 months
@mpshanahan No? Seems quite surprising to me. Was there no better contribution to physics than an application of ideas from physics to a different field?
0
0
1
@miltonllera
Milton
4 months
@kenneth0stanley Data is inadequate. We don't just passively observe data, we interact with a world
0
0
1
@miltonllera
Milton
6 months
RT @chrx_h: ... 2 days later, without much having happened, new type of organisms suddenly appear everywhere. Most of them are kind of asle…
0
106
0
@miltonllera
Milton
6 months
RT @fchollet: Easy -- being immersed in an ever-changing, adversarial ecosystem requires you to be able to adapt to novelty on the fly. Th…
0
73
0
@miltonllera
Milton
6 months
RT @risi1979: And it’s a wrap! Thank you everybody for a great #ALIFE2024 conference.
Tweet media one
0
13
0
@miltonllera
Milton
7 months
RT @ALifeConf: How to make neural networks more flexible and adaptive by @JoachimWinther #ALife2024
Tweet media one
0
5
0