Fangru Lin Profile Banner
Fangru Lin Profile
Fangru Lin

@FangruLin99

Followers
2,000
Following
326
Media
26
Statuses
97

DPhil student in language modelling @UniofOxford , @turinginst ; Clarendon Scholar; Prev Research @MSFTResearch , Engineer @Microsoft ; Computational Linguist

Oxford, England
Joined January 2024
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
Pinned Tweet
@FangruLin99
Fangru Lin
3 months
Just finished our poster session @icmlconf ! Had a great time talking with people stopping by!🤩🤩🤩
Tweet media one
34
28
1K
@FangruLin99
Fangru Lin
3 months
Thank you for sharing this video @florianhoenicke ! It’s a great pleasure to know what people are interested in our work!
@florianhoenicke
Florian Hönicke
3 months
An easy trick to improve your LLM results without fine-tuning. Many people know "Few-Shot prompting" or "Chain of Thought prompting". A new (better) method was presented by @FangruLin99 at #ICML2024 . It is called: Plan Like a Graph (PLaG) The core idea is simple. #icml #icml24
13
141
819
6
32
487
@FangruLin99
Fangru Lin
6 months
I’m so glad that our paper is accepted at #ICML2024 ! Again many thanks to my fantastic co-authors and see you in Vienna!🤩🤩🤩
@FangruLin99
Fangru Lin
9 months
Excited to share our paper with @iperboreo_ @vjhofmann @ellemichelley Anthony Cohn and Janet Pierrehumbert: ! We release a benchmark for asynchronous plan *AsyncHow*. When *Plan Like a Graph*, GPT-4/3.5 get consistent boost over all task complexities. 1/n
Tweet media one
1
4
23
3
14
144
@FangruLin99
Fangru Lin
5 months
Finished my poster session @LrecColing ! Many thanks for stopping by!
Tweet media one
2
3
72
@FangruLin99
Fangru Lin
5 months
💥Our #ICML2024 camera-ready paper Graph-enhanced Large Language Models in Asynchronous Planning is available on arxiv: ! *Off-the-shelf* method *Plan Like a Graph* gives GPT-3.5/4 Pareto improvement on asynchronous planning tasks of all complexities!🧵
Tweet media one
2
4
35
@FangruLin99
Fangru Lin
3 months
Going to @icmlconf London meetup this Friday to present our paper . I also have a oral presentation session in London at 15:00-16:00. Please stop by our poster or oral session and DM me for random research chat in London or Vienna!😆😆
Tweet media one
5
0
27
@FangruLin99
Fangru Lin
3 months
And paper here!
2
1
28
@FangruLin99
Fangru Lin
7 months
My master thesis ‘Probing Large Language Models for Scalar Adjective Lexical Semantics and Scalar Diversity Pragmatics’ accepted by LREC-COLING 2024 is online! Many thanks to my supervisors Janet Pierrehumbert and @Dr_Semantic for their generous help!
Tweet media one
0
4
27
@FangruLin99
Fangru Lin
3 months
I will be presenting this poster tomorrow at 1:30-3 pm Tuesday, Hall C 4-9 #700 @icmlconf with @iperboreo_ ! Stop by and have a chat with us!🤩🤩🤩
@FangruLin99
Fangru Lin
3 months
Going to @icmlconf London meetup this Friday to present our paper . I also have a oral presentation session in London at 15:00-16:00. Please stop by our poster or oral session and DM me for random research chat in London or Vienna!😆😆
Tweet media one
5
0
27
5
0
24
@FangruLin99
Fangru Lin
9 months
Excited to share our paper with @iperboreo_ @vjhofmann @ellemichelley Anthony Cohn and Janet Pierrehumbert: ! We release a benchmark for asynchronous plan *AsyncHow*. When *Plan Like a Graph*, GPT-4/3.5 get consistent boost over all task complexities. 1/n
Tweet media one
1
4
23
@FangruLin99
Fangru Lin
3 months
This is the paper for the poster: . For a quick tweet thread see here:
@FangruLin99
Fangru Lin
5 months
💥Our #ICML2024 camera-ready paper Graph-enhanced Large Language Models in Asynchronous Planning is available on arxiv: ! *Off-the-shelf* method *Plan Like a Graph* gives GPT-3.5/4 Pareto improvement on asynchronous planning tasks of all complexities!🧵
Tweet media one
2
4
35
0
0
21
@FangruLin99
Fangru Lin
5 months
Am at @LrecColing rn! Happy to chat at any point!🤩🤩
0
1
18
@FangruLin99
Fangru Lin
5 months
Presenting this poster @LrecColing at 15:50! Stop by if you are interested!🥳🥳
@FangruLin99
Fangru Lin
7 months
My master thesis ‘Probing Large Language Models for Scalar Adjective Lexical Semantics and Scalar Diversity Pragmatics’ accepted by LREC-COLING 2024 is online! Many thanks to my supervisors Janet Pierrehumbert and @Dr_Semantic for their generous help!
Tweet media one
0
4
27
1
2
16
@FangruLin99
Fangru Lin
5 months
Will be at @LrecColing main conference! DM me to chat about random NLP ideas!🥳🥳
2
0
17
@FangruLin99
Fangru Lin
3 months
I will be in ICML to present this poster! Happy to chat about anything related to LLMs/neuro-symbolic methods/agents in general! And our poster is at 1:30-3 pm Tuesday, Hall C 4-9 #700 ! Come to check if you are interested in LLM and planning!
@FangruLin99
Fangru Lin
6 months
I’m so glad that our paper is accepted at #ICML2024 ! Again many thanks to my fantastic co-authors and see you in Vienna!🤩🤩🤩
3
14
144
5
0
15
@FangruLin99
Fangru Lin
5 months
What happens when you are trying to make yourself searchable on social media but the conference site is not completely indoor and the transition part is pouring 🥲 @LrecColing
Tweet media one
0
0
14
@FangruLin99
Fangru Lin
3 months
Fantastic work! Many congrats Daniella!
@Daniella_yz
Daniella Ye
3 months
Beyond their use in assisting human evaluation (e.g. CriticGPT), can critiques directly enhance preference learning? During my @Cohere internship, we explored using synthetic critiques from large language models to improve reward models. 📑Preprint:
Tweet media one
5
57
322
1
0
13
@FangruLin99
Fangru Lin
8 months
I will be in @OxfordAI mini conference as a panelist with @iperboreo_ @guohao_li @hunarbatra @AleksPPetrov @puyu1001 to discuss **Large Language Models and Artifitial General Intelligence** on 25th Feb. Join us if you are at Oxford!
0
5
9
@FangruLin99
Fangru Lin
6 months
@icmlconf I’m so excited our paper is accepted! See you in Vienna!
Tweet media one
0
0
8
@FangruLin99
Fangru Lin
8 months
I finally had my master graduation ceremony at @UniofOxford ! I’m honored to be awarded distinction in MPhil Linguistics, and I want to thank people who offered me invaluable support. It’s a great pleasure to stay at Oxford for a DPhil degree and work with amazing researchers!
Tweet media one
3
0
7
@FangruLin99
Fangru Lin
2 months
Thank you for having me! Look forward to sharing our paper this weekend!
@talks_cv
Computer Vision Talks
2 months
Join our global paper reading group on August 24 at 10 AM EST as we dive into "Plan Like a Graph (PLaG): Enhancing LLMs in Asynchronous Plan Reasoning" with @FangruLin99 from @UniofOxford . Don’t miss out! 🌐👩‍💻Link : #AI #LLMs #MachineLearning
0
1
4
0
1
8
@FangruLin99
Fangru Lin
3 months
At meet-up now! Please find our pink poster and come to our oral presentation and chat!😆😆😆
Tweet media one
@FangruLin99
Fangru Lin
3 months
Going to @icmlconf London meetup this Friday to present our paper . I also have a oral presentation session in London at 15:00-16:00. Please stop by our poster or oral session and DM me for random research chat in London or Vienna!😆😆
Tweet media one
5
0
27
0
0
7
@FangruLin99
Fangru Lin
5 months
We propose a method *Plan Like a Graph* (PLaG), which we find can be applied to a wide variety of open- and closed-source models to boost their performance. It’s so easy that you can apply it off the shelf.🧵
Tweet media one
1
0
4
@FangruLin99
Fangru Lin
5 months
Why is asynchronous planning so difficult? We consider three key skills required for this task: time summation, time comparison, and constraint analysis. We find that constraint analysis is the key difficulty for our task.🧵
Tweet media one
1
0
4
@FangruLin99
Fangru Lin
5 months
More interesting details are in the paper! This is a fantastic collaboration with @iperboreo_ @vjhofmann @ellemichelley Anthony Cohn and Janet Pierrehumbert. Look forward to presenting in Vienna!🥳🥳🥳
1
0
4
@FangruLin99
Fangru Lin
5 months
We compare our task with prototypical graph search of more diverse complexities. These tasks share similar downgoing trend in domain, which means the naturalistic task is likely to suffer from continued performance degradation in more complex tasks like the prototypical one.🧵
Tweet media one
1
0
4
@FangruLin99
Fangru Lin
6 months
Applied for neurosymbolic language modelling! Look forward to meeting people there!🥰
@JungWooHa2
Jung-Woo Ha
6 months
Congrats to all the authors accepted in #ICML24 ( @icmlconf )!! Now, it's time to apply ICML Socials! You can find more details: Socials application form: Deadline: 26 May 2024 (AoE) Please share this~.
1
3
33
0
0
4
@FangruLin99
Fangru Lin
5 months
@ruochenz_ @LrecColing Terrific! See you at conference site!🤩
0
0
3
@FangruLin99
Fangru Lin
5 months
Last interesting bit especially for linguists, we are inspired by Discourse Representation Theory when designing PLaG. It’s so interesting that LLMs can be improved so much by simply representing natural language prompts in a more structured way!🧵
1
0
3
@FangruLin99
Fangru Lin
3 months
@Yoann_Buzenet @florianhoenicke Details including graph examples can be found in the paper! Our latency is comparable to popular baseline methods but offers a significant boost of performance (please see appendix for details)!😊
1
1
3
@FangruLin99
Fangru Lin
4 months
@zouharvi @overleaf Wow you are a machine!
0
0
2
@FangruLin99
Fangru Lin
5 months
We find LLMs are not robust to trivially different prompts: e.g. varying graph types or changing expressions such as ‘Step 1 must precede step 2’ to ‘Step 2 must follow step 1’ result in different performance.🧵
Tweet media one
1
0
2
@FangruLin99
Fangru Lin
3 months
@YouJiacheng @icmlconf It’s the default pink in keynote! I’m not sure what RGB specifically…
0
0
2
@FangruLin99
Fangru Lin
6 months
@DengHokin We generally find that PLaG is helpful in all models we assessed and we think it’s super cool! Hope that answers your question!
Tweet media one
0
0
2
@FangruLin99
Fangru Lin
3 months
@Lunens__ @florianhoenicke The poster explains the general idea from a very high level but you can see details including code in paper here: .
1
0
2
@FangruLin99
Fangru Lin
2 months
@paul_rottger Oh wow this is the coolest social I’ve ever heard of!🤩🤩🤩
0
0
2
@FangruLin99
Fangru Lin
5 months
We find that although our method can improve model performance, they still suffer from drastic performance degradation with increasing task complexities.🧵
Tweet media one
1
0
2
@FangruLin99
Fangru Lin
8 months
I’m excited to announce that my master thesis is accepted to @LrecColing : Probing Large Language Models for Scalar Adjective Lexical Semantics and Scalar Diversity Pragmatics. See a preview here and the paper will follow afterwards after modification:
1
0
2
@FangruLin99
Fangru Lin
3 months
@DuyHMNguyen1 @icmlconf Thank you so much! I’m really proud of this work!😆
1
0
2
@FangruLin99
Fangru Lin
2 months
@shangbinfeng Oh that’s very interesting! We find graph prompting to be quite helpful in naturalistic complex planning tasks: , but I’m surprised simple graph tuning does not improve LLM performance in related complex tasks.
1
0
2
@FangruLin99
Fangru Lin
3 months
@MichelIvan92347 @florianhoenicke Thank you for liking our work!
0
0
2
@FangruLin99
Fangru Lin
5 months
We automatically generated and released a benchmark for naturalistic asynchronous plan reasoning *AsyncHow*. It has wide coverage of topics and diverse task complexities! Our generated data has near-human level quality!🧵
Tweet media one
1
0
2
@FangruLin99
Fangru Lin
3 months
@vjhofmann @UniofOxford Many congrats Valentin! Wish you a great new journey!
0
0
1
@FangruLin99
Fangru Lin
9 months
We (i) automatically generate and open-source a high-quality dataset of 1.6k datapoints for asynchronous planning which requires both sequential and parallel efficient sheduling. 2/n
1
0
1
@FangruLin99
Fangru Lin
3 months
@rohanpaul_ai @florianhoenicke Thanks! I hope you will like this work!
1
0
1
@FangruLin99
Fangru Lin
9 months
4. Despite the performance boost, we still find that LLMs tend to suffer from severe degradataion with increasing task complexities, which highlights the limitations of using LLMs to simulate digital devices.
1
0
1
@FangruLin99
Fangru Lin
2 months
@vjhofmann Wow that’s amazing! Huge congrats Valentin!
0
0
1
@FangruLin99
Fangru Lin
9 months
Last, many many thanks to my terrific collaborators who have offered tremendous help and genius ideas to this project!
1
0
1
@FangruLin99
Fangru Lin
3 months
@zouharvi @icmlconf Thanks Vilém! Definitely inspired by your great purple poster!🥳
0
0
1
@FangruLin99
Fangru Lin
2 months
@compthink @florianhoenicke It’s similar to cot in that they can be applied off the shelf but it’s more of an abstraction technique——it casts a natural language problem to some structured representation.
1
0
0
@FangruLin99
Fangru Lin
9 months
2. We find that LLMs are extremely poor when they are not supplied with detailed task illustrations for efficient asynchronous planning. 3/n
1
0
1
@FangruLin99
Fangru Lin
6 months
@hunarbatra Thanks Hunar!
0
0
1
@FangruLin99
Fangru Lin
5 months
@ruochenz_ @LrecColing Going to Torino too! Let’s get a coffee there!🙌
1
0
1
@FangruLin99
Fangru Lin
6 months
@nicolayr_ Thanks so much!!!
0
0
1
@FangruLin99
Fangru Lin
3 months
@bonniesjli @GoogleDeepMind Super interested in chatting! 🙋‍♀️
0
0
1
@FangruLin99
Fangru Lin
5 months
@LAWeissweiler Sounds terribly interesting! I’ll definitely try to get a lecture!🤩🤩
0
0
1
@FangruLin99
Fangru Lin
2 months
@paul_rottger @vjhofmann That’s amazing! Huge congrats Paul!
0
0
1
@FangruLin99
Fangru Lin
8 months
@hunarbatra @UniofOxford Thanks so much Hunar!🤩
0
0
1
@FangruLin99
Fangru Lin
3 months
@alfcnz @icmlconf Thank you! I’m glad I used a pink theme and spent a whole morning to generate a cartoon to look like me😆😆
1
0
1
@FangruLin99
Fangru Lin
6 months
@VBambini @DaniilAltshuler Thanks! This is very interesting!
0
0
1
@FangruLin99
Fangru Lin
3 months
@philipcortes @rohanpaul_ai @florianhoenicke Thanks for the comment! You can hand-craft several in-context-learning examples (we used 3) as PLaG with BaG and in our experience it works like a charm!
1
0
1
@FangruLin99
Fangru Lin
3 months
@parthasarathypd Thank you! I used a whole morning generating the image to make sure it looks like me😆
0
0
1
@FangruLin99
Fangru Lin
5 months
@zouharvi That’s a terrific poster! Isabelle and I really loved it! Great job Vilém!
0
0
1
@FangruLin99
Fangru Lin
9 months
If you are a linguist, you might have heard of discourse representation theory. This PLaG idea was initially inspired by DRT and we are just very excited to see it works so well and so neatly!
1
0
1
@FangruLin99
Fangru Lin
3 months
@mr_shitij @icmlconf Thank you! Here it is:
1
0
1