George Lan Profile
George Lan

@GeorgeLan7

Followers
551
Following
131
Statuses
61

An optimizer, A. Russell Chandler III Chair and Professor at Georgia Tech

Atlanta, GA
Joined September 2017
Don't wanna be here? Send us removal request.
@GeorgeLan7
George Lan
12 days
International Conference on Stochastic Programming (ICSP) in Paris ( in the end of July. In addition to great plenary/semi-plenary/regular talks, students might be interested in the tutorials. Honored to serve in the program committee.
0
3
20
@GeorgeLan7
George Lan
1 month
AI dominates: All formalizable, computable, and reproducible human wisdom will merge with AI. Researchers are: a) shifting to top AI subfields, b) applying AI in their work, c) building tools to enhance AI. Check that took my time in 2024. Happy New Year!
Tweet media one
Tweet media two
1
6
41
@GeorgeLan7
George Lan
2 months
Parameter-free nonconvex and stochastic optimization: outcome of a pleasant collaboration with Tianjiao and Yangyang during Yangyang’s sabbatical leave at Georgia Tech!
@mathOCb
arXiv math.OC Optimization and Control
2 months
Guanghui Lan, Tianjiao Li, Yangyang Xu: Projected gradient methods for nonconvex and stochastic optimization: new complexities and auto-conditioned stepsizes
0
3
35
@GeorgeLan7
George Lan
4 months
@ShiqianMa @krizna_b Congratulations,Shiqian!
1
0
2
@GeorgeLan7
George Lan
9 months
0
0
1
@GeorgeLan7
George Lan
9 months
@ShiqianMa Congratulations!
1
0
1
@GeorgeLan7
George Lan
1 year
RT @uwchancellor: Congratulations to two Badger professors, Pascale Carayon of @UWMadEngr and Stephen J. Wright of @uwcdis, who were recent…
0
5
0
@GeorgeLan7
George Lan
1 year
Francesco pointed out that Nesterov’s FGM method requires epsilon as input, and this makes the algorithm not really parameter free at least for nonsmooth optimization. I was convinced by his very intelligent construction.
1
1
5
@GeorgeLan7
George Lan
1 year
@bremen79 Found a nice picture for them.
Tweet media one
0
1
11
@GeorgeLan7
George Lan
1 year
RT @bremen79: Congratulations to Arkadi Nemirovski and Yurii Nesterov that have been awarded the World Laureates Association Prize, a new i…
0
15
0
@GeorgeLan7
George Lan
1 year
@NicLoizou Thanks a lot, Nicolas!
0
0
0
@GeorgeLan7
George Lan
1 year
@damekdavis Thanks a lot, Damek!
0
0
1
@GeorgeLan7
George Lan
1 year
@peter_richtarik Thanks a lot, Peter!
0
0
1
@GeorgeLan7
George Lan
1 year
@jasondeanlee Hopefully their students will read it😀
1
0
3
@GeorgeLan7
George Lan
1 year
@damekdavis @jasondeanlee Note that there are parameters other than smoothness or boundedness. Regularity conditions (strong convexity and lower curvature) are more notoriously hard since they are defined over a global scope.
1
0
4
@GeorgeLan7
George Lan
1 year
@damekdavis @jasondeanlee @aaron_defazio I had never checked this. In 2009, in an ISMP I presented my AC-SA paper in d’aspremont’s session, I called AGD an universally optimal method because it is optimal for smooth, nonsmooth and stochastic. I feel the name is quite big and did not use it later😀
0
0
4
@GeorgeLan7
George Lan
1 year
@bremen79 @aaron_defazio @damekdavis I understand many people read this thread. I checked there was nothing against anyone in my message sent to Francesco. So I posted it here. Some interesting story can be find in my posts today
Tweet media one
Tweet media two
Tweet media three
Tweet media four
@GeorgeLan7
George Lan
1 year
Something came to my mind when replying Francesco. In 12/2010 I developed a uniformly optimal bundle level method and in 04/2011 I extended it to weakly smooth and composite problems. Surprisingly, the first one got rejected and the other received a major revision from math prog.
0
0
7