![George Lan Profile](https://pbs.twimg.com/profile_images/1715562434012835840/dRTy57MY_x96.jpg)
George Lan
@GeorgeLan7
Followers
551
Following
131
Statuses
61
An optimizer, A. Russell Chandler III Chair and Professor at Georgia Tech
Atlanta, GA
Joined September 2017
Parameter-free nonconvex and stochastic optimization: outcome of a pleasant collaboration with Tianjiao and Yangyang during Yangyang’s sabbatical leave at Georgia Tech!
Guanghui Lan, Tianjiao Li, Yangyang Xu: Projected gradient methods for nonconvex and stochastic optimization: new complexities and auto-conditioned stepsizes
0
3
35
RT @uwchancellor: Congratulations to two Badger professors, Pascale Carayon of @UWMadEngr and Stephen J. Wright of @uwcdis, who were recent…
0
5
0
RT @bremen79: Congratulations to Arkadi Nemirovski and Yurii Nesterov that have been awarded the World Laureates Association Prize, a new i…
0
15
0
@damekdavis @jasondeanlee Note that there are parameters other than smoothness or boundedness. Regularity conditions (strong convexity and lower curvature) are more notoriously hard since they are defined over a global scope.
1
0
4
@damekdavis @jasondeanlee @aaron_defazio I had never checked this. In 2009, in an ISMP I presented my AC-SA paper in d’aspremont’s session, I called AGD an universally optimal method because it is optimal for smooth, nonsmooth and stochastic. I feel the name is quite big and did not use it later😀
0
0
4
@bremen79 @aaron_defazio @damekdavis I understand many people read this thread. I checked there was nothing against anyone in my message sent to Francesco. So I posted it here. Some interesting story can be find in my posts today
Something came to my mind when replying Francesco. In 12/2010 I developed a uniformly optimal bundle level method and in 04/2011 I extended it to weakly smooth and composite problems. Surprisingly, the first one got rejected and the other received a major revision from math prog.
0
0
7