Noam Aigerman Profile
Noam Aigerman

@AigermanNoam

Followers
502
Following
260
Statuses
27

Assistant Professor @ University of Montreal

Montreal
Joined March 2021
Don't wanna be here? Send us removal request.
@AigermanNoam
Noam Aigerman
2 months
RT @RanaHanocka: 3DL is recruiting PhD students to start in Fall 2025! A thread👇highlighting some of our recent works, and what we are exc…
0
13
0
@AigermanNoam
Noam Aigerman
4 months
I'll be recruiting PhD and MSc students through Mila - consider applying if you want to work at the intersection of machine learning and 3D geometry!
@Mila_Quebec
Mila - Institut québécois d'IA
4 months
Mila's annual supervision request process opens on October 15 to receive MSc and PhD applications for Fall 2025 admission! Join our community! More information here
Tweet media one
2
40
104
@AigermanNoam
Noam Aigerman
7 months
@sellan_s I think the implicit function is indeed a function: the (local) function that is represented by the zero level set (e.g., in the case of a surface, the *function* from a 2D patch to 3D). That is the "implicit function" that the "implicit function theorem" refers to, methinks.
0
0
0
@AigermanNoam
Noam Aigerman
7 months
RT @sellan_s: This is as good time to announce that I am *HIRING* students and postdocs to join my new lab at Columbia in 2025 (see link be…
0
112
0
@AigermanNoam
Noam Aigerman
1 year
@amirvaxman_dgp For one, I can say that I've never been able to use the LBFGS solver in Pytorch - always stagnates on problems that I know would've worked with Matlab's LBFGS solver
1
0
3
@AigermanNoam
Noam Aigerman
1 year
@sellan_s Let him upload a md5 checksum, we'll see what he says about lack of action after that.
1
0
7
@AigermanNoam
Noam Aigerman
1 year
@PrathebaSelva @Mila_Quebec Currently no, just graduate students.
1
0
0
@AigermanNoam
Noam Aigerman
2 years
RT @allthatissolids: Excited to be presenting DA-Wand: Distortion Aware Selection Using Neural Mesh Parameterization at #CVPR2023! Come fi…
0
5
0
@AigermanNoam
Noam Aigerman
2 years
RT @WillMGao: Excited to share TextDeformer, our SIGGRAPH2023 work on text-driven mesh manipulation! (1/7) https://…
0
6
0
@AigermanNoam
Noam Aigerman
2 years
RT @thibaultgroueix: 📣📣Code release for *Neural Jacobian Fields*.
0
2
0
@AigermanNoam
Noam Aigerman
3 years
@amirvaxman_dgp No guarantees about prediction in the triangle's (irrelevant) normal direction. It's a null-space for the loss, so intuitively, probably the network chooses its prediction in the normal direction s.t. it facilitates making a correct prediction in the tangent space.
1
0
1
@AigermanNoam
Noam Aigerman
3 years
@amirvaxman_dgp The loss is defined on the restriction, and on it it is well-defined. The *gradient* is also well-defined; it's simply zero in each triangles' normal direction (that's due to back-propagation through the restriction operator which discards the normal direction).
1
0
0
@AigermanNoam
Noam Aigerman
3 years
@esx2ve Yes, any genus, out of the box! More so, your dataset isn't required to have a consistent genus. (Fig. 9 shows how a network trained solely on genus-0 meshes still makes valid predictions for higher-genus meshes.)
0
0
1
@AigermanNoam
Noam Aigerman
3 years
@coreqode We learn to emulate SLIM's locally-injective maps. Neither us nor SLIM have guarantees for global bijectivity, but this could be achieved by training with maps that were generated via a globally-bijective method (e.g., SLIM + ensuring non overlapping boundary)
1
0
1
@AigermanNoam
Noam Aigerman
3 years
@ajayj_ Likewise, was fascinated to talk about your two last papers.
0
0
0
@AigermanNoam
Noam Aigerman
3 years
Joint work with my excellent co-authors, @thibaultgroueix, @KunalMGupta, Vova Kim, Siddhartha Chaudhuri, and Jun Saito @dukecyto at @AdobeResearch. Code and models to be released soon! (10/10)
0
1
5