Welcome

I am a researcher in optimization and machine learning. Since February 2022, I am a postdoc at the University of Tübingen, working on the TRINOM-DS project led by Peter Ochs and Jalal Fadili.

Prior to that, I wrote my PhD thesis (readable here) at the IRIT laboratory in Toulouse (supervised by Cédric Févotte, Edouard Pauwels and Jérôme Bolte). I was a member of the FACTORY project.

I graduated from INSA Toulouse where I obtained a master degree in "Applied Mathematics and Modeling". I wrote my master thesis under the supervision of Pierre Weiss and Frédéric de Gournay.

My research focuses on designing and analyzing optimization algorithms for machine learning, with second-order methods and deep learning as main topics.

Research interests

Teaching

From 2018 to 2021 I gave tutorials and practicals to master students at INP-ENSEEIHT where I taught the following:

Recent work

Near-optimal closed-loop method via Lyapunov damping for convex optimization (2023)

S. Maier, C. Castera, P. Ochs
Preprint

Accelerated gradient dynamics on Riemannian manifolds: Faster rate and trajectory convergence (2023)

T. Natu, C. Castera, J. Fadili, P. Ochs
Preprint

Continuous Newton-like methods featuring inertia and variable mass

C. Castera, H. Attouch, J. Fadili, P. Ochs (2023)
SIAM Journal on Optimization (SIOPT)

Inertial Newton algorithms avoiding strict saddle points

C. Castera (2021)
Journal of Optimization Theory and Applications (JOTA)

Second-order step-size tuning of SGD for non-convex optimization

C. Castera, J. Bolte, C. Févotte, E. Pauwels (2022)
Neural Processing Letters

An inertial Newton algorithm for deep learning

C. Castera, J. Bolte, C. Févotte, E. Pauwels (2021)
Journal of Machine Learning Research (JMLR)

Contact

[firstname].[lastname]@protonmail.com