Assistant Professor of Industrial and Systems Engineering
- 2017, Doctoral Degree, IEOR, University of California - Berkeley
- 2014, Master's Degree, IEOR, University of California - Berkeley
- 2012, Bachelor's Degree, Mathematics, Universidad de los Andes
- 2011, Bachelor's Degree, Computer Science, Universidad de los Andes
Andrés Gómez received his B.S. in Mathematics and B.S. in Computer Science from the Universidad de los Andes (Colombia) in 2011 and 2012, respectively. He then obtained his M.S. and Ph.D. in Industrial Engineering and Operations Research from the University of California Berkeley in 2014 and 2017, respectively. From 2017 to 2019, Dr. Gómez worked as an Assistant Professor in the Department of Industrial Engineering at the University of Pittsburgh, and since 2019 he is an Assistant Professor in the Department of Industrial and Systems Engineering at the University of Southern California. Dr. Gómez research focuses on developing new theory and tools for challenging optimization problems arising in finance, machine learning and statistics.
The past two decades have witnessed an explosion in the use of optimization to tackle problems arising in data analysis, finance, statistics and, more recently, machine learning. These new application domains demand faster, scalable and more precise algorithms, yet classical optimization tools and techniques have been unable to cope with such requirements. Specifically, there has been an enormous progress in solving convex optimization problems. However, most inference problems are naturally non-convex once priors or interpretability conditions are imposed. Thus, decision-makers are faced with a dichotomy: either construct a (crude) convex approximation of the problem, which can be solved efficiently but yields sub-optimal and even bad solutions; alternatively, tackle the non-convex directly to obtain optimal solutions, but at the expense of large or excessive computational times.
Dr. Gómez goal is to bridge the gap between these two extremes. His research focuses on systematically constructing strong or ideal convex relaxations of difficult problems. Such relaxations can then be naturally used to obtain high-quality solutions quickly, and to solve the problems to optimality efficiently, resulting in the best of both worlds. His research uses ideas from the following disciplines:
- Discrete optimization (combinatorial, submodularity)
- Mixed-integer optimization (branch-and-bound, lifting, disjunctive programming)
- Convex optimization (quadratic, conic).