The theories of optimization answer foundational questions in machine learning and lead to new algorithms for practical applications. In this talk, I will introduce two quantum algorithms that we recently developed for convex optimization and nonconvex optimization, respectively. Both achieve polynomial quantum speedup compared to the best-known classical algorithms. Our quantum algorithms are built upon two techniques: First, we replace the classical perturbations in gradient descent methods by simulating quantum wave equations, which constitutes the polynomial speedup in $n$ for escaping from saddle points. Second, we show how to use a quantum gradient computation algorithm due to Jordan to replace the classical gradient queries by quantum evaluation queries with the same complexity. Finally, we also perform numerical experiments that support our quantum speedup.