Digital Services        F A Q


Optimization in Machine Learning

Dept. of Computer Science (February 11, 2015)

SEMINAR SERIES : Distinguished lecture

MAJOR SPEAKER : Boley, Daniel
LENGTH : 78 min.
ACCESS : Open to all
SUMMARY : Many problems in machine learning today can be cast as minimizing a convex loss function subject to some inequality constraints. As a result, the success of machine learning today depends on convex optimization methods that can scale to sizes reaching that of the World Wide Web. Problems in this class include basis pursuit, compressed sensing, graph reconstruction via precision matrix estimation, matrix completion under rank constraints, etc. One of the most popular optimization methods to use is the Alternating Direction Method of Multipliers. This is extremely well-scalable, but the convergence rate can be erratic. In this talk I will introduce the problem and algorithm with some applications and show how linear algebra can explain the erratic behavior.  [Go to the full record in the library's catalogue]

  ●  Persistent link:
  ●  XML Dublin Core code for metadata harvesting

Recommended for You

This video is presented here with the permission of the speakers. Any downloading, storage, reproduction, and redistribution are strictly prohibited without the prior permission of the respective speakers. Go to Full Disclaimer.

  For enquiries, please contact Digital and Multimedia Services Section

© 2009-2023 All rights reserved