作者：DMIR 来自： 发表时间：2016-06-21 浏览量：738
时 间: 2016年06月22日（周三）10：00-11：30
地 点: 广工大学城校园工学一号馆725室
题 目: Nonconvex Rank Approximations and Applications
报告人：南伊利诺伊大学 Zhao Kang博士
Kang Zhao is currently a PhD candidate at the Department of Computer Science, Southern Illinois University Carbondale. Before this, he obtained his M.S. degree in theoretical physics from Sichuan University. His major research areas include data mining, machine learning, and recommender systems. He has papers published in AAAI, SIGKDD, ICDM, CIKM, SDM, etc. More information about him can be found at https://sites.google.com/site/zhaokanghomepage/.
High dimensional data usually have intrinsic low-dimensional representations, which are suited for subsequent analysis or processing. Therefore, finding low-dimensional representations is an essential step in many machine learning and data mining tasks. Low-rank is one of the two widely used low-dimensional representations. Since the general rank minimization problem is computationally NP-hard, one popular heuristic method is to use the nuclear norm (the sum of singular values) to approximate the rank of matrix (the number of nonzero singular values). However, the nuclear norm minimizes not only the rank, but also the variance of matrix and may not be a good approximation to the rank function in practical problems. In this talk, I will introduce several nonconvex functions to approximate the rank function. An optimization framework for nonconvex problem is also developed. The effectiveness of this approach is examined on some important applications, e.g., subspace clustering, robust principle component analysis, recommender systems.