ICB 2013 Short Course
Sparse Representation
and Low-Rank Representation for Biometrics
-- Theory, Algorithms,
and Applications
Description:
The recent vibrant study of sparse representation and compressive sensing has led to numerous groundbreaking results in pattern recognition and computer vision. In this tutorial, we will present a series of three talks to provide a high-level overview about its theory, algorithms and broad applications to pattern recognition and biometrics. We will also point out ready-to-use MATLAB toolboxes available for participants to further acquire hands-on experience on these related topics.
Online Source Code and References:
Session 1:
Introduction and Sparse Representation Theory.
This session introduces the basic concepts of
sparse representation. The emphasis will be
on how to model and recover low-dimensional structures in
high-dimensional signals, and how to verify that the models are
appropriate. We will illustrate this process through examples drawn
from a face recognition application. We will gently introduce the
foundational theoretical results, and show how theory
informs the modeling process.
Session 2: Low-Rank Representation and Applications.
This session extends the sparse representation techniques to estimating low-rank matrices. We will show how tools and ideas from convex optimization give simple, robust algorithms for recovering low-rank matrices from incomplete, corrupted and noisy observations. Participants will learn how to identify problems for which these tools may be appropriate, and how to apply them effectively to solve practical problems such as robust batch image alignment and the detection of symmetric structures in images. We will illustrate the power and potential of these revolutionary tools in a wide range of applications in computer visions including but not limited to: Face and Text Recognition, Texture Repairing, Video Panorama, and Holistic Reconstruction of Urban Scenes, etc.
Session 3:
Sparse Optimization and Numerical Implementation.
This session discusses acceleration of sparse and low-rank representation algorithms. Classical convex optimization algorithms utilize interior-point methods, which are not still too expensive in high-dimensional space. We present modern solutions in sparse optimization to improve the speed of convex solvers when the objective function contains nonsmooth sparse and low-rank relaxations.
Speaker Bio:
Allen Y. Yang is a Research
Scientist in the Department of EECS at UC Berkeley. He also serves
as the CTO of Atheer Inc., an IT startup in Mountain View, CA.
His primary research areas include pattern analysis of geometric and
statistical models in very high-dimensional data spaces and
applications in motion segmentation, image segmentation, face
recognition, and signal processing in heterogeneous sensor networks. He
has published three books/chapters, 11 journal papers and more than 30
conference papers. He is also the inventor of four US patents/applications. He
received his BEng degree in Computer Science from the University of
Science and Technology of China (USTC) in 2001. From the University of
Illinois at Urbana-Champaign (UIUC), he received two MS degrees in
Electrical Engineering and Mathematics in 2003 and 2005, respectively,
and a PhD in Electrical and Computer Engineering in 2006. Among the
awards he received are a Best Bachelor's Thesis Award from USTC in
2001, a Henry Ford II Scholar Award from UIUC in 2003, a Best Paper
Award from the International Society of Information Fusion and a Best
Student Paper Award from Asian Conference on Computer Vision in 2009.
References: