Skip to main content
and other projects
Pages and Files
Our current interests are: #learning, #network, #computation #TopicModeling and related topics.
Representation learning: A review and new perspectives
- IEEE transactions on pattern …, 2013 - ieeexplore.ieee.org
Among the various ways of
representations, this paper focuses on
those that are formed by the composition of multiple nonlinear transformations with the goal of
yielding more abstract—and ultimately more useful—representations.
Cited by 1223
All 41 versions
A toolbox for representational similarity analysis
, A Walther, L Su… - PLoS Comput …, 2014 - journals.plos.org
Abstract Neuronal population codes are increasingly being investigated with multivariate
pattern-information analyses. A key challenge is to use measured brain-activity patterns to
test computational models of brain information processing. One approach to this problem
Cited by 65
All 20 versions
Nonlinear component analysis as a kernel eigenvalue problem
- Neural computation, 1998 - MIT Press
and Neural Network Architectures for the Classification of Paddy
Variform Object of Remote-Sensing Image Using Improved Robust Weighted
2016) Nonlinear process monitoring based on
Cited by 6482
All 29 versions
__Deep Boltzmann Machines.__
- AISTATS, 2009 - jmlr.org
Abstract We present a new learning algorithm for Boltzmann machines that contain many layers of hidden variables. Data-dependent expectations are estimated using a variational approximation that tends to focus on a single mode, and data independent expectations ...
Cited by 739
All 18 versions
Sparse kernel principal component analysis
- Advances in neural information processing …, 2001 - pdfs.semanticscholar.org
principal component analysis
(PCA) is an elegant nonlinear generalisation
of the popular linear data
method, where a
function implicitly defines a
nonlinear transformation into a feature space wherein standard PCA is performed.
Cited by 162
All 4 versions
Learning eigenfunctions links spectral embedding and kernel PCA
__N Le Roux__
, JF Paiement… - Neural …, 2004 - MIT Press
In this letter, we show a direct relation between spectral
principal components analysis and how both are special cases of a more general learning
problem: learning the principal eigenfunctions of an operator defined from a
Cited by 228
All 31 versions
Web of Science: 94
Kernel methods for deep learning
- Advances in neural information processing systems, 2009 - papers.nips.cc
Abstract We introduce a new family of positive-definite kernel functions that mimic the
computation in large, multilayer neural nets. These kernel functions can be used in shallow
architectures, such as support vector machines (SVMs), or in
Cited by 87
All 13 versions
Unsupervised feature learning and deep learning: A review and new perspectives
… - CoRR, abs/1206.5538, 2012 - pdfs.semanticscholar.org
Abstract—The success of machine
algorithms generally depends on data
representation, and we hypothesize that this is because different representations can
entangle and hide more or less the different explanatory factors of variation behind the
Cited by 148
Kernel analysis of deep networks
, KR MÃŧller - Journal of Machine Learning …, 2011 - jmlr.org
Abstract When training
networks it is common knowledge that an efficient and well
generalizing representation of the problem is formed. In this paper we aim to elucidate what
makes the emerging representation successful. We analyze the layer-wise evolution of the
Cited by 30
All 10 versions
Scaling learning algorithms towards AI
- Large-scale kernel machines, 2007 - iro.umontreal.ca
Although a num- ber of
architectures have been available for some
time, training such architectures is still largely perceived as a difficult challenge.
8 Page 9. 3
Architectures, Shallow and
3.1 Architecture Types
Cited by 516
All 19 versions
Deep Restricted Kernel Machines using Conjugate Feature Duality
- 2016 - esat.kuleuven.be
The aim of this paper is to propose a theory of
Restricted Kernel Ma- chines offering new
with kernel machines.
, kernel methods, least squares support vector machines, kernel PCA, restricted Boltzmann machines, duality. 1
All 3 versions
A novel supervised approach to learning efficient kernel descriptors for high accuracy object recognition
B Xie, Y Liu, H Zhang,
- Neurocomputing, 2016 - Elsevier
In addition to methods for
low-level image representations described above, methods
higher level feature representations have been proposed. For example,
nets (DBNs)  and  and convolutional neural networks  can be used to
All 3 versions
This semester, our reading group has been doing a kaggle challenge together.
This semester, we are going to experiment with a new format for our reading group. We will watch a video presentation and read a related paper every week.
Nov 20th, 2014 [Led by Yuting Ma]
Mauro Maggioni from Duke on
Geometric Methods for Statistical Learning and High-Dimensional Data
, which was given on
Colloquium UAM-ICMAT, Madrid in Sep, 2014. Here are the links for the talk and the related but not exact slides for the talk:
Geometric estimation of intrinsically low-dimensional probability measures in high dimensions
Related paper: Multi-scale Geometric Methods for estimating intrinsic dimension
December 4th, 2014 we will continue discussing the video presentations from Nov 20th.
December 11th, 2014 "The pursuit of low-dimensional structures in high-dimensional data", MMDS 2012, Yi Ma, UIUC, Shanghai Tech University, MSR Asia [
Jan 30th, 2014 [Lu Meng] Robust Principal Component Analysis (Candes et al, 2010)
Feb 6th, 2014 [Yuting Ma] Optimal weighted nearest neighbour classifiers (AOS, Samwirth, 2012)
Feb 13th, 2014. Cancelled due to snow.
Feb 20th, 2014 [Rachel Fan] SKAT (Harvard, Xihong Lin group. Focus on motivation and implementation)
Lee, Seunggeun, et al. (2012).
Optimal Unified Approach for Rare-Variant Association Testing with Application to Small-Sample Case-Control Whole-Exome Sequencing Studies
The American Journal of Human Genetics,
Lee, S., Wu, M.C. and Lin, X. (2012).
Optimal tests for rare variant effects in sequencing association studies
, 13.4, 762-775.
Wu, M. C., Lee, S., Cai, T., Li, Y., Boehnke, M. and Lin, X (2011)
Rare Variant Association Testing for Sequencing Data Using the Sequence Kernel Association Test (SKAT).
American Journal of Human Genetics,
, 89.1, 82-93.
Wu, M. C., Kraft, P., Epstein, M. P.,Taylor, D., M., Chanock, S. J., Hunter, D., J., and Lin, X. (2010)
Powerful SNP Set Analysis for Case-Control GenomeWide Association Studies.
American Journal of Human Genetics,
, 86, 929-942.
Feb 27th, 2014 [Ran He] Graph limits and ERGM modeling
March 6th, 2014 [Tony Tong] On topic modeling.
March 13th, 2014 [Cancelled]
March 27th, 2014 [Ran He] Mixed membership stochastic blockmodels
April 3rd, 2014 [Yang Kang] Stochastic variational inference
April 10th, 2014 [Swupnil Sahai]
How Many People Do You Know?: Efficiently Estimating Personal Network Size.
April 17th, 2014 [Yuting]
On Deep learning
April 24th, 2014 [Shuaiwen]
Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
To be resumed in Fall 2014
Roderick J. Little (2013) In Praise of Simplicity not Mathematistry! Ten Simple Powerful Ideas for the Statistical Scientist, Journal of the American Statistical Association, 108:502, 359-369
A comparative study of social network models: Network evolution models and nodal attribute models.
help on how to format text
Turn off "Getting Started"