2.160 Identification, Estimation, and Learning
Provides a broad theoretical basis for system identification, estimation, and learning. Least squares estimation and its convergence properties, Kalman filter and extended Kalman filter, noise dynamics and system representation, function approximation theory, neural nets, radial basis functions, wavelets, Volterra expansions, informative data sets, persistent excitation, asymptotic variance, central limit theorems, model structure selection, system order estimate, maximum likelihood, unbiased estimates, Cramer-Rao lower bound, Kullback-Leibler information distance, Akaike's information criterion, experiment design, and model validation.
This class has 2.151 as a prerequisite.
This class counts for a total of 12 credits. This is a graduate-level class.
You can find more information at the MIT + 2.160 - Google Search site.
© Copyright 2015 Yasyf Mohamedali