2.160 Identification, Estimation, and Learning
Provides a broad theoretical basis for system identification, estimation, and learning. Least squares estimation and its convergence properties, Kalman filter and extended Kalman filter, noise dynamics and system representation, function approximation theory, neural nets, radial basis functions, wavelets, Volterra expansions, informative data sets, persistent excitation, asymptotic variance, central limit theorems, model structure selection, system order estimate, maximum likelihood, unbiased estimates, Cramer-Rao lower bound, Kullback-Leibler information distance, Akaike's information criterion, experiment design, and model validation.
This class has 2.151 as a prerequisite.
2.160 will be offered this semester (Fall 2019). It is instructed by H. Asada.
Lecture occurs 1:00 PM to 2:30 PM on Mondays and Wednesdays in 1-371.
This class counts for a total of 12 credits.
You can find more information on MIT OpenCourseWare at the Identification, Estimation, and Learning site.
© Copyright 2015 Yasyf Mohamedali