2.160 Identification, Estimation, and Learning
Provides a broad theoretical basis for system identification, estimation, and learning. Least squares estimation and its convergence properties, Kalman filter and extended Kalman filter, noise dynamics and system representation, function approximation theory, neural nets, radial basis functions, wavelets, Volterra expansions, informative data sets, persistent excitation, asymptotic variance, central limit theorems, model structure selection, system order estimate, maximum likelihood, unbiased estimates, Cramer-Rao lower bound, Kullback-Leibler information distance, Akaike's information criterion, experiment design, and model validation.
This class has 2.151 as a prerequisite.
Lecture occurs 1:00 PM to 2:30 PM on Mondays and Wednesdays in 5-233.
This class counts for a total of 12 credits.
You can find more information at the http://www.google.com/search?&q=MIT+%2B+2.160&btnG=Google+Search&inurl=https site or on the 2.160 Stellar site.
© Copyright 2015 Yasyf Mohamedali