TSRT14 Sensor fusion
Course Information VT2, 2013
Goal:The student should after the course have the ability to describe the most important methods and algorithms for sensor fusion, and be able to apply these to sensor network, navigation and target tracking applications. More specifically, after the course the student should have the ability to
- Understand the fundamental principles in estimation and detection theory.
- Implement algorithms for parameter estimation in linear and non-linear models.
- Implement algorithms for detection and estimation of the position of a target in a sensor network.
- Apply the Kalman filter to linear state space models with a multitude of sensors.
- Apply non-linear filters (extended Kalman filter, unscented Kalman filter, particle filter) to non-linear or non-Gaussian state space models.
- Implement basic algorithms for simultaneous localization and mapping (SLAM).
- Describe and model the most common sensors used in sensor fusion applications.
- Implement the most common motion models in target tracking and navigation applications.
- Understand the interplay of the above in a few concrete real applications.
The course consists of
Laboratory exercises: 2
The lab contains one data collection part in our lab (RT3, Laboteket, which is located in house B, entrance 27, corridor C) and one data processing part where algorithms will be developed and applied to the data.
The participants will be examined with a lab report which will be peer-reviewed by other students attending the course. Each lab group will review one report each. The report is due on Thursday May 2, 2013, at 23:59, the review report is due on Thursday May 9, 23:59 and the resubmission of the lab report (if it is to be resubmitted) is due on Monday May 20, 2013, at 23:59. The report may be written in English (preferably) or Swedish. The each lab and review report should be submitted in an e-mail (pdf attachment) to the course assistant. The first version of the lab report should also be sent to Urkund (e-mail: nikwa61.liu_at_analys.urkund.se).
Orientation estimation using smartphone sensors. In this lab an orientation filter will be implemented using measurements from gyroscope, accelerometer and magnetometer in a smartphone. The lab is compatible with any android phone containing these sensors (which most modern smartphones do). The students can either use their own phones, or use a phone provided by the course. Matlab files are provided as well as the Sensor Fusion Android app which will be needed to stream sensor data from the phone to matlab.
The lab will consist of a 4 hour lab session in our computer rooms. The participants will be examined during the session and no written report will be required.
The course assistant is responsible for the lab schedule Niklas Wahlström , nikwa_at_isy.liu.se, 282803.
Information during the course will be sent to the email list firstname.lastname@example.org. Information about the email list can be found on "Studentsidan".
The toolbox that will be used during the course can be downloaded here
LiteratureStatistical Sensor Fusion. Studentlitteratur, 2012, Second Edition.
Statistical Sensor Fusion - Exercises. LiU.
Statistical Sensor Fusion - Laborations. Available from homepage.
Statistical Sensor Fusion - Matlab Toolbox Manual. Available here
ExaminationWritten examination with Matlab.
Preliminary lecture plan
LecturesSlides will be linked from the lecture number in advance.
|1 slides||Course overview. Estimation theory for linear models.|
|2 slides||Estimation theory for nonlinear models with sensor network applications.|
|3 slides||Detection theory with sensor network applications.|
|4 slides||Nonlinear filter theory. The Kalman filter. Filter banks.|
|5 slides||Kalman filter approximation for nonlinear models (EKF, UKF).|
|6 slides||The point-mass filter and the particle filter.|
|7 slides||The particle filter theory. The marginalized particle filter.|
|8 slides||Simultaneous localization and mapping (SLAM).|
|9 slides||Modeling and motion models.|
|10 slides||Sensors and sensor-near signal processing.|
Preliminary exercise plan
||2.1, 2.4, 3.1, 3.2, 3.6b, 2.3, 2.5||Estimation.|
||4.10, 4.2, 4.3, 16.1||Sensor networks.|
||2.10, 3.10, 3.7, 4.7, 4.8, 4.9||Computer-based estimation and detection.|
||6.1, 6.2, 7.1, 7.2, 7.3||Optimal filtering.|
||8.1, 8.2, 8.4, 9.1, 9.2, 9.3||Approximative filtering.|
||8.6, 9.5, 7.10, (16.3)||Computer-based filtering.|
||11.1, 11.3||Computer-based SLAM.|
||12.1, 12.3, 13.1, 13.2, 14.2, 12.2||Modeling and motion models.|
Informationsansvarig: Fredrik Gustafsson
Senast uppdaterad: 2013-10-23