Keynote Speakers

Keynote Speaker 1:
Prof Ts. Dr. Massila Kamalrudin

Keynote Speaker 2:

Keynote Speaker 3:

Dr. Daniel Watzenig


Daniel Watzenig was born in Austria. He received his doctoral degree in electrical engineering from Graz University of Technology, Austria, in 2006. In 2009 he received the venia docendi for Electrical Measurement and Signal Processing. Since 2008 he is Divisional Director of the Automotive Electronics Department at the Virtual Vehicle Research Center Graz. In 2017 he was appointed as Full Professor of Autonomous Driving at the Institute of Automation and Control, Graz University of Technology, Austria. His research interests focus on sense & control of automated vehicles, sensor fusion, and uncertainty estimation. He is author or co-author of over 180 peer-reviewed papers, book chapters, patents, and articles. He is Editor-in-Chief of the SAE Int. Journal on Connected and Automated Vehicles (SAE JCAV, launched in 2018). Since 2019 he is invited guest lecturer at Stanford University, USA, teaching multi-sensor perception, data fusion, and software for autonomous systems (Principles of Robot Autonomy I). He is founder of the Autonomous Racing Graz Team, one of currently six teams of the global Roborace race series.


• A basic introduction to the sense-plan-act challenges of autonomous vehicles
• Introduction to the most common state-of-the-art sensors used in autonomous driving (radar, camera, lidar, GPS, odometry, vehicle-2-x) in terms of benefits and disadvantages along with mathematical models of these sensors

Autonomous driving is seen as one of the pivotal technologies that considerably will shape our society and will influence future transportation modes and quality of life, altering the face of mobility as we experience it by today. Many benefits are expected ranging from reduced accidents, optimized traffic, improved comfort, social inclusion, lower emissions, and better road utilization due to efficient integration of private and public transport. Autonomous driving is a highly complex sensing and control problem. State-of-the-art vehicles include many different compositions of sensors including radar, cameras, and lidar. Each sensor provides specific information about the environment at varying levels and has an inherent uncertainty and accuracy measure. Sensors are the key to the perception of the outside world in an autonomous driving system and whose cooperation performance directly determines the safety of such vehicles. The ability of one isolated sensor to provide accurate reliable data of its environment is extremely limited as the environment is usually not very well defined. Beyond the sensors needed for perception, the control system needs some basic measure of its position in space and its surrounding reality. Real-time capable sensor processing techniques used to integrate this information have to manage the propagation of their inaccuracies, fuse information to reduce the uncertainties, and, ultimately, offer levels of confidence in the produced representations that can be then used for safe navigation decisions and actions.