Automated Driving: Design and Verify Perception Systems Giuseppe Ridinò 2015 The MathWorks, Inc. 1
Some common questions from automated driving engineers 1011010101010100101001 0101010100100001010101 0010101001010100101010 0101010100101010101001 0101010010100010101010 0010100010100101010101 0100101010101010010101 0100101010010110000101 0100101010101010010010 1010011101010100101010 vehicle 1 F 2 How can I visualize vehicle data? How can I detect objects in images? How can I fuse multiple detections? 2
Some common questions from automated driving engineers 1011010101010100101001 0101010100100001010101 0010101001010100101010 0101010100101010101001 0101010010100010101010 0010100010100101010101 0100101010101010010101 0100101010010110000101 0100101010101010010010 1010011101010100101010 vehicle 1 F 2 How can I visualize vehicle data? How can I detect objects in images? How can I fuse multiple detections? 3
Examples of automated driving sensors Camera Radar-based object detector Vision-based object detector Lidar Lane detector Inertial measurement unit 4
Examples of automated driving sensor data Camera (640 x 480 x 3) Radar Detector Radar-based 239 239 237 238 241 241 241 242 243 243 243 243 243 243 243 243 243 243 24 252 252 251 252 Camera 252 253 253 253 255 255 255 SensorID 255 255 255 = 2; 255 255 255 255 25 254 254 253 253 254 253 255 255 255 255 255 Timestamp 255 object 255 255 detector = 1461634696407521; 255 255 255 255 25 250 Vision 251 251 Detector 251 251 251 253 251 253 253 255 255 253 255 255 255 255 255 25 NumDetections = 23; 251 252 251 253 253 253 251 251 253 253 253 253 254 253 253 253 253 253 25 252 SensorID 253 253 254 = 254 1; 253 253 253 253 253 253 Detections(1) 253 253 253 254 253 253 253 25 252 Timestamp 253 253 254 = 254 1461634696379742; 254 253 253 253 253 253 TrackID: 253 253 Lidar 253 0253 (47197 253 251 x 3) 251 25 253 NumDetections 253 253 254 = 254 6; 254 254 253 253 253 253 TrackStatus: 253 253 253 6253 253 253 253 25 253 254 253 254 254 254 254 253 253 253 253 253 253 253 253 253 253 253 25 254 Detections(1) Vision-based Position: [56.07 17.73 0.34] 254 254 254 254 254 254 253 253 253 253 253 253 253 253 253 253 253 25 254 254 TrackID: 254 254 254 254 254 253 253 253 253 253 253 253 253 253 253 253 25 object Lane detectordetector 0 Velocity: [-8.50 Lidar 2.86 0] 254 254 Classification: 254 254 254 254 254 253 253 253 253 253 253 253 253 253 253 253 25 Left 5 Amplitude: 3 255 255 255 255 255 255 255 255 254 254 254 254 254 254 254 254 254 254 25 Position: 255 255 255 IsValid: [22.61-0.43 255 255 255 255 1 2.24] Detections(2) 255 254 254 254 254 254 254 254 254 254 254 25 253 254 Velocity: 255 Confidence: 255 255 [-9.86 255 254 30 254 0] 253 253 253 TrackID: 253 254 254 1254 254 254 254 25 253 255 Size: 255 BoundaryType: 255 255 [0 255 254 31.75 254 0] 253 253 253 TrackStatus: 253 254 254 6254 254 254 254 25 254 Detections(2) 254 254 254 253 253 253 253 253 253 253 253 253 253 253 253 253 253 25 254 254 254 Offset: Position: [35.35 19.59 0.34] Lane 254 253 detector 253 253 1.68 253 Inertial 253 253 253 Measurement 253 253 Inertial 253 253 253 Unit 253 253 25 253 253 TrackID: 253 HeadingAngle: 253 253 1 253 251 0.002 Velocity: [-8.02 4.92 0] 251 Timestamp: 251 251 251 1461634696379742 251 253 253 253 253 253 253 25 253 253 Classification: 253 253 253 5 253 251 251 251 251 251 Amplitude: measurement Curvature: 0.0000228 251 253 253 3253 253 253 253 25 Velocity: 9.2795 253 253 Position: 253 253 253 [22.8 253 3.12 251 2.24] 251 251 251 251 251 253 253 253 253 253 253 25 Right Detections(3) unit 253 253 253 253 253 253 251 251 YawRate: 251 251 251 0.0040 251 253 253 253 253 253 253 25 Velocity: [-9.37 0 0] TrackID: 12 253 253 253 IsValid: 253 253 253 251 1 251 251 251 251 251 253 253 253 253 253 253 25-12.2911 1.4790-0.59-14.8852 1.7755-0.64-18.8020 2.2231-0.73-25.7033 3.0119-0.92-0.0632 0.0815 1.25-0.0978 0.0855 1.25-0.2814 0.1064 1.25-0.3375 0.1129 1.26-0.4611 0.1270 1.25-0.6184 0.1450 1.24-0.8369 0.1699 1.23-14.8815 1.8245-0.64 TrackStatus: -18.8008 5 2.2849 5-0.74
Visualize sensor data 6
Visualize differences in sensor detections 7
Explore logged vehicle data Load video data and corresponding mono-camera parameters >> video = VideoReader('01_city_c2s_fcw_10s.mp4') >> load('fcwdemomonocamerasensor.mat', 'sensor') Load detection sensor data and corresponding parameters >> load('01_city_c2s_fcw_10s_sensor.mat', 'vision','lane','radar') >> load('sensorconfigurationdata.mat', 'sensorparams') Load lidar point cloud data >> load('01_city_c2s_fcw_10s_lidar.mat', 'LidarPointCloud') 8
Learn more about visualizing vehicle data by exploring examples in the Automated Driving System Toolbox Plot object detectors in vehicle coordinates Vision & radar detector Lane detectors Detector coverage areas Transform between vehicle and image coordinates Plot lidar point cloud 9
Some common questions from automated driving engineers 1011010101010100101001 0101010100100001010101 0010101001010100101010 0101010100101010101001 0101010010100010101010 0010100010100101010101 0100101010101010010101 0100101010010110000101 0100101010101010010010 1010011101010100101010 vehicle 1 F 2 How can I visualize vehicle data? How can I detect objects in images? How can I fuse multiple detections? 10
How can I detect objects in images? Object detector Classification Left Classification Bottom Left Width Bottom Height Width Height 11
Train object detectors based on ground truth Images Ground Truth Train detector Object detector Classification Left Classification Bottom Left Width Bottom Height Width Height 12
Train object detectors based on ground truth Images Ground Truth Train detector Object detector Design object detectors with the Computer Vision System Toolbox Machine Learning Deep Learning Aggregate Channel Feature Cascade R-CNN (Regions with Convolutional Neural Networks) Fast R-CNN Faster R-CNN trainacfobjectdetector traincascadeobjectdetector trainrcnnobjectdetector trainfastrcnnobjectdetector trainfasterrcnnobjectdetector 13
Specify ground truth to train detector Images How can I create ground truth? Ground Truth Train detector Object detector 14
Specify ground truth to train detector Video Ground Truth Labeler App Ground Truth Train detector Object detector 15
Automate labeling based on a manually labeled frame with point tracker 16
Ground truth labeling to train detectors Video Ground Truth Labeler App Ground Truth Train detector Object detector Ground truth labeling to evaluate detectors Video Object detector Detections Evaluate detections Ground Truth Labeler App Ground truth 17
Customize Ground Truth Labeler App Add custom image reader with groundtruthdatasource 18
Customize Ground Truth Labeler App Add custom automation algorithm driving.automation.automationalgorithm 19
Customize Ground Truth Labeler App Add connection to other tools with driving.connector.connector 20
Learn more about detecting objects in images by exploring examples in the Automated Driving System Toolbox Label detections with Ground Truth Labeler App Add automation algorithm for lane tracking Extend connectivity of Ground Truth Labeler App 21
Learn more about detecting objects in images by exploring examples in the Automated Driving System Toolbox Train object detector using deep learning and machine learning techniques Explore pre-trained pedestrian detector Explore lane detector using coordinate transforms for monocamera sensor model 22
Some common questions from automated driving engineers 1011010101010100101001 0101010100100001010101 0010101001010100101010 0101010100101010101001 0101010010100010101010 0010100010100101010101 0100101010101010010101 0100101010010110000101 0100101010101010010010 1010011101010100101010 vehicle 1 F 2 How can I visualize vehicle data? How can I detect objects in images? How can I fuse multiple detections? 23
Example of radar and vision detections of a vehicle Can we fuse detections to better track the vehicle? 24
Fuse detections with multi-object tracker 25
Synthesize scenario to test tracker 26
Test tracker against synthesized data 27
Track multiple object detections Multi-Object Tracker Object Detections Track Manager Tracking Filter Tracks Time Measurement Measurement Noise Assigns detections to tracks Creates new tracks Updates existing tracks Removes old tracks Predicts and updates state of track Supports linear, extended, and unscented Kalman filters Time State State Covariance Track ID Age Is Confirmed Is Coasted 28
Examples of Kalman Filter (KF) initialization functions Multi-Object Tracker Object Detections Track Manager Tracking Filter Tracks Linear KF (trackingkf) Extended KF (trackingekf) Unscented KF (trackingukf) Constant velocity initcvkf initcvekf initcvukf Constant acceleration initcakf initcaekf initcaukf Constant turn Not applicable initctekf initctukf 29
Fuse and track multiple detections from different sensors Multi-Object Tracker Radar Detections Time Position Velocity Vision Detections Time Position Velocity Object Packer Track Manager Object Detections Time Measurement Measurement Noise Kalman Filter Typically unique to application and sensors Map sensor readings into measurement matrix Specify measurement noise for each sensor Tracks Time State State Covariance Track ID Age Is Confirmed Is Coasted 30
Explore demo to learn more about fusing detections Multi-Object Tracker Radar Detections Object Packer Track Manager Kalman Filter Tracks Vision Detections Forward Collision Warning Using Sensor Fusion product demo illustrates Packing sensor data into object detections Initializing Kalman filter Configuring multi-object tracker 31
Virtual scenario generation Specify driving scenario and roads Add ego vehicle Add target vehicle and pedestrian actor Play scenario with chase plot Create birds eye plot to view sensor detections Play scenario with sensor models 32
Simulate effects of vision detection sensor Range Effects Occlusion Effects Road Elevation Effects Range measurement accuracy degrades with distance to object Angle measurement accuracy consistent throughout coverage area Partially or completely occluded objects are not detected Objects in coverage area may not be detected because they appear above the horizon line Large range measurement errors may be introduced for detected objects 33
Learn more about sensor fusion by exploring examples in the Automated Driving System Toolbox Design multi-object tracker based on logged vehicle data Generate C/C++ code from algorithm which includes a multi-object tracker Synthesize driving scenario to test multi-object tracker 34
The Automated Driving System Toolbox helps you 1011010101010100101001 0101010100100001010101 0010101001010100101010 0101010100101010101001 0101010010100010101010 0010100010100101010101 0100101010101010010101 0100101010010110000101 0100101010101010010010 1010011101010100101010 vehicle 1 F 2 Visualize vehicle data Plot sensor detections Plot coverage areas Transform between image and vehicle coordinates Detect objects in images Train deep learning networks Label ground truth Connect to other tools Fuse multiple detections Design multi-object tracker Generate C/C++ Synthesize driving scenarios 35