get started with automated driving toolbox -凯发k8网页登录
automated driving toolbox™ provides algorithms and tools for designing, simulating, and testing adas and autonomous driving systems. you can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. the toolbox lets you import and work with here hd live map data and asam opendrive® road networks.
using the ground truth labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. for hardware-in-the-loop (hil) testing and desktop simulation of perception, sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios. you can simulate camera, radar, and lidar sensor output in a photorealistic 3d environment and sensor detections of objects and lane boundaries in a 2.5d simulation environment.
automated driving toolbox provides reference application examples for common adas and automated driving features, including forward collision warning, autonomous emergency braking, adaptive cruise control, lane keeping assist, and parking valet. the toolbox supports c/c code generation for rapid prototyping and hil testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
tutorials
- get started with ground truth labelling
interactively label multiple lidar and video signals simultaneously.
programmatically create ground truth driving scenarios for synthetic sensor data and tracking algorithms.
use the driving scenario designer app to create a driving scenario and generate sensor detections and point cloud data from the scenario.
learn the basics of configuring and simulating scenes, vehicles, and sensors in a virtual environment rendered using the unreal engine® from epic games®.
this topic describes the workflow to simulate roadrunner scenarios with matlab® and simulink®.
construct a monocular camera sensor simulation capable of lane boundary and vehicle detections.- train a deep learning vehicle detector
train a vision-based vehicle detector using deep learning.
perform automatic detection and motion-based tracking of moving objects in a video by using a multi-object tracker.- build a map from lidar data using slam
process lidar data to build a map and estimate a vehicle trajectory using simultaneous localization and mapping.
develop a visual simultaneous localization and mapping (slam) algorithm using image data from the unreal engine simulation environment.
ground truth labeling
driving scenario design
detection and tracking
localization and mapping
about automated driving
understand coordinate systems for automated driving.
videos
create virtual driving scenarios and import scenarios into the app.
generate synthetic sensor detections and export them to matlab.
build a map from lidar data using slam.
simulate and test adaptive cruise control application for automated
driving.