main content

visualize sensor coverage, detections, and tracks -凯发k8网页登录

configure and use a bird's-eye plot to display sensor coverage, detections and tracking results around the ego vehicle.

overview

displaying data recorded in vehicle coordinates on a 2-dimensional map around the ego vehicle is an important part of analyzing sensor coverages, detections and tracking results. use to display a snapshot of this information for a certain time or to stream data and efficiently update the display.

this example reads pre-recorded sensor data and tracking results. it includes the following:

  • lane information

  • vision objects

  • radar objects

  • positions, velocities, covariance matrices, and labels of the tracks

  • most important object

the above information was recorded at a high rate of 20 updates per second, except vision detections that were recorded at 10 updates per second.

a sensor configuration file defines the position and coverage areas of a vision sensor and a radar sensor with two coverage modes. these coverage areas will be displayed on the bird's-eye plot.

note that the birdseyeplot object sets up a very specific vehicle coordinate system, where the x-axis points forward from the vehicle, the y-axis points to the left of the vehicle, and the z-axis points up from the ground. the origin of the coordinate system is typically defined as the center of the rear axle, and the positions of the sensors are defined relative to the origin. for more details, see .

defining scene limits and sensor coverage

configuring a bird's-eye plot takes two steps. in the first step, the bird's-eye plot is created, which sets up the coordinate system described above, where the x-axis is directed upwards and y-axis is directed to the left. it is possible to define the axes limits in each direction. in this forward looking example, we define the scene up to 90 meters in front of the ego vehicle and 35 meters on each side.

% create a bird's-eye plot and limit its axes
bep = birdseyeplot('xlimits', [0 90], 'ylimits', [-35 35]);

in the second step, the bird's-eye plotters are created. the bird's-eye plot offers the following variety of plotters, each configured for plotting a specific data type. they include:

  • coverageareaplotter - plot sensor coverage areas

  • detectionplotter - plot object detections

  • trackplotter - plot tracks, track uncertainties, and history trails

  • laneboundaryplotter - plot lane boundaries

  • pathplotter - plot object trajectory

% create a coverageareaplotter for a vision sensor and two radar modes
cap(1) = coverageareaplotter(bep,'facecolor','blue','edgecolor','blue');
cap(2) = coverageareaplotter(bep,'facecolor','red','edgecolor','red');
cap(3) = coverageareaplotter(bep,'facecolor','red','edgecolor','red');

load sensor configuration data. sensor configuration includes:

  • the position of the sensors relative to the axes origin (x,y), in meters

  • the sensor range, in meters

  • the sensor yaw angle relative to the x-axis, in degrees

  • the sensor field of view (fov), in degrees

load('sensorconfigurationdata.mat');
% use the sensor configuration to plot the sensor coverage areas. vision
% sensor uses the shaded blue coverage area and radar modes are shaded in
% red.
for i = 1:3
    plotcoveragearea(cap(i), [sensorparams(i).x, sensorparams(i).y],...
        sensorparams(i).range, sensorparams(i).yawangle, sensorparams(i).fov);
end
% add title
title('bird''s-eye plot')

the display above shows the coverage of the vision sensor and two radar sensor modes.

the vision sensor is positioned 3.30 meters in front of the origin (rear axle) at the center of the car, with a range of 150 meters and a fov of 38 degrees.

the radar is positioned 3.38 meters in front of the origin at the center of the car. the radar long-range mode has a range of 174 meters and a fov of 20 degrees, while the medium-range mode has a range of 60 meters and a fov of 90 degrees. note that the coverage areas are truncated at 90 meters in front of the ego vehicle and 35 meters on each side.

this example shows a forward looking scenario; however, you can define coverage area in around the ego vehicle. for example, a sensor that covers from the rear of the vehicle backwards would be oriented with a yaw angle of .

the next few lines read the recorded data in preparation for the next steps.

% load recorded data from a file
load('birdseyeplotexampledata.mat', 'datatodisplay');
% skip to the 125th time step, where there are 5 vision detections and
% multiple radar objects and tracks.
timestep = 125;
% extract the various data from the recorded file for that time step
[visionobjectspos, radarobjectspos, laneboundaries, trackpositions, ...
    trackvelocities, trackcovariances, tracklabels, miolabel, mioposition, ...
    miovelocity] = readdataframe(datatodisplay(timestep));

plotting detections

next, create plotters to display the recorded vision and radar detections

% create a vision detection plotter put it in a struct for future use
bepplotters.vision = detectionplotter(bep, 'displayname','vision detection', ...
    'markeredgecolor','blue', 'marker','^');
% combine all radar detections into one entry and store it for later update
bepplotters.radar = detectionplotter(bep, 'displayname','radar detection', ...
    'markeredgecolor','red');
% call the vision detections plotter
plotdetection(bepplotters.vision, visionobjectspos);
% repeat the above for radar detections
plotdetection(bepplotters.radar, radarobjectspos);

plotting tracks and most-important objects

when adding the tracks to the bird's-eye plot, we provide position, velocity and position covariance information. the plotter takes care of displaying the track history trail, but since this is a single frame, there will be no history.

% create a track plotter that shows the last 10 track updates
bepplotters.track = trackplotter(bep, 'displayname','tracked object', ...
    'historydepth',10);
% create a track plotter to plot the most important object
bepplotters.mio = trackplotter(bep, 'displayname','most important object', ...
    'markerfacecolor','black');
% call the track plotter to plot all the tracks
plottrack(bepplotters.track, trackpositions, trackvelocities, trackcovariances, tracklabels);
% repeat for the most important object (mio)
plottrack(bepplotters.mio, mioposition, miovelocity, miolabel);

plotting the lane boundaries

plotting lane boundaries can utilize the object. to use it, we saved the lane boundaries as paraboliclaneboundary objects, and call the plotter with it.

% create a plotter for lane boundaries
bepplotters.laneboundary = laneboundaryplotter(bep, ...
    'displayname','lane boundaries', 'color',[.9 .9 0]);
% call the lane boundaries plotter
plotlaneboundary(bepplotters.laneboundary, laneboundaries);

displaying a scenario from a recording file

the recording file contains time-dependent sensor detections, tracking information, and lane boundaries. the next code shows how to play back the recordings and display the results on the bird's-eye plot that was configured above.

note: vision detections were provided every other frame. in such cases, it is beneficial to show the lack of new sensor detections. to do that, simply pass an empty array to the appropriate plotter to delete the previous detections from the display.

% rewind to the beginning of the recording file
timestep = 0;
numsteps = numel(datatodisplay); % number of steps in the scenario
% loop through the scenario as long as the bird's eye plot is open
while timestep < numsteps && isvalid(bep.parent)
    % promote the timestep
    timestep = timestep   1;
    % capture the current time for a realistic display rate
    tic;
    % read the data for that time step
    [visionobjectspos, radarobjectspos, laneboundaries, trackpositions, ...
        trackvelocities, trackcovariances, tracklabels, miolabel, mioposition, ...
        miovelocity] = readdataframe(datatodisplay(timestep));
    % plot detections
    plotdetection(bepplotters.vision, visionobjectspos);
    plotdetection(bepplotters.radar, radarobjectspos);
    % plot tracks and mio
    plottrack(bepplotters.track, trackpositions, trackvelocities, trackcovariances, tracklabels);
    plottrack(bepplotters.mio, mioposition, miovelocity, miolabel);
    % plot lane boundaries
    plotlaneboundary(bepplotters.laneboundary, laneboundaries);
    % the recorded data was obtained at a rate of 20 frames per second.
    % pause for 50 milliseconds for a more realistic display rate. you
    % would not need this when you process data and form tracks in this
    % loop.
    pause(0.05 - toc)
end

summary

this example demonstrated how to configure and use a bird's-eye plot object and some of the various plotters associated with it.

try using the track and most-important object plotters or using the bird's-eye plot with a different recording file.

supporting functions

readdataframe - extracts the separate fields from the data provided in dataframe

function [visionobjectspos, radarobjectspos, laneboundaries, trackpositions, ...
    trackvelocities, trackcovariances, tracklabels, miolabel, mioposition, ...
    miovelocity] = readdataframe(dataframe)
    visionobjectspos    = dataframe.visionobjectspos;
    radarobjectspos     = dataframe.radarobjectspos;
    laneboundaries      = dataframe.laneboundaries;
    trackpositions      = dataframe.trackpositions;
    trackvelocities     = dataframe.trackvelocities;
    trackcovariances    = dataframe.trackcovariances;
    tracklabels         = dataframe.tracklabels;
    miolabel            = dataframe.miolabel;
    mioposition         = dataframe.mioposition;
    miovelocity         = dataframe.miovelocity;
end

see also

objects

functions

  • | | |

related topics

    网站地图