model radar sensor detections -凯发k8网页登录
this example shows how to model and simulate the output of an automotive radar sensor for different driving scenarios. generating synthetic radar detections is important for testing and validating tracking and sensor fusion algorithms in corner cases or when sensor hardware is unavailable. this example analyzes the differences between radar measurements and the vehicle ground truth position and velocity for a forward collision warning (fcw) scenario, a passing vehicle scenario, and a scenario with closely spaced targets. it also includes a comparison of signal-to-noise ratio (snr) values between pedestrian and vehicle targets at various ranges.
in this example, you generate radar detections programmatically. you can also generate detections by using the driving scenario designer app. for an example, see .
introduction
vehicles that contain advanced driver assistance system (adas) features or are designed to be fully autonomous typically rely on multiple types of sensors. these sensors include sonar, radar, lidar, and vision. a robust solution includes a sensor fusion algorithm to combine the strengths across the various types of sensors included in the system. for more information about sensor fusion of synthetic detections from a multisensor adas system, see .
when using synthetic detections for testing and validating tracking and sensor fusion algorithms, it is important to understand how the generated detections model the sensor's unique performance characteristics. each kind of automotive sensor provides a specific set of strengths and weaknesses which contribute to the fused solution. this example presents some important performance characteristics of automotive radars and shows how the radar performance is modeled by using synthetic detections.
radar sensor model
this example uses drivingradardatagenerator
to generate synthetic radar detections. drivingradardatagenerator
models the following performance characteristics of automotive radar:
strengths
good range and range-rate accuracy over long detection ranges
long detection range for vehicles
weaknesses
poor position and velocity accuracy along the cross-range dimension
shorter detection range for pedestrians and other nonmetallic objects
close range detection clusters pose a challenge to tracking algorithms
inability to resolve closely spaced targets at long ranges
fcw driving scenario
create a forward collision warning (fcw) test scenario, which is used to illustrate how to measure a target's position with a typical long-range automotive radar. the scenario consists of a moving ego vehicle and a stationary target vehicle placed 150 meters down the road. the ego vehicle has an initial speed of 50 kph before applying its brakes to achieve a constant deceleration of 3 m/s^2. the vehicle then comes to a complete stop 1 meter before the target vehicle's rear bumper.
addpath(fullfile(matlabroot,'toolbox','shared','tracking','fusionlib')); rng default; initialdist = 150; % m initialspeed = 50; % kph brakeaccel = 3; % m/s^2 finaldist = 1; % m [scenario, egocar] = helpercreatesensordemoscenario('fcw', initialdist, initialspeed, brakeaccel, finaldist);
forward-facing long-range radar
create a forward-facing long-range radar sensor mounted on the ego vehicle's front bumper, 20 cm above the ground. the sensor generates raw detections at 10 hz (every 0.1 seconds) and has an azimuthal field of view of 20 degrees and an angle resolution of 4 degrees. its maximum range is 150 m and its range resolution is 2.5 m. the actorprofiles
property specifies the physical dimensions and radar cross-section (rcs) patterns of the vehicles seen by the radar in the simulation. as an alternative to raw detections, the drivingradardatageneratior
can output clustered detections or track updates, as specified with the targetreportformat
property.
radarsensor = drivingradardatagenerator( ... 'sensorindex', 1, ... 'targetreportformat', 'detections', ... 'updaterate', 10, ... 'mountinglocation', [egocar.wheelbase egocar.frontoverhang 0 0.2], ... 'fieldofview', [20 5], ... 'rangelimits', [0 150], ... 'azimuthresolution', 4, ... 'rangeresolution', 2.5, ... 'profiles', actorprofiles(scenario))
radarsensor = drivingradardatagenerator with properties: sensorindex: 1 updaterate: 10 mountinglocation: [3.7000 0 0.2000] mountingangles: [0 0 0] fieldofview: [20 5] rangelimits: [0 150] rangeratelimits: [-100 100] detectionprobability: 0.9000 falsealarmrate: 1.0000e-06 use get to show all properties
simulation of radar detections
simulate the radar measuring the position of the target vehicle by advancing the simulation time of the scenario. the radar sensor generates detections from the true target pose (position, velocity, and orientation) expressed in the ego vehicle's coordinate frame.
the radar is configured to generate detections at 0.1-second intervals, which is consistent with the update rate of typical automotive radars. however, to accurately model the motion of the vehicles, the scenario simulation advances every 0.01 seconds. the sensor returns a logical flag, isvalidtime
, that is true when the radar reaches its required update interval, indicating that this simulation time step will generate detections.
% create display for fcw scenario [bep, figscene] = helpercreatesensordemodisplay(scenario, egocar, radarsensor); metrics = struct; % initialize struct to collect scenario metrics while advance(scenario) % update vehicle positions gtruth = targetposes(egocar); % get target positions in ego vehicle coordinates % generate time-stamped radar detections time = scenario.simulationtime; [dets, ~, isvalidtime] = radarsensor(gtruth, time); if isvalidtime % update bird's-eye plot with detections and road boundaries helperupdatesensordemodisplay(bep, egocar, radarsensor, dets); % collect radar detections and ground truth for offline analysis metrics = helpercollectscenariometrics(metrics, gtruth, dets); end % take a snapshot for the published example helperpublishsnapshot(figscene, time>=9.1); end
position measurements
over the duration of the fcw test, the target vehicle's distance from the ego vehicle spans a wide range of values. by comparing the radar's measured longitudinal and lateral positions of the target vehicle to the vehicle's ground truth position, you can observe the accuracy of the radar's measured positions.
use helperplotsensordemodetections
to plot the longitudinal and lateral position errors as the difference between the measured position reported by the radar and the target vehicle's ground truth. the ground truth reference for the target vehicle is the point on the ground directly below the center of the target vehicle's rear axle, which is 1 meter in front of the car's bumper.
helperplotsensordemodetections(metrics, 'position', 'reverse range', [-6 6]); % show rear overhang of target vehicle tgtcar = scenario.actors(2); rearoverhang = tgtcar.rearoverhang; subplot(1,2,1); hold on; plot(-rearoverhang*[1 1], ylim, 'k'); hold off; legend('error', '2\sigma noise', 'rear overhang');
longitudinal position measurements
for a forward-facing radar configuration, the radar's range measurements correspond to the longitudinal position of the target vehicle.
the longitudinal position errors in the preceding plot on the left show a -1 meter bias between the longitude measured by the radar and the target's ground truth position. this bias indicates that the radar consistently measures the target to be closer than the position reported by the ground truth. instead of approximating the target as a single point in space, the radar models the physical dimensions of the vehicle's body. detections are generated along the vehicle's rear side according to the radar's resolution in azimuth, range, and (when enabled) elevation. this -1 meter offset is then explained by the target vehicle's rear overhang, which defines the distance between the vehicle's rear side and its rear axle, where the ground truth reference is located.
the radar is modeled with a range resolution of 2.5 meters. however, the measurement noise is reported to be as small as 0.25 meter at the closest point and grows slightly to 0.41 meter at the farthest tested range. the realized sensor accuracy is much smaller than the radar's range resolution. because the radar models the snr dependence of the range errors using the cramer-rao lower bound, targets with a large radar cross-section (rcs) or targets that are close to the sensor will have better range accuracy than smaller or more distant targets.
this snr dependence on the radar's measurement noise is modeled for each of the radar's measured dimensions: azimuth, elevation, range, and range rate.
lateral position measurements
for a forward-facing radar configuration, the dimension orthogonal to the radar's range measurements (commonly referred to as the sensor's cross-range dimension) corresponds to the lateral position of the target vehicle.
the lateral position errors from the fcw test in the preceding plot on the right show a strong dependence on the target's ground truth range. the radar reports lateral position accuracies as small as 0.03 meters at close ranges and up to 2.6 meters when the target is far from the radar.
additionally, multiple detections appear when the target is at ranges less than 30 meters. as previously mentioned, the target vehicle is not modeled as a single point in space, but the radar model compares the vehicle's dimensions with the radar's resolution. in this scenario, the radar views the rear side of the target vehicle. when the vehicle's rear side spans more than one of the radar's azimuth resolution cells, the radar generates detections from each resolution cell that the target occupies.
compute the azimuth spanned by the target vehicle in the fcw test when it is at 30 meters ground truth range from the ego vehicle.
% range from radar to target vehicle's rear side radarrange = 30-(radarsensor.mountinglocation(1) tgtcar.rearoverhang); % azimuth spanned by vehicle's rear side at 30 meters ground truth range width = tgtcar.width; azspan = rad2deg(width/radarrange)
azspan = 4.0764
at a ground truth range of 30 meters, the vehicle's rear side begins to span an azimuth greater than the radar's azimuth resolution of 4 degrees. because the azimuth spanned by the target's rear side exceeds the sensor's resolution, 3 resolved points along the vehicle's rear side are generated: one from the center of the rear side, one from the left edge of the rear side, and one from the right edge.
velocity measurements
create a driving scenario with two target vehicles (a lead car and a passing car) to illustrate the accuracy of a radar's longitudinal and lateral velocity measurements. the lead car is placed 40 meters in front of the ego vehicle and is traveling with the same speed. the passing car starts in the left lane alongside the ego vehicle, passes the ego vehicle, and merges into the right lane just behind the lead car. this merging maneuver generates longitudinal and lateral velocity components, enabling you to compare the sensor's accuracy along these two dimensions.
because the lead car is directly in front of the radar, it has a purely longitudinal velocity component. the passing car has a velocity profile with both longitudinal and lateral velocity components. these components change as the car passes the ego vehicle and moves into the right lane behind the lead car. comparing the radar's measured longitudinal and lateral velocities of the target vehicles to their ground truth velocities illustrates the radar's ability to observe both of these velocity components.
% create passing scenario leaddist = 40; % m speed = 50; % kph passspeed = 70; % kph [scenario, egocar] = helpercreatesensordemoscenario('passing', leaddist, speed, passspeed);
configuration of radar velocity measurements
a radar generates velocity measurements by observing the doppler frequency shift on the signal energy returned from each target. the rate at which the target's range is changing relative to the radar is derived directly from these doppler frequencies. take the radar sensor used in the previous section to measure position, and configure it to generate range-rate measurements. these measurements have a resolution of 0.5 m/s, which is a typical resolution for an automotive radar.
% configure radar for range-rate measurements release(radarsensor); radarsensor.hasrangerate = true; radarsensor.rangerateresolution = 0.5; % m/s % use actor profiles for the passing car scenario radarsensor.profiles = actorprofiles(scenario);
use helperrunsensordemoscenario
to simulate the motion of the ego and target vehicles. this function also collects the simulated metrics, as was previously done for the fcw driving scenario.
snaptime = 6; % simulation time to take snapshot for publishing
metrics = helperrunsensordemoscenario(scenario, egocar, radarsensor, snaptime);
use helperplotsensordemodetections
to plot the radar's longitudinal and lateral velocity errors as the difference between the measured velocity reported by the radar and the target vehicle's ground truth.
helperplotsensordemodetections(metrics, 'velocity', 'time', [-25 25]); subplot(1,2,1); legend('lead car error', 'lead car 2\sigma noise', ... 'pass car error', 'pass car 2\sigma noise', 'location', 'northwest');
longitudinal velocity measurements
for a forward-facing radar, longitudinal velocity is closely aligned to the radar's range-rate measurements. the preceding plot on the left shows the radar's longitudinal velocity errors for the passing vehicle scenario. because the radar can accurately measure longitudinal velocity from the doppler frequency shift observed in the signal energy received from both cars, the velocity errors for both vehicles (shown as points) are small. however, when the passing car enters the radar's field of view at 3 seconds, the passing car's measurement noise (shown using solid yellow lines) is initially large. the noise then decreases until the car merges into the right lane behind the lead car at 7 seconds. as the car passes the ego vehicle, the longitudinal velocity of the passing car includes both radial and nonradial components. the radar inflates its reported longitudinal velocity noise to indicate its inability to observe the passing car's nonradial velocity components as it passes the ego vehicle.
lateral velocity measurements
for a forward-facing radar, the measured lateral velocity corresponds to a target's nonradial velocity component. the preceding plot on the right shows the passing car's lateral velocity measurement errors, which display as yellow points. the radar's inability to measure lateral velocity produces a large error during the passing car's lane change maneuver between 5 and 7 seconds. however, the radar reports a large lateral velocity noise (shown as solid lines) to indicate that it is unable to observe velocity along the lateral dimension.
pedestrian and vehicle detection
a radar "sees" not only an object's physical dimensions (length, width, and height) but also is sensitive to an object's electrical size. an object's electrical size is referred to as its radar cross-section (rcs) and is commonly given in units of decibel square meters (dbsm). an object's rcs defines how effectively it reflects the electromagnetic energy received from the radar back to the sensor. an object's rcs value depends on many properties, including the object's size, shape, and the kind of materials it contains. an object's rcs also depends on the transmit frequency of the radar. this value can be large for vehicles and other metallic objects. for typical automotive radar frequencies near 77 ghz, a car has a nominal rcs of approximately 10 square meters (10 dbsm). however, nonmetallic objects typically have much smaller values. -8 dbsm is a reasonable rcs to associate with a pedestrian. this value corresponds to an effective electrical size of only 0.16 square meters. in an adas or autonomous driving system, a radar must be able to generate detections on both of these objects.
fcw driving scenario with a pedestrian and a vehicle
revisit the fcw scenario from earlier by adding a pedestrian standing on the sidewalk beside the stopped vehicle. over the duration of the fcw test, the distance from the radar to the target vehicle and pedestrian spans a wide range of values. comparing the radar's measured signal-to-noise ratio (snr) reported for the test vehicle and pedestrian detections across the tested ranges demonstrates how the radar's detection performance changes with both detection range and object type.
% create fcw test scenario initialdist = 150; % m finaldist = 1; % m initialspeed = 50; % kph brakeaccel = 3; % m/s^2 withpedestrian = true; [scenario, egocar] = helpercreatesensordemoscenario('fcw', initialdist, initialspeed, brakeaccel, finaldist, withpedestrian);
configuration of radar detection performance
a radar's detection performance is usually specified by the probability of detecting a reference target that has an rcs of 0 dbsm at a specific range. create a long-range radar that detects a target with an rcs of 0 dbsm at a range of 100 meters, with a detection probability of 90%.
% configure radar's long-range detection performance release(radarsensor); radarsensor.referencerange = 100; % m radarsensor.referencercs = 0; % dbsm radarsensor.detectionprobability = 0.9; % use actor profiles for the passing car scenario radarsensor.profiles = actorprofiles(scenario);
run the scenario to collect radar detections and ground truth data. store them for offline analysis.
snaptime = 8; % simulation time to take snapshot for publishing
metrics = helperrunsensordemoscenario(scenario, egocar, radarsensor, snaptime);
plot snr of detections for both the target vehicle and the pedestrian.
helperplotsensordemodetections(metrics, 'snr', 'range', [0 160]); legend('vehicle', 'pedestrian');
this plot shows the effect of an object's rcs on the radar's ability to "see" it. detections corresponding to the stationary test vehicle are shown in red. detections from the pedestrian are shown in yellow.
the test vehicle is detected out to the farthest range included in this test, but detection of the pedestrian becomes less consistent near 70 meters. this difference between the detection range of the two objects occurs because the test vehicle has a much larger rcs (10 dbsm) than the pedestrian (-8 dbsm), which enables the radar to detect the vehicle at longer ranges than the pedestrian.
the test vehicle is also detected at the closest range included in this test, but the radar stops generating detections on the pedestrian near 20 meters. in this scenario, the target vehicle is placed directly in front of the radar, but the pedestrian is offset from the radar's line of sight. near 20 meters, the pedestrian is no longer inside of the radar's field of view and cannot be detected by the radar.
revisit this scenario for a mid-range automotive radar to illustrate how the radar's detection performance is affected. model a mid-range radar to detect an object with an rcs of 0 dbsm at a reference range of 50 meters, with a detection probability of 90%.
% configure radar for a mid-range detection requirement release(radarsensor); radarsensor.referencerange = 50; % m radarsensor.referencercs = 0; % dbsm radarsensor.detectionprobability = 0.9;
additionally, to improve the detection of objects at close ranges that are offset from the radar's line of sight, the mid-range radar's azimuthal field of view is increased to 90 degrees. the radar's azimuth resolution is set to 10 degrees to search this large coverage area more quickly.
% increase radar's field of view in azimuth and elevation to 90 and 10 degrees respectively radarsensor.fieldofview = [90 10]; % increase radar's azimuth resolution radarsensor.azimuthresolution = 10;
run the fcw test using the mid-range radar and the snr for the detections from the target vehicle and pedestrian. plot the snr.
% run simulation and collect detections and ground truth for offline analysis metrics = helperrunsensordemoscenario(scenario, egocar, radarsensor); % plot snr for vehicle and pedestrian detections helperplotsensordemodetections(metrics, 'snr', 'range', [0 160]); legend('vehicle', 'pedestrian');
for the mid-range radar, the detections of both the vehicle and pedestrian are limited to shorter ranges. with the long-range radar, the vehicle is detected out to the full test range, but now vehicle detection becomes unreliable at 95 meters. likewise, the pedestrian is detected reliably only out to 35 meters. however, the mid-range radar's extended field of view in azimuth enables detections on the pedestrian to a 10-meter ground truth range from the sensor, a significant improvement in coverage over the long-range radar.
detection of closely spaced targets
when multiple targets occupy a radar's resolution cell, the group of closely spaced targets are reported as a single detection. the reported location is the centroid of the location of each contributing target. this merging of multiple targets into a single detection is common at long ranges, because the area covered by the radar's azimuth resolution grows with increasing distance from the sensor.
create a scenario with two motorcycles traveling side-by-side in front of the ego vehicle. this scenario shows how the radar merges closely spaced targets. the motorcycles are 1.8 meters apart and are traveling 10 kph faster than the ego vehicle.
over the duration of the scenario, the distance between the motorcycles and the ego vehicle increases. when the motorcycles are close to the radar, they occupy different radar resolution cells. by the end of the scenario, after the distance between the radar and the motorcycles has increased, both motorcycles occupy the same radar resolution cells and are merged. the radar's longitudinal and lateral position errors show when this transition occurs during the scenario.
duration = 8; % s speedego = 50; % kph speedmotorcycles = 60; % kph distmotorcycles = 25; % m [scenario, egocar] = helpercreatesensordemoscenario('side-by-side', duration, speedego, speedmotorcycles, distmotorcycles); % create forward-facing long-range automotive radar sensor mounted on ego vehicle's front bumper radarsensor = drivingradardatagenerator(... 'sensorindex', 1, ... 'targetreportformat', 'detections', ... 'mountinglocation', [egocar.wheelbase egocar.frontoverhang 0 0.2], ... 'profiles', actorprofiles(scenario)); % run simulation and collect detections and ground truth for offline analysis snaptime = 5.6; % simulation time to take snapshot for publishing metrics = helperrunsensordemoscenario(scenario, egocar, radarsensor, snaptime);
plot the radar's longitudinal and lateral position errors. by analyzing the position errors reported for each motorcycle, you can identify the range where the radar no longer can distinguish the two motorcycles as unique objects.
helperplotsensordemodetections(metrics, 'position', 'range', [-3 3], true); subplot(1,2,2); legend('left error', 'right error', 'merged error');
detections are generated from the rear and along the inner side of each motorcycle. the red errors are from the left motorcycle, the yellow errors are from the right motorcycle, and the purple points show the detections that are merged between the two motorcycles. the motorcycles are separated by a distance of 1.8 meters. each motorcycle is modeled to have a width of 0.6 meters and a length of 2.2 meters. the inner sides of the motorcycles are only 1.2 meters apart.
inner side detections
detections are generated from points along the inner side of each motorcycle. the detections start at the closest edge and are sampled in range according to the radar's range resolution of 2.5 meters and the motorcycle's position relative to the radar. the location of the range cell's boundary produces a detection that occurs either at the middle or far edge of the motorcycle's inner side. a detection from the motorcycle's closest edge is also generated. this movement through the radar's range resolution cell boundaries creates the 3 bands of longitudinal position errors seen in the preceding plot on the left. the total longitudinal extent covered by these 3 bands is 2.2 meters, which corresponds to the length of the motorcycles.
because the inner sides of the motorcycles are separated by only 1.2 meters, these sampled points all fall within a common azimuthal resolution cell and are merged by the radar. the centroid of these merged points lies in the middle of the two motorcycles. the centroiding of the merged detections produces a lateral bias with a magnitude of 0.9 meters, corresponding to half of the distance between the motorcycles. in the lateral position error plot on the right, all of the merged detections (shown in purple) have this bias.
rear side detections
detections generated from the rear side of each motorcycle are further apart (1.8 m) than the sampled points along the inner sides (1.2 m).
at the beginning of the scenario, the motorcycles are at a ground truth range of 25 meters from the ego vehicle. at this close range, detections from the rear sides lie in different azimuthal resolution cells and the radar does not merge them. these distinct rear-side detections are shown as red points (left motorcycle) and yellow points (right motorcycle) in the preceding longitudinal and lateral position error plots. for these unmerged detections, the longitudinal position errors from the rear sides are offset by the rear overhang of the motorcycles (0.37 m). the lateral position errors from the rear sides do not exhibit any bias. this result is consistent with the position errors observed in the fcw scenario.
summary
this example demonstrated how to model the output of automotive radars using synthetic detections. in particular, it presented how the drivingradardatagenerator
model:
provides accurate longitudinal position and velocity measurements over long ranges, but has limited lateral accuracy at long ranges
generates multiple detections from single target at close ranges, but merges detections from multiple closely spaced targets into a single detection at long ranges
sees vehicles and other targets with large radar cross-sections over long ranges, but has limited detection performance for nonmetallic objects such as pedestrians
see also
apps
objects
- | |