main content

forward collision warning using sensor fusion -凯发k8网页登录

this example shows how to perform forward collision warning by fusing data from vision and radar sensors to track objects in front of the vehicle.

overview

forward collision warning (fcw) is an important feature in driver assistance and automated driving systems, where the goal is to provide correct, timely, and reliable warnings to the driver before an impending collision with the vehicle in front. to achieve the goal, vehicles are equipped with forward-facing vision and radar sensors. sensor fusion is required to increase the probability of accurate warnings and minimize the probability of false warnings.

for the purposes of this example, a test car (the ego vehicle) was equipped with various sensors and their outputs were recorded. the sensors used for this example were:

  1. vision sensor, which provided lists of observed objects with their classification and information about lane boundaries. the object lists were reported 10 times per second. lane boundaries were reported 20 times per second.

  2. radar sensor with medium and long range modes, which provided lists of unclassified observed objects. the object lists were reported 20 times per second.

  3. imu, which reported the speed and turn rate of the ego vehicle 20 times per second.

  4. video camera, which recorded a video clip of the scene in front of the car. note: this video is not used by the tracker and only serves to display the tracking results on video for verification.

the process of providing a forward collision warning comprises the following steps:

  1. obtain the data from the sensors.

  2. fuse the sensor data to get a list of tracks, i.e., estimated positions and velocities of the objects in front of the car.

  3. issue warnings based on the tracks and fcw criteria. the fcw criteria are based on the euro ncap aeb test procedure and take into account the relative distance and relative speed to the object in front of the car.

for more information about tracking multiple objects, see .

the visualization in this example is done using and . for brevity, the functions that create and update the display were moved to helper functions outside of this example. for more information on how to use these displays, see and visualize sensor coverage, detections, and tracks.

this example is a script, with the main body shown here and helper routines in the form of local functions in the sections that follow. for more details about local functions, see .

% set up the display
[videoreader, videodisplayhandle, bepplotters, sensor] = helpercreatefcwdemodisplay('01_city_c2s_fcw_10s.mp4', 'sensorconfigurationdata.mat');
% read the recorded detections file
[visionobjects, radarobjects, inertialmeasurementunit, lanereports, ...
    timestep, numsteps] = readsensorrecordingsfile('01_city_c2s_fcw_10s_sensor.mat');
% an initial ego lane is calculated. if the recorded lane information is
% invalid, define the lane boundaries as straight lines half a lane
% distance on each side of the car
lanewidth = 3.6; % meters
egolane = struct('left', [0 0 lanewidth/2], 'right', [0 0 -lanewidth/2]);
% prepare some time variables
time = 0;           % time since the beginning of the recording
currentstep = 0;    % current timestep
snaptime = 9.3;     % the time to capture a snapshot of the display
% initialize the tracker
[tracker, positionselector, velocityselector] = setuptracker();
while currentstep < numsteps && ishghandle(videodisplayhandle)
    % update scenario counters
    currentstep = currentstep   1;
    time = time   timestep;
    % process the sensor detections as objectdetection inputs to the tracker
    [detections, laneboundaries, egolane] = processdetections(...
        visionobjects(currentstep), radarobjects(currentstep), ...
        inertialmeasurementunit(currentstep), lanereports(currentstep), ...
        egolane, time);
    % using the list of objectdetections, return the tracks, updated to time
    confirmedtracks = updatetracks(tracker, detections, time);
    % find the most important object and calculate the forward collision
    % warning
    mostimportantobject = findmostimportantobject(confirmedtracks, egolane, positionselector, velocityselector);
    % update video and birds-eye plot displays
    frame = readframe(videoreader);     % read video frame
    helperupdatefcwdemodisplay(frame, videodisplayhandle, bepplotters, ...
        laneboundaries, sensor, confirmedtracks, mostimportantobject, positionselector, ...
        velocityselector, visionobjects(currentstep), radarobjects(currentstep));
    % capture a snapshot
    if time >= snaptime && time < snaptime   timestep
        snapnow;
    end
end

create the multi-object tracker

the tracks the objects around the ego vehicle based on the object lists reported by the vision and radar sensors. by fusing information from both sensors, the probability of a false collision warning is reduced.

the setuptracker function returns the multiobjecttracker. when creating a multiobjecttracker, consider the following:

  1. filterinitializationfcn: the likely motion and measurement models. in this case, the objects are expected to have a constant acceleration motion. although you can configure a linear kalman filter for this model, initconstantaccelerationfilter configures an extended kalman filter. see the 'define a kalman filter' section.

  2. assignmentthreshold: how far detections can fall from tracks. the default value for this parameter is 30. if there are detections that are not assigned to tracks, but should be, increase this value. if there are detections that get assigned to tracks that are too far, decrease this value. this example uses 35.

  3. deletionthreshold: when a track is confirmed, it should not be deleted on the first update that no detection is assigned to it. instead, it should be coasted (predicted) until it is clear that the track is not getting any sensor information to update it. the logic is that if the track is missed p out of q times it should be deleted. the default value for this parameter is 5-out-of-5. in this case, the tracker is called 20 times a second and there are two sensors, so there is no need to modify the default.

  4. confirmationthreshold: the parameters for confirming a track. a new track is initialized with every unassigned detection. some of these detections might be false, so all the tracks are initialized as 'tentative'. to confirm a track, it has to be detected at least m times in n tracker updates. the choice of m and n depends on the visibility of the objects. this example uses the default of 2 detections out of 3 updates.

the outputs of setuptracker are:

  • tracker - the multiobjecttracker that is configured for this case.

  • positionselector - a matrix that specifies which elements of the state vector are the position: position = positionselector * state

  • velocityselector - a matrix that specifies which elements of the state vector are the velocity: velocity = velocityselector * state

    function [tracker, positionselector, velocityselector] = setuptracker()
        tracker = multiobjecttracker(...
            'filterinitializationfcn', @initconstantaccelerationfilter, ...
            'assignmentthreshold', 35, 'confirmationthreshold', [2 3], ...
            'deletionthreshold', 5);
        % the state vector is:
        %   in constant velocity:     state = [x;vx;y;vy]
        %   in constant acceleration: state = [x;vx;ax;y;vy;ay]
        % define which part of the state is the position. for example:
        %   in constant velocity:     [x;y] = [1 0 0 0; 0 0 1 0] * state
        %   in constant acceleration: [x;y] = [1 0 0 0 0 0; 0 0 0 1 0 0] * state
        positionselector = [1 0 0 0 0 0; 0 0 0 1 0 0];
        % define which part of the state is the velocity. for example:
        %   in constant velocity:     [x;y] = [0 1 0 0; 0 0 0 1] * state
        %   in constant acceleration: [x;y] = [0 1 0 0 0 0; 0 0 0 0 1 0] * state
        velocityselector = [0 1 0 0 0 0; 0 0 0 0 1 0];
    end

define a kalman filter

the multiobjecttracker defined in the previous section uses the filter initialization function defined in this section to create a kalman filter (linear, extended, or unscented). this filter is then used for tracking each object around the ego vehicle.

function filter = initconstantaccelerationfilter(detection)
% this function shows how to configure a constant acceleration filter. the
% input is an objectdetection and the output is a tracking filter.
% for clarity, this function shows how to configure a trackingkf,
% trackingekf, or trackingukf for constant acceleration.
%
% steps for creating a filter:
%   1. define the motion model and state
%   2. define the process noise
%   3. define the measurement model
%   4. initialize the state vector based on the measurement
%   5. initialize the state covariance based on the measurement noise
%   6. create the correct filter
    % step 1: define the motion model and state
    % this example uses a constant acceleration model, so:
    stf = @constacc;     % state-transition function, for ekf and ukf
    stfj = @constaccjac; % state-transition function jacobian, only for ekf
    % the motion model implies that the state is [x;vx;ax;y;vy;ay]
    % you can also use constvel and constveljac to set up a constant
    % velocity model, constturn and constturnjac to set up a constant turn
    % rate model, or write your own models.
    % step 2: define the process noise
    dt = 0.05; % known timestep size
    sigma = 1; % magnitude of the unknown acceleration change rate
    % the process noise along one dimension
    q1d = [dt^4/4, dt^3/2, dt^2/2; dt^3/2, dt^2, dt; dt^2/2, dt, 1] * sigma^2;
    q = blkdiag(q1d, q1d); % 2-d process noise
    % step 3: define the measurement model
    mf = @fcwmeas;       % measurement function, for ekf and ukf
    mjf = @fcwmeasjac;   % measurement jacobian function, only for ekf
    % step 4: initialize a state vector based on the measurement
    % the sensors measure [x;vx;y;vy] and the constant acceleration model's
    % state is [x;vx;ax;y;vy;ay], so the third and sixth elements of the
    % state vector are initialized to zero.
    state = [detection.measurement(1); detection.measurement(2); 0; detection.measurement(3); detection.measurement(4); 0];
    % step 5: initialize the state covariance based on the measurement
    % noise. the parts of the state that are not directly measured are
    % assigned a large measurement noise value to account for that.
    l = 100; % a large number relative to the measurement noise
    statecov = blkdiag(detection.measurementnoise(1:2,1:2), l, detection.measurementnoise(3:4,3:4), l);
    % step 6: create the correct filter.
    % use 'kf' for trackingkf, 'ekf' for trackingekf, or 'ukf' for trackingukf
    filtertype = 'ekf';
    % creating the filter:
    switch filtertype
        case 'ekf'
            filter = trackingekf(stf, mf, state,...
                'statecovariance', statecov, ...
                'measurementnoise', detection.measurementnoise(1:4,1:4), ...
                'statetransitionjacobianfcn', stfj, ...
                'measurementjacobianfcn', mjf, ...
                'processnoise', q ...
                );
        case 'ukf'
            filter = trackingukf(stf, mf, state, ...
                'statecovariance', statecov, ...
                'measurementnoise', detection.measurementnoise(1:4,1:4), ...
                'alpha', 1e-1, ...
                'processnoise', q ...
                );
        case 'kf' % the constantacceleration model is linear and kf can be used
            % define the measurement model: measurement = h * state
            % in this case:
            %   measurement = [x;vx;y;vy] = h * [x;vx;ax;y;vy;ay]
            % so, h = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0]
            %
            % note that processnoise is automatically calculated by the
            % constantacceleration motion model
            h = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0];
            filter = trackingkf('motionmodel', '2d constant acceleration', ...
                'measurementmodel', h, 'state', state, ...
                'measurementnoise', detection.measurementnoise(1:4,1:4), ...
                'statecovariance', statecov);
    end
end

process and format the detections

the recorded information must be processed and formatted before it can be used by the tracker. this has the following steps:

  1. filtering out unnecessary radar clutter detections. the radar reports many objects that correspond to fixed objects, which include: guard-rails, the road median, traffic signs, etc. if these detections are used in the tracking, they create false tracks of fixed objects at the edges of the road and therefore must be removed before calling the tracker. radar objects are considered nonclutter if they are either stationary in front of the car or moving in its vicinity.

  2. formatting the detections as input to the tracker, i.e., an array of elements. see the processvideo and processradar supporting functions at the end of this example.

    function [detections,laneboundaries, egolane] = processdetections...
            (visionframe, radarframe, imuframe, laneframe, egolane, time)
        % inputs:
        %   visionframe - objects reported by the vision sensor for this time frame
        %   radarframe  - objects reported by the radar sensor for this time frame
        %   imuframe    - inertial measurement unit data for this time frame
        %   laneframe   - lane reports for this time frame
        %   egolane     - the estimated ego lane
        %   time        - the time corresponding to the time frame
        % remove clutter radar objects
        [laneboundaries, egolane] = processlanes(laneframe, egolane);
        realradarobjects = findnonclutterradarobjects(radarframe.object,...
            radarframe.numobjects, imuframe.velocity, laneboundaries);
        % return an empty list if no objects are reported
        % counting the total number of objects
        detections = {};
        if (visionframe.numobjects   numel(realradarobjects)) == 0
            return;
        end
        % process the remaining radar objects
        detections = processradar(detections, realradarobjects, time);
        % process video objects
        detections = processvideo(detections, visionframe, time);
    end

update the tracker

to update the tracker, call the updatetracks method with the following inputs:

  1. tracker - the multiobjecttracker that was configured earlier. see the 'create the multi-object tracker' section.

  2. detections - a list of objectdetection objects that was created by processdetections

  3. time - the current scenario time.

the output from the tracker is a struct array of tracks.

find the most important object and issue a forward collision warning

the most important object (mio) is defined as the track that is in the ego lane and is closest in front of the car, i.e., with the smallest positive x value. to lower the probability of false alarms, only confirmed tracks are considered.

once the mio is found, the relative speed between the car and mio is calculated. the relative distance and relative speed determine the forward collision warning. there are 3 cases of fcw:

  1. safe (green): there is no car in the ego lane (no mio), the mio is moving away from the car, or the distance to the mio remains constant.

  2. caution (yellow): the mio is moving closer to the car, but is still at a distance above the fcw distance. fcw distance is calculated using the euro ncap aeb test protocol. note that this distance varies with the relative speed between the mio and the car, and is greater when the closing speed is higher.

  3. warn (red): the mio is moving closer to the car, and its distance is less than the fcw distance, .

euro ncap aeb test protocol defines the following distance calculation:

where:

is the forward collision warning distance.

is the relative velocity between the two vehicles.

is the maximum deceleration, defined to be 40% of the gravity acceleration.

    function mostimportantobject = findmostimportantobject(confirmedtracks,egolane,positionselector,velocityselector)
        % initialize outputs and parameters
        mio = [];               % by default, there is no mio
        trackid = [];           % by default, there is no trackid associated with an mio
        fcw = 3;                % by default, if there is no mio, then fcw is 'safe'
        threatcolor = 'green';  % by default, the threat color is green
        maxx = 1000;  % far enough forward so that no track is expected to exceed this distance
        gaccel = 9.8; % constant gravity acceleration, in m/s^2
        maxdeceleration = 0.4 * gaccel; % euro ncap aeb definition
        delaytime = 1.2; % delay time for a driver before starting to brake, in seconds
        positions = gettrackpositions(confirmedtracks, positionselector);
        velocities = gettrackvelocities(confirmedtracks, velocityselector);
        for i = 1:numel(confirmedtracks)
            x = positions(i,1);
            y = positions(i,2);
            relspeed = velocities(i,1); % the relative speed between the cars, along the lane
            if x < maxx && x > 0 % no point checking otherwise
                yleftlane  = polyval(egolane.left,  x);
                yrightlane = polyval(egolane.right, x);
                if (yrightlane <= y) && (y <= yleftlane)
                    maxx = x;
                    trackid = i;
                    mio = confirmedtracks(i).trackid;
                    if relspeed < 0 % relative speed indicates object is getting closer
                        % calculate expected braking distance according to
                        % euro ncap aeb test protocol
                        d = abs(relspeed) * delaytime   relspeed^2 / 2 / maxdeceleration;
                        if x <= d % 'warn'
                            fcw = 1;
                            threatcolor = 'red';
                        else % 'caution'
                            fcw = 2;
                            threatcolor = 'yellow';
                        end
                    end
                end
            end
        end
        mostimportantobject = struct('objectid', mio, 'trackindex', trackid, 'warning', fcw, 'threatcolor', threatcolor);
    end

summary

this example showed how to create a forward collision warning system for a vehicle equipped with vision, radar, and imu sensors. it used objectdetection objects to pass the sensor reports to the multiobjecttracker object that fused them and tracked objects in front of the ego vehicle.

try using different parameters for the tracker to see how they affect the tracking quality. try modifying the tracking filter to use trackingkf or trackingukf, or to define a different motion model, e.g., constant velocity or constant turn. finally, you can try to define your own motion model.

supporting functions

readsensorrecordingsfile reads recorded sensor data from a file

function [visionobjects, radarobjects, inertialmeasurementunit, lanereports, ...
    timestep, numsteps] = readsensorrecordingsfile(sensorrecordingfilename)
% read sensor recordings
% the |readdetectionsfile| function reads the recorded sensor data file.
% the recorded data is a single structure that is divided into the
% following substructures:
%
% # |inertialmeasurementunit|, a struct array with fields: timestamp,
%   velocity, and yawrate. each element of the array corresponds to a
%   different timestep.
% # |lanereports|, a struct array with fields: left and right. each element
%   of the array corresponds to a different timestep.
%   both left and right are structures with fields: isvalid, confidence,
%   boundarytype, offset, headingangle, and curvature.
% # |radarobjects|, a struct array with fields: timestamp (see below),
%   numobjects (integer) and object (struct). each element of the array
%   corresponds to a different timestep.
%   |object| is a struct array, where each element is a separate object,
%   with the fields: id, status, position(x;y;z), velocity(vx,vy,vz),
%   amplitude, and rangemode.
%   note: z is always constant and vz=0.
% # |visionobjects|, a struct array with fields: timestamp (see below),
%   numobjects (integer) and object (struct). each element of the array
%   corresponds to a different timestep.
%   |object| is a struct array, where each element is a separate object,
%   with the fields: id, classification, position (x;y;z),
%   velocity(vx;vy;vz), size(dx;dy;dz). note: z=vy=vz=dx=dz=0
%
% the timestamp for recorded vision and radar objects is a uint64 variable
% holding microseconds since the unix epoch. timestamps are recorded about
% 50 milliseconds apart. there is a complete synchronization between the
% recordings of vision and radar detections, therefore the timestamps are
% not used in further calculations.
a = load(sensorrecordingfilename);
visionobjects = a.vision;
radarobjects = a.radar;
lanereports = a.lane;
inertialmeasurementunit = a.inertialmeasurementunit;
timestep = 0.05;                 % data is provided every 50 milliseconds
numsteps = numel(visionobjects); % number of recorded timesteps
end

processlanes converts sensor-reported lanes to paraboliclaneboundary lanes and maintains a persistent ego lane estimate

function [laneboundaries, egolane] = processlanes(lanereports, egolane)
% lane boundaries are updated based on the lanereports from the recordings.
% since some lanereports contain invalid (isvalid = false) reports or
% impossible parameter values (-1e9), these lane reports are ignored and
% the previous lane boundary is used.
leftlane    = lanereports.left;
rightlane   = lanereports.right;
% check the validity of the reported left lane
cond = (leftlane.isvalid && leftlane.confidence) && ...
    ~(leftlane.headingangle == -1e9 || leftlane.curvature == -1e9);
if cond
    egolane.left = cast([leftlane.curvature, leftlane.headingangle, leftlane.offset], 'double');
end
% update the left lane boundary parameters or use the previous ones
leftparams  = egolane.left;
leftboundaries = paraboliclaneboundary(leftparams);
leftboundaries.strength = 1;
% check the validity of the reported right lane
cond = (rightlane.isvalid && rightlane.confidence) && ...
    ~(rightlane.headingangle == -1e9 || rightlane.curvature == -1e9);
if cond
    egolane.right  = cast([rightlane.curvature, rightlane.headingangle, rightlane.offset], 'double');
end
% update the right lane boundary parameters or use the previous ones
rightparams = egolane.right;
rightboundaries = paraboliclaneboundary(rightparams);
rightboundaries.strength = 1;
laneboundaries = [leftboundaries, rightboundaries];
end

findnonclutterradarobjects removes radar objects that are considered part of the clutter

function realradarobjects = findnonclutterradarobjects(radarobject, numradarobjects, egospeed, laneboundaries)
% the radar objects include many objects that belong to the clutter.
% clutter is defined as a stationary object that is not in front of the
% car. the following types of objects pass as nonclutter:
%
% # any object in front of the car
% # any moving object in the area of interest around the car, including
%   objects that move at a lateral speed around the car
    % allocate memory
    normvs = zeros(numradarobjects, 1);
    inlane = zeros(numradarobjects, 1);
    inzone = zeros(numradarobjects, 1);
    % parameters
    lanewidth = 3.6;            % what is considered in front of the car
    zonewidth = 1.7*lanewidth;  % a wider area of interest
    minv = 1;                   % any object that moves slower than minv is considered stationary
    for j = 1:numradarobjects
        [vx, vy] = calculategroundspeed(radarobject(j).velocity(1),radarobject(j).velocity(2),egospeed);
        normvs(j) = norm([vx,vy]);
        laneboundariesatobject = computeboundarymodel(laneboundaries, radarobject(j).position(1));
        lanecenter = mean(laneboundariesatobject);
        inlane(j) = (abs(radarobject(j).position(2) - lanecenter) <= lanewidth/2);
        inzone(j) = (abs(radarobject(j).position(2) - lanecenter) <= max(abs(vy)*2, zonewidth));
    end
    realradarobjectsidx = union(...
        intersect(find(normvs > minv), find(inzone == 1)), ...
        find(inlane == 1));
    realradarobjects = radarobject(realradarobjectsidx);
end

calculategroundspeed calculates the true ground speed of a radar-reported object from the relative speed and the ego vehicle speed

function [vx,vy] = calculategroundspeed(vxi,vyi,egospeed)
% inputs
%   (vxi,vyi) : relative object speed
%   egospeed  : ego vehicle speed
% outputs
%   [vx,vy]   : ground object speed
    vx = vxi   egospeed;    % calculate longitudinal ground speed
    theta = atan2(vyi,vxi); % calculate heading angle
    vy = vx * tan(theta);   % calculate lateral ground speed
end

processvideo converts reported vision objects to a list of objectdetection objects

function postprocesseddetections = processvideo(postprocesseddetections, visionframe, t)
% process the video objects into objectdetection objects
numradarobjects = numel(postprocesseddetections);
numvisionobjects = visionframe.numobjects;
if numvisionobjects
    classtouse = class(visionframe.object(1).position);
    visionmeascov = cast(diag([2,2,2,100]), classtouse);
    % process vision objects:
    for i=1:numvisionobjects
        object = visionframe.object(i);
        postprocesseddetections{numradarobjects i} = objectdetection(t,...
            [object.position(1); object.velocity(1); object.position(2); 0], ...
            'sensorindex', 1, 'measurementnoise', visionmeascov, ...
            'measurementparameters', {1}, ...
            'objectclassid', object.classification, ...
            'objectattributes', {object.id, object.size});
    end
end
end

processradar converts reported radar objects to a list of objectdetection objects

function postprocesseddetections = processradar(postprocesseddetections, realradarobjects, t)
% process the radar objects into objectdetection objects
numradarobjects = numel(realradarobjects);
if numradarobjects
    classtouse = class(realradarobjects(1).position);
    radarmeascov = cast(diag([2,2,2,100]), classtouse);
    % process radar objects:
    for i=1:numradarobjects
        object = realradarobjects(i);
        postprocesseddetections{i} = objectdetection(t, ...
            [object.position(1); object.velocity(1); object.position(2); object.velocity(2)], ...
            'sensorindex', 2, 'measurementnoise', radarmeascov, ...
            'measurementparameters', {2}, ...
            'objectattributes', {object.id, object.status, object.amplitude, object.rangemode});
    end
end
end

fcwmeas the measurement function used in this forward collision warning example

function measurement = fcwmeas(state, sensorid)
% the example measurements depend on the sensor type, which is reported by
% the measurementparameters property of the objectdetection. the following
% two sensorid values are used:
%   sensorid=1: video objects, the measurement is [x;vx;y].
%   sensorid=2: radar objects, the measurement is [x;vx;y;vy].
% the state is:
%   constant velocity       state = [x;vx;y;vy]
%   constant turn           state = [x;vx;y;vy;omega]
%   constant acceleration   state = [x;vx;ax;y;vy;ay]
    if numel(state) < 6 % constant turn or constant velocity
        switch sensorid
            case 1 % video
                measurement = [state(1:3); 0];
            case 2 % radar
                measurement = state(1:4);
        end
    else % constant acceleration
        switch sensorid
            case 1 % video
                measurement = [state(1:2); state(4); 0];
            case 2 % radar
                measurement = [state(1:2); state(4:5)];
        end
    end
end

fcwmeasjac the jacobian of the measurement function used in this forward collision warning example

function jacobian = fcwmeasjac(state, sensorid)
% the example measurements depend on the sensor type, which is reported by
% the measurementparameters property of the objectdetection. we choose
% sensorid=1 for video objects and sensorid=2 for radar objects.  the
% following two sensorid values are used:
%   sensorid=1: video objects, the measurement is [x;vx;y].
%   sensorid=2: radar objects, the measurement is [x;vx;y;vy].
% the state is:
%   constant velocity       state = [x;vx;y;vy]
%   constant turn           state = [x;vx;y;vy;omega]
%   constant acceleration   state = [x;vx;ax;y;vy;ay]
    numstates = numel(state);
    jacobian = zeros(4, numstates, 'like', state);
    if numel(state) < 6 % constant turn or constant velocity
        switch sensorid
            case 1 % video
                jacobian(1,1) = 1;
                jacobian(2,2) = 1;
                jacobian(3,3) = 1;
            case 2 % radar
                jacobian(1,1) = 1;
                jacobian(2,2) = 1;
                jacobian(3,3) = 1;
                jacobian(4,4) = 1;
        end
    else % constant acceleration
        switch sensorid
            case 1 % video
                jacobian(1,1) = 1;
                jacobian(2,2) = 1;
                jacobian(3,4) = 1;
            case 2 % radar
                jacobian(1,1) = 1;
                jacobian(2,2) = 1;
                jacobian(3,4) = 1;
                jacobian(4,5) = 1;
        end
    end
end

see also

functions

objects

  • | | | | | |

related topics

网站地图