main content

automate testing for highway lane following controller -凯发k8网页登录

this example shows how to automate testing of a lane following controller and the generated code for this component by using simulink® test™. in this example, you:

  • assess the behavior of a lane following controller on different test scenarios with different test requirements.

  • automate testing of the lane following controller and the generated code for the reference model.

this example uses the lane following controller presented in the example.

introduction

the lane following controller is a fundamental component in highway lane following applications. the lane following controller generates the steering angle and acceleration control commands for an ego vehicle by using lane and vehicle information along with a set speed. for more information about how to design a lane following controller and configure the model for c code generation, see the example.

this example shows how to automate testing of the lane following controller against multiple scenarios by using simulink test. the scenarios are based on system-level requirements. it also shows how you can verify the generated code using software-in-the-loop (sil) simulation. in this example, you:

  1. review requirements — the requirements describe system-level test conditions. use simulation test scenarios to represent these conditions.

  2. review test bench model — the model contains controls, vehicle dynamics, and metrics to assess functionality. the metric assessments integrate the test bench model with simulink test for automated testing.

  3. disable runtime visualizations — disable runtime visualizations to reduce the execution time for automated testing.

  4. automate testing — configure a test manager to simulate each test scenario, assess success criteria, and report results. explore the results dynamically in the test manager and export them to a pdf for external reviewers.

  5. automate testing with generated code — configure the decision logic and controls components to generate c code. run automated testing on the generated code to verify the behavior.

  6. automate testing in parallel — reduce the overall execution time for running the tests by using parallel computing on a multicore computer.

in this example, you enable system-level simulation through integration with the unreal engine® from epic games®. the 3d simulation environment requires a windows® 64-bit platform.

if ~ispc
    error(['3d simulation is supported only on microsoft',char(174),' windows',char(174),'.'])
end

review requirements

to explore the requirements, open a working copy of the project example files. matlab® copies the files to an example folder so that you can edit them.

addpath(fullfile(matlabroot,"toolbox","driving","drivingdemos"))
helperdrivingprojectsetup("hlfcontroller.zip",workdir=pwd)

requirements toolbox™ enables you to author, analyze, and manage requirements within simulink. this example contains 12 test scenarios, with high-level testing requirements defined for each scenario. open the requirement set.

open("highwaylanefollowingcontrollertestrequirements.slreqx")

alternatively, you can open the file from the requirements tab of the requirements manager app in simulink.

each row in this file specifies the testing requirements of the lane following controller component in textual and graphical formats. the scenarios with the scenario_lf_ prefix enable you to test the lane following controller algorithm without obstruction by other vehicles. the scenarios with the scenario_acc_ prefix enable you to test adaptive cruise control (acc) behavior with other vehicles on the road. the scenarios with the scenario_lfacc_ prefix enable you to test lane following and acc behavior with other vehicles on the road.

  • scenario_lf_01_straight_rightlane — straight road scenario with the ego vehicle in the right lane.

  • scenario_lf_02_straight_leftlane — straight road scenario with the ego vehicle in the left lane.

  • scenario_lf_03_curve_leftlane — curved road scenario with the ego vehicle in the left lane.

  • scenario_lf_04_curve_rightlane — curved road scenario with the ego vehicle in the right lane.

  • scenario_acc_01_straight_targetdiscriminationtest — straight road scenario with two target vehicles, one in the ego lane and another one in an adjacent lane. this scenario tests the ability of the ego vehicle to identify the lead vehicle when there is another target vehicle that is traveling adjacent to the lead vehicle with the same speed.

  • scenario_acc_02_straight_stopngo — straight road scenario with a decelerating lead vehicle in the ego lane.

  • scenario_lfacc_01_curve_deceltarget — curved road scenario with a decelerating lead vehicle in the ego lane.

  • scenario_lfacc_02_curve_autoretarget — curved road scenario with changing lead vehicles in the ego lane. this scenario tests the ability of the ego vehicle to retarget to a new lead vehicle while driving along a curve.

  • scenario_lfacc_03_curve_stopngo — curved road scenario with a lead vehicle slowing down in the ego lane.

  • scenario_lfacc_04_curve_cutinout — curved road scenario with a fast-moving car in the adjacent lane that cuts into the ego lane, and then cuts out from the ego lane.

  • scenario_lfacc_05_curve_cutinout_tooclose — curved road scenario with a fast-moving car in the adjacent lane that cuts into the ego lane and cuts out from the ego lane aggressively.

  • scenario_lfacc_06_straight_stopandgoleadcar — straight road scenario with a broken down vehicle in the ego lane.

review test bench model

open the test bench model.

open_system("highwaylanefollowingcontrollertestbench")

the test bench model contains these subsystems:

  • simulation 3d scenario — specifies the road, vehicles, and vision detection generator used for simulation.

  • lane following decision logic — specifies the lateral and longitudinal decision logic, and provides lane center information and most important object (mio) related information to the controller.

  • lane following controller — specifies the path-following controller that generates control commands to steer the ego vehicle.

  • vehicle dynamics — specifies the dynamic model for the ego vehicle.

  • metrics assessment — assesses system-level behavior.

the simulation 3d scenario, lane following decision logic, lane following controller, vehicle dynamics, and metrics assessment subsystems are based on the subsystems used in the example.

in this example, the focus is on automating the simulation runs for this test bench model using simulink test for the different test scenarios. the metrics assessment subsystem enables integration of system-level metric evaluations with simulink test. this subsystem uses (simulink) blocks for this integration. open the metrics assessment subsystem.

open_system("highwaylanefollowingcontrollertestbench/metrics assessment")

in this example, four metrics are used to assess the lane following system.

  • verify lateral deviation — this block verifies that the lateral deviation from the center line of the lane is within the prescribed thresholds for the corresponding scenario. define the thresholds when you author the test scenario.

  • verify in lane — this block verifies that the ego vehicle is following one of the lanes on the road throughout the simulation.

  • verify time gap — this block verifies that the time gap between the ego vehicle and the lead vehicle is more than 0.8 seconds. the time gap between the two vehicles is defined as the ratio of the calculated headway distance to the ego vehicle velocity.

  • verify no collision — this block verifies that the ego vehicle does not collide with the lead vehicle at any point during the simulation.

disable runtime visualizations

the system-level test bench model opens an unreal engine simulation window for visualizing the scenario. this window is not required when the tests are automated.

configure the block to run the unreal engine in headless mode, where the 3d simulation window is disabled.

blk = "highwaylanefollowingcontrollertestbench/simulation 3d scenario/simulation 3d scene configuration";
set_param(blk,enablewindow="off");

automate testing

the test manager is configured to automate the testing of the lane following controller component. open the highwaylanefollowingcontrollermetricassessments.mldatx test file in the test manager.

sltestmgr
sltest.testmanager.load("highwaylanefollowingcontrollermetricassessments.mldatx");

observe the populated test cases previously authored in this file. these tests are configured to run the model.

each test case uses the post-load callback to run the setup script with appropriate inputs. after the simulation of each test case, the test manager runs the script from the cleanup callback to generate the results plots.

run and explore results for single test scenario

turn off the update messages about model predictive control objects.

mpcverbosity("off");

test the system-level model with the scenario_lfacc_03_curve_stopngo test scenario from simulink test.

testfile = sltest.testmanager.load("highwaylanefollowingcontrollermetricassessments.mldatx");
testsuite = gettestsuitebyname(testfile,"test scenarios");
testcase = gettestcasebyname(testsuite,"scenario_lfacc_03_curve_stopngo");
resultobj = run(testcase);

generate a report after the simulation.

sltest.testmanager.report(resultobj,"report.pdf", ...
title="highway lane following controller", ...
includematlabfigures=true, ...
includeerrormessages=true, ...
includetestresults=false, ...
launchreport=true);

examine report.pdf. observe that the test environment section shows the platform on which the test is run and the matlab version used for testing. the summary section shows the outcome of the test and duration of the simulation in seconds. the results section shows pass or fail results based on the assessment criteria. this section also shows the logged plots from the cleanup callback commands.

if you have a license for simulink coverage™, you can get coverage results in the generated report.pdf by enabling coverage settings in the test manager file. for more information, see the coverage settings section in (simulink test). you can use coverage data to find gaps in testing, missing requirements, or unintended functionality.

run and explore results for all test scenarios

simulate the system for all the tests by using the run(testfile) command. alternatively, you can simulate the system by selecting play in the test manager app.

when the test simulations are complete, you can view the test results in the results and artifacts tab of the test manager. for each test case, the (simulink) blocks in the model are associated with the test manager. this association enables you to the visualize overall pass or fail results.

you can find the generated report in the current working directory. this report contains a detailed summary of the pass or fail statuses and plots for each test case.

verify test status in requirements editor

open the requirements editor and select display. then, select verification status to see a verification status summary for each requirement. the green and red bars indicate the respective pass or fail status of the simulation results for each test.

automate testing with generated code

the highwaylanefollowingcontrollertestbench model enables you to verify the generated code by performing equivalence testing for the lane following decision logic and lane following controller components in open-loop. to perform equivalence testing of these components, use back-to-back testing. back-to-back tests compare the results of normal simulations with the generated code results from software-in-the-loop, processor-in-the-loop, or hardware-in-the-loop simulations. for more information, see (simulink test). this example focusses on verifying the lane following controller.

use these steps to create and run an equivalence test for the lane following controller.

1. select a test scenario and run the setup script.

helperslhighwaylanefollowingcontrollersetup(scenariofcnname="scenario_lfacc_03_curve_stopngo");

2. create a test suite object.

testsuite = gettestsuitebyname(testfile,"lanefollowingcontrollerequivalencetest");
if isempty(testsuite)
    testsuite = sltest.testmanager.testsuite(testfile,"lanefollowingcontrollerequivalencetest");
end

3. create an equivalence test for the component.

testcase = sltest.testmanager.createtestforcomponent("testfile",testsuite, ...
    "component","highwaylanefollowingcontrollertestbench/lane following controller", ...
    testtype="equivalence",simulation1mode="normal", ...
    simulation2mode="software-in-the-loop (sil)",usecomponentinputs=false, ...
    harnessoptions={"logoutputs",true});

a test harness is created by default in the previous step. find and open the test harness.

harnesslist = sltest.harness.find("highwaylanefollowingcontrollertestbench/lane following controller");
sltest.harness.open("highwaylanefollowingcontrollertestbench/lane following controller",harnesslist(end).name);

4. set the tolerance for the equivalence test.

capture the equivalence criteria.

eq = captureequivalencecriteria(testcase);

set the equivalence criteria tolerance for output signals.

sc = getsignalcriteria(eq);
for i=1:size(sc,2)
    if (strcmp(sc(i).name,"steering_angle") || strcmp(sc(i).name,"ego_acceleration"))
        sc(i).abstol = sqrt(eps("double"));
    else
        sc(i).enabled = false;
    end
end

5. run the equivalence test simulation.

run(testcase);

6. view the test results after the simulation completes. select the results and artifacts tab of the test manager or enter this command.

sltest.testmanager.view;

the tab shows pass or fail results based on the assessment criteria. you can use this process to create equivalence tests for other test scenarios as well.

this process has shown you how to create and run an equivalence test programmatically. you can also do this graphically by following the steps explained in the (simulink test) example.

the highwaylanefollowingcontrollertestbench model also enables integrated testing of the lane following decision logic and lane following controller components with vehicle dynamics in closed-loop. regression testing of these components through sil verification allows you to identify any issues at the system level. this workflow enables you to verify that the generated code produces expected results that match the system-level requirements throughout the simulation.

set the lane following decision logic to run in software-in-the-loop mode.

model = "highwaylanefollowingcontrollertestbench/lane following decision logic";
set_param(model,simulationmode="software-in-the-loop")

set the lane following controller to run in software-in-the-loop mode.

model = "highwaylanefollowingcontrollertestbench/lane following controller";
set_param(model,simulationmode="software-in-the-loop")

use the run(testfile) command to simulate the system for all test scenarios. after the tests are complete, review the plots and results in the generated report. if you have a license for simulink coverage, you can also get the code coverage analysis for the generated code in the generated report by enabling coverage settings in the test manager file.

you can visualize the coverage results for individual test cases, as well as the aggregated coverage results.

reenable the mpc update messages.

mpcverbosity("on");

automate testing in parallel

if you have a parallel computing toolbox™ license, then you can configure test manager to execute tests in parallel using a parallel pool. to run tests in parallel, save the models after disabling the runtime visualizations using save_system("highwaylanefollowingcontrollertestbench"). test manager uses the default parallel computing toolbox cluster, and executes tests on only the local machine. running tests in parallel can speed up execution and decrease the amount of time it takes to get test results. for more information on how to configure tests in parallel from the test manager, see (simulink test).

related topics

网站地图