main content

automate testing for highway lane following -凯发k8网页登录

this example shows how to assess the functionality of a lane-following application by defining scenarios based on requirements, automating testing of components and the generated code for those components. the components include lane-detection, sensor fusion, decision logic, and controls. this example builds on the highway lane following example.

introduction

a highway lane-following system steers a vehicle to travel within a marked lane. it also maintains a set velocity or safe distance from a preceding vehicle in the same lane. the system typically includes lane detection, sensor fusion, decision logic, and controls components. system-level simulation is a common technique for assessing functionality of the integrated components. simulations are configured to test scenarios based on system requirements. automatically running these simulations enables regression testing to verify system-level functionality.

the highway lane following example showed how to simulate a system-level model for lane-following. this example shows how to automate testing that model against multiple scenarios using simulink test™. the scenarios are based on system-level requirements. in this example, you will:

  1. review requirements: the requirements describe system-level test conditions. simulation test scenarios are created to represent these conditions.

  2. review the test bench model: review the system-level lane-following test bench model that contains metric assessments. these metric assessments integrate the test bench model with simulink test for the automated testing.

  3. disable runtime visualizations: runtime visualizations are disabled to reduce execution time for the automated testing.

  4. automate testing: a test manager is configured to simulate each test scenario, assess success criteria, and report results. the results are explored dynamically in the test manager and exported to a pdf for external reviewers.

  5. automate testing with generated code: the lane detection, sensor fusion, decision logic, and controls components are configured to generate c code. the automated testing is run on the generated code to verify expected behavior.

  6. automate testing in parallel: overall execution time for running the tests is reduced using parallel computing on a multi-core computer.

testing the system-level model requires a photorealistic simulation environment. in this example, you enable system-level simulation through integration with the unreal engine from epic games®. the 3d simulation environment requires a windows® 64-bit platform.

if ~ispc
    error("the 3d simulation environment requires a windows 64-bit platform");
end

to ensure reproducibility of the simulation results, set the random seed.

rng(0);

review requirements

requirements toolbox™ lets you author, analyze, and manage requirements within simulink. this example contains ten test scenarios, with high-level testing requirements defined for each scenario. open the requirement set.

to explore the test requirements and test bench model, open a working copy of the project example files. matlab copies the files to an example folder so that you can edit them. the testautomation folder contains the files that enables the automate testing.

addpath(fullfile(matlabroot, 'toolbox', 'driving', 'drivingdemos'));
helperdrivingprojectsetup('highwaylanefollowing.zip', 'workdir', pwd);
open('highwaylanefollowingtestrequirements.slreqx')

alternatively, you can also open the file from the requirements tab of the requirements manager app in simulink.

each row in this file specifies the requirements in textual and graphical formats for testing the lane-following system for a test scenario. the scenarios with the scenario_lf_ prefix enable you to test lane-detection and lane-following algorithms without obstruction by other vehicles. the scenarios with the scenario_lfacc_ prefix enable you to test lane-detection, lane-following, and acc behavior with other vehicles on the road.

  1. scenario_lf_01_straight_rightlane — straight road scenario with ego vehicle in right lane.

  2. scenario_lf_02_straight_leftlane — straight road scenario with ego vehicle in left lane.

  3. scenario_lf_03_curve_leftlane — curved road scenario with ego vehicle in left lane.

  4. scenario_lf_04_curve_rightlane — curved road scenario with ego vehicle in right lane.

  5. scenario_lfacc_01_curve_deceltarget — curved road scenario with a decelerating lead vehicle in ego lane.

  6. scenario_lfacc_02_curve_autoretarget — curved road scenario with changing lead vehicles in ego lane. this scenario tests the ability of the ego vehicle to retarget to a new lead vehicle while driving along a curve.

  7. scenario_lfacc_03_curve_stopngo — curved road scenario with a lead vehicle slowing down in ego lane.

  8. scenario_lfacc_04_curve_cutinout — curved road scenario with a fast moving car in the adjacent lane cuts into the ego lane and cuts out from ego lane.

  9. scenario_lfacc_05_curve_cutinout_tooclose — curved road scenario with a fast moving car in the adjacent lane cuts into the ego lane and cuts out from ego lane aggressively.

  10. scenario_lfacc_06_straight_stopandgoleadcar — straight road scenario with a lead vehicle that breaks down in ego lane.

these requirements are implemented as test scenarios with the same names as the scenarios used in the highwaylanefollowingtestbench model.

review test bench model

this example reuses the highwaylanefollowingtestbench model from the highway lane following example. open the test bench model.

open_system("highwaylanefollowingtestbench");

this test bench model has simulation 3d scenario, lane marker detector, vehicle detector, forward vehicle sensor fusion, lane following decision logic and lane following controller and vehicle dynamics components.

this test bench model is configured using the helperslhighwaylanefollowingsetup script. this setup script takes scenarioname as input. scenarioname can be any one of the previously described test scenarios. to run the setup script, use code:

scenarioname = "scenario_lfacc_03_curve_stopngo";
helperslhighwaylanefollowingsetup("scenariofcnname",scenarioname);

you can now simulate the model and visualize the results. for more details on the analysis of the simulation results and the design of individual components in the test bench model, see the highway lane following example.

in this example, the focus is more on automating the simulation runs for this test bench model using simulink test for the different test scenarios. the metrics assessment subsystem enables integration of system-level metric evaluations with simulink test. open the metrics assessment subsystem.

open_system("highwaylanefollowingtestbench/metrics assessment");

using this example, you can evaluate the system-level behavior using four system-level metrics. additionally, you can also compute component-level metrics to analyze individual components and their impact on the overall system performance.

system-level metrics

  • verify lateral deviation — this block verifies that the lateral deviation from the center line of the lane is within prescribed thresholds for the corresponding scenario. define the thresholds when you author the test scenario.

  • verify in lane — this block verifies that the ego vehicle is following one of the lanes on the road throughout the simulation.

  • verify time gap — this block verifies that the time gap between the ego vehicle and the lead vehicle is more than 0.8 seconds. the time gap between the two vehicles is defined as the ratio of the calculated headway distance to the ego vehicle velocity.

  • verify no collision — this block verifies that the ego vehicle does not collide with the lead vehicle at any point during the simulation.

component-level metrics

  • lane metrics — this block verifies that distances between the detected lane boundaries and the ground truth data are within the thresholds specified in a test scenario.

  • vehicle detector metrics — this blocks computes and logs true positives, false negatives, and false positives for the detections.

  • sensor fusion & tracking metrics — this subsystem computes generalized optimal subpattern assignment (gospa) metric, localization error, missed target error, and false track error. for more information on these metrics, see example.

disable runtime visualizations

the system-level test bench model visualizes intermediate outputs during the simulation for the analysis of different components in the model. these visualizations are not required when the tests are automated. you can reduce execution time for the automated testing by disabling them.

disable runtime visualizations for the lane marker detector subsystem.

load_system('lanemarkerdetector');
blk = 'lanemarkerdetector/lane marker detector';
set_param(blk,'enabledisplays','off');

disable runtime visualizations for the vehicle detector subsystem.

load_system('visionvehicledetector');
blk = 'visionvehicledetector/pack detections/pack vehicle detections';
set_param(blk,'enabledisplay','off');

configure the block to run the unreal engine in headless mode, where the 3d simulation window is disabled.

blk = ['highwaylanefollowingtestbench/simulation 3d scenario/', ...
      'simulation 3d scene configuration'];
set_param(blk,'enablewindow','off');

automate testing

the test manager is configured to automate the testing of the lane-following application. open the highwaylanefollowingtestassessments.mldatx test file in the test manager.

sltestmgr;
testfile = sltest.testmanager.load('highwaylanefollowingtestassessments.mldatx');

observe the populated test cases that were authored previously in this file. each test case is linked to the corresponding requirement in the requirements editor for traceability. each test case uses the post-load callback to run the setup script with appropriate inputs and to configure the output video filename. after the simulation of the test case, it invokes helpertmtestcasepostprocessing from the clean-up callback to assess performances of the overall system and individual components by generating the plots explained in the highway lane following example.

after simulation of the test case, simulink test also invokes these functions from the custom criteria callback to get additional metrics for lane marker detector and vehicle detector components:

  • helperverifyprecisionandsensitivity — verifies that the precision and sensitivity metrics of the lane marker detector component are within the predefined threshold limit.

  • helperverifyprecisionandmissrate — verifies that the precision and miss rate metrics of the vehicle detector component are within the predefined threshold limit.

run and explore results for a single test scenario:

to reduce command-window output, turn off the mpc update messages.

mpcverbosity('off');

to test the system-level model with the scenario_lfacc_03_curve_stopngo test scenario from simulink test, use this code:

testsuite = gettestsuitebyname(testfile,'test scenarios');
testcase = gettestcasebyname(testsuite,'scenario_lfacc_03_curve_stopngo');
resultobj = run(testcase);

to generate a report after the simulation, use this code:

sltest.testmanager.report(resultobj,'report.pdf',...,
    'title','highway lane following',...
    'includematlabfigures',true,...
    'includeerrormessages',true,...
    'includetestresults',0,'launchreport',true);

examine the report report.pdf. observe that the test environment section shows the platform on which the test is run and the matlab® version used for testing. the summary section shows the outcome of the test and duration of the simulation in seconds. the results section shows pass/fail results based on the assessment criteria. this section also shows the plots logged from the helpergeneratefilesforlanefollowingreport function.

run and explore results for all test scenarios:

you can simulate the system for all the tests by using sltest.testmanager.run. alternatively, you can simulate the system by clicking play in the test manager app.

after completion of the test simulations, the results for all the tests can be viewed in the results and artifacts tab of the test manager. for each test case, the (simulink) blocks in the model are associated with the test manager to visualize overall pass/fail results.

you can find the generated report in current working directory. this report contains a detailed summary of pass/fail statuses and plots for each test case.

verify test status in requirements editor:

open the requirements editor and select display. then, select verification status to see a verification status summary for each requirement. green and red bars indicate the pass/fail status of simulation results for each test.

automate testing with generated code

the highwaylanefollowingtestbench model enables integrated testing of lane marker detector, vehicle detector, forward vehicle sensor fusion, lane following decision logic, and lane following controller components. it is often helpful to perform regression testing of these components through software-in-the-loop (sil) verification. if you have embedded coder™ simulink coder™ license, then you can generate code for these components. this workflow lets you verify that the generated code produces expected results that match the system-level requirements throughout simulation.

set lane marker detector to run in software-in-the-loop mode.

model = 'highwaylanefollowingtestbench/lane marker detector';
set_param(model,'simulationmode','software-in-the-loop');

set vehicle detector to run in software-in-the-loop mode.

model = 'highwaylanefollowingtestbench/vehicle detector';
set_param(model,'simulationmode','software-in-the-loop');

set forward vehicle sensor fusion to run in software-in-the-loop mode.

model = 'highwaylanefollowingtestbench/forward vehicle sensor fusion';
set_param(model,'simulationmode','software-in-the-loop');

set lane following decision logic to run in software-in-the-loop mode.

model = 'highwaylanefollowingtestbench/lane following decision logic';
set_param(model,'simulationmode','software-in-the-loop');

set lane following controller to run in software-in-the-loop mode.

model = 'highwaylanefollowingtestbench/lane following controller';
set_param(model,'simulationmode','software-in-the-loop');

now, run sltest.testmanager.run to simulate the system for all the test scenarios. after the completion of tests, review the plots and results in the generated report.

enable the mpc update messages again.

mpcverbosity('on');

automate testing in parallel

if you have a parallel computing toolbox™ license, then you can configure test manager to execute tests in parallel using a parallel pool. to run tests in parallel, save the models after disabling the runtime visualizations using save_system('lanemarkerdetector'), save_system('visionvehicledetector') and save_system('highwaylanefollowingtestbench'). test manager uses the default parallel computing toolbox cluster and executes tests only on the local machine. running tests in parallel can speed up execution and decrease the amount of time it takes to get test results. for more information on how to configure tests in parallel from the test manager, see (simulink test).

related topics

网站地图