main content

plot training information from a previous training session -凯发k8网页登录

plot training information from a previous training session

since r2021a

description

by default, the train function shows the training progress and results in the episode manager during training. if you configure training to not show the episode manager or you close the episode manager after training, you can view the training results using the inspecttrainingresult function, which opens the episode manager. you can also use inspecttrainingresult to view the training results for agents saved during training.

example

inspecttrainingresult(trainresults) opens the episode manager and plots the training results from a previous training session.

example

inspecttrainingresult(agentresults) opens the episode manager and plots the training results from a previously saved agent structure.

examples

for this example, assume that you have trained the agent in the train reinforcement learning agent in mdp environment example and subsequently closed the episode manager.

load the training information returned by the train function.

load mdptrainingstats trainingstats

reopen the episode manager for this training session.

inspecttrainingresult(trainingstats)

for this example, load the environment and agent for the train reinforcement learning agent in mdp environment example.

load mdpagentandenvironment

specify options for training the agent. configure the saveagentcriteria and saveagentvalue options to save all agents after episode 30.

trainopts = rltrainingoptions;
trainopts.maxstepsperepisode = 50;
trainopts.maxepisodes = 50;
trainopts.plots = "none";
trainopts.saveagentcriteria = "episodecount";
trainopts.saveagentvalue = 30;

train the agent. during training, when an episode has a reward greater than or equal to 13, a copy of the agent is saved in a savedagents folder.

rng("default") % for reproducibility
trainingstats = train(qagent,env,trainopts);

load the training results for one of the saved agents. this command loads both the agent and a structure that contains the corresponding training results.

load savedagents/agent50

view the training results from the saved agent result structure.

inspecttrainingresult(savedagentresult)

the episode manager shows the training progress up to the episode in which the agent was saved.

input arguments

training episode data, specified as a structure or structure array returned by the train function.

saved agent results, specified as a structure previously saved by the train function. the train function saves agents when you specify the saveagentcriteria and saveagentvalue options in the rltrainingoptions object used during training.

when you load a saved agent, the agent and its training results are added to the matlab® workspace as saved_agent and savedagentresultstruct, respectively. to plot the training data for this agent, use the following command.

inspecttrainingresult(savedagentresultstruct)

for multi-agent training, savedagentresultstruct contains structure fields with training results for all the trained agents.

version history

introduced in r2021a

网站地图