main content

plot results of local interpretable model-凯发k8网页登录

plot results of local interpretable model-agnostic explanations (lime)

since r2020b

description

example

f = plot(results) visualizes the lime results in the lime object results. the function returns the figure object f. use f to query or modify of the figure after it is created.

  • the figure contains a horizontal bar graph that shows the coefficient values of a linear simple model or predictor importance values of a decision tree simple model, depending on the simple model in results (simplemodel property of results).

  • the figure displays two predictions for the query point computed using the machine learning model and the simple model, respectively. these values correspond to the blackboxfitted property and the simplemodelfitted property of results.

examples

train a classification model and create a lime object that uses a decision tree simple model. when you create a lime object, specify a query point and the number of important predictors so that the software generates samples of a synthetic data set and fits a simple model for the query point with important predictors. then display the estimated predictor importance in the simple model by using the object function plot.

load the creditrating_historical data set. the data set contains customer ids and their financial ratios, industry labels, and credit ratings.

tbl = readtable('creditrating_historical.dat');

display the first three rows of the table.

head(tbl,3)
     id      wc_ta    re_ta    ebit_ta    mve_bvtd    s_ta     industry    rating
    _____    _____    _____    _______    ________    _____    ________    ______
    62394    0.013    0.104     0.036      0.447      0.142       3        {'bb'}
    48608    0.232    0.335     0.062      1.969      0.281       8        {'a' }
    42444    0.311    0.367     0.074      1.935      0.366       1        {'a' }

create a table of predictor variables by removing the columns of customer ids and ratings from tbl.

tblx = removevars(tbl,["id","rating"]);

train a blackbox model of credit ratings by using the fitcecoc function.

blackbox = fitcecoc(tblx,tbl.rating,'categoricalpredictors','industry');

create a lime object that explains the prediction for the last observation using a decision tree simple model. specify 'numimportantpredictors' as six to find at most 6 important predictors. if you specify the 'querypoint' and 'numimportantpredictors' values when you create a lime object, then the software generates samples of a synthetic data set and fits a simple interpretable model to the synthetic data set.

querypoint = tblx(end,:)
querypoint=1×6 table
    wc_ta    re_ta    ebit_ta    mve_bvtd    s_ta    industry
    _____    _____    _______    ________    ____    ________
    0.239    0.463     0.065      2.924      0.34       2    
rng('default') % for reproducibility
results = lime(blackbox,'querypoint',querypoint,'numimportantpredictors',6, ...
    'simplemodeltype','tree')
results = 
  lime with properties:
             blackboxmodel: [1x1 classificationecoc]
              datalocality: 'global'
     categoricalpredictors: 6
                      type: 'classification'
                         x: [3932x6 table]
                querypoint: [1x6 table]
    numimportantpredictors: 6
          numsyntheticdata: 5000
             syntheticdata: [5000x6 table]
                    fitted: {5000x1 cell}
               simplemodel: [1x1 classificationtree]
       importantpredictors: [2x1 double]
            blackboxfitted: {'aa'}
         simplemodelfitted: {'aa'}

plot the lime object results by using the object function plot. to display an existing underscore in any predictor name, change the ticklabelinterpreter value of the axes to 'none'.

f = plot(results);
f.currentaxes.ticklabelinterpreter = 'none';

figure contains an axes object. the axes object with title lime with decision tree model, xlabel predictor importance, ylabel predictor contains an object of type bar.

the plot displays two predictions for the query point, which correspond to the blackboxfitted property and the simplemodelfitted property of results.

the horizontal bar graph shows the sorted predictor importance values. lime finds the financial ratio variables mve_bvtd and re_ta as important predictors for the query point.

you can read the bar lengths by using data tips or . for example, you can find bar objects by using the findobj function and add labels to the ends of the bars by using the text function.

b = findobj(f,'type','bar');
text(b.yendpoints 0.001,b.xendpoints,string(b.ydata))

figure contains an axes object. the axes object with title lime with decision tree model, xlabel predictor importance, ylabel predictor contains 3 objects of type bar, text.

alternatively, you can display the coefficient values in a table with the predictor variable names.

imp = b.ydata;
flipud(array2table(imp', ...
    'rownames',f.currentaxes.yticklabel,'variablenames',{'predictor importance'}))
ans=2×1 table
                predictor importance
                ____________________
    mve_bvtd          0.088412      
    re_ta            0.0018061      

train a regression model and create a lime object that uses a linear simple model. when you create a lime object, if you do not specify a query point and the number of important predictors, then the software generates samples of a synthetic data set but does not fit a simple model. use the object function fit to fit a simple model for a query point. then display the coefficients of the fitted linear simple model by using the object function plot.

load the carbig data set, which contains measurements of cars made in the 1970s and early 1980s.

load carbig

create a table containing the predictor variables acceleration, cylinders, and so on, as well as the response variable mpg.

tbl = table(acceleration,cylinders,displacement,horsepower,model_year,weight,mpg);

removing missing values in a training set can help reduce memory consumption and speed up training for the fitrkernel function. remove missing values in tbl.

tbl = rmmissing(tbl);

create a table of predictor variables by removing the response variable from tbl.

tblx = removevars(tbl,'mpg');

train a blackbox model of mpg by using the function.

rng('default') % for reproducibility
mdl = fitrkernel(tblx,tbl.mpg,'categoricalpredictors',[2 5]);

create a lime object. specify a predictor data set because mdl does not contain predictor data.

results = lime(mdl,tblx)
results = 
  lime with properties:
             blackboxmodel: [1x1 regressionkernel]
              datalocality: 'global'
     categoricalpredictors: [2 5]
                      type: 'regression'
                         x: [392x6 table]
                querypoint: []
    numimportantpredictors: []
          numsyntheticdata: 5000
             syntheticdata: [5000x6 table]
                    fitted: [5000x1 double]
               simplemodel: []
       importantpredictors: []
            blackboxfitted: []
         simplemodelfitted: []

results contains the generated synthetic data set. the simplemodel property is empty ([]).

fit a linear simple model for the first observation in tblx. specify the number of important predictors to find as 3.

querypoint = tblx(1,:)
querypoint=1×6 table
    acceleration    cylinders    displacement    horsepower    model_year    weight
    ____________    _________    ____________    __________    __________    ______
         12             8            307            130            70         3504 
results = fit(results,querypoint,3);

plot the lime object results by using the object function plot. to display an existing underscore in any predictor name, change the ticklabelinterpreter value of the axes to 'none'.

f = plot(results);
f.currentaxes.ticklabelinterpreter = 'none';

figure contains an axes object. the axes object with title lime with linear model, xlabel coefficient, ylabel predictor contains an object of type bar.

the plot displays two predictions for the query point, which correspond to the blackboxfitted property and the simplemodelfitted property of results.

the horizontal bar graph shows the coefficient values of the simple model, sorted by their absolute values. lime finds horsepower, model_year, and cylinders as important predictors for the query point.

model_year and cylinders are categorical predictors that have multiple categories. for a linear simple model, the software creates one less dummy variable than the number of categories for each categorical predictor. the bar graph displays only the most important dummy variable. you can check the coefficients of the other dummy variables using the simplemodel property of results. display the sorted coefficient values, including all categorical dummy variables.

[~,i] = sort(abs(results.simplemodel.beta),'descend');
table(results.simplemodel.expandedpredictornames(i)',results.simplemodel.beta(i), ...
    'variablenames',{'expanded predictor name','coefficient'})
ans=17×2 table
     expanded predictor name      coefficient
    __________________________    ___________
    {'cylinders (5 vs. 8)'   }        0.18008
    {'model_year (74 vs. 70)'}      -0.082499
    {'model_year (80 vs. 70)'}      -0.052277
    {'model_year (81 vs. 70)'}       0.035987
    {'model_year (82 vs. 70)'}      -0.026442
    {'model_year (71 vs. 70)'}       0.014736
    {'model_year (76 vs. 70)'}       0.014723
    {'model_year (75 vs. 70)'}       0.013979
    {'model_year (77 vs. 70)'}       0.012762
    {'model_year (78 vs. 70)'}      0.0089647
    {'cylinders (6 vs. 8)'   }      -0.006972
    {'model_year (79 vs. 70)'}     -0.0058682
    {'model_year (72 vs. 70)'}       0.005654
    {'cylinders (3 vs. 8)'   }     -0.0023194
    {'horsepower'            }    -0.00021074
    {'cylinders (4 vs. 8)'   }     0.00014773
      ⋮

input arguments

lime results, specified as a lime object. the simplemodel property of results must contain a fitted simple model.

references

[1] ribeiro, marco tulio, s. singh, and c. guestrin. "'why should i trust you?': explaining the predictions of any classifier." in proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, 1135–44. san francisco, california: acm, 2016.

version history

introduced in r2020b

网站地图