main content

word-凯发k8网页登录

this example shows how to train a deep learning lstm network to generate text word-by-word.

to train a deep learning network for word-by-word text generation, train a sequence-to-sequence lstm network to predict the next word in a sequence of words. to train the network to predict the next word, specify the responses to be the input sequences shifted by one time step.

this example reads text from a website. it reads and parses the html code to extract the relevant text, then uses a custom mini-batch datastore documentgenerationdatastore to input the documents to the network as mini-batches of sequence data. the datastore converts documents to sequences of numeric word indices. the deep learning network is an lstm network that contains a word embedding layer.

a mini-batch datastore is an implementation of a datastore with support for reading data in batches. you can use a mini-batch datastore as a source of training, validation, test, and prediction data sets for deep learning applications. use mini-batch datastores to read out-of-memory data or to perform specific preprocessing operations when reading batches of data.

you can adapt the custom mini-batch datastore specified by documentgenerationdatastore.m to your data by customizing the functions. this file is attached to this example as a supporting file. to access this file, open the example as a live script. for an example showing how to create your own custom mini-batch datastore, see (deep learning toolbox).

load training data

load the training data. read the html code from from project gutenberg.

url = "https://www.gutenberg.org/files/11/11-h/11-h.htm";
code = webread(url);

parse html code

the html code contains the relevant text inside

(paragraph) elements. extract the relevant text by parsing the html code using htmltree and then finding all the elements with element name "p".

tree = htmltree(code);
selector = "p";
subtrees = findelement(tree,selector);

extract the text data from the html subtrees using extracthtmltext and view the first 10 paragraphs.

textdata = extracthtmltext(subtrees);
textdata(1:10)
ans = 10×1 string
    "alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, “and what is the use of a book,” thought alice “without pictures or conversations?”"
    "so she was considering in her own mind (as well as she could, for the hot day made her feel very sleepy and stupid), whether the pleasure of making a daisy-chain would be worth the trouble of getting up and picking the daisies, when suddenly a white rabbit with pink eyes ran close by her."
    "there was nothing so very remarkable in that; nor did alice think it so very much out of the way to hear the rabbit say to itself, “oh dear! oh dear! i shall be late!” (when she thought it over afterwards, it occurred to her that she ought to have wondered at this, but at the time it all seemed quite natural); but when the rabbit actually took a watch out of its waistcoat-pocket, and looked at it, and then hurried on, alice started to her feet, for it flashed across her mind that she had never before seen a rabbit with either a waistcoat-pocket, or a watch to take out of it, and burning with curiosity, she ran across the field after it, and fortunately was just in time to see it pop down a large rabbit-hole under the hedge."
    "in another moment down went alice after it, never once considering how in the world she was to get out again."
    "the rabbit-hole went straight on like a tunnel for some way, and then dipped suddenly down, so suddenly that alice had not a moment to think about stopping herself before she found herself falling down a very deep well."
    "either the well was very deep, or she fell very slowly, for she had plenty of time as she went down to look about her and to wonder what was going to happen next. first, she tried to look down and make out what she was coming to, but it was too dark to see anything; then she looked at the sides of the well, and noticed that they were filled with cupboards and book-shelves; here and there she saw maps and pictures hung upon pegs. she took down a jar from one of the shelves as she passed; it was labelled “orange marmalade”, but to her great disappointment it was empty: she did not like to drop the jar for fear of killing somebody underneath, so managed to put it into one of the cupboards as she fell past it."
    "“well!” thought alice to herself, “after such a fall as this, i shall think nothing of tumbling down stairs! how brave they’ll all think me at home! why, i wouldn’t say anything about it, even if i fell off the top of the house!” (which was very likely true.)"
    "down, down, down. would the fall never come to an end? “i wonder how many miles i’ve fallen by this time?” she said aloud. “i must be getting somewhere near the centre of the earth. let me see: that would be four thousand miles down, i think-” (for, you see, alice had learnt several things of this sort in her lessons in the schoolroom, and though this was not a very good opportunity for showing off her knowledge, as there was no one to listen to her, still it was good practice to say it over) “-yes, that’s about the right distance-but then i wonder what latitude or longitude i’ve got to?” (alice had no idea what latitude was, or longitude either, but thought they were nice grand words to say.)"
    "presently she began again. “i wonder if i shall fall right through the earth! how funny it’ll seem to come out among the people that walk with their heads downward! the antipathies, i think-” (she was rather glad there was no one listening, this time, as it didn’t sound at all the right word) “-but i shall have to ask them what the name of the country is, you know. please, ma’am, is this new zealand or australia?” (and she tried to curtsey as she spoke-fancy curtseying as you’re falling through the air! do you think you could manage it?) “and what an ignorant little girl she’ll think me for asking! no, it’ll never do to ask: perhaps i shall see it written up somewhere.”"
    "down, down, down. there was nothing else to do, so alice soon began talking again. “dinah’ll miss me very much to-night, i should think!” (dinah was the cat.) “i hope they’ll remember her saucer of milk at tea-time. dinah my dear! i wish you were down here with me! there are no mice in the air, i’m afraid, but you might catch a bat, and that’s very like a mouse, you know. but do cats eat bats, i wonder?” and here alice began to get rather sleepy, and went on saying to herself, in a dreamy sort of way, “do cats eat bats? do cats eat bats?” and sometimes, “do bats eat cats?” for, you see, as she couldn’t answer either question, it didn’t much matter which way she put it. she felt that she was dozing off, and had just begun to dream that she was walking hand in hand with dinah, and saying to her very earnestly, “now, dinah, tell me the truth: did you ever eat a bat?” when suddenly, thump! thump! down she came upon a heap of sticks and dry leaves, and the fall was over."

remove the empty paragraphs and view the first 10 remaining paragraphs.

textdata(textdata == "") = [];
textdata(1:10)
ans = 10×1 string
    "alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, “and what is the use of a book,” thought alice “without pictures or conversations?”"
    "so she was considering in her own mind (as well as she could, for the hot day made her feel very sleepy and stupid), whether the pleasure of making a daisy-chain would be worth the trouble of getting up and picking the daisies, when suddenly a white rabbit with pink eyes ran close by her."
    "there was nothing so very remarkable in that; nor did alice think it so very much out of the way to hear the rabbit say to itself, “oh dear! oh dear! i shall be late!” (when she thought it over afterwards, it occurred to her that she ought to have wondered at this, but at the time it all seemed quite natural); but when the rabbit actually took a watch out of its waistcoat-pocket, and looked at it, and then hurried on, alice started to her feet, for it flashed across her mind that she had never before seen a rabbit with either a waistcoat-pocket, or a watch to take out of it, and burning with curiosity, she ran across the field after it, and fortunately was just in time to see it pop down a large rabbit-hole under the hedge."
    "in another moment down went alice after it, never once considering how in the world she was to get out again."
    "the rabbit-hole went straight on like a tunnel for some way, and then dipped suddenly down, so suddenly that alice had not a moment to think about stopping herself before she found herself falling down a very deep well."
    "either the well was very deep, or she fell very slowly, for she had plenty of time as she went down to look about her and to wonder what was going to happen next. first, she tried to look down and make out what she was coming to, but it was too dark to see anything; then she looked at the sides of the well, and noticed that they were filled with cupboards and book-shelves; here and there she saw maps and pictures hung upon pegs. she took down a jar from one of the shelves as she passed; it was labelled “orange marmalade”, but to her great disappointment it was empty: she did not like to drop the jar for fear of killing somebody underneath, so managed to put it into one of the cupboards as she fell past it."
    "“well!” thought alice to herself, “after such a fall as this, i shall think nothing of tumbling down stairs! how brave they’ll all think me at home! why, i wouldn’t say anything about it, even if i fell off the top of the house!” (which was very likely true.)"
    "down, down, down. would the fall never come to an end? “i wonder how many miles i’ve fallen by this time?” she said aloud. “i must be getting somewhere near the centre of the earth. let me see: that would be four thousand miles down, i think-” (for, you see, alice had learnt several things of this sort in her lessons in the schoolroom, and though this was not a very good opportunity for showing off her knowledge, as there was no one to listen to her, still it was good practice to say it over) “-yes, that’s about the right distance-but then i wonder what latitude or longitude i’ve got to?” (alice had no idea what latitude was, or longitude either, but thought they were nice grand words to say.)"
    "presently she began again. “i wonder if i shall fall right through the earth! how funny it’ll seem to come out among the people that walk with their heads downward! the antipathies, i think-” (she was rather glad there was no one listening, this time, as it didn’t sound at all the right word) “-but i shall have to ask them what the name of the country is, you know. please, ma’am, is this new zealand or australia?” (and she tried to curtsey as she spoke-fancy curtseying as you’re falling through the air! do you think you could manage it?) “and what an ignorant little girl she’ll think me for asking! no, it’ll never do to ask: perhaps i shall see it written up somewhere.”"
    "down, down, down. there was nothing else to do, so alice soon began talking again. “dinah’ll miss me very much to-night, i should think!” (dinah was the cat.) “i hope they’ll remember her saucer of milk at tea-time. dinah my dear! i wish you were down here with me! there are no mice in the air, i’m afraid, but you might catch a bat, and that’s very like a mouse, you know. but do cats eat bats, i wonder?” and here alice began to get rather sleepy, and went on saying to herself, in a dreamy sort of way, “do cats eat bats? do cats eat bats?” and sometimes, “do bats eat cats?” for, you see, as she couldn’t answer either question, it didn’t much matter which way she put it. she felt that she was dozing off, and had just begun to dream that she was walking hand in hand with dinah, and saying to her very earnestly, “now, dinah, tell me the truth: did you ever eat a bat?” when suddenly, thump! thump! down she came upon a heap of sticks and dry leaves, and the fall was over."

visualize the text data in a word cloud.

figure
wordcloud(textdata);
title("alice's adventures in wonderland")

prepare data for training

create a datastore that contains the data for training using documentgenerationdatastore. for the predictors, this datastore converts the documents into sequences of word indices using a word encoding. the first word index for each document corresponds to a "start of text" token. the "start of text" token is given by the string "startoftext". for the responses, the datastore returns categorical sequences of the words shifted by one.

tokenize the text data using tokenizeddocument.

documents = tokenizeddocument(textdata);

create a document generation datastore using the tokenized documents.

ds = documentgenerationdatastore(documents);

to reduce the amount of padding added to the sequences, sort the documents in the datastore by sequence length.

ds = sort(ds);

create and train lstm network

define the lstm network architecture. to input sequence data into the network, include a sequence input layer and set the input size to 1. next, include a word embedding layer of dimension 100 and the same number of words as the word encoding. next, include an lstm layer and specify the hidden size to be 100. finally, add a fully connected layer with the same size as the number of classes, a softmax layer, and a classification layer. the number of classes is the number of words in the vocabulary plus an extra class for the "end of text" class.

inputsize = 1;
embeddingdimension = 100;
numwords = numel(ds.encoding.vocabulary);
numclasses = numwords   1;
layers = [ 
    sequenceinputlayer(inputsize)
    wordembeddinglayer(embeddingdimension,numwords)
    lstmlayer(100)
    dropoutlayer(0.2)
    fullyconnectedlayer(numclasses)
    softmaxlayer
    classificationlayer];

specify the training options. specify the solver to be 'adam'. train for 300 epochs with learn rate 0.01. set the mini-batch size to 32. to keep the data sorted by sequence length, set the 'shuffle' option to 'never'. to monitor the training progress, set the 'plots' option to 'training-progress'. to suppress verbose output, set 'verbose' to false.

options = trainingoptions('adam', ...
    'maxepochs',300, ...
    'initiallearnrate',0.01, ...
    'minibatchsize',32, ...
    'shuffle','never', ...
    'plots','training-progress', ...
    'verbose',false);

train the network using trainnetwork.

net = trainnetwork(ds,layers,options);

generate new text

generate the first word of the text by sampling a word from a probability distribution according to the first words of the text in the training data. generate the remaining words by using the trained lstm network to predict the next time step using the current sequence of generated text. keep generating words one-by-one until the network predicts the "end of text" word.

to make the first prediction using the network, input the index that represents the "start of text" token. find the index by using the word2ind function with the word encoding used by the document datastore.

enc = ds.encoding;
wordindex = word2ind(enc,"startoftext")
wordindex = 1

for the remaining predictions, sample the next word according to the prediction scores of the network. the prediction scores represent the probability distribution of the next word. sample the words from the vocabulary given by the class names of the output layer of the network.

vocabulary = string(net.layers(end).classes);

make predictions word by word using predictandupdatestate. for each prediction, input the index of the previous word. stop predicting when the network predicts the end of text word or when the generated text is 500 characters long. for large collections of data, long sequences, or large networks, predictions on the gpu are usually faster to compute than predictions on the cpu. otherwise, predictions on the cpu are usually faster to compute. for single time step predictions, use the cpu. to use the cpu for prediction, set the 'executionenvironment' option of predictandupdatestate to 'cpu'.

generatedtext = "";
maxlength = 500;
while strlength(generatedtext) < maxlength
    % predict the next word scores.
    [net,wordscores] = predictandupdatestate(net,wordindex,'executionenvironment','cpu');
    
    % sample the next word.
    newword = datasample(vocabulary,1,'weights',wordscores);
    
    % stop predicting at the end of text.
    if newword == "endoftext"
        break
    end
    
    % add the word to the generated text.
    generatedtext = generatedtext   " "   newword;
    
    % find the word index for the next input.
    wordindex = word2ind(enc,newword);
end

the generation process introduces whitespace characters between each prediction, which means that some punctuation characters appear with unnecessary spaces before and after. reconstruct the generated text by removing the spaces before and after the appropriate punctuation characters.

remove the spaces that appear before the specified punctuation characters.

punctuationcharacters = ["." "," "’" ")" ":" "?" "!"];
generatedtext = replace(generatedtext," "   punctuationcharacters,punctuationcharacters);

remove the spaces that appear after the specified punctuation characters.

punctuationcharacters = ["(" "‘"];
generatedtext = replace(generatedtext,punctuationcharacters   " ",punctuationcharacters)
generatedtext = 
" “ just about as much right, ” said the duchess, “ and that’s all the least, ” said the hatter. “ fetch me to my witness at the shepherd heart of him."

to generate multiple pieces of text, reset the network state between generations using resetstate.

net = resetstate(net);

see also

| | | (deep learning toolbox) | (deep learning toolbox) | (deep learning toolbox) | (deep learning toolbox) | | | |

related topics

网站地图