long short-凯发k8网页登录

a long short-term memory network is a type of recurrent neural network (rnn). lstms are predominantly used to learn, process, and classify sequential data because these networks can learn long-term dependencies between time steps of data. common lstm applications include sentiment analysis, language modeling, speech recognition, and video analysis.

lstm applications and examples

the examples below use matlab® and to apply lstm in specific applications. beginners can get started with lstm networks through this simple example: .

radar target classification

classify radar returns using a long short-term memory (lstm) recurrent neural network in matlab

keyword spotting

wake up a system when a user speaks a predefined keyword

text generation

train a deep learning lstm network to generate text word-by-word

classifying ecg signals

categorize ecg signals, which record the electrical activity of a person's heart over time, as normal or afib

water distribution system scheduling

generate an optimal pump scheduling policy for a water distribution system using reinforcement learning (rl)

video classification

classify video by combining a pretrained image classification model and an lstm network

technical features of rnn and lstm

lstm networks are a specialized form of rnn architecture. the differences between the architectures and the advantages of lstms are highlighted in this section. 

recurrent neural network.

basic structure of recurrent neural network (rnn).

in practice, simple rnns are limited in their capacity to learn longer-term dependencies. rnns are commonly trained through backpropagation, in which they may experience either a ‘vanishing’ or ‘exploding’ gradient problem. these problems cause the network weights to either become very small or very large, limiting effectiveness in applications that require the network to learn long-term relationships.

to overcome this issue, lstm networks use additional gates to control what information in the hidden cell is exported as output and to the next hidden state . the additional gates allow the network to learn long-term relationships in the data more effectively. lower sensitivity to the time gap makes lstm networks better for analyzing sequential data than simple rnns.

in addition to the hidden state in traditional rnns, the architecture for an lstm block typically has a memory cell, input gate, output gate, and forget gate, as shown below.

in comparison to rnn, long short-term memory (lstm) architecture has more gates to control information flow.

the weights and biases to the input gate control the extent to which a new value flows into the cell. similarly, the weights and biases to the forget gate and output gate control the extent to which a value remains in the cell and the extent to which the value in the cell is used to compute the output activation of the lstm block, respectively.

for more details on the lstm network, see deep learning toolbox™.


software reference

see also: matlab for deep learning, machine learning, matlab for data science, gpu computing, artificial intelligence

watch this series of matlab tech talks to explore key deep learning concepts. learn to identify when to use deep learning, discover what approaches are suitable for your application, and explore some of the challenges you might encounter.
网站地图