論文:Recurrent Neural Network for TextClassification with Multi-Task Learning
1 Introduction
DNN缺點(diǎn):usually need a large-scale corpus due to the large number of parameters,it is hard to train a network that generalizes well with limited data
The first model uses just one shared layer for all the tasks.?
The second model uses different layers for different tasks, but each layer can read information from other layers.
The third model not only assigns one specific layer for each task, but also builds a shared layer for all the tasks.
2 Recurrent Neural Network for Specific-Task Text Classification
2.1 Recurrent Neural Network
缺陷:梯度消失
LSTM :learning long-term dependencies
論文:A Bi-LSTM-RNNModel for Relation Classification Using Low-Cost Sequence Features
1. Introduction
①performs bi-directional recurrent computation along all the tokens of the sentences which the relation spans.?
②the sequence of token representations, which are generated in the previous step, is divided into five parts according to the order that tokens occur in these sentences专筷,picture1
③standard pooling functions are applied over the token representations of each part and we obtain five representations corresponding to the five parts.
④they are concatenated and fed into a softmaxlayer for relation classification
LSTMs are used to attenuate the gradientvanishing problem when two target entities aredistant in text.
2. Related Work
3. Our Bi-LSTM-RNN Model
3.1. Long Short Term Memory (LSTM)
3.2. Bi-LSTM-RNN
補(bǔ)充: