Problem Analysis
The problem is that users say a sentence include some emotion and in corresponding our model need to produce several sentences to respond the user from different emotional angle.First we need to analysis or know the emotion of the user's question.And second we can return proper words.For example, one says "What a lovely day." and we can say "Haha, so happy today." in response.As the problem is unfamiliar to us in a degree, we need to search some papers and do some analysis, in the end we will cite the papers we read.
High-Level System Design
1 Related work
1.1Emotional Intelligence
In interactions between humans and artificial agents, the capability to detect signs of human emotions and to suitably react to them can enrich communication.
1.2 Large-scale Sequence-to-sequence BasedConversation Generation
Lots of works have been done to?improve the content quality generation.Here we use some to?generate responses both relevant in content and coherent in emotion.
1.3 Memory-based Networks
We can adopt a dynamic memory to model the change of an internal emotion state, and a static memory to store a dictionary of emotion words in our model.
2 High-Level System Design
We want to train several independent networks ,which are used to deal with different emotional? questions, such as happiness and anger .
2.1 Introduction of frame diagram?
As shown in the overall framework of the model, the user's problem is entered as "What a lovely day!", which is encoded as a hidden vector by Encoder, and then the implicit vector of the problem is enhanced by the attention mechanism, which is combined with the state vector s of decoder to generate different words, and the information selectivity of the different parts of H is enhanced by the implicit vector of the problem and then get the vector c.The emotion category is designated as "Happiness". After indexing, we get the emotion category embedding vector, the initial emotion state memory vector and the corresponding emotion word list.Decoder accepts the problem vector c through the attention mechanism, the emotion category embedding vector and the initial emotional state memory vector as input, and then generates the generation probability o of the next word through recurrent neural network, then passes the emotional word list to the weighting of the emotion word and the non emotion word.Finally we obtain the generation probability of the final word by sampling. You can get the output "Haha, so happy today!"
2.2 Emotion Category Embedding
Here are some brief introductions.We?initialize one vector for each emotion category, and then?learn the emotion category representations through training.
2.3 Internal Memory
we design an internal memory module to approach the emotion dynamics during decoding.
2.4?External Memory
we use an external memory model to model emotion expressions explicitly by assigning different generation probabilities to emotion?words and generic words.
2.5 Loss Function
We can get the the loss function later, when we will analysis all the aspects about loss.And then we can write or proposal some methods to lower the loss.??
Expected Goal
We want to train a model to produce proper respond with users's question include some emotion , get a loss function and find some ways to lower the loss.
Schedule and Human Allocation(in Chinese)
組長:胡欽濤
組員:吳行屈糊、劉章杰搭盾、湯敏芳、許彥夫
人員安排:
吳行?胡欽濤?許彥夫 for coding,劉章杰 湯敏芳 for files
進度安排:
第5-7周:論文查詢兄裂、模型可行性分析及確定
第8-9周:數(shù)據(jù)集收集標記
第9-11周:模型訓(xùn)練,調(diào)參
第11-12周:結(jié)果分析幌衣,撰寫報告
Appendix
Affect-LM:A Neural Language Model for Customizable Affective Text Generation
Affective Neural Response Generation
Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory