National Science Library, Chinese Academy of Sciences, Beijing 100190, China Department of Library, Information and Archives Management, School of Economics and Management, University of Chinese Academy of Sciences, Beijing 100190, China
[Objective] This paper tries to generate contrastive sentences from two related paragraphs, aiming to establish a new model for creating contrastive paragraphs. [Methods] We generated contrastive sentences automatically from contrastive text sequences. We designed a deep learning model based on Seq2seq, which incorporated contrast features with character vectors to represent texts. Both the Encoder and Decoder layers of our model used BiLSTM structure, which also included attention mechanism. [Results] We examined the proposed model with manually annotated search lists and scientific papers. Then, we adopted BLEU as evaluation index for the results. The final evaluation score was 12.1, which was 6.5 higher than those of the benchmark model using BiLSTM + Attention. [Limitations] Due to the complexity of manually labeling, the data size in our experiments was small. [Conclusions] The proposed model could be used to build new model for generating contrastive paragraphs.
( Wei Yangyang. A Study on the Semantic Cognitive Mechanism of Three Parable Sentence Patterns in Modern Chinese[[J]. Theory Monthly, 2017(12):75-80.)
Jindal N, Liu B. Identifying Comparative Sentences in Text Documents [C]//Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. ACM, 2006: 244-251.
( Zhu Maoran, Wang Yilei, Gao Song, el at. A Deep-Learning Model Based on Attention Mechanism for Chinese Comparative Relation Detection[J]. Journal of the China Society for Scientific and Technical Information, 2019,38(6):612-621.)
Baxendale P B. Machine-made Index for Technical Literature—An Experiment[J]. IBM Journal of Research and Development, 1958,2(4):354-361.
Edmundson H P. New Methods in Automatic Extracting[J]. Journal of the ACM, 1969,16(2):264-285.
Gkatzia D, Lemon O, Rieser V. Natural Language Generation Enhances Human Decision-making with Uncertain Information[OL]. arXiv Preprint, arXiv: 1606. 03254.
Sutskever I, Vinyals O, Le Q V. Sequence to Sequence Learning with Neural Networks[OL]. arXiv Preprint, arXiv: 1409. 3215.
Shi T, Keneshloo Y, Ramakrishnan N, et al. Neural Abstractive Text Summarization with Sequence-to-Sequence Models : A Survey [OL]. arXiv Preprint, arXiv: 1812. 02303.
Jain P, Agrawal P, Mishra A, et al. Story Generation from Sequence of Independent Short Descriptions[OL]. arXiv Preprint, arXiv: 1707. 05501.
Liu T, Wang K, Sha L, et al. Table-to-Text Generation by Structure-aware Seq2Seq Learning [C]//Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 2018.
Deng Y, Kim Y, Chiu J, et al. Latent Alignment and Variational Attention [C]//Advances in Neural Information Processing Systems. 2018: 9712-9724.
Li J, Monroe W, Shi T, et al. Adversarial Learning for Neural Dialogue Generation[OL]. arXiv Preprint, arXiv: 1701. 06547.
Al-Rfou R, Perozzi B, Skiena S. Polyglot: Distributed Word Representations for Multilingual NLP[OL]. arXiv Preprint, arXiv: 1307. 1662.
Papineni K, Roukos S, Ward T, et al. BLEU: A Method for Automatic Evaluation of Machine Translation [C]//Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2002: 311-318.