Int J Performability Eng ›› 2017, Vol. 13 ›› Issue (5): 775-782.doi: 10.23940/ijpe.17.05.p20.775782

• Original articles • Previous Articles    

An Attention-Based Syntax-Tree and Tree-LSTM Model for Sentence Summarization

Wenfeng Liua, b, Peiyu Liua, c, *, Yuzhen Yangb, Yaling Gaod, and Jing Yia, e   

  1. aSchool of Information Science and Engineering, Shandong Normal University, Jinan, 250014, China
    b
    Department of Computer and Information Engineering, Heze University, Heze, 274015, China
    c
    Shandong Provincial Key Laboratory for Distributed Computer Software Novel Technology, Jinan,250014, China
    d
    Ruipu Peony Industrial Technology Development Co., Ltd, 274015, China
    e
    School of Computer Science and Technology, Shandong Jianzhu University, Jinan, 250101, China

Abstract: Generative Summarization is of great importance in understanding large-scale textual data. In this work, we propose an attention-based Tree-LSTM model for sentence summarization, which utilizes an attention-based syntactic structure as auxiliary information. Thereinto, block-alignment is used to align the input and output syntax blocks, while inter-alignment is used for alignment of words within that of block pairs. To some extent, block-alignment can prevent structural deviations on the long sentences and inter-alignment is capable of increasing the flexibility of the generation in the blocks. This model can be easily trained to end-to-end mode and deal with any length of the input sentences. Compared with several relatively strong baselines, our model has achieved the state-of-art on DUC-2004 shared task.


Submitted on January 29, 2017; Revised on April 12, 2017; Accepted on June 23, 20177
References: 21