TY - JOUR
T1 - Sentiment Lossless Summarization
AU - Li, Xiaodong
AU - Wu, Pangjing
AU - Zou, Chenxin
AU - Xie, Haoran
AU - Wang, Fu Lee
N1 - Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/9/5
Y1 - 2021/9/5
N2 - The aim of automatic text summarization (ATS) is to extract representative texts from documents and keep major points of the extracted texts consistent with the original documents. However, most existing studies ignore sentimental information loss in the summarization process, which leads to sentiment loss summarization. To address the sentiment loss issue during summarization, we introduce a sentiment compensation mechanism into document summarization and propose a graph-based extractive summarization approach named Sentiment Lossless Summarization (SLS). SLS first creates a graph representation for a document to obtain the importance score (i.e., literal indicator) of each sentence. Second, sentiment dictionaries are leveraged to analyze the sentence sentiments. Third, during each summarization iteration, the sentences with the lowest scores are iteratively removed, and the sentiment compensation weights of the remaining sentences are updated. With the help of sentiment compensation during the summarization process, sentiment consistencies between candidate summaries and the original documents are maintained. Intrinsic evaluations conducted on the DUC2001, DUC2002, DUC2004, and Multi-News datasets demonstrate that our approach outperforms baselines and state-of-the-art summarization methods in terms of Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores. Additionally, to further evaluate SLS performance in sentiment retention, extrinsic evaluations are introduced, and summary quality in terms of sentiment loss is evaluated by measuring the prediction accuracy for sentiment polarities of either movie (IMDb dataset) or product (Amazon dataset) review summaries. The experimental results demonstrate that our approach can improve prediction accuracy by at most 6% compared to the baseline.
AB - The aim of automatic text summarization (ATS) is to extract representative texts from documents and keep major points of the extracted texts consistent with the original documents. However, most existing studies ignore sentimental information loss in the summarization process, which leads to sentiment loss summarization. To address the sentiment loss issue during summarization, we introduce a sentiment compensation mechanism into document summarization and propose a graph-based extractive summarization approach named Sentiment Lossless Summarization (SLS). SLS first creates a graph representation for a document to obtain the importance score (i.e., literal indicator) of each sentence. Second, sentiment dictionaries are leveraged to analyze the sentence sentiments. Third, during each summarization iteration, the sentences with the lowest scores are iteratively removed, and the sentiment compensation weights of the remaining sentences are updated. With the help of sentiment compensation during the summarization process, sentiment consistencies between candidate summaries and the original documents are maintained. Intrinsic evaluations conducted on the DUC2001, DUC2002, DUC2004, and Multi-News datasets demonstrate that our approach outperforms baselines and state-of-the-art summarization methods in terms of Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores. Additionally, to further evaluate SLS performance in sentiment retention, extrinsic evaluations are introduced, and summary quality in terms of sentiment loss is evaluated by measuring the prediction accuracy for sentiment polarities of either movie (IMDb dataset) or product (Amazon dataset) review summaries. The experimental results demonstrate that our approach can improve prediction accuracy by at most 6% compared to the baseline.
KW - Extractive summarization
KW - Graph-based summarization
KW - Sentiment analysis
UR - http://www.scopus.com/inward/record.url?scp=85107124900&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2021.107170
DO - 10.1016/j.knosys.2021.107170
M3 - Article
AN - SCOPUS:85107124900
SN - 0950-7051
VL - 227
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 107170
ER -