Ke, Yuwei
(2023)
AMR Beam Search: generating faithful summaries with semantic graph representations.
[Laurea magistrale], Università di Bologna, Corso di Studio in
Artificial intelligence [LM-DM270], Documento full-text non disponibile
Il full-text non è disponibile per scelta dell'autore.
(
Contatta l'autore)
Abstract
In recent years, notable strides have been made in abstractive summarization. However, a persistent challenge remains: many existing summarization models often produce summaries that lack semantic consistency with the source document. These models and evaluation methods tend to focus on word level, which may not necessarily bring results for being faithful. This thesis introduces a novel decoding strategy named AMR Beam Search (ABS) which leverages the potential of Abstract Meaning Representation (AMR). ABS is designed to integrate textual and structured semantic knowledge during the decoding process, promoting factual consistency between the generated summary and the source document. It accomplishes this by reranking hypotheses during decoding, where it computes the factuality score for beams completed through saliency-enhanced greedy decoding, considering both their corresponding source documents and AMRs in the process. Experiments conducted on 2 datasets, XSum and CNN/DM, demonstrate the superiority of the proposed ABS method. Across multiple faithfulness evaluation metrics, ABS outperforms the baseline method by an average gap of 0.77%, thus underscoring its effectiveness in achieving higher faithfulness in abstractive summarization.
Abstract
In recent years, notable strides have been made in abstractive summarization. However, a persistent challenge remains: many existing summarization models often produce summaries that lack semantic consistency with the source document. These models and evaluation methods tend to focus on word level, which may not necessarily bring results for being faithful. This thesis introduces a novel decoding strategy named AMR Beam Search (ABS) which leverages the potential of Abstract Meaning Representation (AMR). ABS is designed to integrate textual and structured semantic knowledge during the decoding process, promoting factual consistency between the generated summary and the source document. It accomplishes this by reranking hypotheses during decoding, where it computes the factuality score for beams completed through saliency-enhanced greedy decoding, considering both their corresponding source documents and AMRs in the process. Experiments conducted on 2 datasets, XSum and CNN/DM, demonstrate the superiority of the proposed ABS method. Across multiple faithfulness evaluation metrics, ABS outperforms the baseline method by an average gap of 0.77%, thus underscoring its effectiveness in achieving higher faithfulness in abstractive summarization.
Tipologia del documento
Tesi di laurea
(Laurea magistrale)
Autore della tesi
Ke, Yuwei
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Ordinamento Cds
DM270
Parole chiave
Natural Language Processing,Natural Language Generation,Abstractive Summarization,Abstract Meaning Representation,Decoding Strategies
Data di discussione della Tesi
21 Ottobre 2023
URI
Altri metadati
Tipologia del documento
Tesi di laurea
(NON SPECIFICATO)
Autore della tesi
Ke, Yuwei
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Ordinamento Cds
DM270
Parole chiave
Natural Language Processing,Natural Language Generation,Abstractive Summarization,Abstract Meaning Representation,Decoding Strategies
Data di discussione della Tesi
21 Ottobre 2023
URI
Gestione del documento: