Título: | COREFERENCE RESOLUTION USING LATENT TREES WITH CONTEXTUAL EMBEDDING | ||||||||||||
Autor: |
LEONARDO BARBOSA DE OLIVEIRA |
||||||||||||
Colaborador(es): |
SERGIO COLCHER - Orientador |
||||||||||||
Catalogação: | 19/JAN/2021 | Língua(s): | PORTUGUESE - BRAZIL |
||||||||||
Tipo: | TEXT | Subtipo: | THESIS | ||||||||||
Notas: |
[pt] Todos os dados constantes dos documentos são de inteira responsabilidade de seus autores. Os dados utilizados nas descrições dos documentos estão em conformidade com os sistemas da administração da PUC-Rio. [en] All data contained in the documents are the sole responsibility of the authors. The data used in the descriptions of the documents are in conformity with the systems of the administration of PUC-Rio. |
||||||||||||
Referência(s): |
[pt] https://www.maxwell.vrac.puc-rio.br/projetosEspeciais/ETDs/consultas/conteudo.php?strSecao=resultado&nrSeq=51292&idi=1 [en] https://www.maxwell.vrac.puc-rio.br/projetosEspeciais/ETDs/consultas/conteudo.php?strSecao=resultado&nrSeq=51292&idi=2 |
||||||||||||
DOI: | https://doi.org/10.17771/PUCRio.acad.51292 | ||||||||||||
Resumo: | |||||||||||||
The coreference resolution task consists of to identify and group spans of
text related to the same real-world entity. Although it has been approached
in other conferences, the 2012 CoNLL is a milestone due to the improvement
in the quality of its dataset, metrics, and the presented solutions. In that
edition, the winning model used a structured perceptron to optimize an
antecedent latent tree, achieving 63.4 on the official metric for the English
test dataset. During the following years, the metrics and dataset presented
in that conference became the benchmark for the coreference task. With new
machine learning techniques, more elaborated solutions were presented. The
use of shallow neural networks achieved 68.8; adding contextual representation
raised the state-of-the-art to 73.0; deep neural networks improved the baseline
to 76.9 and the current state-of-the-art, which is a combination of many of
these techniques, is at 79.6. This work presents an analysis of how the word
embedding mechanisms Bag of Words, GloVe, BERT and SpanBERT, used
with antecedent latent trees, are compared to the original model of 2012. The
best model found used SpanBERT with a very large margin, achieving 61.3 in
the CoNLL 2012 metric using the test dataset. With these results, we show
that it is possible to use advanced techniques in simpler structures and still
achieve competitive results in the coreference task. Besides that, we improved
the performance of an open source framework for coreference, so it can manage
solution that demand more memory and processing.
|
|||||||||||||
|