dor_id: 4143029

506.#.#.a: Público

590.#.#.d: Los artículos enviados a la revista "Journal of Applied Research and Technology", se juzgan por medio de un proceso de revisión por pares

510.0.#.a: Scopus, Directory of Open Access Journals (DOAJ); Sistema Regional de Información en Línea para Revistas Científicas de América Latina, el Caribe, España y Portugal (Latindex); Indice de Revistas Latinoamericanas en Ciencias (Periódica); La Red de Revistas Científicas de América Latina y el Caribe, España y Portugal (Redalyc); Consejo Nacional de Ciencia y Tecnología (CONACyT); Google Scholar Citation

561.#.#.u: https://www.icat.unam.mx/

650.#.4.x: Ingenierías

336.#.#.b: article

336.#.#.3: Artículo de Investigación

336.#.#.a: Artículo

351.#.#.6: https://jart.icat.unam.mx/index.php/jart

351.#.#.b: Journal of Applied Research and Technology

351.#.#.a: Artículos

harvesting_group: RevistasUNAM

270.1.#.p: Revistas UNAM. Dirección General de Publicaciones y Fomento Editorial, UNAM en revistas@unam.mx

590.#.#.c: Open Journal Systems (OJS)

270.#.#.d: MX

270.1.#.d: México

590.#.#.b: Concentrador

883.#.#.u: https://revistas.unam.mx/catalogo/

883.#.#.a: Revistas UNAM

590.#.#.a: Coordinación de Difusión Cultural

883.#.#.1: https://www.publicaciones.unam.mx/

883.#.#.q: Dirección General de Publicaciones y Fomento Editorial

850.#.#.a: Universidad Nacional Autónoma de México

856.4.0.u: https://jart.icat.unam.mx/index.php/jart/article/view/1446/968

100.1.#.a: Tripathy, Saroj Anand; Ashok, Sharmila

524.#.#.a: Tripathy, Saroj Anand, et al. (2023). Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode. Journal of Applied Research and Technology; Vol. 21 Núm. 1, 2023; 73-86. Recuperado de https://repositorio.unam.mx/contenidos/4143029

245.1.0.a: Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode

502.#.#.c: Universidad Nacional Autónoma de México

561.1.#.a: Instituto de Ciencias Aplicadas y Tecnología, UNAM

264.#.0.c: 2023

264.#.1.c: 2023-02-27

506.1.#.a: La titularidad de los derechos patrimoniales de esta obra pertenece a las instituciones editoras. Su uso se rige por una licencia Creative Commons BY-NC-SA 4.0 Internacional, https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode.es, para un uso diferente consultar al responsable jurídico del repositorio por medio del correo electrónico gabriel.ascanio@icat.unam.mx

884.#.#.k: https://jart.icat.unam.mx/index.php/jart/article/view/1446

001.#.#.#: 074.oai:ojs2.localhost:article/1446

041.#.7.h: eng

520.3.#.a: With the rise of the Internet, we now have a lot of information at our disposal. We "re swamped from many sources news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.

773.1.#.t: Journal of Applied Research and Technology; Vol. 21 Núm. 1 (2023); 73-86

773.1.#.o: https://jart.icat.unam.mx/index.php/jart

022.#.#.a: ISSN electrónico: 2448-6736; ISSN: 1665-6423

310.#.#.a: Bimestral

300.#.#.a: Páginas: 73-86

264.#.1.b: Instituto de Ciencias Aplicadas y Tecnología, UNAM

doi: https://doi.org/10.22201/icat.24486736e.2023.21.1.1446

harvesting_date: 2023-11-08 13:10:00.0

856.#.0.q: application/pdf

file_creation_date: 2023-02-24 05:10:02.0

file_modification_date: 2023-02-24 05:10:02.0

file_creator: Yolanda G.G.

file_name: 764ae93b2ce2ec8e9daeeb713b7df7e9165f4da856d27c66fdf76c321f92ccdf.pdf

file_pages_number: 14

file_format_version: application/pdf; version=1.7

file_size: 920700

last_modified: 2024-03-19 14:00:00

license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode.es

license_type: by-nc-sa

No entro en nada

No entro en nada 2

Artículo

Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode

Tripathy, Saroj Anand; Ashok, Sharmila

Instituto de Ciencias Aplicadas y Tecnología, UNAM, publicado en Journal of Applied Research and Technology, y cosechado de Revistas UNAM

Licencia de uso

Procedencia del contenido

Cita

Tripathy, Saroj Anand, et al. (2023). Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode. Journal of Applied Research and Technology; Vol. 21 Núm. 1, 2023; 73-86. Recuperado de https://repositorio.unam.mx/contenidos/4143029

Descripción del recurso

Autor(es)
Tripathy, Saroj Anand; Ashok, Sharmila
Tipo
Artículo de Investigación
Área del conocimiento
Ingenierías
Título
Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode
Fecha
2023-02-27
Resumen
With the rise of the Internet, we now have a lot of information at our disposal. We "re swamped from many sources news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.
Idioma
eng
ISSN
ISSN electrónico: 2448-6736; ISSN: 1665-6423

Enlaces