Repositorio Dspace

Positional encodings for light curve transformers: an evaluation of their impact in the pretraining and classification task.

Mostrar el registro sencillo del ítem

dc.contributor.advisor Cabrera Vives, Guillermo es
dc.contributor.author Moreno Cartagena, Daniel Andrés es
dc.date.accessioned 2024-03-27T11:55:34Z
dc.date.available 2024-03-27T11:55:34Z
dc.date.issued 2024
dc.identifier.uri http://repositorio.udec.cl/jspui/handle/11594/11940
dc.description Tesis presentada para optar al grado de Magíster en Ciencias de la Computación. es
dc.description.abstract The vast volume of astronomical data generated nightly by observatories, such as the Vera C. Rubin Observatory, presents significant challenges in the classification and analysis of light curves. These curves, characterized by their unique distributions across various bands, irregular sampling, and varied cadences, necessitate sophisticated models capable of generalization across diverse astronomical surveys. In this work, we conducted empirical experiments to assess the transferability of a light curve transformer model to datasets with different cadences and magnitude distributions, utilizing various positional encodings. We proposed a new approach to directly incorporate temporal information into the output of the last attention layer. Additionally, we modified the common finetuning approach to assess the adaptability of the light curve transformer in contexts where the cadence is markedly different from that of the dataset used for its pretraining. Our results indicate that using trainable positional encodings leads to significant improvements in transformer performance and training times. Our proposed positional encoding, applied to the attention mechanism, can be trained more quickly than the traditional non-trainable positional encoding transformer, while still achieving competitive results when transferred to other datasets. Our approach to adapting the model to a dataset with a very different cadence demonstrates that, in terms of reconstruction of astronomical time series, both training time and computational space can be reduced. This approach achieves an adaptation in the cadence of the survey without the need to train the entire model, indicating a promising direction for future research in astronomical data analysis. es
dc.language.iso en es
dc.publisher Universidad de Concepción es
dc.subject Data sets es
dc.subject Light curves es
dc.subject Data encoding (Computer science) es
dc.title Positional encodings for light curve transformers: an evaluation of their impact in the pretraining and classification task. es
dc.type Tesis es
dc.description.facultad Facultad de Ingeniería. es
dc.description.departamento Departamento de Ingeniería Informática y Ciencias de la Computación es
dc.description.campus Concepción. es


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Buscar en DSpace


Búsqueda avanzada

Listar

Mi cuenta