Positional encodings for light curve transformers: an evaluation of their impact in the pretraining and classification task.

dc.contributor.advisorCabrera Vives, Guillermoes
dc.contributor.authorMoreno Cartagena, Daniel Andréses
dc.date.accessioned2024-03-27T11:55:34Z
dc.date.accessioned2024-08-28T20:05:34Z
dc.date.available2024-03-27T11:55:34Z
dc.date.available2024-08-28T20:05:34Z
dc.date.issued2024
dc.descriptionTesis presentada para optar al grado de Magíster en Ciencias de la Computaciónes
dc.description.abstractThe vast volume of astronomical data generated nightly by observatories, such as the Vera C. Rubin Observatory, presents significant challenges in the classification and analysis of light curves. These curves, characterized by their unique distributions across various bands, irregular sampling, and varied cadences, necessitate sophisticated models capable of generalization across diverse astronomical surveys. In this work, we conducted empirical experiments to assess the transferability of a light curve transformer model to datasets with different cadences and magnitude distributions, utilizing various positional encodings. We proposed a new approach to directly incorporate temporal information into the output of the last attention layer. Additionally, we modified the common finetuning approach to assess the adaptability of the light curve transformer in contexts where the cadence is markedly different from that of the dataset used for its pretraining. Our results indicate that using trainable positional encodings leads to significant improvements in transformer performance and training times. Our proposed positional encoding, applied to the attention mechanism, can be trained more quickly than the traditional non-trainable positional encoding transformer, while still achieving competitive results when transferred to other datasets. Our approach to adapting the model to a dataset with a very different cadence demonstrates that, in terms of reconstruction of astronomical time series, both training time and computational space can be reduced. This approach achieves an adaptation in the cadence of the survey without the need to train the entire model, indicating a promising direction for future research in astronomical data analysis.es
dc.description.campusConcepciónes
dc.description.departamentoDepartamento de Ingeniería Informática y Ciencias de la Computaciónes
dc.description.facultadFacultad de Ingenieríaes
dc.identifier.doihttps://doi.org/10.29393/TMUdeC-74MD1PE74
dc.identifier.urihttps://repositorio.udec.cl/handle/11594/11940
dc.language.isoenen
dc.publisherUniversidad de Concepciónes
dc.rightsCC BY-NC-ND 4.0 DEED Attribution-NonCommercial-NoDerivs 4.0 Internationalen
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectData setsen
dc.subjectLight curvesen
dc.subjectData encoding (Computer science)en
dc.titlePositional encodings for light curve transformers: an evaluation of their impact in the pretraining and classification task.es
dc.typeTesises

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
moreno_c_d_2024_MAG.pdf
Size:
2.68 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Plain Text
Description:

Collections