CUCAI 2026
← Back to papers

Using Self-Supervised Learning for EEG with Heterogeneous Electrode Support via Graph Neural Networks and Task-Switch Contrastive Learning

Leopold Ehrlich

CUCAI 2026 Proceedings - 2026

Published 2026/03/07

Abstract

This paper presents a general-purpose foundation model for electroencephalography (EEG) data using selfsupervised learning. Drawing on neuroscience research, we employ a suite of techniques to identify and label cognitively distinct segments of brain activity across large, unlabeled EEG datasets. These pseudo-labels are used to train a contrastive learning objective alongside a graph neural network encoder that, unlike existing approaches, handles arbitrary electrode configurations without discarding any channels. The resulting foundation model can serve as a starting point for downstream tasks, reducing the amount of labeled data required to build accurate, task-specific EEG models. Evaluated on a motor imagery benchmark, our approach produces statistically significant improvements over the baseline while allowing for flexible electrode configurations. This has significant implications for few-shot clinical modeling and adaptive brain-computer interfaces. Code is available at https://github.com/LeopoldEhrlich/EEG-SSL