Brain-to-Text '25: How NOT to Use GPT-2 for Neural Decoding | Team 66 NKUST
Автор: 蔣定偉
Загружено: 2025-12-11
Просмотров: 19
Описание:
Can GPT-2 rerank brain signals? We tried—and failed.
In this 4-min video, Team 66 (NKUST) shares our negative-result experiment at Kaggle Brain-to-Text ’25: why large-language-model reranking hallucinates “I am a bird of prey” and how a simple Transformer + CTC finally cut WER to 0.47.
📊 Dataset: 45 sessions, 10k+ sentences, 30× rare-phoneme imbalance.
🔧 Pipeline: 1–70 Hz band-pass → z-score → Transformer encoder → greedy/beam CTC.
⚠️ Lesson: preprocessing beats model hype—at least for neural decoding.
Paper draft & code: link in the first comment.
#BrainToText #BCI #NeuralDecoding #Kaggle #NegativeResults #NKUST
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: