Posts by Collection

portfolio

publications

In‑context Time Series Predictor

Published in ICLR 2025 (Poster), 2025

Reformulates TSF as in‑context learning by constructing tokens of (lookback, future) task pairs, enabling Transformers to adapt predictors from context without parameter updates.

Download Paper

Free Energy Mixer

Published in ICLR 2026, 2026

Introduces Free Energy Mixer (FEM), which interprets (q,k) attention scores as a prior and performs a log-sum-exp free-energy readout to reweight values at the channel level, enabling a smooth transition from mean aggregation to selective channel-wise retrieval without increasing asymptotic complexity.

Download Paper

talks

teaching