This work proposes QR-Distill, a distillation framework that (i) filters for correct, LLM-judged high-quality chains-of-thought, (ii) conditionally routes the remaining paths to students based on their current state, and (iii) enables mutual student collaboration via feature-level peer teaching. Across multiple reasoning datasets, it outperforms single- and multi-path baselines.
This survey systematically maps how LLMs support disaster management across mitigation, preparedness, response, and recovery. It introduces a unified taxonomy linking scenarios and tasks (classification, estimation, extraction, generation) to different model families, consolidates public datasets, and highlights open challenges—dataset construction, efficient deployment, robust generation, and unified evaluation—to guide future research and practice.
In this paper, we study an under-explored research problem of inductive forecasting with limited training data, which requires models to generalize the learned spatial-temporal dependencies from the nodes with available training temporal data to those nodes without. To handle this problem, we propose ST-FiT that can achieve superior performance without additional fine-tuning.
While significant progress has been made in understanding brain activity through functional connectivity (FC) graphs, challenges remain in effectively capturing and interpreting the complex, long-range dependencies and multiple pathways that are inherent in these graphs. In this work, we introduce BrainMAP, a novel framework that can extract multiple long-range activation pathways with adaptive sequentialization and pathway aggregation.
We proposed a bot-detection model named BIC. BIC interacts and exchanges information across text modality and graph modality by a text-graph interaction module. BIC contains a semantic consistency module that derives the inconsistency from tweets by the attention weight to identify advanced bots.