Document Type
Article
Journal/Book Title/Conference
13th National Conference on Earthquake Engineering, Portland, OR
Publisher
Earthquake Engineering Research Institute
Location
Portland, OR
Publication Date
3-4-2026
Journal Article Version
Version of Record
First Page
1
Last Page
5
Creative Commons License

This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.
Abstract
This paper presents a knowledge distillation (KD)-based approach to reduce computational expenses and improve the generalizability of deep learning-based models for quality assessment of seismic waveforms. Using two different waveform datasets, teacher models are distilled into lightweight student models from two families of time-series-based and image-based models. Models from both student families are trained on a development dataset and evaluated on the testing portion of the development dataset, as well as on an entirely different external testing dataset. Results indicate that KD can provide student models with substantially reduced size (5%-10% of the teacher model's size) and latency while preserving, and in some cases improving upon, the predictive performance of teacher models, particularly for unseen data in the external testing set.
Recommended Citation
Namin .A, Jalaeifar .F, Kottke .A, Zaker Esteghamati .M. When the Student Surpasses the Teacher: Better Generalization in AI-based Seismic Waveform Quality Assessment Using Knowledge Distillation. Proceedings of the 13th National Conference in Earthquake Engineering, Earthquake Engineering Research Institute, Portland, OR. 2026.
Comments
This article will be rereleased by EERI in July 2026 as part of the full conference proceedings. Published with permission.