Call for Papers: Interpretability and Explainability in GeoAI
Special issue spotlights interpretable and explainable GeoAI
Guest editors Fan Zhang (Peking University), Ziqi Li (Florida State University), Xiao Huang (Emory University), and Xiaoxiang Zhu (Technical University of Munich) invite submissions to the ISPRS Journal of Photogrammetry and Remote Sensing special issue on Interpretability and Explainability in GeoAI for Geospatial Science and Earth Observations.
The issue seeks work that advances transparent, trustworthy GeoAI by embedding spatial mechanisms into model design or deploying explainable AI techniques that surface how complex models behave across geospatial domains.
Focus areas
- Intrinsically interpretable GeoAI architectures and physics-informed learning
- Spatially explicit XAI methods that reveal localized processes or causal drivers
- Evaluation frameworks, benchmarks, and geovisual analytics for explainable GeoAI
- Ethical, fair, and uncertainty-aware approaches for GeoAI deployment
- Scalable workflows that extend XAI to large imagery, sensor, and foundation models
Submission details
- Manuscript deadline: 31 May 2026
- Submit via Editorial Manager under article type VSI: XGeoAI: https://www.editorialmanager.com/photo/default.aspx
- Author guidelines: ISPRS Journal of Photogrammetry and Remote Sensing
- Inquiries: Dr. Fan Zhang (
fanzhanggis@pku.edu.cn)
Researchers exploring GeoAI interpretability and explainability across urban, environmental, and Earth observation applications are encouraged to contribute original or significantly extended work.