陈关州

微信扫描

当前位置: 中文主页 > 科学研究 > 论文成果

Disentangling the hourly dynamics of mixed urban function: A multimodal fusion perspective using dynamic graphs

发布时间:2025-03-18

点击次数:

影响因子:14.8

DOI码:10.1016/j.inffus.2024.102832

发表刊物:Information Fusion

摘要:Traditional studies of urban functions often rely on static classifications, failing to capture the inherently dynamic nature of urban environments. This paper introduces the Spatio-temporal Graph for Dynamic Urban Functions (STG4DUF), a novel framework that combines multimodal data fusion and self-supervised learning to uncover dynamic urban functionalities without ground truth labels. The framework features a dual-branch encoder and dynamic graph architecture that integrates diverse urban data sources: street view imagery, building vector data, Points of Interest (POI), and hourly mobile phone-based human trajectory data. Through a self-supervised learning approach combining dynamic graph neural networks with Spatio-Temporal Fuzzy C-Means (STFCM), STG4DUF extracts parcel-level functional patterns and their temporal dynamics. Using Shenzhen as a case study, we validate the framework through static proxy tasks and demonstrate its effectiveness in capturing multi-scale urban dynamics. Our analysis, based on pyramid functional-semantic interpretation, uncovers intricate functional topics related to human activity, livability, social services, and industrial development, along with their temporal transitions and mixing patterns. These insights provide valuable guidance for evidence-based smart city planning and policy-making.

合写作者:Guanzhou Chen,Wei Tu,Xiaole Shen,Tianhong Zhao,Jiashi Chen,Qingquan Li

论文类型:期刊论文

文献类型:J

卷号:117

是否译文:否

发表时间:2025-03-01

收录刊物:SCI

发布期刊链接:https://www-sciencedirect-com-s.vpn.whu.edu.cn/science/article/abs/pii/S1566253524006109