Open Time:..
The Last Update Time:..
Disentangling the hourly dynamics of mixed urban function: A multimodal fusion perspective using dynamic graphs
Impact Factor:14.8
DOI number:10.1016/j.inffus.2024.102832
Journal:Information Fusion
Abstract:Traditional studies of urban functions often rely on static classifications, failing to capture the inherently dynamic nature of urban environments. This paper introduces the Spatio-temporal Graph for Dynamic Urban Functions (STG4DUF), a novel framework that combines multimodal data fusion and self-supervised learning to uncover dynamic urban functionalities without ground truth labels. The framework features a dual-branch encoder and dynamic graph architecture that integrates diverse urban data sources: street view imagery, building vector data, Points of Interest (POI), and hourly mobile phone-based human trajectory data. Through a self-supervised learning approach combining dynamic graph neural networks with Spatio-Temporal Fuzzy C-Means (STFCM), STG4DUF extracts parcel-level functional patterns and their temporal dynamics. Using Shenzhen as a case study, we validate the framework through static proxy tasks and demonstrate its effectiveness in capturing multi-scale urban dynamics. Our analysis, based on pyramid functional-semantic interpretation, uncovers intricate functional topics related to human activity, livability, social services, and industrial development, along with their temporal transitions and mixing patterns. These insights provide valuable guidance for evidence-based smart city planning and policy-making.
Co-author:Guanzhou Chen,Wei Tu,Xiaole Shen,Tianhong Zhao,Jiashi Chen,Qingquan Li
Indexed by:Journal paper
Document Type:J
Volume:117
Translation or Not:no
Date of Publication:2025-03-01
Included Journals:SCI
Links to published journals:https://www-sciencedirect-com-s.vpn.whu.edu.cn/science/article/abs/pii/S1566253524006109