A Multiple Subspaces-Based Model: Interpreting Urban Functional Regions with Big Geospatial Data

Analyzing the urban spatial structure of a city is a core topic within urban geographical information science that has the ability to assist urban planning, site selection, location recommendation, etc. Among previous studies, comprehending the functionality of places is a central topic and correspo...

Full description

Bibliographic Details
Main Authors: Jiawei Zhu, Chao Tao, Xin Lin, Jian Peng, Haozhe Huang, Li Chen, Qiongjie Wang
Format: Article
Language:English
Published: MDPI AG 2021-02-01
Series:ISPRS International Journal of Geo-Information
Subjects:
Online Access:https://www.mdpi.com/2220-9964/10/2/66
Description
Summary:Analyzing the urban spatial structure of a city is a core topic within urban geographical information science that has the ability to assist urban planning, site selection, location recommendation, etc. Among previous studies, comprehending the functionality of places is a central topic and corresponds to understanding how people use places. With the help of big geospatial data which contain affluent information about human mobility and activity, we propose a novel multiple subspaces-based model to interpret the urban functional regions. This model is based on the assumption that the temporal activity patterns of places lie in a high-dimensional space and can be represented by a union of low-dimensional subspaces. These subspaces are obtained through finding sparse representations using the data science method known as sparse subspace clustering (SSC). The paper details how to use this method in the context of detecting functional regions. With these subspaces, we can detect the functionality of urban regions in a designated study area and further explore the characteristics of functional regions. We conducted experiments using real data in Shanghai. The experimental results and outperformance of our proposed model against the single subspace-based method prove the efficacy and feasibility of our model.
ISSN:2220-9964