Skip to main content
Login | Suomeksi | På svenska | In English

Mapping and classification of urban green spaces with object based image analysis and LiDAR data fusion

Show full item record

Title: Mapping and classification of urban green spaces with object based image analysis and LiDAR data fusion
Author(s): Männistö, Sameli
Contributor: University of Helsinki, Faculty of Biological and Environmental Sciences, Faculty of Biological and Environmental Sciences
Discipline: Environmental Ecology
Language: English
Acceptance year: 2020
As a result of urbanization and climate change, cities are facing various ecological and social challenges. For instance, flooding, pollution, urban heat island, decreased biodiversity, and mental stress of city dwellers are well recognized challenges of urban spaces. Urban green spaces are increasingly important in mitigating the adverse effects of climate change, such as flooding due to precipitation extremes, and also providing various other ecosystem services. In order to ensure sustainable land use and provision of ecosystem services, it is essential to develop methods for effective urban green space mapping. As a result, there is a growing demand for micro-scale land cover maps for urban areas. Emerging technologies, such as Object Based Image Analysis, OBIA, and light detection and ranging, LiDAR, offer promising possibilities for efficient mapping of green spaces in the urban environment. The aim of this thesis was to develop a semi-automatic method for urban green space mapping and classification. The other major task was to study the added benefits of light detection and ranging technology. Three research sites of varying degree of urbanization from the city of Helsinki were chosen for the study; from the city core in Itä-Pasila to appartment area with blocks of flats in Pihlajamäki and small-house residential area in Veräjämäki. The classification process was executed with an image analysis program called Definiens Developer. Main input data for classification was LiDAR data and VHR (very high resolution) aerial images. In the classification process, normalized vegetation index (NDVI) was used to detect live vegetation; assignation to different classes was based on height information derived from LIDAR data. Finally, an accuracy assessment was performed on the classified images to determine how well the classification process accomplished the task. The accuracy was assessed by comparing the classification images to the reference images of each catchment. Results demonstrate well the potential of OBIA for extracting urban green spaces. The downtown area of high land use intensity (Itä-Pasila) had the smallest green space coverage (31%), consisting mostly of urban parks and planted trees along the streets. The small-house area of low land use intensity (Veräjämäki) had the highest proportion (65%) of green spaces, consisting of forests and gardens. In the intermediate land use intensity with block of flats (Pihlajamäki)ts, a little under half of the coverage is green spaces. The highest accuracy of detecting green spaces was reached in low land use intensity area (92%), followed by the high and intermediate land use areas with 82% and 78%, respectively. The most common problem for classification was shaded areas, which reflect only limited spectral information and therefore the calculating of NDVI index becomes impossible. I found the object-based image analysis together with LiDAR data fusion to provide good means for urban green space mapping and classification. The presented method allowed a quick data acquisition with good overall accuracy, while avoiding the problems previously related to more traditional pixel-based methods. The addition of LiDAR data created the possibility of extracting vegetation height and using it in the classification process in order to divide vegetation into four different classes.
Keyword(s): urban green space remote sensing object-based image analysis OBIA LiDAR ecosystem services

Files in this item

Files Size Format View
Männistö_Sameli_ProGradu_2020.pdf 2.307Mb PDF

This item appears in the following Collection(s)

Show full item record