A Review of Location Encoding for GeoAI: Methods and Applications

11/07/2021
by   Gengchen Mai, et al.
0

A common need for artificial intelligence models in the broader geoscience is to represent and encode various types of spatial data, such as points (e.g., points of interest), polylines (e.g., trajectories), polygons (e.g., administrative regions), graphs (e.g., transportation networks), or rasters (e.g., remote sensing images), in a hidden embedding space so that they can be readily incorporated into deep learning models. One fundamental step is to encode a single point location into an embedding space, such that this embedding is learning-friendly for downstream machine learning models such as support vector machines and neural networks. We call this process location encoding. However, there lacks a systematic review on the concept of location encoding, its potential applications, and key challenges that need to be addressed. This paper aims to fill this gap. We first provide a formal definition of location encoding, and discuss the necessity of location encoding for GeoAI research from a machine learning perspective. Next, we provide a comprehensive survey and discussion about the current landscape of location encoding research. We classify location encoding models into different categories based on their inputs and encoding methods, and compare them based on whether they are parametric, multi-scale, distance preserving, and direction aware. We demonstrate that existing location encoding models can be unified under a shared formulation framework. We also discuss the application of location encoding for different types of spatial data. Finally, we point out several challenges in location encoding research that need to be solved in the future.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/25/2022

Sphere2Vec: Multi-Scale Representation Learning over a Spherical Surface for Geospatial Predictions

Generating learning-friendly representations for points in a 2D space is...
07/12/2021

Fock State-enhanced Expressivity of Quantum Machine Learning Models

The data-embedding process is one of the bottlenecks of quantum machine ...
04/17/2022

A Survey on Location-Driven Influence Maximization

Influence Maximization (IM), which aims to select a set of users from a ...
05/31/2016

Kernel Mean Embedding of Distributions: A Review and Beyond

A Hilbert space embedding of a distribution---in short, a kernel mean em...
09/14/2021

Network representation learning systematic review: ancestors and current development state

Real-world information networks are increasingly occurring across variou...
02/16/2020

Multi-Scale Representation Learning for Spatial Feature Distributions using Grid Cells

Unsupervised text encoding models have recently fueled substantial progr...
05/29/2020

Quasi-orthonormal Encoding for Machine Learning Applications

Most machine learning models, especially artificial neural networks, req...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.