# C Spatial notebook

## C.1 Transforming lat/lon data to the MP14 coordinates system

library(tidyverse)
library(sp)
library(sf)

We load the planning areas shapefile.

planning_areas <- st_read(here::here("data/MP14_PLNG_AREA_NO_SEA_PL.shp")) %>%
filter(!(OBJECTID == 49 | OBJECTID == 18))
## Reading layer MP14_PLNG_AREA_NO_SEA_PL' from data source /Users/barnabe/Documents/Research/Teaching/Computation Urban Design/lecturenotes/data/MP14_PLNG_AREA_NO_SEA_PL.shp' using driver ESRI Shapefile'
## Simple feature collection with 55 features and 12 fields
## geometry type:  MULTIPOLYGON
## dimension:      XY
## bbox:           xmin: 2667.538 ymin: 15748.72 xmax: 56396.44 ymax: 50256.33
## projected CRS:  SVY21

Note the projection used to transform the data. When you load the shapefile, the points are already expressed in this coordinates system. This returns a CRS object (“Coordinate Reference System”).

st_crs(planning_areas)
## Coordinate Reference System:
##   User input: SVY21
##   wkt:
## PROJCRS["SVY21",
##     BASEGEOGCRS["SVY21[WGS84]",
##         DATUM["World Geodetic System 1984",
##             ELLIPSOID["WGS 84",6378137,298.257223563,
##                 LENGTHUNIT["metre",1]],
##             ID["EPSG",6326]],
##         PRIMEM["Greenwich",0,
##             ANGLEUNIT["Degree",0.0174532925199433]]],
##     CONVERSION["unnamed",
##         METHOD["Transverse Mercator",
##             ID["EPSG",9807]],
##         PARAMETER["Latitude of natural origin",1.36666666666667,
##             ANGLEUNIT["Degree",0.0174532925199433],
##             ID["EPSG",8801]],
##         PARAMETER["Longitude of natural origin",103.833333333333,
##             ANGLEUNIT["Degree",0.0174532925199433],
##             ID["EPSG",8802]],
##         PARAMETER["Scale factor at natural origin",1,
##             SCALEUNIT["unity",1],
##             ID["EPSG",8805]],
##         PARAMETER["False easting",28001.642,
##             LENGTHUNIT["metre",1],
##             ID["EPSG",8806]],
##         PARAMETER["False northing",38744.572,
##             LENGTHUNIT["metre",1],
##             ID["EPSG",8807]]],
##     CS[Cartesian,2],
##         AXIS["(E)",east,
##             ORDER[1],
##             LENGTHUNIT["metre",1,
##                 ID["EPSG",9001]]],
##         AXIS["(N)",north,
##             ORDER[2],
##             LENGTHUNIT["metre",1,
##                 ID["EPSG",9001]]]]

We create a dataframe with one single point expressed as lat/lon.

pts <- tibble(ID = c(1), lat = c(1.3143394), lon = c(103.7041647))

Set its coordinates system to use lon/lat (longitude comes first in sp/sf objects).

coordinates(pts) <- c("lon", "lat")

Note that pts is now a SpatialPointsDataFrame object (an “sp” object then). We will do the following steps:

• Cast pts to an sf object first (st_as_sf)
• Set its coordinate system to the one that we implicitly used with lat/lon. That coordinates system is called “WGS84” (st_set_crs).
• Transform the point to the coordinates system used by the Singapore shapefile.
pts_transform <- pts %>%
st_as_sf() %>%
st_set_crs("+proj=longlat +datum=WGS84") %>% # projection we start from
st_transform(planning_areas %>% st_crs()) # projection we want to get to

Open pts_transform and notice that the coordinates of the point have changed. They are now comparable with the coordinates used in planning_areas. Note also that pts can hold an arbitrary number of lat/lon points and the code is the same to transform it to an sf object in the coordinates system of planning_areas.

## C.2 OpenStreetMap (OSM) tips

Let’s assume that we have a list of addresses we want to know the location of:

adrs <- tibble(nb = c("85", "260"), street = c("Jalan Sultan", "Jalan Sultan"))

We can build a request to OSM using their Overpass API query builder in R, returning an sf object.

library(osmdata)
osm_sf <- getbb("Singapore") %>%
opq() %>%
add_osm_feature("addr:housenumber", adrs$nb[1]) %>% add_osm_feature("addr:street", adrs$street[1]) %>%
osmdata_sf()

We only really care about the polygons, and for each polygon, the number, street name and geometry of the polygon.

osm_sf_sub <- osm_sf$osm_polygons[, c("addr.housenumber", "addr.street")] Sometimes though, our query returns nothing (or more precisely, returns an sf object with no polygon). osm_sf_null <- getbb("Singapore") %>% opq() %>% add_osm_feature("addr:housenumber", adrs$nb[2]) %>%
add_osm_feature("addr:street", adrs$street[2]) %>% osmdata_sf() nrow(osm_sf_null$osm_polygons)
## [1] 0

Our goal is to run many queries to find the (lat/lon) locations of a list of addresses. We need to check whether the query returned something, otherwise subsetting the columns fails. This is the function we will use (you will see why we use dat %>% pluck(1) later).

query_osm <- function(dat) {
q <- getbb("Singapore") %>%
opq() %>%
osmdata_sf()
if (nrow(q$osm_polygons) == 0) { return(NULL) } else { return(q$osm_polygons[, c("addr.housenumber", "addr.street")])
}
}

Now we’ll use purrr to run the requests many times using map. map needs a list of addresses, with each address a list itself. Fortunately, purrr is used to such manipulations, so provides a neat transpose method to go from the addr tibble to a list of lists. This is why we use pluck in query_osm by the way.

adrs_list <- transpose(
list(adrs$nb, adrs$street)
)

Now that we have a list, we can call map.

queries <- map(adrs_list, query_osm)

Open the queries object. You will observe that this is a list of two elements: one query that returned something (for 85 Jalan Sultan) and one that didn’t (for 260 Jalan Sultan). sf objects are collections: you can accumulate many geometries together in the same sf object. For instance, to union two different sf objects, you can use rbind, like you might for dataframes (or bind_rows if you want the dplyr equivalent). Once again, purrr is helpful: we can use its reduce method to bind all the objects our queries returned.

queries_sf <- queries %>%
reduce(rbind)

This works for an arbitrarily long list of addresses, but note that OSM might limit your rate and this can take some time. My advice is to subset your queries and send them progressively, e.g., if you have a big list of addresses, run first

adrs_list <- transpose(
list(adrs$nb[1:100], adrs$street[1:100])
)
queries_sf <- queries %>%
reduce(rbind)

Then run

adrs_list <- transpose(
list(adrs$nb[101:200], adrs$street[101:200])
)
queries_sf <- queries_sf %>% rbind(queries)`