1 Introduction

During times of disaster, such as the devastating flood event in Germany’s Ahr valley in July 2021, isochrones can support emergency responders, humanitarian organizations, and disaster management teams. Openrouteservice, an open-source routing engine utilizing OpenStreetMap data, offers a flexible and powerful solution for generating isochrones and addressing logistical challenges in such critical situations.

This training aims to familiarize participants with the capabilities of openrouteservice’s isochrone generation within R and demonstrate how it can be leveraged to enhance disaster response planning. Through a series of code examples, we will explore the key features and functionalities of openrouteservice’s isochrone API. You will learn how easy it is to interact with openrouteservice from R and be able to integrate isochrone analysis directly into your own workflow.

By utilizing openrouteservice’s isochrone capabilities, participants will gain the ability to analyze and visualize travel time polygons, highlighting areas reachable within specific time thresholds. This information is invaluable for identifying high-priority locations, optimizing evacuation routes, and strategically positioning resources during emergencies. Through hands-on exercises, you will learn how to customize isochrone parameters, such as travel modes, durations, and destination points, to tailor the analysis to your specific needs.

1.1 Setup

Before we delve into the details of utilizing openrouteservice, let’s ensure that we have all the necessary prerequisites in place. The following code chunk provides instructions on how to install the required packages. The openrouteservice R library is currently not published on CRAN, but can be conveniently installed through the devtools packages install_github function. Same applies to the exactextract library, it provides us with a set of methods to conduct zonal statistic methods.

install.packages(c("devtools","dplyr","sf","terra",
                   "geojsonsf","units","tmap","ggplot2"))
devtools::install_github("GIScience/openrouteservice-r")
devtools::install_github("isciences/exactextract")

Additional libraries we will need for this training:

# main libraries
library(dplyr) # manipulating tabular data
library(sf) # geospatial classes and datastructures - the swiss army knife on GIS in R
library(openrouteservice) # interface to openrouteservice API
library(terra) # handle all sorts of raster data
library(exactextractr) # sophisticated, fast extraction from raster to vector more: https://github.com/isciences/exactextract

# utils
library(geojsonsf) # sf classes to/from geojsons
library(units) # add units to numerics
#
# # visualization
library(tmap) # visualize spatial data
library(ggplot2) # visualize (non-) spatial data

In order to use the openrouteservice public api you need to register yourself an account and create an api key.

When you sucessfully created your personal api key, copy it over in the chunk below.

your_api_key <-
  "YOUR_API_KEY"

1.2 Context

During the Ahr valley flood event, many roads and bridges were damaged or rendered impassable, posing significant challenges for navigation and transportation. You may have already done the other exercise on disaster aware routing in the Ahr valley during the flood disaster. There we used the avoid areas parameter of openrouteservices directions endpoint to calculate routes which do not intersect with determined areas, in this very case bridges and roads destroyed by the flood. But openrouteservice can also combine avoid areas with isochrones.

What is an isochrone again?

An isochrone is a geographical representation of areas that can be reached within a specified travel time or distance from a given location.

1.3 Single isochrone and avoid areas

Let’s try it out. In the following code chunk we define a location in the city of Bad Neunahr-Ahrweiler. We then pass the coordinates to accompanied by some options for the isochrone service API parameters. Contrary to the directions api, we only need one coordinate. It serves as the center from which the isochrone is calculated. We can add more options, customizing our request. Many options are the same to directions. So there is also a profile, that is the option of mode of transport. Motorized, on foot or by bicycle. Or the shortest or fastest route. Isochrones specific are for example range_type, range and interval. Range type specifies the unit of measurement of time or distance. Range specifies the total range. With the option interval partial isochrones can be calculated at the same time. For this example, we use time as the unit and a total range of 20 minutes divided into intervals of 5.

1.4 Regular from a location in Bad Neuanahr-Ahrweiler

bad_neuenahr <- c(7.119166, 50.548979)
bad_neuenahr_sf <-
  st_as_sf(
    data.frame(longitude = bad_neuenahr[1], latitude = bad_neuenahr[2]),
    coords = c("longitude", "latitude"),
    crs = 4326
  )

isochrone <-
  ors_isochrones(
    bad_neuenahr,
    profile = "driving-car",
    range = 1200,
    interval = 300,
    output = "sf",
    api_key = your_api_key,
    range_type = "time"
  )

Great, you requested your first isochrone. Before we visualize it, lets get another one with avoid areas, and compare them in one go.

The following code is familiar to you if you have already edited the directions exercise. We use data from Copernicus Emergency Mapping Service, , which provides information on destroyed or damaged roads and bridges in the Ahr valley.

We convert it to polygons, ready to be consumed by the openrouteservice API as avoid areas.

affected_roads_bridges <- st_read("data/affected_roads.gpkg", quiet=T)

affected_roads_bridges <- affected_roads_bridges |>
  st_transform(25832) |>
  st_buffer(2) |>
  st_transform(4326) |>
  st_union() |>
  st_as_sf() |>
  st_make_valid()


options <- list(avoid_polygons =
                  affected_roads_bridges |> sf_geojson() |> rjson::fromJSON())

Then we are good to go to request a second isochrone but with avoid areas. At the end of the code chunk we combine both dataframes to make visualization easier.

isochrone_avoid <-
  ors_isochrones(
    bad_neuenahr,
    profile = "driving-car",
    range = 1200,
    interval = 300,
    output = "sf",
    api_key = your_api_key,
    options = options,
    range_type = "time"
  )

isochrone$type <- "regular"
isochrone_avoid$type <- "avoid"

both_isochrones <- rbind(isochrone, isochrone_avoid)
both_isochrones <- both_isochrones |> arrange(desc(value))

The next chunk visualizes both isochrones. Can you spot the difference?

Look south of the center of the isochrones. This area is not reachable according to the disaster aware isochrone in 20 minutes. We further see some hints about the infrastructure. Both isochrones extend quite far from south east to north west. This is due to the Autobahn A61.

# add a attribute to both routes to be able to differentiate.

tmap_mode("view")
## tmap mode set to interactive viewing
tm_basemap("OpenStreetMap") +
  tm_shape(both_isochrones) +
  tm_polygons(
    "value",
    title = "Traveltime in seconds",
    palette = "magma",
    alpha = 1,
    lwd = NA,
    breaks = c(0, 200, 400, 600, 800, 1000, 1200)
  ) +
tm_facets("type") +
  tm_shape(bad_neuenahr_sf) +
  tm_symbols(size = 0.5)

2 Multiple Isochrones - local physicians catchments

In the next part of the exercise, we will utilize the openrouteservice isochrone API to generate catchment areas with a travel time of 30 minutes for each of the 20 physician locations in the Ahr valley. This analysis will involve creating regular isochrones as well as isochrones that account for avoid roads damaged by the flood disaster. To gain further insights, we will integrate population data and examine any changes in catchment areas resulting from the impact of the flood disaster. This comprehensive approach will provide valuable information for assessing healthcare accessibility and the potential challenges faced during the disaster.

For this purpose we need the following additonal datasets:

physicians <- st_read("data/physicians.gpkg")
## Reading layer `amenity_hospital_healthcare' from data source
##   `/home/mreinmuth/giscience/FOSS4G/openrouteservice-workshop-r/data/physicians.gpkg'
##   using driver `GPKG'
## Simple feature collection with 20 features and 4 fields
## Geometry type: POINT
## Dimension:     XY
## Bounding box:  xmin: 6.579343 ymin: 50.43688 xmax: 7.280351 ymax: 50.57988
## Geodetic CRS:  WGS 84
population <- rast("data/wpop_ahrvalley.tif")
municipalities <- st_read("data/affected_municipalities.gpkg")
## Reading layer `municipalities' from data source
##   `/home/mreinmuth/giscience/FOSS4G/openrouteservice-workshop-r/data/affected_municipalities.gpkg'
##   using driver `GPKG'
## Simple feature collection with 20 features and 3 fields
## Geometry type: MULTIPOLYGON
## Dimension:     XY
## Bounding box:  xmin: 6.584629 ymin: 50.3361 xmax: 7.284653 ymax: 50.58244
## Geodetic CRS:  WGS 84

2.1 Get the isochrones

Similar to the directions exercise, we conduct multiple openrouteservice requests in a loop. We loop over all physicians, as they serve as the center for our isochrone analysis. We request a regular isochrone and one with avoid polygons. Then we add some additional attributes from our physician dataset to the response isochrones and save them in a sf dataframe. This may take a little while, you may have a coffe in the mean time.

for (a in 1:nrow(physicians)) {
  tryCatch({
    # request a regular isochrone around location of physician
    isochrone_regular <- ors_isochrones(
      physicians[a,] |> st_coordinates() |> c(),
      api_key = your_api_key,
      output = "sf",
      profile = "driving-car",
      range = 1800,
      interval = 300,
      range_type = "time"
    )
    # request a disaster aware route from staging to community
    isochrone_avoid <- ors_isochrones(
      physicians[a, ] |> st_coordinates() |> c(),
      api_key = your_api_key,
      output = "sf",
      profile = "driving-car",
      range = 1800,
      interval = 300,
      options = options,
      range_type = "time"
    )

    # add attribute to differentiate the results later
    isochrone_regular$type <- "regular"
    isochrone_avoid$type <- "avoid"
    isochrone <- rbind(isochrone_regular, isochrone_avoid)

    # add id to relate place and isochrone later on
    isochrone$id <- a
    physicians[a,]$id <- a
    # also add the name of the respective physician to the isochrone
    isochrone$destination <- physicians[a,]$name

    # check for the first iteration to init the result sf data frame
    if (a == 1) {
      isochrones <- isochrone
      # if not just append to the result sf data frame
    } else {
      isochrones <- rbind(isochrones, isochrone)
    }
  },
  error = function(e)
  {
    # print info on affected communities that openrouteservice could not route to
    print(
      glue::glue(
        "Error on center: {physicians[a,]$name}; iteration: {a}/{nrow(physicians)}"
      )
    )

  })
}

Well done. Lets look at the result in a map.

isochrone_union <-
  isochrones |> st_make_valid() |>  group_by(value, type) |>
  summarize(geometry = st_union(geometry)) |> st_make_valid()
## `summarise()` has grouped output by 'value'. You can override using the
## `.groups` argument.
tm_basemap("OpenStreetMap") +
  tm_shape(isochrone_union |>  arrange(desc(value))) +
  tm_polygons(
    "value",
    title = "Traveltime in seconds",
    palette = "magma",
    alpha = 1,
    lwd = NA,
    breaks = c(0, 300, 600, 900, 1200, 1500, 1800)
  ) +
  tm_facets("type") +
  tm_shape(physicians) +
  tm_symbols(size = 0.5)

Any pattern you spot right away?

2.2 Combine isochrones with population counts by municipality boundaries

The next step in our analysis involves overlaying and intersecting the isochrone intervals obtained from openrouteservice with municipality boundaries. This allows us to examine the impact of the flood damage on the catchments of physicians in the Ahr Valley area. By performing this intersection, we can identify the areas within each municipality that fall within specific travel time intervals, indicating the accessibility of physicians before and after the flood.

Once we have the intersected shapes, we aim to enrich them with population information from the WorldPop dataset. This additional data will provide insights into the population residing within each isochrone interval and municipality, helping us assess the potential impact of the flood on healthcare accessibility for different population groups.

To achieve this, we will utilize the “exactextract” package, which offers zonal statistic methods. These methods enable us to calculate summary statistics within each zone or polygon, in this case, the intersected shapes. By applying the zonal statistic methods, we can derive population-related information such as total population, population density, or any other relevant demographic indicators within each zone.

# Iterate over each municipality
for (m in 1:nrow(municipalities)) {
  mun <- municipalities[m, ] # select single municipality

  # intersect isochrone with municipality boundary
  mun_iso <-
    isochrones |> st_make_valid() |>
    filter(st_intersects(geometry, mun, sparse = F)) |>
    st_intersection(mun) |>
    group_by(value, type) |>
    # union/dissolve by interval to create continous areas for 300, 600 .. sec
    summarize(geometry = st_union(geometry)) |>
    ungroup() |>
    filter(
      st_geometry_type(geometry) == "POLYGON" |
        st_geometry_type(geometry) == "MULTIPOLYGON"
    )

  # calculate the total pop according to worldpop within the municipality
  mun_total_pop <-
    population |> exact_extract(mun, "sum", progress = T)
  mun_total_area <- mun |>
    st_transform(25832) |>
    st_area() |>
    set_units("km^2")

  # add attributes from municiplaity to isochrone
  mun_iso$name <- mun$name
  mun_iso$id <- mun$id

  # execute the zonal statistic -
  # extract pop on the municipality isochrone intervals
  mun_iso$population_cum <- population |>
    exact_extract(mun_iso, "sum", progress = F)

  # add area as well
  mun_iso$area_cum <- mun_iso |>
    st_transform(25832) |>
    st_area() |>
    set_units("km^2")

  # right now we have cumulative population counts only.
  # here we use window functions to
  # calculate absolute values per interva
  mun_iso <- mun_iso |>
    group_by(value, type) |>
    arrange(desc(value)) |>
    mutate(
      area = area_cum - lead(area_cum, default = set_units(0, "km^2")),
      pop = population_cum - lead(population_cum, default = 0),
      area_rel = area / mun_total_area,
      pop_rel = pop / mun_total_pop
    ) |>
    ungroup()

  # check for first iteration to init, else append
  if (m == 1) {
    mun_isos <- mun_iso
    # if not just append to the result sf data frame
  } else {
    mun_isos <- rbind(mun_isos, mun_iso)
  }

}

3 Analysis and evaluation

By combining the isochrone intervals, municipality boundaries, and population data, we can gain a comprehensive understanding of the changes to physician catchments and healthcare accessibility after the flood. This information will be crucial in identifying areas where population needs may have increased due to infrastructure damage, and aid in planning targeted relief efforts and healthcare resource allocation. But a table is hard to extract insights from. Therefore we create a bar plot comparing the amount of people within a isochrone interval for every municipality.

mun_sort <- mun_isos |> filter(value == 1200 & type == "regular")
mun_sort <- mun_sort |> arrange(pop_rel)
mun_isos$name <- factor(mun_isos$name, levels = c(mun_sort$name, "Münsch"))


mun_isos |> #filter(value < 9999) |>
  mutate(value = factor(value, levels = c(seq(300, 1800, 300)))) |>
  ggplot(aes(x = name, y = pop_rel * 100, fill = type)) +
  geom_bar(stat = "identity", position = position_dodge()) +
  scale_fill_grey() +
  theme_classic() +
  theme(axis.text.x = element_text(
    angle = 45,
    vjust = 1,
    hjust = 1
  )) +
  labs(x = "Affected Communities",
       y = "Relative population reached ") +
  facet_wrap(vars(value), nrow = 4)

What do you see in the plot? What may be the reason some communities do not have any bar for either regular or disaster aware isochrone?

4 Add-on: potential service

How many physicians potentially serve an area?

In addition to determining catchments using isochrones, we now have an exciting opportunity to explore potential service availability in space within the Ahr Valley area. This use case of isochrones allows us to gain insights into the distribution of physicians and their accessibility across different travel time intervals.

In this next section, we will take a closer look at the spatial distribution of healthcare services by sampling hexagons within our study area. These hexagons will serve as spatial units for our analysis. By intersecting these hexagons with the previously obtained isochrones, we will be able to quantify the number of physicians within reach of each hexagon.

This analysis not only provides a quantitative measure of service availability in different parts of the Ahr Valley but also allows us to differentiate the isochrones based on travel time. By observing patterns across the different travel time intervals, we can gain valuable insights into how the accessibility of physicians varies in relation to the affected areas.

By exploring these patterns, we can identify areas with a higher concentration of physicians and areas that may require additional healthcare resources. This information can be crucial for post-disaster planning, resource allocation, and identifying potential gaps in service provision.

First we sample the hexagons and select by which intersect with the municipalities.

hexagons <-
  st_sf(geometry = st_make_grid(municipalities, square = F, n = c(50, 50))) |>
  filter(st_intersects(geometry, municipalities |> st_union(), sparse =
                         F))

Great. Next we count overlapping isochrones by interval. To showcase this, we use the first 4 intervalls only.

# Create hexagon copies for each interval of interest. This is necessary, as we wanta long table in the end.
hexagons300 <- hexagons
hexagons600 <- hexagons
hexagons900 <- hexagons
hexagons1200 <- hexagons

# then do the overlap and count for every copy
hexagons300$value <- 300
hexagons300$count <- lengths(st_intersects(
  hexagons,
  isochrones |> filter(value == 300 & type == "regular") |>
    st_make_valid()
))


hexagons600$value <- 600
hexagons600$count <- lengths(st_intersects(
  hexagons,
  isochrones |> filter(value == 600 & type == "regular") |>
    st_make_valid()
))


hexagons900$value <- 900
hexagons900$count <- lengths(st_intersects(
  hexagons,
  isochrones |> filter(value == 900 & type == "regular") |>
    st_make_valid()
))

hexagons1200$value <- 1200
hexagons1200$count <- lengths(st_intersects(
  hexagons,
  isochrones |> filter(value == 1200 & type == "regular") |>
    st_make_valid()
))

# combine to one long sf dataframe
hexagons_all <- rbind(hexagons300, hexagons600, hexagons900, hexagons1200)
tm_basemap("OpenStreetMap") +
  tm_shape(hexagons_all) +
  tm_polygons("count", palette = "magma", lwd = .1) +
  tm_facets("value")

Which pattern can you observe when you go from 5 min up to 20 min? Which regions might be over- / undersupplied?

Now you! Can you repeat the workflow with the foot-walking profile?