Origin and growth of a Snowflake Map

Reproducible code demonstrating the evolution of a latest knowledge viz of CONUS snow cowl
The consequence
It’s been a snowy winter, so let’s make a snow cowl map! This weblog outlines the method of how I made a snowflake hex map of the contiguous U.S. The ultimate product reveals whiter snowflakes the place snow cowl was increased, which was typically within the northern states and alongside the principle mountain ranges. I used just a few attention-grabbing tips and packages (e.g., ggimage, magick) to make this map and overlay the snowflake form. Observe the steps under to see how I made it.

Closing map launched on Twitter.
The idea
It is a creation from the newest knowledge visualization “Concept Blitz” with the USGS VizLab, the place we experiment with new concepts, strategies or knowledge.
My aim for the Concept Blitz was to create a map that confirmed common snow cowl for america in a brand new and attention-grabbing manner. My thought was that it will be cool to indicate a map of snow cowl utilizing snowflake shapes. Hexbin maps are pretty fashionable proper now (example), and I imagined a spin on the basic choropleth hexmap that confirmed imply snow protection with a hexagonal snowflake-shaped masks over every state.

Idea sketch of the map.
The info
On ScienceBase I discovered a publicly-available dataset that had precisely what I used to be searching for: “Contiguous U.S. annual snow persistence and trends from 2001-2020.” (Thanks John Hammond!). These rasters present Snow Cowl Index (SCI) from 2001 by way of 2020 for the contiguous United States utilizing MODIS/Terra Snow Cover 8-day L3 Global 500m Grid knowledge. The SCI values are the fraction of time that snow is current on the bottom from January 1 by way of July 3 for every year within the time collection.
One word: At this level, I’m restricted by this dataset and the truth that it doesn’t embody Alaska or Hawaii in its extent. My aim is to make use of the MODIS knowledge immediately in a future rendition in order that I can embody the entire U.S.
The info and code are open and reproducible so as to attempt it out for your self! Tweet @USGS_DataSci along with your spin-off concepts, we’d like to see what you give you!
Step 1. Obtain the snow cowl index knowledge
To start, obtain the geotif raster recordsdata for SCI from 2001 by way of 2020 utilizing the sbtools
bundle. After studying within the knowledge recordsdata as a raster stack, calculate the imply throughout the 20 years for every raster cell, and plot it to see what the info seem like.
library(sbtools) # used to obtain Sciencebase knowledge
library(tidyverse) # used all through
library(terra)
# Arrange your world enter folder title
input_folder_name <- "static/2023_snowtiles_demo"
# Obtain the snow cowl index (SCI) raster
for(yy in 2001:2020){
# if recordsdata exist already, skip obtain
file_in <- sprintf("%s/MOD10A2_SCI_percents.tif", input_folder_name, yy)
if(!file.exists(file_in)){ # if recordsdata do not exist, obtain
sbtools::item_file_download(sb_id = "5f63790982ce38aaa23a3930",
names = sprintf("MOD10A2_SCI_percents.tif", yy),
locations = file_in,
overwrite_file = F)
}
}
# Learn in SCI geotif recordsdata and convert to raster stack
sci_files <- listing.recordsdata(sprintf("%s/", input_folder_name),
sample = "MOD10A2_SCI",
full.names = T)
sci_stack <- terra::rast(sci_files)
# Calculate 20 yr imply snow cowl index (SCI) for every raster cell
sci_20yr_mean <- imply(sci_stack)
# Plot the 20 yr imply snow cowl index knowledge as-is
terra::plot(sci_20yr_mean)

Map of 20-year-mean Snow Cowl Index as downloaded from Science Base.
Step 2: Summarize the SCI values by state
My unique idea was to make a state-level hexbin map. I began by spatially averaging the snow cowl knowledge to every state’s boundary. To do that, import CONUS state boundaries and re-project them to match the raster’s projection. Then, extract 20-year imply SCI values from the raster to the state polygon boundaries and summarize by state. Then let’s plot the outcomes as a choropleth to see what it appears to be like like. I’m utilizing a darkish plot background to make the snowier whites present up with excessive distinction towards the deep blue background.
library(spData)
library(sf)
library(scico)
# Obtain US State boundaries as sf object
states_shp <- spData::us_states
# Reproject the sf object to match the projection of the raster
states_proj <- states_shp |> sf::st_transform(crs(sci_20yr_mean))
# Clip the raster to the states boundaries to hurry up processing
sci_stack_clip <- terra::crop(x = sci_20yr_mean, y = vect(states_proj), masks = TRUE)
# Extract the SCI values to every state
extract_SCI_states <- terra::extract(x = sci_stack_clip, vect(states_proj))
# Calculate imply SCI by state
SCI_by_state <- as.knowledge.body(extract_SCI_states) |>
group_by(ID) |>
summarise(mean_20yr = imply(imply, na.rm = T))
# Left-join calculated 20-year SCI means to the US States sf object
SCI_state_level <- states_proj |>
mutate(ID = row_number()) |>
left_join(SCI_by_state, by = "ID")
# Arrange theme for all maps to make use of, slightly than default
theme_set(theme_void()+
theme(plot.background = element_rect(fill = "#0A1927"),
legend.textual content = element_text(coloration = "#ffffff"),
legend.title = element_text(coloration = "#ffffff")))
# #0A1927 comes from
# scico::scico(n = 1, palette = "oslo", course = 1, start = 0.1, finish = 0.1)
# Plot choropleth
ggplot() +
geom_sf(knowledge = SCI_state_level, aes( fill = mean_20yr)) +
scico::scale_fill_scico(palette = "oslo", course = 1, start = 0.25)

Map of 20-year-mean Snow Cowl Index spatially summarized by state.
Step 3: Make a state-level hexmap choropleth
To remind you, my aim was to make use of a snowflake form over the choropleth, so I needed the states to be hexagonal in form to reflect the form of a snowflake. To transform the states into hexagons, obtain a pre-made dataset of the hexagons in a geojson
file format. Import this as an sf object
, prepared for plotting, and be part of with the imply SCI knowledge to create the choropleth.
library(geojsonsf)
library(broom)
library(rgeos)
# Obtain the Hexagons boundaries in geojson format right here: https://workforce.carto.com/u/andrew/tables/andrew.us_states_hexgrid/public/map.
# Load this file.
spdf_hex <- geojsonsf::geojson_sf(sprintf("%s/us_states_hexgrid.geojson",
input_folder_name)) |>
# drop the verbose ids
mutate(NAME = gsub(" (United States)", "", google_name)) |>
# filter out Alaska and Hawaii
filter(! NAME %in% c("Alaska", "Hawaii"))
# Left bind the imply SCI knowledge
spdf_hex_sci <- spdf_hex |>
left_join(as.knowledge.body(SCI_state_level) |> choose(NAME, mean_20yr), by = "NAME")
# Now plot this state-level map as simply as described earlier than:
ggplot() +
geom_sf(knowledge = spdf_hex_sci, aes( fill = mean_20yr)) +
scale_fill_scico(palette = "oslo", course = 1, start = 0.25)

Map of 20-year-mean Snow Cowl Index spatially summarized by state, with every state formed like a hexagon.
Step 4: Make a snowflake state-level hexmap choropleth
Lastly, the primary draft of my hexagon choropleth is prepared for refinement, together with the addition of the snowflake masks that I created in Illustrator. This png has a clear background and can permit the colour of the state to indicate by way of.

Vectorized snowflake masks, prepared for all of your mapping makes use of!
To make this rendition, use the packages ggimage
and cowplot
to overlay the png and create a 16×9 composition, respectively. There are two tips to utilizing the ggimage
overlay method right here. First, word that it’s obligatory to make use of the st_coordinates()
perform to extract the coordinates in a type that ggimage
can learn. Secondly, the side ratio and dimension arguments of the geom_image
perform (asp
and dimension
, respectively) have to be toggled till they match the map’s hexagon form and dimension. Lastly, I used a free typeface from Google fonts that felt prefer it had that winter vacation model I used to be going for.
library(cowplot)
library(showtext)
library(sysfonts)
library(ggimage)
# First make the principle choropleth with snowflakes laid on prime
hexmap <- ggplot() +
# foremost hexmap
geom_sf(knowledge = spdf_hex_sci,
aes(fill = mean_20yr),
coloration="#25497C") +
# snowflake overlay with centroids in "x" and "y" format for ggimage to learn
ggimage::geom_image(aes(picture = sprintf("%s/snowMask.png", input_folder_name),
x = st_coordinates(st_centroid(spdf_hex_sci))[,1],
y = st_coordinates(st_centroid(spdf_hex_sci))[,2]),
asp = 1.60, dimension = 0.094) +
scale_fill_scico(palette = "oslo", course = 1, start = 0.20, finish = 1) +
theme(legend.place = "none")
# Load some customized fonts and set some customized settings
font_legend <- 'Pirata One'
sysfonts::font_add_google('Pirata One')
showtext::showtext_opts(dpi = 300, common.wt = 200, daring.wt = 700)
showtext::showtext_auto(allow = TRUE)
# For now, simply utilizing easy white for the font coloration
text_color <- "#ffffff"
# background crammed with darkish blue, extracted from scico however darker than used within the map's coloration ramp
canvas <- grid::rectGrob(
x = 0, y = 0,
width = 16, peak = 9,
gp = grid::gpar(fill = "#0A1927", alpha = 1, col = "#0A1927")
# #0A1927 comes from
# scico::scico(n = 1, palette = "oslo", course = 1, start = 0.1, finish = 0.1)
)
# Make the composition utilizing cowplot
ggdraw(ylim = c(0,1),
xlim = c(0,1)) +
# a background
draw_grob(canvas,
x = 0, y = 1,
peak = 9, width = 16,
hjust = 0, vjust = 1) +
# the nationwide hex map
draw_plot(hexmap,
x = 0.01,
y = 0.01,
peak = 0.95) +
# map title
draw_label("'Tis the season to be snowy!",
x = 0.05,
y = 0.9,
hjust = 0,
vjust = 1,
fontfamily = font_legend,
coloration = text_color,
dimension = 26)

Map of 20-year-mean Snow Cowl Index spatially summarized by state, with every state formed like a hexagon. This was the primary draft shared again to the Vizlab workforce.
It was actually cool to see my first thought come to life on this map!
Step 5: Refine the snowflake map with a finer decision hexmap
The picture above was shared out with my working group, and though all of us thought it was neat, it additionally doesn’t actually present sufficient number of snow pack within the contiguous U.S. – every state has areas of upper and decrease snowfall which are not represented at this coarse scale. To deal with this, we selected a extra personalized hexagon grid that had finer decision than state-level.
Utilizing a built-in function of the bundle sf
, st_make_grid()
, create a hexagonal spatial grid over the contiguous United States, utilizing the state boundaries because the extent. Map that over the boundaries to get an thought for a way small these new hexagons are in relation to the state boundaries.
# make hex tesselation of CONUS
columns <- 70
rows <- 70
hex_grid <- states_proj %>%
# utilizing the venture states boundaries, make a hexagon grid that's 70 by 70 throughout the US
sf::st_make_grid(n = c(columns, rows),
what = "polygons",
# if sq. = TRUE, then sq.. In any other case hexagonal
sq. = FALSE) %>%
sf::st_as_sf() %>%
mutate(geometry = x) %>%
mutate(hex = as.character(seq.int(nrow(.))))
# Map with states on prime to see if the hexagons all lined up appropriately
ggplot(hex_grid) +
geom_sf(fill = "NA", coloration = "#999999") +
geom_sf(knowledge = states_proj |> st_as_sf(), coloration = "cyan", fill = "NA")

Map that reveals the dimensions and form of the hexagonal grid in relation to the state boundaries.
Subsequent, observe the identical steps as above to extract the 20-year imply SCI values to every hexagon and recreate the map, this time with a a lot finer decision snowflake.
# Extract values to the hexagon grid from the masked raster
extract_SCI_hex <- terra::extract(x = sci_stack_clip, vect(hex_grid))
# Calculate the 20-year SCI means to every hexagon
SCI_by_hex <- as.knowledge.body(extract_SCI_hex) |>
group_by(ID) |>
summarise(mean_20yr = imply(imply, na.rm = T))
# Calculate 20-year SCI means and left-join to the hexagon grid sf object
SCI_hex_grid <- hex_grid |>
mutate(ID = row_number()) |>
left_join(SCI_by_hex, by = "ID") |>
filter(! is.na(mean_20yr)) # delete hexagons exterior the US boundaries
# Map the imply SCI values to see if the joins and calculations labored
SCI_hex_grid %>%
ungroup() %>%
ggplot() +
geom_sf(aes(fill = mean_20yr),
coloration = "black",
dimension = 0.2) +
scale_fill_scico(palette = "oslo", course = 1, start = 0.20, finish = 1)

This map model reveals much more element in the place snow is throughout the contiguous United States. Simply want so as to add on the snowflake masks!
Subsequent, add within the snowflake masks as a png. Once more, it’s obligatory to make use of the st_coordinates(st_centroid())
capabilities to extract the centroid coordinates in a type that the ggimage
bundle can learn, and the side ratio and dimension of the picture must be tweaked once more to match the brand new, smaller hexagons.
SCI_hex_grid |>
ungroup() %>%
ggplot() +
geom_sf(aes(fill = mean_20yr),
coloration = "black",
dimension = 0.2) +
scale_fill_scico(palette = "oslo", course = 1, start = 0.20, finish = 1) +
ggimage::geom_image(aes(picture = sprintf("%s/snowMask.png", input_folder_name),
x = st_coordinates(st_centroid(SCI_hex_grid))[,1],
y = st_coordinates(st_centroid(SCI_hex_grid))[,2]),
asp = 1.60, dimension = 0.015) +
theme_void()+
theme(plot.background = element_rect(fill = "#0A1927"),
legend.textual content = element_text(coloration = "#ffffff"),
legend.title = element_text(coloration = "#ffffff"))

This snowflake map turned out higher than I had hoped. Subsequent step is to wash up the map composition and add in textual content and legends.
Step 6: Finalize the map and customise the structure
The very last thing to do is make a customized legend, add within the USGS brand, and clear the whole lot up for sharing out. A cool trick I’m going to make use of for the customized legend is to take the snowflake masks and use the bundle magick
to colorize it in order that it matches the colours used within the scico
palette. Features are used right here to systematically produce a customized legend with staggered snowflakes and textual content labels.
I made the ultimate picture for Twitter in a 16×9 format with the ggsave
perform.
Twitter structure
library(magick)
library(purrr)
# Use the inbuilt scico perform to extract 9 values of colours used within the scico coloration scale
colours <- scico::scico(n = 9, palette = "oslo", course = 1, start = 0.20, finish = 1)
# Direct magick to the picture file of the snowflake masks
snowflake <- magick::image_read(sprintf("%s/snowMask.png", input_folder_name))
# Assign textual content colours to the nearly-white worth of the colour ramp
text_color = colours[8]
# Load in USGS brand and colorize it utilizing the colour ramp
usgs_logo <- magick::image_read(sprintf("%s/usgs_logo.png", input_folder_name)) %>%
magick::image_colorize(100, colours[7])
# Outline some baseline values that we are going to use to systematically construct customized legend
legend_X = 0.405 # baseline worth for the legend hexagons - each different one will stagger off this
legend_X_evenNudge = 0.01 # quantity to shift the evenly numbered legend pngs
legend_Y = 0.15 # baseline for backside hex, every builds up off this distance
legend_Y_nudge = 0.035 # quantity to bump the png up for every legend stage
# First perform right here creates a staggered placement for every of the 9 snowflakes within the legend
hex_list <- purrr::map(1:9, perform (x){
# decide if odd and even quantity for X-positioning
nudge_x <- perform(x){
if((x %% 2) == 0){ #if even
return(legend_X_evenNudge)
} else {
return(0)
}
}
# Draw picture based mostly on baseline and nudge positioning
draw_image(snowflake |>
magick::image_colorize(opacity = 100, coloration = colours[x]),
peak = 0.04,
y = legend_Y + legend_Y_nudge*(x-1),
x = legend_X + nudge_x(x))
})
# Second perform right here equally systematically locations the labels apart every legend snowflake
legend_text_list <- purrr::map(1:9, perform (x){
# decide if odd and even quantity for X-positioning
nudge_x <- perform(x){
if((x %% 2) == 0){ #if even
return(0.04)
} else {
return(0)
}
}
# Draw textual content based mostly on baseline and nudge positioning
draw_text(sprintf("%s%%", (x-1)*10),
y = legend_Y + legend_Y_nudge*(x-1),
x = 0.93 - nudge_x(x),
coloration = text_color,
vjust = -1)
})
# Outline the principle plot
main_plot <- SCI_hex_grid |>
ungroup() %>%
ggplot() +
geom_sf(aes(fill = mean_20yr),
coloration = "black",
dimension = 0.2) +
scale_fill_scico(palette = "oslo", course = 1, start = 0.20, finish = 1) +
ggimage::geom_image(aes(picture = sprintf("%s/snowMask.png", input_folder_name),
x = st_coordinates(st_centroid(SCI_hex_grid))[,1],
y = st_coordinates(st_centroid(SCI_hex_grid))[,2]),
asp = 1.60, dimension = 0.015) + #ggimage bundle
theme(legend.place = "none")
ggdraw(ylim = c(0,1),
xlim = c(0,1)) +
# a background
draw_grob(canvas,
x = 0, y = 1,
peak = 9, width = 16,
hjust = 0, vjust = 1) +
# nationwide hex map
draw_plot(main_plot,
x = 0.01, y = 0.01,
peak = 1) +
# explainer textual content
draw_label("Snow Cowl Index is the fraction of time snow wasnon the bottom from January 1 to July 3, 2001 to 2020.nData from: doi.org/10.5066/P9U7U5FP",
fontfamily = sysfonts::font_add_google("Supply Sans Professional"),
x = 0.04, y = 0.115,
dimension = 14,
hjust = 0, vjust = 1,
coloration = colours[6])+
# Title
draw_label("'Tis the seasonnto be snowy!",
x = 0.04, y = 0.285,
hjust = 0, vjust = 1,
lineheight = 0.75,
fontfamily = font_legend,
coloration = text_color,
dimension = 55) +
# Legend title
draw_label("Snow Cowl Index",
fontfamily = font_legend,
x = 0.86, y = 0.50,
dimension = 18,
hjust = 0, vjust = 1,
coloration = text_color)+
# Legend snowflakes
hex_list +
# Legend lables
legend_text_list +
# Add brand
draw_image(usgs_logo,
x = 0.86, y = 0.05,
width = 0.1,
hjust = 0, vjust = 0,
halign = 0, valign = 0)
# Save the ultimate picture in Twitter's 16 by 9 format
ggsave(sprintf("%s/snowtilesTwitter.png", input_folder_name),
width = 16, peak = 9, dpi = 300)

Closing map launched on Twitter.
And with that, you’ve all of the code and the snowflake masks to make these maps or your personal model of a snow map!
Particular due to Cee, Hayley, Elmera, Anthony, Nicole, and Mandie for his or her assist with this viz.
Associated Posts
-
November 2, 2020
Introduction up to date 11-2-2020 after updates described right here.
The Hydro Community-Linked Information Index (NLDI) is a system that may index knowledge to NHDPlus V2 catchments and gives a search service to find listed info. -
July 26, 2022
A brand new R computational bundle was created to mixture and plot USGS groundwater knowledge, offering customers with a lot of the performance supplied in Groundwater Watch and the Florida Salinity Mapper.
-
July 26, 2022
dataRetrieval is an R bundle that gives user-friendly entry to water knowledge from both the Water High quality Portal (WQP) or the Nationwide Water Info Service (NWIS).
-
October 1, 2021
Missed the webinar? Watch the recording The recording of this webinar might be seen on the USGS web site or on the USGS YouTube channel. Tell us your ideas!
-
March 29, 2021
For the previous few years, we’ve launched quarterly animations of streamflow situations in any respect energetic USGS streamflow websites. These visualizations use every day streamflow measurements pulled from the USGS Nationwide Water Info System (NWIS) to indicate how streamflow modifications all year long and spotlight the explanations behind some hydrologic patterns.