Visualizing Meshes¶
As part of the EarthSim project, extensive support for visualizing triangular meshes was added to HoloViews and Datashader; see the websites for those projects for details of how to use this functionality. Here, we will show how to use the small utilities provided in earthsim
itself, which allow you to read 3DM and mesh2d files with mesh data, and we'll then visualize the resulting structures.
import numpy as np
import holoviews as hv
import geoviews as gv
import cartopy.crs as ccrs
from holoviews import opts
from holoviews.operation.datashader import datashade, rasterize
import datashader as ds
from colorcet import cm_n
from earthsim.io import read_3dm_mesh, read_mesh2d
datashade.precompute = True
hv.extension('bokeh')
size = dict(width=800, height=600)
opts.defaults(
opts.Image(**size), opts.RGB(**size), opts.VectorField(**size))
Static meshes¶
Static trimeshes can be loaded using the read_3dm_mesh
utility, which will return its simplices and vertices. Here we will load a mesh of the Chesapeake and Delaware Bays.
Before we declare the TriMesh
we will project the coordinates of the vertices to a Mercator projection for plotting, so that GeoViews does not have to re-project the coordinates each time the plot is displayed. We also declare the 'z' dimension representing the value at each vertex (water depth in this case).
fpath = '../data/Chesapeake_and_Delaware_Bays.3dm'
tris, verts = read_3dm_mesh(fpath)
points = gv.operation.project_points(gv.Points(verts, vdims=['z']))
Now that we have the simplices and vertices we can construct the TriMesh
. Since the TriMesh is too large to display on its own in Bokeh (which provides all data to the web browser by default), we apply the datashade
operation to render the data into an image before providing it to the browser. To begin with, we will visualize the underlying mesh by datashading just trimesh.edgepaths
:
tiles = gv.WMTS('https://maps.wikimedia.org/osm-intl/{Z}/{X}/{Y}@2x.png')
trimesh = gv.TriMesh((tris, points))
chesapeake_mesh = tiles * datashade(trimesh.edgepaths)
chesapeake_mesh
Here the plot may look continuous in most regions, but if you have a live, running version of this plot, you can zoom in to see that it's a wireframe with high resolution (many grid points and vertices) in the inland areas and low resolution in the open water.
Since the mesh also has associated 'z' values, we can render the value at each pixel that falls inside a triangle, interpolating from the values at the vertices when there are multiple pixels per triangle, and aggregating across vertex values (averaging in this case) when there are multiple vertices per pixel:
tiles = gv.WMTS('https://maps.wikimedia.org/osm-intl/{Z}/{X}/{Y}@2x.png')
trimesh = gv.TriMesh((tris, points))
chesapeake_interp = tiles * rasterize(trimesh, aggregator=ds.mean('z')).opts(colorbar=True, cmap='Viridis')
chesapeake_interp
If you have a live running Python process (not a static web page), you can zoom in to the aboe plot and eventually make out the original triangles.
Meshes over time¶
When running environmental simulations using a tool like AdH, the mesh is also often accompanied by additional data files representing depths, velocity, and error values over time. We can load in these values and use them to visualize changes in each variable as a simulation evolves.
First, we again load a static mesh and project it from its native UTM Zone 11 coordinate system to the Mercator projection.
fpath = '../data/SanDiego_Mesh/SanDiego.3dm'
tris, verts = read_3dm_mesh(fpath, skiprows=2)
points = gv.operation.project_points(gv.Points(verts, vdims=['z'], crs=ccrs.UTM(11)))
trimesh = gv.TriMesh((tris, points))
san_diego = tiles * rasterize(trimesh, aggregator='mean').opts(colorbar=True, cmap='Viridis', clim=(-75, 0))
san_diego
Next we load a mesh2d .dat file containing Overland Velocity values over the course of a simulation. The read_mesh2d
utility returns a dictionary of dataframes indexed by time.
fpath = '../data/SanDiego_Mesh/SanDiego_err_hydro.dat'
dfs = read_mesh2d(fpath)
dfs[0].head()
To explore this data we will wrap it in a DynamicMap
that returns a TriMesh for each timepoint. First we declare some points containing the positions of the vertices, and project those (once at the start, rather than for each subsequent plot). Now we can make use of the add_dimension
method on those points to add the additional 'HYDRO_ERROR' variable we loaded from the .dat
file. Finally, we declare a DynamicMap indexed by time with the keys of the dictionary as values.
points = gv.operation.project_points(gv.Points((verts.x, verts.y), crs=ccrs.UTM(11)))
def time_mesh(time):
depth_points = points.add_dimension('HYDRO_ERROR', 0, dfs[time].values[:, 0], vdim=True)
return gv.TriMesh((tris, depth_points), crs=ccrs.GOOGLE_MERCATOR)
meshes = hv.DynamicMap(time_mesh, kdims='Time').redim.values(Time=sorted(dfs.keys()))
Now we can rasterize
the data by interpolating the 'Velocity' and again plot the data over a map. We'll first activate the scrubber
option for HoloMaps, so that we get a set of controls that makes it simple to explore the dataset over time.
error_meshes = datashade(meshes, aggregator='mean', cmap=cm_n['fire'])
error_plot = tiles * error_meshes
hv.output(error_plot, holomap='scrubber', fps=4)
In addition to visualizing an interpolated mesh, we can also visualize vector field data. The SanDiego_ovl
file contains vector data, which we can again load using the read_mesh2d
utility. The GeoViews VectorField
element expects the data to be expressed as angle and magntitude values, so we convert the values and then declare the VectorField
.
fpath = '../data/SanDiego_Mesh/SanDiego_ovl.dat'
velocity_dfs = read_mesh2d(fpath)
def time_field(time):
vx = velocity_dfs[time].values[:, 0]
vy = velocity_dfs[time].values[:, 1]
xs, ys = (points.dimension_values(i) for i in range(2))
with np.errstate(divide='ignore', invalid='ignore'):
angle = np.arctan2(vy, vx)
mag = np.sqrt(vx**2+vy**2)
return gv.VectorField((xs, ys, angle, mag), vdims=['Angle', 'Magnitude'],
crs=ccrs.GOOGLE_MERCATOR)
vectors = hv.DynamicMap(time_field, kdims='Time').redim.values(Time=sorted(dfs.keys()))
tiles * vectors.opts(color='Magnitude', magnitude='Magnitude', scale=0.005, rescale_lengths=False)
Here you will probably need to zoom in to see the structure, but you should be able to make out changes in the flow pattern over time.
If you have data in other formats, you should be able to read the source code for the read_mesh2d and read_3dm_mesh utilities and adapt it to build the same data structures from whatever data you start with.