Skip to article frontmatterSkip to article content

Zooplankton biomass

A copepod, a type of zooplankton. Art credit: Kristen Krumhardt


Overview

Zooplankton are tiny oceanic animals, making up the next step up after phytoplankton in the food web. Here we evaluate modeled zooplankton biomass and compare it to observational data.

  1. General setup

  2. Subsetting

  3. Processing - long-term mean

  4. Mapping zooplankton biomass at the surface

  5. Comparing mesozooplankton biomass to observations

  6. Making monthly climatology maps to compare to observations

Prerequisites

ConceptsImportanceNotes
MatplotlibNecessary
Intro to CartopyNecessary
Dask CookbookHelpful
Intro to XarrayHelpful
  • Time to learn: 30 min


Imports

import xarray as xr
import glob
import numpy as np
import matplotlib.pyplot as plt
import cartopy
import cartopy.crs as ccrs
import pop_tools
from dask.distributed import LocalCluster
import s3fs

from module import adjust_pop_grid
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pop_tools/__init__.py:4: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81.
  from pkg_resources import DistributionNotFound, get_distribution

General setup (see intro notebooks for explanations)

Connect to cluster

cluster = LocalCluster()
client = cluster.get_client()
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/node.py:187: UserWarning: Port 8787 is already in use.
Perhaps you already have a cluster running?
Hosting the HTTP server on port 42797 instead
  warnings.warn(

Bring in POP grid utilities

ds_grid = pop_tools.get_grid('POP_gx1v7')
lons = ds_grid.TLONG
lats = ds_grid.TLAT
depths = ds_grid.z_t * 0.01
Downloading file 'inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' from 'https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' to '/home/runner/.pop_tools'.
---------------------------------------------------------------------------
ConnectionRefusedError                    Traceback (most recent call last)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connection.py:198, in HTTPConnection._new_conn(self)
    197 try:
--> 198     sock = connection.create_connection(
    199         (self._dns_host, self.port),
    200         self.timeout,
    201         source_address=self.source_address,
    202         socket_options=self.socket_options,
    203     )
    204 except socket.gaierror as e:

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/util/connection.py:85, in create_connection(address, timeout, source_address, socket_options)
     84 try:
---> 85     raise err
     86 finally:
     87     # Break explicitly a reference cycle

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/util/connection.py:73, in create_connection(address, timeout, source_address, socket_options)
     72     sock.bind(source_address)
---> 73 sock.connect(sa)
     74 # Break explicitly a reference cycle

ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

NewConnectionError                        Traceback (most recent call last)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:787, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
    786 # Make the request on the HTTPConnection object
--> 787 response = self._make_request(
    788     conn,
    789     method,
    790     url,
    791     timeout=timeout_obj,
    792     body=body,
    793     headers=headers,
    794     chunked=chunked,
    795     retries=retries,
    796     response_conn=response_conn,
    797     preload_content=preload_content,
    798     decode_content=decode_content,
    799     **response_kw,
    800 )
    802 # Everything went great!

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:488, in HTTPConnectionPool._make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length)
    487         new_e = _wrap_proxy_error(new_e, conn.proxy.scheme)
--> 488     raise new_e
    490 # conn.request() calls http.client.*.request, not the method in
    491 # urllib3.request. It also calls makefile (recv) on the socket.

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:464, in HTTPConnectionPool._make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length)
    463 try:
--> 464     self._validate_conn(conn)
    465 except (SocketTimeout, BaseSSLError) as e:

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:1093, in HTTPSConnectionPool._validate_conn(self, conn)
   1092 if conn.is_closed:
-> 1093     conn.connect()
   1095 # TODO revise this, see https://github.com/urllib3/urllib3/issues/2791

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connection.py:753, in HTTPSConnection.connect(self)
    752 sock: socket.socket | ssl.SSLSocket
--> 753 self.sock = sock = self._new_conn()
    754 server_hostname: str = self.host

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connection.py:213, in HTTPConnection._new_conn(self)
    212 except OSError as e:
--> 213     raise NewConnectionError(
    214         self, f"Failed to establish a new connection: {e}"
    215     ) from e
    217 sys.audit("http.client.connect", self, self.host, self.port)

NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d6ce541a0>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

MaxRetryError                             Traceback (most recent call last)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/adapters.py:644, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
    643 try:
--> 644     resp = conn.urlopen(
    645         method=request.method,
    646         url=url,
    647         body=request.body,
    648         headers=request.headers,
    649         redirect=False,
    650         assert_same_host=False,
    651         preload_content=False,
    652         decode_content=False,
    653         retries=self.max_retries,
    654         timeout=timeout,
    655         chunked=chunked,
    656     )
    658 except (ProtocolError, OSError) as err:

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:841, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
    839     new_e = ProtocolError("Connection aborted.", new_e)
--> 841 retries = retries.increment(
    842     method, url, error=new_e, _pool=self, _stacktrace=sys.exc_info()[2]
    843 )
    844 retries.sleep()

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/util/retry.py:519, in Retry.increment(self, method, url, response, error, _pool, _stacktrace)
    518     reason = error or ResponseError(cause)
--> 519     raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    521 log.debug("Incremented Retry for (url='%s'): %r", url, new_retry)

MaxRetryError: HTTPSConnectionPool(host='svn-ccsm-inputdata.cgd.ucar.edu', port=443): Max retries exceeded with url: /trunk/inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d6ce541a0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
Cell In[3], line 1
----> 1 ds_grid = pop_tools.get_grid('POP_gx1v7')
      2 lons = ds_grid.TLONG
      3 lats = ds_grid.TLAT

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pop_tools/grid.py:137, in get_grid(grid_name, scrip)
    134 nlon = grid_attrs['lateral_dims'][1]
    136 # read horizontal grid
--> 137 horiz_grid_fname = INPUTDATA.fetch(grid_attrs['horiz_grid_fname'], downloader=downloader)
    138 grid_file_data = np.fromfile(horiz_grid_fname, dtype='>f8', count=-1)
    139 grid_file_data = grid_file_data.reshape((7, nlat, nlon))

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pop_tools/grid.py:92, in fetch(self, fname, processor, downloader)
     89     if downloader is None:
     90         downloader = pooch.downloaders.choose_downloader(url)
---> 92     pooch.core.stream_download(url, full_path, known_hash, downloader, pooch=self)
     94 if processor is not None:
     95     return processor(str(full_path), action, self)

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pooch/core.py:807, in stream_download(url, fname, known_hash, downloader, pooch, retry_if_failed)
    803 try:
    804     # Stream the file to a temporary so that we can safely check its
    805     # hash before overwriting the original.
    806     with temporary_file(path=str(fname.parent)) as tmp:
--> 807         downloader(url, tmp, pooch)
    808         hash_matches(tmp, known_hash, strict=True, source=str(fname.name))
    809         shutil.move(tmp, str(fname))

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pooch/downloaders.py:220, in HTTPDownloader.__call__(self, url, output_file, pooch, check_only)
    218     # pylint: enable=consider-using-with
    219 try:
--> 220     response = requests.get(url, timeout=timeout, **kwargs)
    221     response.raise_for_status()
    222     content = response.iter_content(chunk_size=self.chunk_size)

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/api.py:73, in get(url, params, **kwargs)
     62 def get(url, params=None, **kwargs):
     63     r"""Sends a GET request.
     64 
     65     :param url: URL for the new :class:`Request` object.
   (...)     70     :rtype: requests.Response
     71     """
---> 73     return request("get", url, params=params, **kwargs)

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/api.py:59, in request(method, url, **kwargs)
     55 # By using the 'with' statement we are sure the session is closed, thus we
     56 # avoid leaving sockets open which can trigger a ResourceWarning in some
     57 # cases, and look like a memory leak in others.
     58 with sessions.Session() as session:
---> 59     return session.request(method=method, url=url, **kwargs)

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/sessions.py:589, in Session.request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    584 send_kwargs = {
    585     "timeout": timeout,
    586     "allow_redirects": allow_redirects,
    587 }
    588 send_kwargs.update(settings)
--> 589 resp = self.send(prep, **send_kwargs)
    591 return resp

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/sessions.py:703, in Session.send(self, request, **kwargs)
    700 start = preferred_clock()
    702 # Send the request
--> 703 r = adapter.send(request, **kwargs)
    705 # Total elapsed time of the request (approximately)
    706 elapsed = preferred_clock() - start

File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/requests/adapters.py:677, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
    673     if isinstance(e.reason, _SSLError):
    674         # This branch is for urllib3 v1.22 and later.
    675         raise SSLError(e, request=request)
--> 677     raise ConnectionError(e, request=request)
    679 except ClosedPoolError as e:
    680     raise ConnectionError(e, request=request)

ConnectionError: HTTPSConnectionPool(host='svn-ccsm-inputdata.cgd.ucar.edu', port=443): Max retries exceeded with url: /trunk/inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d6ce541a0>: Failed to establish a new connection: [Errno 111] Connection refused'))

Load the data

jetstream_url = 'https://js2.jetstream-cloud.org:8001/'

s3 = s3fs.S3FileSystem(anon=True, client_kwargs=dict(endpoint_url=jetstream_url))

# Generate a list of all files in CESM folder
s3path = 's3://pythia/ocean-bgc/cesm/g.e22.GOMIPECOIAF_JRA-1p4-2018.TL319_g17.4p2z.002branch/ocn/proc/tseries/month_1/*'
remote_files = s3.glob(s3path)
s3.invalidate_cache()

# Open all files from folder
fileset = [s3.open(file) for file in remote_files]

# Open with xarray
ds = xr.open_mfdataset(fileset, data_vars="minimal", coords='minimal', compat="override", parallel=True,
                       drop_variables=["transport_components", "transport_regions", 'moc_components'], decode_times=True)

ds

Subsetting

variables =['mesozooC', 'microzooC']
keep_vars=['z_t','z_t_150m','dz','time_bound','time','TAREA','TLAT','TLONG'] + variables
ds = ds.drop_vars([v for v in ds.variables if v not in keep_vars])

Processing - long-term mean

Pull in the function we defined in the nutrients notebook...

def year_mean(ds):
    """
    Source: https://ncar.github.io/esds/posts/2021/yearly-averages-xarray/
    """
    
    # Make a DataArray with the number of days in each month, size = len(time)
    month_length = ds.time.dt.days_in_month

    # Calculate the weights by grouping by 'time.year'
    weights = (
        month_length.groupby("time.year") / month_length.groupby("time.year").sum()
    )

    # Test that the sum of the weights for each year is 1.0
    np.testing.assert_allclose(weights.groupby("time.year").sum().values, np.ones((len(ds.groupby("time.year")), )))

    # Calculate the weighted average
    return (ds * weights).groupby("time.year").sum(dim="time")
# Take the long-term mean of our data set, processing years and months separately

ds_annual = year_mean(ds).mean("year")

Plot mesozooplankton and microzooplankton biomass at the surface

fig = plt.figure(figsize=(8,5))

ax = fig.add_subplot(2,1,1, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('microzooC at surface', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats,  ds_annual.microzooC.isel(z_t_150m=0))
pc=ax.pcolormesh(lon, lat, field, cmap='Blues',vmin=0,vmax=2,transform=ccrs.PlateCarree())
cbar1 = fig.colorbar(pc, ax=ax,extend='max',label='microzooC (mmol m$^{-3}$)')
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)


ax = fig.add_subplot(2,1,2, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('mesozooC at surface', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats,  ds_annual.mesozooC.isel(z_t_150m=0))
pc=ax.pcolormesh(lon, lat, field, cmap='Oranges',vmin=0,vmax=4,transform=ccrs.PlateCarree())
cbar1 = fig.colorbar(pc, ax=ax,extend='max',label='mesozooC (mmol m$^{-3}$)')
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)

Compare mesozooplankton biomass to COPEPOD database

We use data compiled through the COPEPOD project (Moriarty & O’Brien, 2013). This data has been pre-processed, but the raw data is available on the COPEPOD website.

Read in COPEPOD data

copepod_obs_path = 's3://pythia/ocean-bgc/obs/copepod-2012__cmass-m00-qtr.zarr'

copepod_obs = s3fs.S3Map(root=copepod_obs_path, s3=s3)

ds_copepod = xr.open_dataset(copepod_obs, engine="zarr")

### converting grams to moles
ds_copepod['copepod_C']=ds_copepod.copepod_C/12.011
ds_copepod

Plot

fig = plt.figure(figsize=(12,3))

ax = fig.add_subplot(1,2,1, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('COPEPOD dataset', fontsize=12)
pc=ax.pcolormesh(ds_copepod.lon, ds_copepod.lat, ds_copepod.copepod_C, cmap='Reds',vmin=0,vmax=2,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)

ax = fig.add_subplot(1,2,2, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('CESM ${\it Mesozooplankton}$ biomass', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, ds_annual.mesozooC.mean(dim='z_t_150m'))
pc=ax.pcolormesh(lon, lat, field, cmap='Reds',vmin=0,vmax=2,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)

fig.subplots_adjust(right=0.8)
cbar_ax = fig.add_axes([0.85, 0.15, 0.02, 0.7])
fig.colorbar(pc, cax=cbar_ax,extend='max', label='top 150m/200m mean (mmol m$^{-3}$)');

Making monthly climatology maps to compare to observations

Compare to observation-based GLMM (Generalized Linear Mixed Model) of global mesozooplankton biomass climatology

This data is from Heneghan et al., 2020, which includes the COPEPOD dataset we used previously as well as additional observations, with some pre-processing.

mesozoo_obs_path = 'data/obsglmm_zmeso_vint_200m_monthly_climatology.nc'

ds_copepod_clim = xr.open_dataset(mesozoo_obs_path)
ds_copepod_clim.zmeso200.attrs['units'] = 'mgC m-2'

months = ['Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec']

Make our CESM data into a monthly climatology

mon_ds = ds.copy()
mon_ds = ds.groupby('time.month').mean('time')
### depth integrate and convert model to mg C/m2
mon_ds['mesozooC_zint'] = ((mon_ds.mesozooC) * 10.).sum(dim='z_t_150m') #in mmol/m2
mon_ds['mesozooC_zint'] = mon_ds['mesozooC_zint'] * 12.011 #convert to mgC/m2
mon_ds['mesozooC_zint'].attrs['units'] = 'mgC m-2'

Plot

fig = plt.figure(figsize=(5,18))

for row in np.arange(1,13):
    
    ts=row-1
    
    plot = row*2 - 1
    ax = fig.add_subplot(12,2,plot, projection=ccrs.Robinson(central_longitude=305.0))
    ax.set_title(months[ts]+' obs', fontsize=12)
    pc=ax.pcolormesh(ds_copepod_clim.Lon, ds_copepod_clim.Lat, ds_copepod_clim.zmeso200.isel(month=ts), 
                     cmap='Reds',vmin=0,vmax=4000,transform=ccrs.PlateCarree())
    land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
    ax.add_feature(land)
    
    plot = row*2
    ax = fig.add_subplot(12,2,plot, projection=ccrs.Robinson(central_longitude=305.0))
    ax.set_title(months[ts]+' CESM', fontsize=12)
    tmp = mon_ds.mesozooC_zint.isel(month=ts)
    lon, lat, field = adjust_pop_grid(lons, lats,  tmp)
    pc=ax.pcolormesh(lon, lat, field, cmap='Reds',vmin=0,vmax=4000,transform=ccrs.PlateCarree())
    land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
    ax.add_feature(land)

cbar_ax = fig.add_axes([0.92, 0.15, 0.03, 0.7])
fig.colorbar(pc, cax=cbar_ax,extend='max', label='Depth-integrated copepod biomass (mg m$^{-2}$)');

And close the Dask cluster we spun up at the beginning.

cluster.close()

Summary

You’ve learned how to evaluate zooplankton biomass modeled by CESM-MARBL and compare it to observations.

References
  1. Moriarty, R., & O’Brien, T. D. (2013). Distribution of mesozooplankton biomass in the global ocean. Earth System Science Data, 5(1), 45–55. 10.5194/essd-5-45-2013
  2. Heneghan, R. F., Everett, J. D., Sykes, P., Batten, S. D., Edwards, M., Takahashi, K., Suthers, I. M., Blanchard, J. L., & Richardson, A. J. (2020). A functional size-spectrum model of the global marine ecosystem that resolves zooplankton composition. Ecological Modelling, 435, 109265. 10.1016/j.ecolmodel.2020.109265
  3. Petrik, C. M., Luo, J. Y., Heneghan, R. F., Everett, J. D., Harrison, C. S., & Richardson, A. J. (2022). Assessment and Constraint of Mesozooplankton in CMIP6 Earth System Models. Global Biogeochemical Cycles, 36(11). 10.1029/2022gb007367