
A centric diatom, another type of phytoplankton. Art credit: Kristen Krumhardt
Overview¶
In previous notebooks, we explored the distribution of different nutrients in the ocean. Here we examine how the growth of phytoplankton communities is limited by these nutrient distributions.
General setup
Subsetting
Processing - long-term mean
Mapping nutrient limitation at the surface
Mapping biomass-weighted nutrient limitation in the top 100m
Making monthly climatology maps
Prerequisites¶
Concepts | Importance | Notes |
---|---|---|
Matplotlib | Necessary | |
Intro to Cartopy | Necessary | |
Dask Cookbook | Helpful | |
Intro to Xarray | Helpful |
Time to learn: 30 min
Imports¶
import xarray as xr
import glob
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.colors
import cartopy
import cartopy.crs as ccrs
import pop_tools
from dask.distributed import LocalCluster
import s3fs
from module import adjust_pop_grid
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pop_tools/__init__.py:4: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81.
from pkg_resources import DistributionNotFound, get_distribution
General setup (see intro notebooks for explanations)¶
Connect to cluster¶
cluster = LocalCluster()
client = cluster.get_client()
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/node.py:187: UserWarning: Port 8787 is already in use.
Perhaps you already have a cluster running?
Hosting the HTTP server on port 45297 instead
warnings.warn(
Bring in POP grid utilities¶
ds_grid = pop_tools.get_grid('POP_gx1v7')
lons = ds_grid.TLONG
lats = ds_grid.TLAT
depths = ds_grid.z_t * 0.01
Downloading file 'inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' from 'https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' to '/home/runner/.pop_tools'.
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:1097: InsecureRequestWarning: Unverified HTTPS request is being made to host 'svn-ccsm-inputdata.cgd.ucar.edu'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings
warnings.warn(
Downloading file 'inputdata/ocn/pop/gx1v7/grid/topography_20161215.ieeei4' from 'https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/ocn/pop/gx1v7/grid/topography_20161215.ieeei4' to '/home/runner/.pop_tools'.
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:1097: InsecureRequestWarning: Unverified HTTPS request is being made to host 'svn-ccsm-inputdata.cgd.ucar.edu'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings
warnings.warn(
Downloading file 'inputdata/ocn/pop/gx1v7/grid/region_mask_20151008.ieeei4' from 'https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/ocn/pop/gx1v7/grid/region_mask_20151008.ieeei4' to '/home/runner/.pop_tools'.
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:1097: InsecureRequestWarning: Unverified HTTPS request is being made to host 'svn-ccsm-inputdata.cgd.ucar.edu'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings
warnings.warn(
Load the data¶
jetstream_url = 'https://js2.jetstream-cloud.org:8001/'
s3 = s3fs.S3FileSystem(anon=True, client_kwargs=dict(endpoint_url=jetstream_url))
# Generate a list of all files in CESM folder
s3path = 's3://pythia/ocean-bgc/cesm/g.e22.GOMIPECOIAF_JRA-1p4-2018.TL319_g17.4p2z.002branch/ocn/proc/tseries/month_1/*'
remote_files = s3.glob(s3path)
s3.invalidate_cache()
# Open all files from folder
fileset = [s3.open(file) for file in remote_files]
# Open with xarray
ds = xr.open_mfdataset(fileset, data_vars="minimal", coords='minimal', compat="override", parallel=True,
drop_variables=["transport_components", "transport_regions", 'moc_components'], decode_times=True)
ds
2025-09-09 01:47:37,401 - distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py", line 175, in loads
return msgpack.loads(
~~~~~~~~~~~~~^
frames[0], object_hook=_decode_default, use_list=False, **msgpack_opts
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "msgpack/_unpacker.pyx", line 194, in msgpack._cmsgpack.unpackb
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py", line 159, in _decode_default
return merge_and_deserialize(
sub_header, sub_frames, deserializers=deserializers
)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 525, in merge_and_deserialize
return deserialize(header, merged_frames, deserializers=deserializers)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 452, in deserialize
return loads(header, frames)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 195, in serialization_error_loads
raise TypeError(msg)
TypeError: Could not serialize object of type Dataset
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 366, in serialize
header, frames = dumps(x, context=context) if wants_context else dumps(x)
~~~~~^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 78, in pickle_dumps
frames[0] = pickle.dumps(
~~~~~~~~~~~~^
x,
^^
buffer_callback=buffer_callback,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
protocol=context.get("pickle-protocol", None) if context else None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[4], line 14
11 fileset = [s3.open(file) for file in remote_files]
13 # Open with xarray
---> 14 ds = xr.open_mfdataset(fileset, data_vars="minimal", coords='minimal', compat="override", parallel=True,
15 drop_variables=["transport_components", "transport_regions", 'moc_components'], decode_times=True)
17 ds
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/xarray/backends/api.py:1812, in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, data_vars, coords, combine, parallel, join, attrs_file, combine_attrs, errors, **kwargs)
1807 datasets = [preprocess(ds) for ds in datasets]
1809 if parallel:
1810 # calling compute here will return the datasets/file_objs lists,
1811 # the underlying datasets will still be stored as dask arrays
-> 1812 datasets, closers = dask.compute(datasets, closers)
1814 # Combine all datasets, closing them in case of a ValueError
1815 try:
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/dask/base.py:681, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
678 expr = expr.optimize()
679 keys = list(flatten(expr.__dask_keys__()))
--> 681 results = schedule(expr, keys, **kwargs)
683 return repack(results)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/utils_comm.py:416, in retry_operation(coro, operation, *args, **kwargs)
410 retry_delay_min = parse_timedelta(
411 dask.config.get("distributed.comm.retry.delay.min"), default="s"
412 )
413 retry_delay_max = parse_timedelta(
414 dask.config.get("distributed.comm.retry.delay.max"), default="s"
415 )
--> 416 return await retry(
417 partial(coro, *args, **kwargs),
418 count=retry_count,
419 delay_min=retry_delay_min,
420 delay_max=retry_delay_max,
421 operation=operation,
422 )
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/utils_comm.py:395, in retry(coro, count, delay_min, delay_max, jitter_fraction, retry_on_exceptions, operation)
393 delay *= 1 + random.random() * jitter_fraction
394 await asyncio.sleep(delay)
--> 395 return await coro()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/core.py:1259, in PooledRPCCall.__getattr__.<locals>.send_recv_from_rpc(**kwargs)
1257 prev_name, comm.name = comm.name, "ConnectionPool." + key
1258 try:
-> 1259 return await send_recv(comm=comm, op=key, **kwargs)
1260 finally:
1261 self.pool.reuse(self.addr, comm)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/core.py:1018, in send_recv(comm, reply, serializers, deserializers, **kwargs)
1016 await comm.write(msg, serializers=serializers, on_error="raise")
1017 if reply:
-> 1018 response = await comm.read(deserializers=deserializers)
1019 else:
1020 response = None
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/tcp.py:248, in TCP.read(self, deserializers)
246 else:
247 try:
--> 248 msg = await from_frames(
249 frames,
250 deserialize=self.deserialize,
251 deserializers=deserializers,
252 allow_offload=self.allow_offload,
253 )
254 except EOFError:
255 # Frames possibly garbled or truncated by communication error
256 self.abort()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/utils.py:78, in from_frames(frames, deserialize, deserializers, allow_offload)
76 res = await offload(_from_frames)
77 else:
---> 78 res = _from_frames()
80 return res
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/utils.py:61, in from_frames.<locals>._from_frames()
59 def _from_frames():
60 try:
---> 61 return protocol.loads(
62 frames, deserialize=deserialize, deserializers=deserializers
63 )
64 except EOFError:
65 if size > 1000:
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py:175, in loads(frames, deserialize, deserializers)
172 return pickle.loads(sub_header["pickled-obj"], buffers=sub_frames)
173 return msgpack_decode_default(obj)
--> 175 return msgpack.loads(
176 frames[0], object_hook=_decode_default, use_list=False, **msgpack_opts
177 )
179 except Exception:
180 logger.critical("Failed to deserialize", exc_info=True)
File msgpack/_unpacker.pyx:194, in msgpack._cmsgpack.unpackb()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py:159, in loads.<locals>._decode_default(obj)
157 if "compression" in sub_header:
158 sub_frames = decompress(sub_header, sub_frames)
--> 159 return merge_and_deserialize(
160 sub_header, sub_frames, deserializers=deserializers
161 )
162 else:
163 return Serialized(sub_header, sub_frames)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/contextlib.py:85, in ContextDecorator.__call__.<locals>.inner(*args, **kwds)
82 @wraps(func)
83 def inner(*args, **kwds):
84 with self._recreate_cm():
---> 85 return func(*args, **kwds)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:525, in merge_and_deserialize(header, frames, deserializers)
521 merged = host_array_from_buffers(subframes)
523 merged_frames.append(merged)
--> 525 return deserialize(header, merged_frames, deserializers=deserializers)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:452, in deserialize(header, frames, deserializers)
447 raise TypeError(
448 "Data serialized with %s but only able to deserialize "
449 "data with %s" % (name, str(list(deserializers)))
450 )
451 dumps, loads, wants_context = families[name]
--> 452 return loads(header, frames)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:195, in serialization_error_loads(header, frames)
193 def serialization_error_loads(header, frames):
194 msg = "\n".join([codecs.decode(frame, "utf8") for frame in frames])
--> 195 raise TypeError(msg)
TypeError: Could not serialize object of type Dataset
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 366, in serialize
header, frames = dumps(x, context=context) if wants_context else dumps(x)
~~~~~^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 78, in pickle_dumps
frames[0] = pickle.dumps(
~~~~~~~~~~~~^
x,
^^
buffer_callback=buffer_callback,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
protocol=context.get("pickle-protocol", None) if context else None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
Subsetting¶
variables =['sp_Fe_lim_Cweight_avg_100m','sp_P_lim_Cweight_avg_100m','sp_N_lim_Cweight_avg_100m',
'diat_Fe_lim_Cweight_avg_100m', 'diat_P_lim_Cweight_avg_100m','diat_N_lim_Cweight_avg_100m',
'diat_SiO3_lim_Cweight_avg_100m','diaz_P_lim_Cweight_avg_100m',
'diaz_Fe_lim_Cweight_avg_100m','cocco_Fe_lim_Cweight_avg_100m','cocco_C_lim_Cweight_avg_100m','cocco_N_lim_Cweight_avg_100m',
'cocco_P_lim_Cweight_avg_100m','sp_Fe_lim_surf','sp_P_lim_surf','sp_N_lim_surf',
'diat_Fe_lim_surf', 'diat_P_lim_surf','diat_N_lim_surf','diat_SiO3_lim_surf',
'diaz_P_lim_surf','cocco_Fe_lim_surf','cocco_C_lim_surf','cocco_N_lim_surf',
'cocco_P_lim_surf','diaz_Fe_lim_surf']
keep_vars=['z_t','z_t_150m','dz','time_bound', 'time', 'TAREA','TLAT','TLONG'] + variables
ds = ds.drop_vars([v for v in ds.variables if v not in keep_vars])
Processing - long-term mean¶
Pull in the function we defined in the nutrients notebook...
def year_mean(ds):
"""
Properly convert monthly data to annual means, taking into account month lengths.
Source: https://ncar.github.io/esds/posts/2021/yearly-averages-xarray/
"""
# Make a DataArray with the number of days in each month, size = len(time)
month_length = ds.time.dt.days_in_month
# Calculate the weights by grouping by 'time.season'
weights = (
month_length.groupby("time.year") / month_length.groupby("time.year").sum()
)
# Test that the sum of the weights for each season is 1.0
np.testing.assert_allclose(weights.groupby("time.year").sum().values, np.ones((len(ds.groupby("time.year")), )))
# Calculate the weighted average
return (ds * weights).groupby("time.year").sum(dim="time")
Take the long-term mean of our data set. We process monthly to annual with our custom function, then use xarray’s built-in .mean()
function to process from annual data to a single mean over time, since each year is the same length.
ds_year = year_mean(ds).mean("year")
Mapping nutrient limitation at the surface¶
Phytoplankton need a specific ratio of nutrients to grow and produce organic matter. In general, this is known as the Redfield ratio, first proposed by Redfield et al., 1963, and is approximately 106 C:16 N:1 P. Micronutrients like silicate and iron are also needed in more variable amounts depending on plankton type. To learn more about nutrient limitation, see Sarmiento and Gruber Chapter 4: Organic Matter Production. Our dataset uses a numerical notation to specify which nutrient is limiting in each area for each phytoplankton functional type, as specified below:
0 = PO4
1 = Fe
2 = NO3 (only for small phytoplankton and diatoms)
3 = Si (only for diatoms)
3 = C (only for coccolithophores)
To turn this information into a single array, we concatenate the limitation terms along the nutrient
dimension for each phytoplankton functional type.
limarray_sp=xr.concat((ds_year.sp_P_lim_surf, ds_year.sp_Fe_lim_surf,ds_year.sp_N_lim_surf),dim='nutrient')
limarray_diat=xr.concat((ds_year.diat_P_lim_surf, ds_year.diat_Fe_lim_surf, ds_year.diat_N_lim_surf, ds_year.diat_SiO3_lim_surf),dim='nutrient')
limarray_diaz=xr.concat((ds_year.diaz_P_lim_surf, ds_year.diaz_Fe_lim_surf),dim='nutrient')
limarray_cocco=xr.concat((ds_year.cocco_P_lim_surf, ds_year.cocco_Fe_lim_surf, ds_year.cocco_N_lim_surf, ds_year.cocco_C_lim_surf),dim='nutrient')
most_lim_sp=limarray_sp.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_diat=limarray_diat.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_diaz=limarray_diaz.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_cocco=limarray_cocco.argmin(dim='nutrient', skipna=False).squeeze()
mask = np.isnan(ds_year.sp_N_lim_surf.squeeze())
fig = plt.figure(figsize=(8,13))
colorbar_specs = {'ticks' : np.arange(0,4,1)}
ax = fig.add_subplot(4,1,2, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Diat surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_diat.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,1, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('SP surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_sp.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,3, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Cocco surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_cocco.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,4, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Diaz surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_diaz.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
fig.subplots_adjust(right=0.8)
cbar_ax = fig.add_axes([0.85, 0.15, 0.05, 0.7])
cbar = fig.colorbar(pc, cax=cbar_ax,**colorbar_specs)
cbar.ax.set_yticklabels(['P lim', 'Fe lim', 'N lim','SiO3/C lim']);
Mapping biomass-weighted nutrient limitation in the top 100m¶
limarray_sp=xr.concat((ds_year.sp_P_lim_Cweight_avg_100m, ds_year.sp_Fe_lim_Cweight_avg_100m,ds_year.sp_N_lim_Cweight_avg_100m),dim='nutrient')
limarray_diat=xr.concat((ds_year.diat_P_lim_Cweight_avg_100m, ds_year.diat_Fe_lim_Cweight_avg_100m, ds_year.diat_N_lim_Cweight_avg_100m, ds_year.diat_SiO3_lim_Cweight_avg_100m),dim='nutrient')
limarray_diaz=xr.concat((ds_year.diaz_P_lim_Cweight_avg_100m, ds_year.diaz_Fe_lim_Cweight_avg_100m),dim='nutrient')
limarray_cocco=xr.concat((ds_year.cocco_P_lim_Cweight_avg_100m, ds_year.cocco_Fe_lim_Cweight_avg_100m, ds_year.cocco_N_lim_Cweight_avg_100m, ds_year.cocco_C_lim_Cweight_avg_100m),dim='nutrient')
most_lim_sp=limarray_sp.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_diat=limarray_diat.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_diaz=limarray_diaz.argmin(dim='nutrient', skipna=False).squeeze()
most_lim_cocco=limarray_cocco.argmin(dim='nutrient', skipna=False).squeeze()
mask = np.isnan(ds_year.sp_N_lim_surf.squeeze())
fig = plt.figure(figsize=(8,13))
colorbar_specs = {'ticks' : np.arange(0,4,1)}
ax = fig.add_subplot(4,1,2, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Diat biomass-weighted nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_diat.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,1, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('SP biomass-weighted nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_sp.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,3, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Cocco biomass-weighted nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_cocco.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax = fig.add_subplot(4,1,4, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Diaz biomass-weighted nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_diaz.where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
fig.subplots_adjust(right=0.8)
cbar_ax = fig.add_axes([0.85, 0.15, 0.05, 0.7])
cbar = fig.colorbar(pc, cax=cbar_ax,**colorbar_specs)
cbar.ax.set_yticklabels(['P lim', 'Fe lim', 'N lim','SiO3/C lim']);
Making monthly climatology maps¶
Make a monthly climatology dataset¶
A monthly climatology is a dataset where data from each month, including over different years, is averaged together. So for our dataset, the groups would include the average of all Januaries 2010-2019, all Februaries 2010-2019, and so on. This would usually be over a longer time period such as 30 or more years, but we use our shorter dataset so that it processes faster. This technique is helpful for looking at seasonal phenomena while averaging out year-to-year fluctuations.
mon_ds = ds.copy()
mon_ds = ds.groupby('time.month').mean('time')
limarray_sp=xr.concat((mon_ds.sp_P_lim_surf, mon_ds.sp_Fe_lim_surf,mon_ds.sp_N_lim_surf),dim='nutrient')
limarray_diat=xr.concat((mon_ds.diat_P_lim_surf, mon_ds.diat_Fe_lim_surf, mon_ds.diat_N_lim_surf, mon_ds.diat_SiO3_lim_surf),dim='nutrient')
limarray_diaz=xr.concat((mon_ds.diaz_P_lim_surf, mon_ds.diaz_Fe_lim_surf),dim='nutrient')
limarray_cocco=xr.concat((mon_ds.cocco_P_lim_surf, mon_ds.cocco_Fe_lim_surf, mon_ds.cocco_N_lim_surf, mon_ds.cocco_C_lim_surf),dim='nutrient')
most_lim_sp=limarray_sp.argmin(dim='nutrient', skipna=False)
most_lim_diat=limarray_diat.argmin(dim='nutrient', skipna=False)
most_lim_diaz=limarray_diaz.argmin(dim='nutrient', skipna=False)
most_lim_cocco=limarray_cocco.argmin(dim='nutrient', skipna=False)
mask = np.isnan(ds_year.sp_N_lim_surf.squeeze())
fig = plt.figure(figsize=(12,23))
month_list = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec"]
for row in np.arange(1,13):
ts=row-1
plot = row*3 - 2
#row 1 Jan
ax = fig.add_subplot(12,3,plot, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Diat surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_diat.isel(month=ts).where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
ax.set_ylabel(month_list[ts])
ax.set_yticks([]) # necessary to get ylabel to show up
colorbar_specs = {'ticks' : np.arange(0,4,1)}
plot = row*3 - 1
ax = fig.add_subplot(12,3,plot, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('SP surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_sp.isel(month=ts).where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
colorbar_specs = {'ticks' : np.arange(0,4,1)}
plot = row*3
ax = fig.add_subplot(12,3,plot, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('Cocco surface nutrient limitation', fontsize=12)
lon, lat, field = adjust_pop_grid(lons, lats, most_lim_cocco.isel(month=ts).where(~mask))
pc=ax.pcolormesh(lon, lat, field, cmap=matplotlib.colormaps['viridis'].resampled(4),vmin=-0.5,vmax=3.5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
colorbar_specs = {'ticks' : np.arange(0,4,1)}
fig.subplots_adjust(right=0.8)
cbar_ax = fig.add_axes([0.85, 0.15, 0.05, 0.7])
cbar = fig.colorbar(pc, cax=cbar_ax,**colorbar_specs)
cbar.ax.set_yticklabels(['P lim', 'Fe lim', 'N lim','SiO3/C lim']);
And close the Dask cluster we spun up at the beginning.
cluster.close()
Summary¶
You’ve learned how to evaluate phytoplankton nutrient limitation.
Resources and references¶
Sarmiento and Gruber Chapter 4: Organic Matter Production (see Limiting Nutrient in Section 4.2)
- (2013). In Ocean Biogeochemical Dynamics (pp. 102–172). Princeton University Press. 10.2307/j.ctt3fgxqx.7