Overview¶
The carbon cycle is a key part of ocean biogeochemistry and, more broadly, Earth’s climate system. Here we learn how to make maps of some key variables modeled by CESM related to the marine carbon cycle.
- General setup
- Subsetting
- Processing data
- Making maps
Prerequisites¶
Concepts | Importance | Notes |
---|---|---|
Matplotlib | Necessary | |
Intro to Cartopy | Necessary | |
Dask Cookbook | Helpful | |
Intro to Xarray | Helpful |
- Time to learn: 15 min
Imports¶
import xarray as xr
import glob
import numpy as np
import matplotlib.pyplot as plt
import cartopy
import cartopy.crs as ccrs
import pop_tools
from dask.distributed import LocalCluster
import dask
import distributed
import s3fs
from module import adjust_pop_grid
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/pop_tools/__init__.py:4: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81.
from pkg_resources import DistributionNotFound, get_distribution
General setup (see intro notebooks for explanations)¶
Connect to cluster¶
cluster = LocalCluster()
client = cluster.get_client()
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/node.py:187: UserWarning: Port 8787 is already in use.
Perhaps you already have a cluster running?
Hosting the HTTP server on port 45253 instead
warnings.warn(
Bring in POP grid utilities¶
ds_grid = pop_tools.get_grid('POP_gx1v7')
lons = ds_grid.TLONG
lats = ds_grid.TLAT
depths = ds_grid.z_t * 0.01
Downloading file 'inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' from 'https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/ocn/pop/gx1v7/grid/horiz_grid_20010402.ieeer8' to '/home/runner/.pop_tools'.
/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/urllib3/connectionpool.py:1097: InsecureRequestWarning: Unverified HTTPS request is being made to host 'svn-ccsm-inputdata.cgd.ucar.edu'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings
warnings.warn(
ds_grid
Load the data¶
jetstream_url = 'https://js2.jetstream-cloud.org:8001/'
s3 = s3fs.S3FileSystem(anon=True, client_kwargs=dict(endpoint_url=jetstream_url))
# Generate a list of all files in CESM folder
s3path = 's3://pythia/ocean-bgc/cesm/g.e22.GOMIPECOIAF_JRA-1p4-2018.TL319_g17.4p2z.002branch/ocn/proc/tseries/month_1/*'
remote_files = s3.glob(s3path)
s3.invalidate_cache()
# Open all files from folder
fileset = [s3.open(file) for file in remote_files]
# Open with xarray
ds = xr.open_mfdataset(fileset, data_vars="minimal", coords='minimal', compat="override", parallel=True,
drop_variables=["transport_components", "transport_regions", 'moc_components'], decode_times=True)
ds
2025-09-07 01:51:07,222 - distributed.protocol.pickle - ERROR - Failed to serialize <xarray.Dataset> Size: 61MB
Dimensions: (time: 120, nlat: 384, nlon: 320)
Coordinates:
TLAT (nlat, nlon) float64 983kB dask.array<chunksize=(384, 320), meta=np.ndarray>
TLONG (nlat, nlon) float64 983kB dask.array<chunksize=(384, 320), meta=np.ndarray>
* time (time) object 960B 2010-01-16 12:00:00 ... 2019-12-16 12...
Dimensions without coordinates: nlat, nlon
Data variables:
diat_N_lim_surf (time, nlat, nlon) float32 59MB dask.array<chunksize=(60, 192, 160), meta=np.ndarray>.
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
2025-09-07 01:51:07,226 - distributed.protocol.pickle - ERROR - Failed to serialize <xarray.Dataset> Size: 61MB
Dimensions: (time: 120, nlat: 384, nlon: 320)
Coordinates:
TLAT (nlat, nlon) float64 983kB dask.array<chunksize=(384, 320), meta=np.ndarray>
TLONG (nlat, nlon) float64 983kB dask.array<chunksize=(384, 320), meta=np.ndarray>
* time (time) object 960B 2010-01-16 12:00:00 ... 2019-12-16 1...
Dimensions without coordinates: nlat, nlon
Data variables:
cocco_P_lim_surf (time, nlat, nlon) float32 59MB dask.array<chunksize=(60, 192, 160), meta=np.ndarray>.
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
2025-09-07 01:51:07,411 - distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py", line 175, in loads
return msgpack.loads(
~~~~~~~~~~~~~^
frames[0], object_hook=_decode_default, use_list=False, **msgpack_opts
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "msgpack/_unpacker.pyx", line 194, in msgpack._cmsgpack.unpackb
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py", line 159, in _decode_default
return merge_and_deserialize(
sub_header, sub_frames, deserializers=deserializers
)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 525, in merge_and_deserialize
return deserialize(header, merged_frames, deserializers=deserializers)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 452, in deserialize
return loads(header, frames)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 195, in serialization_error_loads
raise TypeError(msg)
TypeError: Could not serialize object of type Dataset
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 366, in serialize
header, frames = dumps(x, context=context) if wants_context else dumps(x)
~~~~~^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 78, in pickle_dumps
frames[0] = pickle.dumps(
~~~~~~~~~~~~^
x,
^^
buffer_callback=buffer_callback,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
protocol=context.get("pickle-protocol", None) if context else None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[5], line 13
10 fileset = [s3.open(file) for file in remote_files]
12 # Open with xarray
---> 13 ds = xr.open_mfdataset(fileset, data_vars="minimal", coords='minimal', compat="override", parallel=True,
14 drop_variables=["transport_components", "transport_regions", 'moc_components'], decode_times=True)
16 ds
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/xarray/backends/api.py:1812, in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, data_vars, coords, combine, parallel, join, attrs_file, combine_attrs, errors, **kwargs)
1807 datasets = [preprocess(ds) for ds in datasets]
1809 if parallel:
1810 # calling compute here will return the datasets/file_objs lists,
1811 # the underlying datasets will still be stored as dask arrays
-> 1812 datasets, closers = dask.compute(datasets, closers)
1814 # Combine all datasets, closing them in case of a ValueError
1815 try:
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/dask/base.py:681, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
678 expr = expr.optimize()
679 keys = list(flatten(expr.__dask_keys__()))
--> 681 results = schedule(expr, keys, **kwargs)
683 return repack(results)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/utils_comm.py:416, in retry_operation(coro, operation, *args, **kwargs)
410 retry_delay_min = parse_timedelta(
411 dask.config.get("distributed.comm.retry.delay.min"), default="s"
412 )
413 retry_delay_max = parse_timedelta(
414 dask.config.get("distributed.comm.retry.delay.max"), default="s"
415 )
--> 416 return await retry(
417 partial(coro, *args, **kwargs),
418 count=retry_count,
419 delay_min=retry_delay_min,
420 delay_max=retry_delay_max,
421 operation=operation,
422 )
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/utils_comm.py:395, in retry(coro, count, delay_min, delay_max, jitter_fraction, retry_on_exceptions, operation)
393 delay *= 1 + random.random() * jitter_fraction
394 await asyncio.sleep(delay)
--> 395 return await coro()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/core.py:1259, in PooledRPCCall.__getattr__.<locals>.send_recv_from_rpc(**kwargs)
1257 prev_name, comm.name = comm.name, "ConnectionPool." + key
1258 try:
-> 1259 return await send_recv(comm=comm, op=key, **kwargs)
1260 finally:
1261 self.pool.reuse(self.addr, comm)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/core.py:1018, in send_recv(comm, reply, serializers, deserializers, **kwargs)
1016 await comm.write(msg, serializers=serializers, on_error="raise")
1017 if reply:
-> 1018 response = await comm.read(deserializers=deserializers)
1019 else:
1020 response = None
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/tcp.py:248, in TCP.read(self, deserializers)
246 else:
247 try:
--> 248 msg = await from_frames(
249 frames,
250 deserialize=self.deserialize,
251 deserializers=deserializers,
252 allow_offload=self.allow_offload,
253 )
254 except EOFError:
255 # Frames possibly garbled or truncated by communication error
256 self.abort()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/utils.py:78, in from_frames(frames, deserialize, deserializers, allow_offload)
76 res = await offload(_from_frames)
77 else:
---> 78 res = _from_frames()
80 return res
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/comm/utils.py:61, in from_frames.<locals>._from_frames()
59 def _from_frames():
60 try:
---> 61 return protocol.loads(
62 frames, deserialize=deserialize, deserializers=deserializers
63 )
64 except EOFError:
65 if size > 1000:
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py:175, in loads(frames, deserialize, deserializers)
172 return pickle.loads(sub_header["pickled-obj"], buffers=sub_frames)
173 return msgpack_decode_default(obj)
--> 175 return msgpack.loads(
176 frames[0], object_hook=_decode_default, use_list=False, **msgpack_opts
177 )
179 except Exception:
180 logger.critical("Failed to deserialize", exc_info=True)
File msgpack/_unpacker.pyx:194, in msgpack._cmsgpack.unpackb()
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/core.py:159, in loads.<locals>._decode_default(obj)
157 if "compression" in sub_header:
158 sub_frames = decompress(sub_header, sub_frames)
--> 159 return merge_and_deserialize(
160 sub_header, sub_frames, deserializers=deserializers
161 )
162 else:
163 return Serialized(sub_header, sub_frames)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/contextlib.py:85, in ContextDecorator.__call__.<locals>.inner(*args, **kwds)
82 @wraps(func)
83 def inner(*args, **kwds):
84 with self._recreate_cm():
---> 85 return func(*args, **kwds)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:525, in merge_and_deserialize(header, frames, deserializers)
521 merged = host_array_from_buffers(subframes)
523 merged_frames.append(merged)
--> 525 return deserialize(header, merged_frames, deserializers=deserializers)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:452, in deserialize(header, frames, deserializers)
447 raise TypeError(
448 "Data serialized with %s but only able to deserialize "
449 "data with %s" % (name, str(list(deserializers)))
450 )
451 dumps, loads, wants_context = families[name]
--> 452 return loads(header, frames)
File ~/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py:195, in serialization_error_loads(header, frames)
193 def serialization_error_loads(header, frames):
194 msg = "\n".join([codecs.decode(frame, "utf8") for frame in frames])
--> 195 raise TypeError(msg)
TypeError: Could not serialize object of type Dataset
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 63, in dumps
result = pickle.dumps(x, **dump_kwargs)
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 68, in dumps
pickler.dump(x)
~~~~~~~~~~~~^^^
TypeError: cannot pickle 'module' object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 366, in serialize
header, frames = dumps(x, context=context) if wants_context else dumps(x)
~~~~~^^^^^^^^^^^^^^^^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/serialize.py", line 78, in pickle_dumps
frames[0] = pickle.dumps(
~~~~~~~~~~~~^
x,
^^
buffer_callback=buffer_callback,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
protocol=context.get("pickle-protocol", None) if context else None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/distributed/protocol/pickle.py", line 80, in dumps
result = cloudpickle.dumps(x, **dump_kwargs)
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1537, in dumps
cp.dump(obj)
~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/cloudpickle/cloudpickle.py", line 1303, in dump
return super().dump(obj)
~~~~~~~~~~~~^^^^^
File "/home/runner/micromamba/envs/ocean-bgc-cookbook-dev/lib/python3.13/site-packages/h5py/_hl/base.py", line 369, in __getnewargs__
raise TypeError("h5py objects cannot be pickled")
TypeError: h5py objects cannot be pickled
Subsetting¶
variables =['FG_CO2','photoC_TOT_zint','POC_FLUX_100m']
keep_vars=['z_t','z_t_150m','dz','time_bound', 'time', 'TAREA','TLAT','TLONG'] + variables
ds = ds.drop_vars([v for v in ds.variables if v not in keep_vars])
Processing - means in time and space¶
Pull in the function we defined in the nutrients notebook...
def year_mean(ds):
"""
Properly convert monthly data to annual means, taking into account month lengths.
Source: https://ncar.github.io/esds/posts/2021/yearly-averages-xarray/
"""
# Make a DataArray with the number of days in each month, size = len(time)
month_length = ds.time.dt.days_in_month
# Calculate the weights by grouping by 'time.year'
weights = (
month_length.groupby("time.year") / month_length.groupby("time.year").sum()
)
# Test that the sum of the year for each season is 1.0
np.testing.assert_allclose(weights.groupby("time.year").sum().values, np.ones((len(ds.groupby("time.year")), )))
# Calculate the weighted average
return (ds * weights).groupby("time.year").sum(dim="time")
We also define a new function to take global mean in space.
def global_mean(ds, ds_grid, compute_vars, normalize=True, include_ms=False):
"""
Compute the global mean on a POP dataset.
Return computed quantity in conventional units.
"""
other_vars = list(set(ds.variables) - set(compute_vars))
# note TAREA is in cm^2, which affects units
if include_ms: # marginal seas!
surface_mask = ds_grid.TAREA.where(ds_grid.KMT > 0).fillna(0.)
else:
surface_mask = ds_grid.TAREA.where(ds_grid.REGION_MASK > 0).fillna(0.)
masked_area = {
v: surface_mask.where(ds[v].notnull()).fillna(0.)
for v in compute_vars
}
with xr.set_options(keep_attrs=True):
dso = xr.Dataset({
v: (ds[v] * masked_area[v]).sum(['nlat', 'nlon'])
for v in compute_vars
})
if normalize:
dso = xr.Dataset({
v: dso[v] / masked_area[v].sum(['nlat', 'nlon'])
for v in compute_vars
})
return dso
Take the long-term mean of our data set. We process monthly to annual with our custom function, then use xarray’s built-in .mean()
function to process from annual data to a single mean over time, since each year is the same length.
ds = year_mean(ds).mean("year")
Do some global integrals, to check if our values look reasonable¶
ds_glb = global_mean(ds, ds_grid, variables,normalize=False).compute()
# convert from nmol C/s to Pg C/yr
nmols_to_PgCyr = 1e-9 * 12. * 1e-15 * 365. * 86400.
for v in variables:
ds_glb[v] = ds_glb[v] * nmols_to_PgCyr
ds_glb[v].attrs['units'] = 'Pg C yr$^{-1}$'
ds_glb
We can compare these values to some observationally derived values. Each of these is calculated in a different way with combinations of data and models--please reference each linked paper for detailed discussion. Takahashi et al., 2002 estimate global air-sea CO flux to be 2.2 (+22% or −19%) Pg C yr. Our value (shown above as FG_CO2
) is 2.779 Pg C yr. This value is outside of these bounds, but still on the same order of magnitude. We note that these values are calculated over different time periods, so we also don’t expect them to be an exact comparison. photoC_TOT_zint
represents global vertically-integrated NPP; Behrenfeld and Falkowski, 1997 estimate this value to be 43.5 Pg C yr. Our value is 53.26 Pg C yr, which is within 22% of the observationally derived value. POC_FLUX_100m
represents the particulate organic carbon flux at 100 m depth. DeVries and Weber, 2017 calculated this flux integrated over the entire euphotic zone to be 9.1 ± 0.2 Pg C yr. Since the depth ranges are different, this isn’t an exact comparison, but the orders of magnitude are similar. This first-pass analysis tells us that CESM is on the right track for these values.
Make some maps¶
First, convert from mmol/m3 cm/s to mmol/m2/day.
for var in variables:
ds[var] = ds[var] * 0.01 * 86400.
Then, make a few maps of key carbon-related variables.
fig = plt.figure(figsize=(8,12))
ax = fig.add_subplot(3,1,1, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('a) Air-sea CO$_2$ flux', fontsize=12,loc='left')
lon, lat, field = adjust_pop_grid(lons, lats, ds.FG_CO2)
pc=ax.pcolormesh(lon, lat, field, cmap='bwr',vmin=-5,vmax=5,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
cbar1 = fig.colorbar(pc, ax=ax,extend='both',label='mmol m$^{-2}$ d$^{-1}$')
ax = fig.add_subplot(3,1,2, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('b) NPP', fontsize=12,loc='left')
lon, lat, field = adjust_pop_grid(lons, lats, ds.photoC_TOT_zint)
pc=ax.pcolormesh(lon, lat, field, cmap='Greens',vmin=0,vmax=100,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
cbar1 = fig.colorbar(pc, ax=ax,extend='max',label='mmol m$^{-2}$ d$^{-1}$')
ax = fig.add_subplot(3,1,3, projection=ccrs.Robinson(central_longitude=305.0))
ax.set_title('c) POC flux at 100m', fontsize=12,loc='left')
lon, lat, field = adjust_pop_grid(lons, lats, ds.POC_FLUX_100m)
pc=ax.pcolormesh(lon, lat, field, cmap='Oranges',vmin=0,vmax=10,transform=ccrs.PlateCarree())
land = cartopy.feature.NaturalEarthFeature('physical', 'land', scale='110m', edgecolor='k', facecolor='white', linewidth=0.5)
ax.add_feature(land)
cbar1 = fig.colorbar(pc, ax=ax,extend='max',label='mmol m$^{-2}$ d$^{-1}$');
And close the Dask cluster we spun up at the beginning.
cluster.close()
Summary¶
You’ve learned how to make maps of some key quantities related to oceanic carbon.
Resources and references¶
- Takahashi, T., Sutherland, S. C., Sweeney, C., Poisson, A., Metzl, N., Tilbrook, B., Bates, N., Wanninkhof, R., Feely, R. A., Sabine, C., Olafsson, J., & Nojiri, Y. (2002). Global sea–air CO2 flux based on climatological surface ocean pCO2, and seasonal biological and temperature effects. Deep Sea Research Part II: Topical Studies in Oceanography, 49(9–10), 1601–1622. 10.1016/s0967-0645(02)00003-6
- Behrenfeld, M. J., & Falkowski, P. G. (1997). Photosynthetic rates derived from satellite‐based chlorophyll concentration. Limnology and Oceanography, 42(1), 1–20. 10.4319/lo.1997.42.1.0001
- DeVries, T., & Weber, T. (2017). The export and fate of organic matter in the ocean: New constraints from combining satellite and oceanographic tracer observations. Global Biogeochemical Cycles, 31(3), 535–555. 10.1002/2016gb005551
- (2013). In Ocean Biogeochemical Dynamics (pp. 318–358). Princeton University Press. 10.2307/j.ctt3fgxqx.12