Skip to article frontmatterSkip to article content

Gulf Stream Currents

Project Pythia Logo Pangeo Logo

Gulf Stream Currents


Overview

An example that uses ipyleaflet to reproduce style of visualization used in the New York Times article In the Atlantic Ocean, Subtle Shifts Hint at Dramatic Dangers (March 2, 20121).

  1. Open an Intake catalogue reference Sea Surface Height data
  2. Make a geographic map of the data using ipyleaflet

Prerequisites

ConceptsImportanceNotes
XarrayHelpful
DaskHelpful
IntakeHelpful
ipyleafletHelpful
  • Time to learn: 15 minutes

Imports


from ipyleaflet import Map, TileLayer, basemaps
from ipyleaflet.velocity import Velocity
from intake import open_catalog

Load Data

The Copernicus Monitoring Environment Marine Service (CMEMS) is a large repository of ocean products including in-situ observations, satellite based remote sensing data, and numerical model output.

We want to look at altimeter satellite data to show the Sea Level Anomalies (SLA) for the global ocean. The particular data product is called Global Ocean Gridded L4 Sea Surface Heights and Derived Variables Reprocessed (1993-Ongoing) (SEALEVEL_GLO_PHY_L4_MY_008_047).

This dataset is available as an analysis-ready on the Pangeo Cloud Data Catalog

cat = open_catalog("https://raw.githubusercontent.com/pangeo-data/pangeo-datastore/master/intake-catalogs/ocean.yaml")
cat["sea_surface_height"]
<intake_xarray.xzarr.ZarrSource at 0x7f71b0eb4ad0>

This dataset is marked “requester pays” which means we have do an addtional step if we are not already on Pangeo Hub on the Google Cloud Platform.

Working with requester pays data

Several of the datasets within the Pangeo cloud data catalog are contained in requester pays storage buckets. This means that a user requesting data must provide their own billing project (created and authenticated through Google Cloud Platform) to be billed for the charges associated with accessing a dataset. To set up an GCP billing project and use it for authentication in applications:

  • Create a project on GCP; if this is the first time using GCP, a prompt will appear to choose a Google account to link to all GCP-related activities.

  • Create a Cloud Billing account associated with the project and enable billing for the project through this account.

  • Using Google Cloud IAM, add the Service Usage Consumer role to your account, which enables it to make billed requests on the behalf of the project. Through command line, install the Google Cloud SDK; this can be done using conda:

    conda install -c conda-forge google-cloud-sdk

  • Initialize the gcloud command line interface, logging into the account used to create the aforementioned project and selecting it as the default project; this will allow the project to be used for requester pays access through the command line:

gcloud init```

  • Finally, use gcloud to establish application default credentials; this will allow the project to be used for requester pays access through applications:

    gcloud auth application-default login

ds  = cat["sea_surface_height"].to_dask()
ds
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[3], line 1
----> 1 ds  = cat["sea_surface_height"].to_dask()
      2 ds

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/intake_xarray/base.py:8, in IntakeXarraySourceAdapter.to_dask(self)
      6 def to_dask(self):
      7     if "chunks" not in self.reader.kwargs:
----> 8         return self.reader(chunks={}).read()
      9     else:
     10         return self.reader.read()

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/intake/readers/readers.py:121, in BaseReader.read(self, *args, **kwargs)
    119 kw.update(kwargs)
    120 args = kw.pop("args", ()) or args
--> 121 return self._read(*args, **kw)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/intake/readers/readers.py:1327, in XArrayDatasetReader._read(self, data, open_local, **kw)
   1325         f = fsspec.open(data.url, **(data.storage_options or {})).open()
   1326         return open_dataset(f, **kw)
-> 1327 return open_dataset(data.url, **kw)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/xarray/backends/api.py:750, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, create_default_indexes, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    738 decoders = _resolve_decoders_kwargs(
    739     decode_cf,
    740     open_backend_dataset_parameters=backend.open_dataset_parameters,
   (...)    746     decode_coords=decode_coords,
    747 )
    749 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 750 backend_ds = backend.open_dataset(
    751     filename_or_obj,
    752     drop_variables=drop_variables,
    753     **decoders,
    754     **kwargs,
    755 )
    756 ds = _dataset_from_backend_dataset(
    757     backend_ds,
    758     filename_or_obj,
   (...)    769     **kwargs,
    770 )
    771 return ds

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/xarray/backends/zarr.py:1636, in ZarrBackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, synchronizer, consolidated, chunk_store, storage_options, zarr_version, zarr_format, store, engine, use_zarr_fill_value_as_mask, cache_members)
   1634 filename_or_obj = _normalize_path(filename_or_obj)
   1635 if not store:
-> 1636     store = ZarrStore.open_group(
   1637         filename_or_obj,
   1638         group=group,
   1639         mode=mode,
   1640         synchronizer=synchronizer,
   1641         consolidated=consolidated,
   1642         consolidate_on_close=False,
   1643         chunk_store=chunk_store,
   1644         storage_options=storage_options,
   1645         zarr_version=zarr_version,
   1646         use_zarr_fill_value_as_mask=None,
   1647         zarr_format=zarr_format,
   1648         cache_members=cache_members,
   1649     )
   1651 store_entrypoint = StoreBackendEntrypoint()
   1652 with close_on_error(store):

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/xarray/backends/zarr.py:714, in ZarrStore.open_group(cls, store, mode, synchronizer, group, consolidated, consolidate_on_close, chunk_store, storage_options, append_dim, write_region, safe_chunks, align_chunks, zarr_version, zarr_format, use_zarr_fill_value_as_mask, write_empty, cache_members)
    688 @classmethod
    689 def open_group(
    690     cls,
   (...)    707     cache_members: bool = True,
    708 ):
    709     (
    710         zarr_group,
    711         consolidate_on_close,
    712         close_store_on_close,
    713         use_zarr_fill_value_as_mask,
--> 714     ) = _get_open_params(
    715         store=store,
    716         mode=mode,
    717         synchronizer=synchronizer,
    718         group=group,
    719         consolidated=consolidated,
    720         consolidate_on_close=consolidate_on_close,
    721         chunk_store=chunk_store,
    722         storage_options=storage_options,
    723         zarr_version=zarr_version,
    724         use_zarr_fill_value_as_mask=use_zarr_fill_value_as_mask,
    725         zarr_format=zarr_format,
    726     )
    728     return cls(
    729         zarr_group,
    730         mode,
   (...)    739         cache_members=cache_members,
    740     )

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/xarray/backends/zarr.py:1840, in _get_open_params(store, mode, synchronizer, group, consolidated, consolidate_on_close, chunk_store, storage_options, zarr_version, use_zarr_fill_value_as_mask, zarr_format)
   1836 group = open_kwargs.pop("path")
   1838 if consolidated:
   1839     # TODO: an option to pass the metadata_key keyword
-> 1840     zarr_root_group = zarr.open_consolidated(store, **open_kwargs)
   1841 elif consolidated is None:
   1842     # same but with more error handling in case no consolidated metadata found
   1843     try:

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/api/synchronous.py:222, in open_consolidated(use_consolidated, *args, **kwargs)
    217 def open_consolidated(*args: Any, use_consolidated: Literal[True] = True, **kwargs: Any) -> Group:
    218     """
    219     Alias for :func:`open_group` with ``use_consolidated=True``.
    220     """
    221     return Group(
--> 222         sync(async_api.open_consolidated(*args, use_consolidated=use_consolidated, **kwargs))
    223     )

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/core/sync.py:163, in sync(coro, loop, timeout)
    160 return_result = next(iter(finished)).result()
    162 if isinstance(return_result, BaseException):
--> 163     raise return_result
    164 else:
    165     return return_result

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/core/sync.py:119, in _runner(coro)
    114 """
    115 Await a coroutine and return the result of running it. If awaiting the coroutine raises an
    116 exception, the exception will be returned.
    117 """
    118 try:
--> 119     return await coro
    120 except Exception as ex:
    121     return ex

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/api/asynchronous.py:381, in open_consolidated(use_consolidated, *args, **kwargs)
    376 if use_consolidated is not True:
    377     raise TypeError(
    378         "'use_consolidated' must be 'True' in 'open_consolidated'. Use 'open' with "
    379         "'use_consolidated=False' to bypass consolidated metadata."
    380     )
--> 381 return await open_group(*args, use_consolidated=use_consolidated, **kwargs)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/api/asynchronous.py:844, in open_group(store, mode, cache_attrs, synchronizer, path, chunk_store, storage_options, zarr_version, zarr_format, meta_array, attributes, use_consolidated)
    842 try:
    843     if mode in _READ_MODES:
--> 844         return await AsyncGroup.open(
    845             store_path, zarr_format=zarr_format, use_consolidated=use_consolidated
    846         )
    847 except (KeyError, FileNotFoundError):
    848     pass

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/core/group.py:542, in AsyncGroup.open(cls, store, zarr_format, use_consolidated)
    535         raise FileNotFoundError(store_path)
    536 elif zarr_format is None:
    537     (
    538         zarr_json_bytes,
    539         zgroup_bytes,
    540         zattrs_bytes,
    541         maybe_consolidated_metadata_bytes,
--> 542     ) = await asyncio.gather(
    543         (store_path / ZARR_JSON).get(),
    544         (store_path / ZGROUP_JSON).get(),
    545         (store_path / ZATTRS_JSON).get(),
    546         (store_path / str(consolidated_key)).get(),
    547     )
    548     if zarr_json_bytes is not None and zgroup_bytes is not None:
    549         # warn and favor v3
    550         msg = f"Both zarr.json (Zarr format 3) and .zgroup (Zarr format 2) metadata objects exist at {store_path}. Zarr format 3 will be used."

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/storage/_common.py:164, in StorePath.get(self, prototype, byte_range)
    162 if prototype is None:
    163     prototype = default_buffer_prototype()
--> 164 return await self.store.get(self.path, prototype=prototype, byte_range=byte_range)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/zarr/storage/_fsspec.py:300, in FsspecStore.get(self, key, prototype, byte_range)
    298 try:
    299     if byte_range is None:
--> 300         value = prototype.buffer.from_bytes(await self.fs._cat_file(path))
    301     elif isinstance(byte_range, RangeByteRequest):
    302         value = prototype.buffer.from_bytes(
    303             await self.fs._cat_file(
    304                 path,
   (...)    307             )
    308         )

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/gcsfs/core.py:1117, in GCSFileSystem._cat_file(self, path, start, end, **kwargs)
   1115 else:
   1116     head = {}
-> 1117 headers, out = await self._call("GET", u2, headers=head)
   1118 return out

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/gcsfs/core.py:483, in GCSFileSystem._call(self, method, path, json_out, info_out, *args, **kwargs)
    479 async def _call(
    480     self, method, path, *args, json_out=False, info_out=False, **kwargs
    481 ):
    482     logger.debug(f"{method.upper()}: {path}, {args}, {kwargs.get('headers')}")
--> 483     status, headers, info, contents = await self._request(
    484         method, path, *args, **kwargs
    485     )
    486     if json_out:
    487         return json.loads(contents)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/decorator.py:224, in decorate.<locals>.fun(*args, **kw)
    222 if not kwsyntax:
    223     args, kw = fix(args, kw, sig)
--> 224 return await caller(func, *(extras + args), **kw)

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/gcsfs/retry.py:135, in retry_request(func, retries, *args, **kwargs)
    133     if retry > 0:
    134         await asyncio.sleep(min(random.random() + 2 ** (retry - 1), 32))
--> 135     return await func(*args, **kwargs)
    136 except (
    137     HttpError,
    138     requests.exceptions.RequestException,
   (...)    141     aiohttp.client_exceptions.ClientError,
    142 ) as e:
    143     if (
    144         isinstance(e, HttpError)
    145         and e.code == 400
    146         and "requester pays" in e.message
    147     ):

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/gcsfs/core.py:476, in GCSFileSystem._request(self, method, path, headers, json, data, *args, **kwargs)
    473 info = r.request_info  # for debug only
    474 contents = await r.read()
--> 476 validate_response(status, contents, path, args)
    477 return status, headers, info, contents

File ~/micromamba/envs/po-cookbook-dev/lib/python3.13/site-packages/gcsfs/retry.py:120, in validate_response(status, content, path, args)
    118     raise requests.exceptions.ProxyError()
    119 elif "invalid" in str(msg):
--> 120     raise ValueError(f"Bad Request: {path}\n{msg}")
    121 elif error and not isinstance(error, str):
    122     raise HttpError(error)

ValueError: Bad Request: https://storage.googleapis.com/download/storage/v1/b/pangeo-cmems-duacs/o/zarr.json?alt=media
User project specified in the request is invalid.

Make a Map

center = [35, -50]
zoom = 4
m = Map(center=center, zoom=zoom, interpolation='nearest', basemap=basemaps.Gaode.Satellite)

display_options = {
    'velocityType': 'Global Wind',
    'displayPosition': 'bottomleft',
    'displayEmptyString': 'No wind data'
}

wind = Velocity(
    data=ds.isel(time=-1), 
    zonal_speed='ugos', meridional_speed='vgos', 
    latitude_dimension='latitude', longitude_dimension='longitude', 
    velocity_scale=0.2, max_velocity=1, 
    display_options=display_options
)

m.add_layer(wind)

m

Summary

In this example we loaded sea level data from an analysis-ready cloud based dataset and made a visualization of that data using mapping library.

Resources and references