Polar ocean temperatures and sea ice thickness¶
Description & purpose: This Notebook is designed to demonstrate how to find and plot EOCIS related data held on the Earth Observation Data Hub. It is assumed that this notebook will be running within the Notebook Service on the Hub.
Setup¶
The first thing to do is ensure that the most recent version of pyeodh is installed on your system. It is good practice to run the following cell if you have not installed pyeodh or have not used it in a while. The cell will also install the plotting and spatial libraries required.
# Run this cell if pyeodh is not installed, or needs updating
%pip install --upgrade pyeodh
%pip install folium shapely xarray fsspec rioxarray cartopy pyproj
Imports¶
import pyeodh
import os
import subprocess
import cartopy.crs as ccrs
import folium
import fsspec
import geopandas as gpd
import matplotlib.pyplot as plt
import shapely as sh
import xarray as xr
from pyproj import Transformer
from shapely.geometry import Point
Set geographic locations¶
# Point of interest - ice/land (Nuuk, Greenland)
ipnt = sh.Point(-51.7216, 64.1835)
# Point of interest - ocean (Baffin Bay, Greenland)
olon, olat = [-54.22884, 83.11055]
opnt = sh.Point(olon, olat)
# Create a folium map centred between the points
m = folium.Map(location=[(ipnt.y + opnt.y)/2, (ipnt.x + opnt.x)/2], zoom_start=3)
# Add points to the map
for pt, name in [(ipnt, "Nuuk"), (opnt, "Arctic Ocean")]:
folium.Marker([pt.y, pt.x], popup=name).add_to(m)
m
Exploring the Resource Catalogue¶
Now we are ready to investigate the Resource Catalogue.
We need to create an instance of the Client, which is our entrypoint to EODH APIs. From there we can start to search the collections held within the platform. First we'll look at the paths to the data.
# Connect to the Hub
client = pyeodh.Client(base_url="https://eodatahub.org.uk").get_catalog_service()
for c in client.get_catalogs():
print(c._pystac_object.self_href.removeprefix("https://eodatahub.org.uk/api/catalogue/stac/catalogs/"))
commercial/catalogs/airbus user/catalogs/npl/catalogs/processing-results/catalogs/qa-workflow/catalogs/airbus_phr_qa user/catalogs/tjellicoetpzuk/catalogs/processing-results/catalogs/snuggs/catalogs/catalog public/catalogs/ceda-stac-catalogue commercial user/catalogs/npl commercial/catalogs/opencosmos commercial/catalogs/planet user/catalogs/npl/catalogs/processing-results/catalogs/qa-workflow/catalogs/planet_psscene_qa user/catalogs/npl/catalogs/processing-results
Now we'll look explicitly for the EOCIS climate data.
for index, collect in enumerate(
[c for c in client.get_collections() if "eocis" in c.id],
start=1):
print(f"{index} -- {collect.id}")
1 -- eocis-sst-cdrv3 2 -- eocis-soil-moisture-africa 3 -- eocis-lst-s3b-night 4 -- eocis-lst-s3b-day 5 -- eocis-lst-s3a-night 6 -- eocis-lst-s3a-day 7 -- eocis-chuk-land-vegetation-lai 8 -- eocis-chuk-land-vegetation-fapar 9 -- eocis-chuk-geospatial-landcover 10 -- eocis-chuk-geospatial-landclass 11 -- eocis-chuk-geospatial-elevation 12 -- eocis-chuk-geospatial-builtarea 13 -- eocis-chuk-geospatial 14 -- eocis-arctic-sea-ice-thickness-monthly 15 -- eocis-aerosol-slstr-monthly-s3b 16 -- eocis-aerosol-slstr-monthly-s3a 17 -- eocis-aerosol-slstr-daily-s3b 18 -- eocis-aerosol-slstr-daily-s3a
Sea Ice¶
Note: For additional details see this link.
We need to connect to the Sea Ice collection and get some simple information about it
datasets = client.get_catalog(
"public/catalogs/ceda-stac-catalogue").get_collection(
'eocis-arctic-sea-ice-thickness-monthly')
print("id: ", datasets.id)
print("title: ", datasets.title)
print("description: ", datasets.description)
print("")
print(
"DATASET TEMPORAL EXTENT: ",
[str(d) for d in datasets.extent.temporal.intervals[0]],
)
id: eocis-arctic-sea-ice-thickness-monthly title: EOCIS Arctic Monthly Gridded Sea Ice Thickness Product from CryoSat-2 description: The sea ice products provide a 5x5 km grid of Arctic sea ice thickness (for the whole Arctic region and of 17 sub-regions), delivered as NetCDF files. EOCIS sea ice thickness NetCDF products are generated monthly by the Centre for Polar Observation and Modelling (CPOM) from radar altimetry measurements taken from the ESA CryoSAT-2 satellite during the winter months (Oct-Apr). Sea ice thickness is only reliably measured from satellite radar altimetry during the winter months. During summer, melt ponds can form on the sea ice floes making it difficult for the satellite to differentiate between floes and leads, and hence calculate sea ice freeboard (and subsequently thickness). Measurement during summer months using radar altimetry is an area of active research (Landy et al, 2022) but is not yet operationally processed. DATASET TEMPORAL EXTENT: ['2010-11-15 00:00:00+00:00', '2024-11-15 00:00:00+00:00']
It would be useful to understand more about the items that are held within the catalogue. The attributes of a catalogue are mapped to a series of properties. For instance, we can find out the properties such asid, title and description.
# Now we want to access the first item and see what properties are available. Need to convert an iterator to a list first.
items = list(datasets.get_items())
print("Number of items available: ", len(items))
print ("-----")
item = items[1] # second item
print(f"\n{item.id}")
for k, v in item.properties.items():
print(f" {k}: {v}")
Number of items available: 198
-----
0a2cdd11-d4e1-4806-ab4f-1a66849cf30a
title: EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-11-15
description: The sea ice products provide a 5x5 km grid of Arctic sea ice thickness (for the whole Arctic region and of 17 sub-regions), delivered as NetCDF files. EOCIS sea ice thickness NetCDF products are generated monthly by the Centre for Polar Observation and Modelling (CPOM) from radar altimetry measurements taken from the ESA CryoSAT-2 satellite during the winter months (Oct-Apr). Sea ice thickness is only reliably measured from satellite radar altimetry during the winter months. During summer, melt ponds can form on the sea ice floes making it difficult for the satellite to differentiate between floes and leads, and hence calculate sea ice freeboard (and subsequently thickness). Measurement during summer months using radar altimetry is an area of active research (Landy et al, 2022) but is not yet operationally processed.
datetime: 2024-11-15T00:00:00Z
created: 2025-10-09T10:39:36.790911Z
updated: 2026-03-12T02:33:14.131303Z
license: cc-by-4.0
platform: CryoSat-2
instruments: ['SIRAL']
renders: {'sea_ice_thickness_stdev': {'reference': True, 'assets': ['reference_file'], 'rescale': [0, 2], 'variable': 'sea_ice_thickness_stdev', 'colormap_name': 'viridis', 'title': 'Standard deviation of Sea Ice Thickness'}, 'sea_ice_thickness': {'reference': True, 'assets': ['reference_file'], 'rescale': [0, 3.5], 'variable': 'sea_ice_thickness', 'colormap_name': 'blues_r', 'title': 'Sea Ice Thickness'}}
sci:doi: https://dx.doi.org/10.5285/4c246cb11aa04651a0182b2d329a84f9
sci:publications: []
sci:citation: Ridout, A.; Palmer, B.; Muir, A.; Swiggs, A.; McMillan, M.; Maddalena, J. (2025): EOCIS: Arctic Sea Ice Thickness Grids, v1.00. NERC EDS Centre for Environmental Data Analysis, 20 March 2025. doi:10.5285/4c246cb11aa04651a0182b2d329a84f9.
cube:variables: {'n_thickness_measurements': {'description': 'number of contributing thickness measurements', 'type': 'auxiliary', 'dimensions': ['time', 'yc', 'xc']}, 'sea_ice_thickness_stdev': {'unit': 'm', 'description': 'standard deviation of sea ice thickness, defined as the standard deviation of the thickness in the grid cell', 'type': 'data', 'dimensions': ['time', 'yc', 'xc']}, 'sea_ice_thickness': {'unit': 'm', 'description': 'sea ice thickness', 'type': 'data', 'dimensions': ['time', 'yc', 'xc']}}
project: UK Earth Observation Climate Information Service (EOCIS)
cube:dimensions: {'yc': {'extent': [-5350000, 5849999], 'unit': 'm', 'reference_system': 3413, 'type': 'spatial', 'axis': 'yc'}, 'xc': {'extent': [-3850000, 3749999], 'unit': 'm', 'reference_system': 3413, 'type': 'spatial', 'axis': 'xc'}, 'time': {'values': ['2024-11-15T00:00:00Z'], 'step': 'P1M', 'type': 'temporal'}}
resolution: 5 km
institution: EOCIS UK
We can then get the item id, date and name for all items. It is recommended to only look at the first few as the number of items could be large.
# Warning: without the limit to 10 items this will take a long time for large catalogues
for item in datasets.get_items()[:10]:
print(item.id, "---", item.properties["datetime"], "---", item.properties["title"])
ea60feff-3a21-49d1-b79f-9179c052cfc0 --- 2024-11-15T00:00:00Z --- EOCIS Sea-Surface Temperatures V3 2024-11-15 0a2cdd11-d4e1-4806-ab4f-1a66849cf30a --- 2024-11-15T00:00:00Z --- EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-11-15 f2a5764b-5f1e-4011-a8c2-d1eb75995d3b --- 2024-10-15T00:00:00Z --- EOCIS Sea-Surface Temperatures V3 2024-10-15 c2a3829e-ea0c-4379-be8c-da4f7e0e6c07 --- 2024-10-15T00:00:00Z --- EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-10-15 f476f3a5-14ca-4055-9b4d-d735946338ff --- 2024-04-15T00:00:00Z --- EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-04-15 d3ee4814-f522-4bf9-99d6-45bff8018cec --- 2024-04-15T00:00:00Z --- EOCIS Sea-Surface Temperatures V3 2024-04-15 aa5d9ec6-d7f3-4a23-b55b-ee6751bb7170 --- 2024-03-15T00:00:00Z --- EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-03-15 113583bf-a0d3-499d-9252-ea6ec096abe7 --- 2024-03-15T00:00:00Z --- EOCIS Sea-Surface Temperatures V3 2024-03-15 d9caf30d-ca73-4673-a079-17342019c598 --- 2024-02-14T00:00:00Z --- EOCIS Arctic Monthly Gridded Sea Ice Thickness 2024-02-14 8d23432c-9f46-4ae9-9854-27b880d675f3 --- 2024-02-14T00:00:00Z --- EOCIS Sea-Surface Temperatures V3 2024-02-14
The next thing to do is find out what assets are held within each item. The asset is the actual data that we are interested in. We will look at the asset information for the first item returned in the list.
for item in items[:1]: # Process only the second item
print(f"Item ID: {item.id}")
print(f"Item name: {item.properties["title"]}")
print("Assets:")
if not item.assets:
print(" No assets available.")
else:
for asset_key, asset in item.assets.items():
print(f" - {asset_key}: {asset.to_dict()}") # Convert asset to dict for readable output
print("-" * 40) # Separator for better readability
Item ID: ea60feff-3a21-49d1-b79f-9179c052cfc0
Item name: EOCIS Sea-Surface Temperatures V3 2024-11-15
Assets:
- reference_file: {'href': 'https://gws-access.jasmin.ac.uk/public/nceo_uor/eocis-stac/eocis-arctic-sea-ice-thickness-monthly-aux/items/2024/11/EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0-kerchunk.json', 'type': 'application/zstd', 'cloud_format': 'kerchunk', 'checksum_type': None, 'open_kwargs': {'open_mapper_kwargs': {}, 'open_zarr_kwargs': {}, 'open_xarray_kwargs': {}}, 'size': None, 'checksum': None, 'roles': ['reference', 'data']}
----------------------------------------
- EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0: {'href': 'https://dap.ceda.ac.uk/neodc/eocis/data/global_and_regional/arctic_sea_ice/arctic_sea_ice_thickness_grids/L3C/monthly/v1.0/EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0.nc', 'type': 'application/netcdf', 'roles': ['data']}
----------------------------------------
# we can also search by properties, such as date range
datasearch = client.search(
collections=["eocis-arctic-sea-ice-thickness-monthly"],
catalog_paths=["public/catalogs/ceda-stac-catalogue"],
start_datetime="2024-01-01",
end_datetime="2024-12-30",
)
# Print each item id and associated date
for i, myitem in enumerate(datasearch, start=1):
print(f"{i}: {myitem.id}: {myitem.properties}")
1: ea60feff-3a21-49d1-b79f-9179c052cfc0: {'datetime': '2024-11-15T00:00:00Z'}
2: 0a2cdd11-d4e1-4806-ab4f-1a66849cf30a: {'datetime': '2024-11-15T00:00:00Z'}
3: f2a5764b-5f1e-4011-a8c2-d1eb75995d3b: {'datetime': '2024-10-15T00:00:00Z'}
4: c2a3829e-ea0c-4379-be8c-da4f7e0e6c07: {'datetime': '2024-10-15T00:00:00Z'}
5: f476f3a5-14ca-4055-9b4d-d735946338ff: {'datetime': '2024-04-15T00:00:00Z'}
6: d3ee4814-f522-4bf9-99d6-45bff8018cec: {'datetime': '2024-04-15T00:00:00Z'}
7: aa5d9ec6-d7f3-4a23-b55b-ee6751bb7170: {'datetime': '2024-03-15T00:00:00Z'}
8: 113583bf-a0d3-499d-9252-ea6ec096abe7: {'datetime': '2024-03-15T00:00:00Z'}
9: d9caf30d-ca73-4673-a079-17342019c598: {'datetime': '2024-02-14T00:00:00Z'}
10: 8d23432c-9f46-4ae9-9854-27b880d675f3: {'datetime': '2024-02-14T00:00:00Z'}
11: f003fdb3-3635-4b8f-80d8-894592432fcf: {'datetime': '2024-01-15T00:00:00Z'}
12: 9cf039bd-a1c0-46a1-96f2-c412a7c82e46: {'datetime': '2024-01-15T00:00:00Z'}
# Or get the details for just the first item in the search output
item1 = next(iter(datasearch))
print(item1.id)
print(item1.properties["datetime"])
print(item1.assets["reference_file"])
ea60feff-3a21-49d1-b79f-9179c052cfc0 2024-11-15T00:00:00Z <Asset href=https://gws-access.jasmin.ac.uk/public/nceo_uor/eocis-stac/eocis-arctic-sea-ice-thickness-monthly-aux/items/2024/11/EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0-kerchunk.json>
Now we are ready to load some data and visualise it. To do this we will need to use xarray.
Note
The data are provided in NetCDF (Network Common Data Form) file format, a widely used standard for storing, sharing, and analyzing scientific data. NetCDF is designed to handle large, multi-dimensional datasets, such as temperature, precipitation, and wind speed, across space and time. The CEDA STAC library also contains reference files in Kerchunk format. Kerchunk is a lightweight reference file format that maps chunks of data in archival formats such as NetCDF that can be lazily loaded to access only the required data chunks for any given request. Rather than converting the source NetCDF to a different format, Kerchunk contains pointers to individual chunks of data and uses the fsspec library to assemble those chunks into the required array in a tool like xarray.
The following cells test for the service being up and drops to a local download if any errors are present
# --- Identify assets ---
ker_href = None
nc_href = None
for k, a in item.assets.items():
if '.nc' in a.href:
nc_href = a.href
elif k == "reference_file":
ker_href = a.href
#if "data" in a.roles: #(a.roles or []) or a.type == "application/netcdf":
# nc_href = a.href
print("Kerchunk href:", ker_href)
print("NetCDF href:", nc_href)
Kerchunk href: https://gws-access.jasmin.ac.uk/public/nceo_uor/eocis-stac/eocis-arctic-sea-ice-thickness-monthly-aux/items/2024/11/EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0-kerchunk.json NetCDF href: https://dap.ceda.ac.uk/neodc/eocis/data/global_and_regional/arctic_sea_ice/arctic_sea_ice_thickness_grids/L3C/monthly/v1.0/EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0.nc
item = items[1]
# --- Try kerchunk first ---
try: # Load the dataset depending on asset type
if is_kerchunk: # kerchunk: open via fsspec + zarr
fs = fsspec.filesystem("reference", fo=ker_href)
mapper = fs.get_mapper("")
ds = xr.open_dataset(mapper, engine="zarr", consolidated=False)
except Exception as e:
print("Kerchunk failed")
print("Error:", e)
ds = None
# --- Fallback: download NetCDF with wget ---
if ds is None:
if nc_href is None:
raise ValueError("No NetCDF asset available for fallback")
print("Falling back to NetCDF download via wget...")
local_file = os.path.basename(nc_href)
# Download if not already present
if not os.path.exists(local_file):
subprocess.run(
["wget", "-O", local_file, nc_href],
check=True
)
print(f"Downloaded: {local_file}")
else:
print(f"File already exists: {local_file}")
# Open locally
ds = xr.open_dataset(local_file)
# --- Inspect dataset ---
print("\n=== DATASET SUMMARY ===")
print(ds)
print("\n=== VARIABLES ===")
print(list(ds.data_vars))
print("\n=== COORDINATES ===")
print(list(ds.coords))
print("\n=== GLOBAL ATTRIBUTES ===")
for k, v in ds.attrs.items():
print(f"{k}: {v}")
Kerchunk failed
Error: name 'is_kerchunk' is not defined
Falling back to NetCDF download via wget...
File already exists: EOCIS-SEAICE-L3C-SITHICK-CS2-5KM-202411-fv1.0.nc
=== DATASET SUMMARY ===
<xarray.Dataset> Size: 68MB
Dimensions: (xc: 1520, yc: 2240, time: 1, nv: 2)
Coordinates:
* xc (xc) float32 6kB -3.85e+06 -3.845e+06 ... 3.75e+06
* yc (yc) float32 9kB -5.35e+06 -5.345e+06 ... 5.85e+06
lat (yc, xc) float32 14MB ...
lon (yc, xc) float32 14MB ...
* time (time) datetime64[ns] 8B 2024-11-15
Dimensions without coordinates: nv
Data variables:
sea_ice_thickness (time, yc, xc) float32 14MB ...
sea_ice_thickness_stdev (time, yc, xc) float32 14MB ...
n_thickness_measurements (time, yc, xc) int32 14MB ...
time_bnds (time, nv) datetime64[ns] 16B ...
polar_stereographic int8 1B ...
Attributes: (12/32)
title: Arctic Monthly Gridded Sea Ice Thickness Produ...
Conventions: CF-1.10
platform: CryoSat-2
sensor: SIRAL
institution: Centre for Polar Observation and Modelling (CP...
keywords: Sea Ice, Thickness, Radar Altimeters
... ...
standard_name_vocabulary: CF
conventions: CF-1.10
processing_level: Level-3C
license: Creative Commons Attribution 4.0 International...
creator_name: Andy Ridout, Alan Muir (CPOM, University Colle...
date_created: 20250114T095946Z
=== VARIABLES ===
['sea_ice_thickness', 'sea_ice_thickness_stdev', 'n_thickness_measurements', 'time_bnds', 'polar_stereographic']
=== COORDINATES ===
['xc', 'yc', 'lat', 'lon', 'time']
=== GLOBAL ATTRIBUTES ===
title: Arctic Monthly Gridded Sea Ice Thickness Product from CryoSat-2
Conventions: CF-1.10
platform: CryoSat-2
sensor: SIRAL
institution: Centre for Polar Observation and Modelling (CPOM), U.K
keywords: Sea Ice, Thickness, Radar Altimeters
project: EOCIS
creator_url: http://www.cpom.org.uk
spatial_resolution: 5 km
geospatial_lat_min: 67.1128
geospatial_lat_max: 88.2284
geospatial_lon_min: 0.0002
geospatial_lon_max: 359.9999
geospatial_vertical_min: 0.0
geospatial_vertical_max: 0.0
time_coverage_start: 20241101T000000Z
time_coverage_end: 20241130T235959Z
time_coverage_duration: P1M
time_coverage_resolution: P1M
ndays: 30
start_day: 1
start_month: 11
start_year: 2024
end_day: 30
end_month: 11
end_year: 2024
standard_name_vocabulary: CF
conventions: CF-1.10
processing_level: Level-3C
license: Creative Commons Attribution 4.0 International (CC BY 4.0)
creator_name: Andy Ridout, Alan Muir (CPOM, University College London)
date_created: 20250114T095946Z
Now we need to reproject and plot
# set the projection
ds3413 = ds.rio.write_crs("EPSG:3413", inplace=False)
sit = ds3413["sea_ice_thickness"].squeeze()
proj = ccrs.Stereographic(central_latitude=90, central_longitude=-45) # NSIDC default
plt.figure(figsize=(10,10))
ax = plt.axes(projection=proj)
# Transform from EPSG:3413 to map projection
sit.plot(
ax=ax,
transform=ccrs.epsg(3413),
cmap="viridis",
cbar_kwargs={"label": "Sea Ice Thickness (m)"}
)
ax.coastlines()
ax.set_title("Sea Ice Thickness (EPSG:3413)")
plt.show()
# Transform lon/lat for ocean point to EPSG:3413
to_3413 = Transformer.from_crs("EPSG:4326", "EPSG:3413", always_xy=True)
x0, y0 = to_3413.transform(olon, olat)
# 50 km buffer in metres
aoi_buffer = Point(x0, y0).buffer(50_000) # 50 km radius
# Build GeoDataFrame in EPSG:3413
gdf = gpd.GeoDataFrame(
{"geometry": [aoi_buffer]},
crs="EPSG:3413"
)
# Clip to buffer polygon
sit_clip = sit.rio.clip(gdf.geometry, gdf.crs)
plt.figure(figsize=(8,8))
ax = plt.axes(projection=proj)
sit_clip.plot(
ax=ax,
transform=ccrs.epsg(3413),
cmap="viridis",
cbar_kwargs={"label": "Sea Ice Thickness (m)"}
)
# Draw AOI point
ax.plot(x0, y0, "ro", markersize=6, transform=ccrs.epsg(3413))
ax.coastlines()
ax.set_title("Sea Ice Thickness – 50 km AOI Buffer (EPSG:3413)")
plt.show()
Sea Surface Temperature¶
We can do something similar for other datasets. We will follow the data discovery steps above to find an example dataset that we are interested in. Using code we will search for some data from the 21 June 2024, and then extract the asset URL from the returned item.
sstsearch = client.search(
collections=["eocis-sst-cdrv3"],
catalog_paths=["public/catalogs/ceda-stac-catalogue"],
start_datetime="2024-06-21",
end_datetime="2024-06-21",
)
# Get the details for just the first item in the search output
item1 = next(iter(sstsearch))
# --- Identify assets ---
ker_href = None
nc_href = None
for k, a in item1.assets.items():
if '.nc' in a.href:
nc_href = a.href
elif k == "reference_file":
ker_href = a.href
#if "data" in a.roles: #(a.roles or []) or a.type == "application/netcdf":
# nc_href = a.href
print("Kerchunk href:", ker_href)
print("NetCDF href:", nc_href)
Kerchunk href: https://gws-access.jasmin.ac.uk/public/nceo_uor/eocis-stac/sst-cdrv3-aux/items/2024/06/20240621120000-ESACCI-L4_GHRSST-SSTdepth-OSTIA-GLOB_ICDR3.0-v02.0-fv01.0-kerchunk.json NetCDF href: https://dap.ceda.ac.uk/neodc/eocis/data/global_and_regional/sea_surface_temperature/CDR_v3/Analysis/L4/v3.0.1/2024/06/21/20240621120000-ESACCI-L4_GHRSST-SSTdepth-OSTIA-GLOB_ICDR3.0-v02.0-fv01.0.nc
The data can also be searched for in the [Resource Catalogue](https://eodatahub.org.uk/static-apps/sg-rc-ui/prod/index.html#/).
Select the EOCIS Sea-Surface Temperatures V3 catalogue and then filter on the date range required. The filter should return a single item.
Click on the View Metadata icon and then click on the Assets tab. Expand the reference_file dropdown and copy the URL provided into a code variable called nhref. This has been completed for you in the cell below, and commented out so that it doesn't overwrite the output from the prevous cell.
#item = items[1]
dsst = None
# --- Try kerchunk first ---
try: # Load the dataset depending on asset type
if is_kerchunk: # kerchunk: open via fsspec + zarr
fs = fsspec.filesystem("reference", fo=ker_href)
mapper = fs.get_mapper("")
ds = xr.open_dataset(mapper, engine="zarr", consolidated=False)
except Exception as e:
print("Kerchunk failed")
print("Error:", e)
ds = None
# --- Fallback: download NetCDF with wget ---
if dsst is None:
if nc_href is None:
raise ValueError("No NetCDF asset available for fallback")
print("Falling back to NetCDF download via wget...")
local_file = os.path.basename(nc_href)
# Download if not already present
if not os.path.exists(local_file):
subprocess.run(
["wget", "-O", local_file, nc_href],
check=True
)
print(f"Downloaded: {local_file}")
else:
print(f"File already exists: {local_file}")
# Open locally
dsst = xr.open_dataset(local_file)
# --- Inspect dataset ---
print("\n=== DATASET SUMMARY ===")
print(dsst)
print("\n=== VARIABLES ===")
print(list(dsst.data_vars))
print("\n=== COORDINATES ===")
print(list(dsst.coords))
print("\n=== GLOBAL ATTRIBUTES ===")
for k, v in dsst.attrs.items():
print(f"{k}: {v}")
=== DATASET SUMMARY ===
<xarray.Dataset> Size: 415MB
Dimensions: (time: 1, bnds: 2, lat: 3600, lon: 7200)
Coordinates:
* time (time) datetime64[ns] 8B 2024-06-21T12:00:00
* lat (lat) float32 14kB -89.97 -89.93 ... 89.93 89.97
* lon (lon) float32 29kB -180.0 -179.9 ... 179.9 180.0
Dimensions without coordinates: bnds
Data variables:
time_bnds (time, bnds) datetime64[ns] 16B ...
lat_bnds (lat, bnds) float32 29kB ...
lon_bnds (lon, bnds) float32 58kB ...
analysed_sst (time, lat, lon) float32 104MB ...
analysed_sst_uncertainty (time, lat, lon) float32 104MB ...
sea_ice_fraction (time, lat, lon) float32 104MB ...
mask (time, lat, lon) float32 104MB ...
Attributes: (12/66)
Conventions: CF-1.5, Unidata Observation Dataset v1.0
title: ESA SST CCI Analysis v3.0
summary: European Space Agency Sea Surface Tempera...
references: Embury, O. et al. Satellite-based time-se...
institution: ESACCI
history: Created using OSTIA reanalysis system ICD...
... ...
processing_level: L4
cdm_data_type: grid
product_specification_version: SST_CCI-PSD-UKMO-201-Issue-2
key_variables: analysed_sst,sea_ice_fraction
contact: https://climate.esa.int/en/projects/sea-s...
creation_date: 2024-07-04T15:56:27Z
=== VARIABLES ===
['time_bnds', 'lat_bnds', 'lon_bnds', 'analysed_sst', 'analysed_sst_uncertainty', 'sea_ice_fraction', 'mask']
=== COORDINATES ===
['time', 'lat', 'lon']
=== GLOBAL ATTRIBUTES ===
Conventions: CF-1.5, Unidata Observation Dataset v1.0
title: ESA SST CCI Analysis v3.0
summary: European Space Agency Sea Surface Temperature Climate Change Initiative: Analysis product version 3.0
references: Embury, O. et al. Satellite-based time-series of sea-surface temperature since 1980 for climate applications. Scientific Data (2024). https://doi.org/10.1038/s41597-024-03147-w
institution: ESACCI
history: Created using OSTIA reanalysis system ICDR3.0
comment: These data were produced by the Met Office as part of the MCAS project. WARNING Some applications are unable to properly handle signed byte values. If values are encountered > 127, please subtract 256 from this reported value
license: Creative Commons Attribution 4.0 https://creativecommons.org/licenses/by/4.0/
Users of these data should cite the dataset along with the dataset paper: Embury, O. et al. Satellite-based time-series of sea-surface temperature since 1980 for climate applications. Scientific Data (2024). https://doi.org/10.1038/s41597-024-03147-w
id: OSTIA-ESACCI-L4-GLOB_ICDR-v3.0
naming_authority: org.ghrsst
product_version: 3.0.1
uuid: ef4779ea-3d9a-42b5-8423-9da05b7b7d7d
tracking_id: ef4779ea-3d9a-42b5-8423-9da05b7b7d7d
gds_version_id: 2.0
netcdf_version_id: 4.3.2
date_created: 20240704T155627Z
file_quality_level: 3
spatial_resolution: 0.05 degree
start_time: 20240621T000000Z
time_coverage_start: 20240621T000000Z
stop_time: 20240622T000000Z
time_coverage_end: 20240622T000000Z
time_coverage_duration: P1D
time_coverage_resolution: P1D
source: AVHRRMTB-UKEOCIS-L3U-ICDR-v3.0, SLSTRA-UKEOCIS-L3U-ICDR-v3.0, SLSTRB-UKEOCIS-L3U-ICDR-v3.0, EUMETSAT_OSI-SAF-ICE-OSI-430-a
platform: MetOpB, Sentinel-3A, Sentinel-3B
sensor: AVHRR, SLSTR
Metadata_Conventions: Unidata Dataset Discovery v1.0
metadata_link: https://doi.org/10.5285/4a9654136a7148e39b7feb56f8bb02d2
doi: 10.5285/4a9654136a7148e39b7feb56f8bb02d2
keywords: Oceans > Ocean Temperature > Sea Surface Temperature
keywords_vocabulary: NASA Global Change Master Directory (GCMD) Science Keywords
standard_name_vocabulary: NetCDF Climate and Forecast (CF) Metadata Convention
geospatial_lat_min: -90.0
geospatial_lat_max: 90.0
geospatial_lat_units: degrees_north
geospatial_lat_resolution: 0.05000000074505806
geospatial_lon_min: -180.0
geospatial_lon_max: 180.0
geospatial_lon_units: degrees_east
geospatial_lon_resolution: 0.05000000074505806
northernmost_latitude: 90.0
southernmost_latitude: -90.0
easternmost_longitude: 180.00001525878906
westernmost_longitude: -180.0
geospatial_vertical_min: -0.20000000298023224
geospatial_vertical_max: -0.20000000298023224
acknowledgment: The European Space Agency (ESA) funded the research and development of software to generate these data (grant reference ESA/AO/1-9322/18/I-NB), in addition to funding the production of the Climate Data Record (CDR) for 1980 to 2021. The Copernicus Climate Change Service (C3S) funded the development of the Interim-CDR (ICDR) extension and production of ICDR during 2022. From 2023 onwards the production of the ICDR is funded by the UK Natural Environment Research Council (NERC grant reference number NE/X019071/1, Earth Observation Climate Information Service) and the UK Marine and Climate Advisory Service (UKMCAS), benefitting from the Earth Observation Investment Package of the Department of Science, Innovation and Technology.
creator_name: Met Office
creator_url: https://www.metoffice.gov.uk
creator_email: ml-ostia@metoffice.gov.uk
creator_type: institution
creator_institution: Met Office
project: UK Marine and Climate Advisory Service
contributor_name: JASMIN
contributor_role: This work used JASMIN, the UK's collaborative data analysis environment (https://jasmin.ac.uk)
publisher_name: NERC EDS Centre for Environmental Data Analysis
publisher_url: https://www.ceda.ac.uk
publisher_email: support@ceda.ac.uk
publisher_type: institution
processing_level: L4
cdm_data_type: grid
product_specification_version: SST_CCI-PSD-UKMO-201-Issue-2
key_variables: analysed_sst,sea_ice_fraction
contact: https://climate.esa.int/en/projects/sea-surface-temperature
creation_date: 2024-07-04T15:56:27Z
# Extract SST variable
sst = dsst["analysed_sst"].squeeze() # already in EPSG:4326
proj = ccrs.PlateCarree() # map projection = EPSG:4326
plt.figure(figsize=(10,10))
ax = plt.axes(projection=proj)
sst.plot(
ax=ax,
transform=ccrs.PlateCarree(), # IMPORTANT: matches data CRS
cmap="viridis",
cbar_kwargs={"label": "Sea Surface Temperature (°C)"},
)
ax.coastlines()
ax.set_title("Sea Surface Temperature")
plt.show()
Author(s): Alastair Graham
Date created: 2026-01-25
Date last modified: 2026-03-03
Licence: This notebook is licensed under Creative Commons Attribution-ShareAlike 4.0 International. The code is released using the BSD-2-Clause license.
Copyright © - All rights reserved.Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.