Handling Timeseries Data¶
Description & purpose: This Notebook is designed to demonstrate how to plot a timeseries using data held on the Earth Observation Data Hub. This notebook plots data for a point in Greater London. It is assumed that this notebook will be running within the Notebook Service on the Hub.
The first thing to do is ensure that the most recent version of pyeodh is installed on your system. It is good practice to run the following cell if you have not installed pyeodh or have not used it in a while. The cell will also install the plotting and spatial libraries required.
# Run this cell if pyeodh is not installed, or needs updating
%pip install --upgrade pyeodh
%pip install hvplot shapely ipywidgets rasterio pyproj
Exploring the Resource Catalogue¶
Now we are ready to investigate the Resource Catalogue. First off, we need to import the packages we'll use.
# Import the Python API Client
import pyeodh
# Import all other packages
import math
import shapely
import rasterio
import pandas as pd
import hvplot.pandas
import matplotlib.pyplot as plt
from pyproj import Transformer
import urllib.request
from PIL import Image
from io import BytesIO
from IPython.display import display
from ipywidgets import interact
Next we need to create an instance of the Client, which is our entrypoint to EODH APIs. From there we can start to search the collections held within the platform.
# Connect to the Hub
# base_url can be changed to optionally specify a different server, but the default is the production server
client = pyeodh.Client(
base_url="https://eodatahub.org.uk"
).get_catalog_service()
# Print a list of the collections held in the Resource Catalogue (their id and description).
# As the Resource Catalogue fills and development continues, the number of collections and the richness of their descriptions will increase
for index, collect in enumerate(client.get_collections(), start=1):
print(f"{index} -- {collect.id}")
print(f"{collect.description}")
1 -- ukcp
Regional climate model projections produced as part of the UK Climate Projection 2018 (UKCP18) project. The data produced by the Met Office Hadley Centre provides information on changes in climate for the UK until 2080, downscaled to a high resolution (12km), helping to inform adaptation to a changing climate. The projections cover Europe and a 100 year period, 1981-2080, for a high emissions scenario, RCP8.5. Each projection provides an example of climate variability in a changing climate, which is consistent across climate variables at different times and spatial locations. This dataset contains 12km data for the United Kingdom, the Isle of Man and the Channel Islands provided on the Ordnance Survey's British National Grid.
2 -- sentinel2_ard
These data have been created by the Department for Environment, Food and Rural Affairs (Defra) and Joint Nature Conservation Committee (JNCC) in order to cost-effectively provide high quality, Analysis Ready Data (ARD) for a wide range of applications. The dataset contains modified Copernicus Sentinel-2 (Level 1C data processed into a surface reflectance product using ARCSI software (Level 2)).
3 -- sentinel1_ard
These data have been created by the Department for Environment, Food and Rural Affairs (Defra) and Joint Nature Conservation Committee (JNCC) in order to cost-effectively provide high quality, Analysis Ready Data (ARD) for a wide range of applications. The dataset contains modified Copernicus Sentinel-1 IW GRDH products processed using the ESA SNAP toolbox.
4 -- sentinel1
This dataset contains level 1 Interferometric Wide swath (IW) Single Look Complex (SLC) C-band Synthetic Aperture Radar (SAR) data from the European Space Agency (ESA) Sentinel 1 series satellites. Sentinel 1 satellites provide continuous all-weather, day and night imaging radar data. The IW mode is the main operational mode. The IW mode supports single (HH or VV) and dual (HH+HV or VV+VH) polarisation.
5 -- qa_radiometric
Collection for radiometric QA checks
6 -- qa_radiometric
Collection for radiometric QA checks
7 -- qa_radiometric
Collection for radiometric QA checks
8 -- qa_documentation
Collection for radiometric QA checks
9 -- qa_documentation
Collection for radiometric QA checks
10 -- qa_documentation
Collection for radiometric QA checks
11 -- land_cover
As part of the ESA Land Cover Climate Change Initiative (CCI) project a new set of Global Land Cover Maps have been produced. These maps are available at 300m spatial resolution for each year between 1992 and 2015. Each pixel value corresponds to the classification of a land cover class defined based on the UN Land Cover Classification System (LCCS). The reliability of the classifications made are documented by the four quality flags (decribed further in the Product User Guide) that accompany these maps. Data are provided in both NetCDF and GeoTiff format.
12 -- eocis-sst-cdrv3
This dataset provides daily-mean sea surface temperatures (SST), presented on global 0.05° latitude-longitude grid, spanning 1980 to present. This is a Level 4 product, with gaps between available daily observations filled by statistical means. The SST CCI Analysis product contains estimates of daily mean SST and sea ice concentration. Each SST value has an associated uncertainty estimate. The dataset has been produced as part of the version 3 Climate Data Record (CDR) produced by the European Space Agency (ESA) Climate Change Initiative Sea Surface Temperature project (ESA SST_cci). The CDR accurately maps the surface temperature of the global oceans over the period 1980 to 2021 using observations from many satellites, with a high degree of independence from in situ measurements. The data provide independently quantified SSTs to a quality suitable for climate research. Data from 2022 onwards are provided as an Interim Climate Data Record (ICDR) and will be updated daily at one month behind present. The Copernicus Climate Change Service (C3S) funded the development of the ICDR extension and production of the ICDR during 2022. From 2023 onwards the production of the ICDR is funded by the UK Earth Observation Climate Information Service (EOCIS) and Marine and Climate Advisory Service (MCAS).
13 -- eocis-soil-moisture-africa
This dataset contains Africa-wide soil moisture and related water balance variables produced within the Earth Observation Climate Information Service (EOCIS) project.
The product is derived using the JULES land surface model, forced with TAMSAT satellite rainfall estimates and other meteorological variables from the NCEP reanalysis, and tuned to NASA SMAP satellite soil moisture observations. Data are provided at the daily scale, from 1983-01-01 to present (latency around 7 days) at 0.25 degrees by 0.25 degrees spatial resolution.
14 -- eocis-lst-s3b-night
This collection contains datasets of level L3C global land surface temperature from the SLSTR sensor on board Sentinel 3B observed daily during nighttime. The collection is available from 2018-11-17.
15 -- eocis-lst-s3b-day
This collection contains datasets of level L3C global land surface temperature from the SLSTR sensor on board Sentinel 3B observed daily during daytime. The collection is available from 2018-11-17.
16 -- eocis-lst-s3a-night
This collection contains datasets of level L3C global land surface temperature from the SLSTR sensor on board Sentinel 3A observed daily during nighttime. The collection is available from 2016-05-01.
17 -- eocis-lst-s3a-day
This collection contains datasets of level L3C global land surface temperature from the SLSTR sensor on board Sentinel 3A observed daily during daytime. The collection is available from 2016-05-01.
18 -- eocis-chuk-land-vegetation-lai
Leaf area index (LAI) are produced from April 2018 to December 2024. These data were produced by the LEAF toolbox (see documentation) from data acquired from the Multispectral Imager (MSI) on Sentinel-2. Gap-filled data are available for 100m spatial resolution at a 15-day temporal resolution. The data must be divided by a scaling factor of 20.
19 -- eocis-chuk-land-vegetation-fapar
The fraction of absorbed photosynthetically active radiation (fAPAR) are produced from April 2018 to December 2024. These data were produced by the LEAF toolbox (see documentation) from data acquired from the Multispectral Imager (MSI) on Sentinel-2. Gap-filled data are available for 100m spatial resolution at a 15-day temporal resolution. The data must be divided by a scaling factor of 200.
20 -- eocis-chuk-geospatial-landcover
Land cover data on the CHUK grid derived from the UKCEH 25m land cover data
21 -- eocis-chuk-geospatial-elevation
Elevation data on the CHUK grid derived from the Copernicus Global 30m DEM data
22 -- eocis-chuk-geospatial-builtarea
Urban and suburban area fractions, derived from the UKCEH 10m land cover data
23 -- eocis-chuk-geospatial
EOCIS Geospatial Information
24 -- eocis-arctic-sea-ice-thickness-monthly
The sea ice products provide a 5x5 km grid of Arctic sea ice thickness (for the whole Arctic region and of 17 sub-regions), delivered as NetCDF files. EOCIS sea ice thickness NetCDF products are generated monthly by the Centre for Polar Observation and Modelling (CPOM) from radar altimetry measurements taken from the ESA CryoSAT-2 satellite during the winter months (Oct-Apr). Sea ice thickness is only reliably measured from satellite radar altimetry during the winter months. During summer, melt ponds can form on the sea ice floes making it difficult for the satellite to differentiate between floes and leads, and hence calculate sea ice freeboard (and subsequently thickness). Measurement during summer months using radar altimetry is an area of active research (Landy et al, 2022) but is not yet operationally processed.
25 -- cordex
The vision of the CORDEX program is to enhance and coordinate the science and application of regional climate downscaling through global partnerships. The program is sponsored by the World Climate Research Program (WCRP) to organise an internationally coordinated framework to produce improved regional climate change projections for all land regions world-wide. The CORDEX-results will serve as input for climate change impact and adaptation studies. The CORDEX regional downscaling domains include: South America, North America, Africa, Europe, East Asia, Central Asia, West Asia, Australasia, Antarctica, Arctic. CORDEX data are also available from the Earth System Grid Federation (ESGF).
26 -- col_d9a70328-35a4-11f0-987a-56fda0cd4f2f
description
27 -- col_b450a80a-361b-11f0-b489-56fda0cd4f2f
description
28 -- col_b2f5128e-3585-11f0-9dd8-56fda0cd4f2f
description
29 -- col_b24c51c0-35ae-11f0-9d79-56fda0cd4f2f
description
30 -- col_a57fd9e4-3617-11f0-b5ae-56fda0cd4f2f
description
31 -- col_9e19775a-35ae-11f0-afca-56fda0cd4f2f
description
32 -- col_98c2dbc0-35ae-11f0-8b31-56fda0cd4f2f
description
33 -- col_94cebf98-35ae-11f0-804c-56fda0cd4f2f
description
34 -- col_8dc36848-35ae-11f0-83cf-56fda0cd4f2f
description
35 -- col_89a7ee64-35ae-11f0-95e4-56fda0cd4f2f
description
36 -- col_7d6a51aa-35ae-11f0-b19f-56fda0cd4f2f
description
37 -- col_7515fa40-35ae-11f0-bfbc-56fda0cd4f2f
description
38 -- col_6f001b68-35ae-11f0-9b1b-56fda0cd4f2f
description
39 -- col_63f132f2-35ae-11f0-a3ab-56fda0cd4f2f
description
40 -- col_5d9e0ca4-35ae-11f0-939e-56fda0cd4f2f
description
41 -- col_5d702cbe-3624-11f0-8158-56fda0cd4f2f
description
42 -- col_5c8c1646-3615-11f0-88b3-56fda0cd4f2f
description
43 -- col_2d9f3edc-35a5-11f0-b30b-56fda0cd4f2f
description
44 -- col_286e4d36-35a5-11f0-bbdd-56fda0cd4f2f
description
45 -- cmip6
The WCRP Coupled Model Intercomparison Project, Phase 6 (CMIP6), is a global climate model intercomparison project, coordinated by PCMDI (Program For Climate Model Diagnosis and Intercomparison) on behalf of the World Climate Research Program (WCRP) providing input for the Intergovernmental Panel on Climate Change (IPCC) 6th Assessment Report (AR6).
The CMIP6 archive is managed via the Earth System Grid Federation, a globally distributed archive, with various portals delivering advanced faceted search capabilities provided from a number of participating organisations. Full details are available from the CMIP6 pages (see linked documentation on this record).
CEDA provides access to key CMIP6 simulations including those generated by the UK CMIP6 collaboration between the Met Office and NERC. Replicas of international data sets are provided to aid local access and use.
46 -- airbus_spot_data
SPOT is designed to efficiently cover huge areas in record time, making it a perfect choice for cartography and monitoring applications. The SPOT satellites build upon a legacy of data that began 30 years ago with the launch of the original SPOT 1 satellite in 1986. This legacy means that today, users have access to a vast archive of high-resolution imagery collected over the past decade, spanning billions of km² and offering unique historical insights. Every day SPOT acquires 3 million km² of imagery with 1.5 m panchromatic and 6 m multispectral bands.
47 -- airbus_sar_data
The TerraSAR-X and TanDEM-X satellites carry high frequency X-band Synthetic Aperture Radar (SAR) sensors in order to acquire datasets ranging from very high-resolution imagery to wide area coverage. Using spaceborne SAR we can provide accurate measurements, unmatched geometric accuracy and provide highly precise information of any point on Earth. Our imagery ranges from very high resolution to wide-area coverage. The satellites' proven data quality and resolution of up to 25 cm continues to be among the very best in the spaceborne commercial radar market. The constellation's capacity empowers both data-hungry services and time-critical missions, enabling a broad array of applications, including maritime monitoring, change detection, and interferometry.
48 -- airbus_pneo_data
Pléiades Neo is Airbus' industry-leading very high-resolution optical constellation of two identical satellites phased at 180° from each other. The constellation provides continuity for the Pléiades mission, with enhanced performance in terms of accuracy, reactivity and frequency. Pléiades Neo provides users with 30 cm native resolution panchromatic imagery, and six 1.2 m multispectral bands, offering the best of very high-resolution optical imagery for an unprecedented level of geospatial services.
49 -- airbus_phr_data
The Pléiades constellation is Airbus' main very high resolution programme, expanding and consolidating on the SPOT legacy with the identical Pléiades 1A and Pléiades 1B launched in 2011 and 2012. These satellites operate as a true constellation, combining a twice-daily revisit capability with an ingenious range of resolutions. Pléiades collects imagery with a 50 cm panchromatic and four 2 m multispectral bands across a 20 km swath, making them an ideal source of data for a wide range of environmental, civil, and military needs.
50 -- SkySatCollect
SkySat, operated by Planet, is a high resolution constellation of 15 satellites, able to image revisit any location on Earth up to 10x daily (a daily collection capacity of 4,000 km2/day). SkySat produces 4-band (blue, green, red, NIR) and panchromatic imagery and is sampled at 50 centimeters per pixel when orthorectified. Skysat Collect provides coregistered, stacked, mosaiced scenes.
51 -- PSScene
PlanetScope, operated by Planet, is a constellation of approximately 130 satellites, able to image the entire land surface of the Earth every day (a daily collection capacity of 200 million km²/day). PlanetScope images are approximately 3 meters per pixel resolution. The PlanetScope satellite constellation consists of multiple launches ("flocks") of Dove satellites. PlanetScope Scenes are either Four band (RGBNIR), or Eight band (RGBNIR, red edge, green I, yellow and coastal blue), depending on which generation of satellite the product was acquired by. On-orbit capacity is constantly improving in capability and quantity, with technology improvements deployed at a rapid pace.
The attributes of a catalogue are mapped to a series of properties. For instance, in the following cell we print the id, title and description.
# The next thing to do is find some open data
# Let's use the CEDA Sentinel 2 ARD
s2ard_cat = client.get_catalog("public/catalogs/ceda-stac-catalogue").get_collection('sentinel2_ard')
print("id: ", s2ard_cat.id)
print("title: ", s2ard_cat.title)
print("description: ", s2ard_cat.description)
id: sentinel2_ard title: Sentinel 2 ARD description: These data have been created by the Department for Environment, Food and Rural Affairs (Defra) and Joint Nature Conservation Committee (JNCC) in order to cost-effectively provide high quality, Analysis Ready Data (ARD) for a wide range of applications. The dataset contains modified Copernicus Sentinel-2 (Level 1C data processed into a surface reflectance product using ARCSI software (Level 2)).
The Hub API endpoints are wrapped in methods inside pyeodh and are structured into classes, following the same logic as the underlying APIs. This means that, for example, to fetch a collection item we first need to get the collection from the resource catalogue. The following cell provedes a code example to do this.
s2ard = client.get_catalog("public/catalogs/ceda-stac-catalogue").get_collection('sentinel2_ard')
s2ard
<pyeodh.resource_catalog.Collection at 0x7f72a64f3830>
Some API responses are paginated (e.g. collection items), and you can simply iterate over them.
items = s2ard.get_items()
# Warning: without the limit to 10 items this will take a long time for large catalogues such as s2ard
for item in items[:10]:
print(item.id)
neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0063_T29VPF_ORB023_20251130142106_utm29n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0055_T30VUL_ORB023_20251130142106_utm30n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0038_T30VVL_ORB023_20251130142106_utm30n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0064_T29VPE_ORB023_20251130142106_utm29n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0055_T30VUK_ORB023_20251130142106_utm30n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0038_T30VVK_ORB023_20251130142106_utm30n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn572lonw0064_T29VPD_ORB023_20251130142106_utm29n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn572lonw0054_T30VUJ_ORB023_20251130142106_utm30n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn563lonw0065_T29VPC_ORB023_20251130142106_utm29n_osgb neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn563lonw0053_T30VUH_ORB023_20251130142106_utm30n_osgb
Now we want to access the first few items and see what they are called and how much cloud there is.
for i, item in enumerate(items[:10]):
print(i, " ", item.id)
print("Cloud cover: ", item.properties['eo:cloud_cover'])
0 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0063_T29VPF_ORB023_20251130142106_utm29n_osgb Cloud cover: 42.67 1 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0055_T30VUL_ORB023_20251130142106_utm30n_osgb Cloud cover: 55.48 2 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn590lonw0038_T30VVL_ORB023_20251130142106_utm30n_osgb Cloud cover: 31.54 3 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0064_T29VPE_ORB023_20251130142106_utm29n_osgb Cloud cover: 26.12 4 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0055_T30VUK_ORB023_20251130142106_utm30n_osgb Cloud cover: 16.69 5 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn581lonw0038_T30VVK_ORB023_20251130142106_utm30n_osgb Cloud cover: 13.24 6 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn572lonw0064_T29VPD_ORB023_20251130142106_utm29n_osgb Cloud cover: 50.0 7 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn572lonw0054_T30VUJ_ORB023_20251130142106_utm30n_osgb Cloud cover: 15.11 8 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn563lonw0065_T29VPC_ORB023_20251130142106_utm29n_osgb Cloud cover: 78.08 9 neodc.sentinel_ard.data.sentinel_2.2025.11.30.S2A_20251130_latn563lonw0053_T30VUH_ORB023_20251130142106_utm30n_osgb Cloud cover: 24.78
To find specific imagery for a given date range we can set up a search with a query. That query needs to have a start and end date.
If we wanted a specific location, we could add in an intersects parameter.
items = client.search(
collections=['sentinel2_ard'],
catalog_paths=["public/catalogs/ceda-stac-catalogue"],
query=[
'start_datetime>=2024-04-01',
'end_datetime<=2024-09-30',
],
)
# We can then count the number of items returned by the search
total_items = sum(1 for _ in items)
print(f"Total items: {total_items}")
Total items: 3412
For this example we will use a location that relates to the WWT London Wetlands Centre: https://www.wwt.org.uk/wetland-centres/london
# Define the WWT site
lon, lat = (-0.2346089199955088, 51.478446832015834)
wwt_pnt = shapely.Point(lon, lat)
items = client.search(
collections=['sentinel2_ard'],
catalog_paths=["public/catalogs/ceda-stac-catalogue"],
intersects = wwt_pnt,
query=[
'start_datetime>=2024-04-01',
'end_datetime<=2024-09-30',
'eo:cloud_cover<=30',
],
)
# We can then count the number of items returned by the search
total_items = sum(1 for _ in items)
print(f"Total items: {total_items}")
Total items: 13
A useful thing to do now is find the asset information for one of those items. We shall use the first item in the list.
for item in items[:1]: # Process only the first item
print(f"Item ID: {item.id}")
print("Assets:")
if not item.assets:
print(" No assets available.")
else:
for asset_key, asset in item.assets.items():
print(f" - {asset_key}: {asset.to_dict()}") # Convert asset to dict for readable output
print("-" * 40) # Separator for better readability
Item ID: neodc.sentinel_ard.data.sentinel_2.2024.09.19.S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb
Assets:
- cloud: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_clouds.tif', 'type': 'image/tiff; application=geotiff', 'size': 2116177, 'location': 'on_disk', 'roles': ['data']}
----------------------------------------
- cloud_probability: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_clouds_prob.tif', 'type': 'image/tiff; application=geotiff', 'size': 31613463, 'location': 'on_disk', 'roles': ['data']}
----------------------------------------
- cog: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_vmsk_sharp_rad_srefdem_stdsref.tif', 'type': 'image/tiff; application=geotiff', 'size': 571656423, 'location': 'on_disk', 'eo:bands': [{'eo:central_wavelength': 496.6, 'eo:common_name': 'blue', 'description': 'Blue', 'eo: full_width_half_max': 0.07, 'name': 'B02'}, {'eo:central_wavelength': 560, 'eo:common_name': 'green', 'description': 'Green', 'eo: full_width_half_max': 0.04, 'name': 'B03'}, {'eo:central_wavelength': 664.5, 'eo:common_name': 'red', 'description': 'Red', 'eo: full_width_half_max': 0.03, 'name': 'B04'}, {'eo:central_wavelength': 703.9, 'eo:common_name': 'rededge', 'description': 'Visible and Near Infrared', 'eo: full_width_half_max': 0.02, 'name': 'B05'}, {'eo:central_wavelength': 740.2, 'eo:common_name': 'rededge', 'description': 'Visible and Near Infrared', 'eo: full_width_half_max': 0.02, 'name': 'B06'}, {'eo:central_wavelength': 782.5, 'eo:common_name': 'rededge', 'description': 'Visible and Near Infrared', 'eo: full_width_half_max': 0.02, 'name': 'B07'}, {'eo:central_wavelength': 835.1, 'eo:common_name': 'nir', 'description': 'Visible and Near Infrared', 'eo: full_width_half_max': 0.11, 'name': 'B08'}, {'eo:central_wavelength': 864.8, 'eo:common_name': 'nir08', 'description': 'Visible and Near Infrared', 'eo: full_width_half_max': 0.02, 'name': 'B08a'}, {'eo:central_wavelength': 1613.7, 'eo:common_name': 'swir16', 'description': 'Short Wave Infrared', 'eo: full_width_half_max': 0.09, 'name': 'B11'}, {'eo:central_wavelength': 2202.4, 'eo:common_name': 'swir22', 'description': 'Short Wave Infrared', 'eo: full_width_half_max': 0.18, 'name': 'B12'}], 'roles': ['data']}
----------------------------------------
- metadata: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_vmsk_sharp_rad_srefdem_stdsref_meta.xml', 'type': 'application/xml', 'size': 18453, 'location': 'on_disk', 'roles': ['metadata']}
----------------------------------------
- saturated_pixels: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_sat.tif', 'type': 'image/tiff; application=geotiff', 'size': 1774631, 'location': 'on_disk', 'roles': ['data']}
----------------------------------------
- thumbnail: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_vmsk_sharp_rad_srefdem_stdsref_thumbnail.jpg', 'type': 'image/jpeg', 'size': 40710, 'location': 'on_disk', 'roles': ['thumbnail']}
----------------------------------------
- topographic_shadow: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_toposhad.tif', 'type': 'image/tiff; application=geotiff', 'size': 215956, 'location': 'on_disk', 'roles': ['data']}
----------------------------------------
- valid_pixels: {'href': 'https://dap.ceda.ac.uk/neodc/sentinel_ard/data/sentinel_2/2024/09/19/S2A_20240919_latn518lonw0008_T30UXC_ORB094_20240919144023_utm30n_osgb_valid.tif', 'type': 'image/tiff; application=geotiff', 'size': 319023, 'location': 'on_disk', 'roles': ['data']}
----------------------------------------
We can see that this returns a lot of information. We can extract the thumbnail URL and use that to plot the image overview.
tn_url = None
for item in items[:2]: # Process only the second item
if not item.assets:
print(" No assets available.")
else:
for asset_key, asset in item.assets.items():
if asset_key == "thumbnail":
tn_url = asset.href # Directly access the href attribute
# Here we open the remote URL, read the data and dislay the thumbnail
with urllib.request.urlopen(tn_url) as url:
img = Image.open(BytesIO(url.read()))
display(img)
# code to show all thumbnails
thumbnail_urls = []
for item in items:
if not item.assets:
continue
if "thumbnail" in item.assets:
thumbnail_urls.append(item.assets["thumbnail"].href) # collect link to thumbnails
print(f"Number of assets with thumbnails: {len(thumbnail_urls)}")
# Ask for start and end thumbnails
start = int(input("Start image: "))
end = int(input("End image: "))
# Clamp the range
start = max(0, start)
end = min(len(thumbnail_urls), end)
selected_urls = thumbnail_urls[start:end]
print(f"Showing thumbnails {start} to {end-1}")
images = []
for url in selected_urls:
with urllib.request.urlopen(url) as u:
img = Image.open(BytesIO(u.read())) # get the thumbnails
images.append(img)
# Determine grid size (roughly square)
cols = 5
rows = math.ceil(len(images) / cols)
Number of assets with thumbnails: 13
Start image: 1 End image: 6
Showing thumbnails 1 to 5
# Plot thumbnail matrix
fig = plt.figure(figsize=(15, 3 * rows))
for i, img in enumerate(images):
ax = fig.add_subplot(rows, cols, i + 1)
ax.imshow(img)
ax.axis("off")
plt.tight_layout()
plt.show()
Timeseries¶
# EPSG:4326 → EPSG:27700
transformer = Transformer.from_crs("EPSG:4326", "EPSG:27700", always_xy=True)
xc, yc = transformer.transform(lon, lat)
dates, ndwi_vals = [], []
# Extract NDWI for each scene
for item in items:
asset = item.assets["cog"]
href = asset.href
# Extract band ordering from eo:bands list
bands = asset.extra_fields["eo:bands"]
band_index = {b["name"]: i + 1 for i, b in enumerate(bands)}
# Need B03 (green) and B08 (nir)
g_idx = band_index["B03"]
n_idx = band_index["B08"]
# Sample the COG
try:
with rasterio.open(href) as src:
green = list(src.sample([(xc, yc)], indexes=g_idx))[0][0]
nir = list(src.sample([(xc, yc)], indexes=n_idx))[0][0]
except:
continue
# Skip invalid values
if green <= 0 or nir <= 0:
continue
ndwi = (((green/100) - (nir/100)) / ((green/100) + (nir/100))) # Working on the reasoning that the data needs rescaling by 100
# Store result
dates.append(item.datetime)
ndwi_vals.append(float(ndwi))
Finally we plot the data using hvplot which allows the user to interact with the chart.
df = pd.DataFrame({
"date": dates,
"ndwi": ndwi_vals
})
df.hvplot.line(
x="date",
y="ndwi",
title="NDWI Timeseries at WWT London",
ylabel="NDWI",
xlabel="Date",
marker="o",
grid=True,
width=700,
height=350,
)
WARNING:param.main: marker option not found for line plot with bokeh; similar options include: [] WARNING:param.main:marker option not found for line plot with bokeh; similar options include: []
Author(s): Alastair Graham, Dusan Figala
Date created: 2025-12-09
Date last modified: 2025-12-10
Licence: This notebook is licensed under Creative Commons Attribution-ShareAlike 4.0 International. The code is released using the BSD-2-Clause license.
Copyright © - All rights reserved.Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.