Skip to content

This documentation is for the integration of the GHGSat API into the datalab and the process for importing emissions, non-detections and site data into the Aershed platform.

Sites

The GHGSat API provides a limited number of the operator's sites around the world. Most of these we already have but some are new. We need to ensure we stay up to date with this data for two reasons:

  • We use the GHGSat site identifier to correctly match observations to site non-detections
  • If we do not already have the same infrastructure in Aershed then we need to import it to avoid missing a matched emission event

GHGSat provides a unique id and type for each site, we rename these to ghgsat_id and ghgsat_site_type and then perform an infrastructure update against the Aershed database. GHGSat sites are point location, so in most cases these points are inside existing Aershed sites, so we simply add these two fields as extra data.

In some cases there are multiple GHGSat sites inside an existing polygon. In this case we concatenate the ghgsat_id and type (if different) in to a semicolon separated list.

To create an infrastructure update use the configuration in config/ghgsat/sites.yaml. In this file you can optionally specify a geographic "zone", this process will download all available sites and perform an infrastructure update against the chosen environment.

Emissions

Global Emissions

Non-detections

Plume image processing

The plume images that come from the emissions and global emissions product can be different. Even within the global emissions product we have different plume images as well, as each provider can have different formats. For this reason for GHGSat we will provide all images to the Aershed platform in a clean state and take care of the processing here in the datalab tools.

To process a sample set of images using the GHGSat image processing script use the example program in config/ghgsat/test-images.yaml. Create a folder in your environment of choice, for example if your PROCESSING_DIR is the data folder in the root of the datalab tools then make a directory at data/local/plume_images/raw and place the raw plume images in there.

Then run the program with the following command.

./scripts/run.sh config/ghgsat/test-images.yaml --env local