PMW 2A products#
First, let’s import the package required in this tutorial.
[1]:
import datetime
import gpm
Let’s have a look at the available PMW products:
[2]:
gpm.available_products(product_categories="PMW", product_levels="2A")
[2]:
['2A-AMSR2-GCOMW1',
'2A-AMSR2-GCOMW1-CLIM',
'2A-AMSRE-AQUA',
'2A-AMSRE-AQUA-CLIM',
'2A-AMSUB-NOAA15',
'2A-AMSUB-NOAA15-CLIM',
'2A-AMSUB-NOAA16',
'2A-AMSUB-NOAA16-CLIM',
'2A-AMSUB-NOAA17',
'2A-AMSUB-NOAA17-CLIM',
'2A-ATMS-NOAA20',
'2A-ATMS-NOAA20-CLIM',
'2A-ATMS-NOAA21',
'2A-ATMS-NOAA21-CLIM',
'2A-ATMS-NPP',
'2A-ATMS-NPP-CLIM',
'2A-GMI',
'2A-GMI-CLIM',
'2A-MHS-METOPA',
'2A-MHS-METOPA-CLIM',
'2A-MHS-METOPB',
'2A-MHS-METOPB-CLIM',
'2A-MHS-METOPC',
'2A-MHS-METOPC-CLIM',
'2A-MHS-NOAA18',
'2A-MHS-NOAA18-CLIM',
'2A-MHS-NOAA19',
'2A-MHS-NOAA19-CLIM',
'2A-SAPHIR-MT1',
'2A-SAPHIR-MT1-CLIM',
'2A-SSMI-F08',
'2A-SSMI-F08-CLIM',
'2A-SSMI-F10',
'2A-SSMI-F10-CLIM',
'2A-SSMI-F11',
'2A-SSMI-F11-CLIM',
'2A-SSMI-F13',
'2A-SSMI-F13-CLIM',
'2A-SSMI-F14',
'2A-SSMI-F14-CLIM',
'2A-SSMI-F15',
'2A-SSMI-F15-CLIM',
'2A-SSMIS-F16',
'2A-SSMIS-F16-CLIM',
'2A-SSMIS-F17',
'2A-SSMIS-F17-CLIM',
'2A-SSMIS-F18',
'2A-SSMIS-F18-CLIM',
'2A-SSMIS-F19',
'2A-SSMIS-F19-CLIM',
'2A-TMI',
'2A-TMI-CLIM']
1. Data Download#
Now let’s download a 2A PMW product over a couple of hours.
[3]:
# Specify the time period you are interested in
start_time = datetime.datetime.strptime("2020-08-01 12:00:00", "%Y-%m-%d %H:%M:%S")
end_time = datetime.datetime.strptime("2020-08-02 12:00:00", "%Y-%m-%d %H:%M:%S")
# Specify the product and product type
product = "2A-MHS-METOPB" # "2A-GMI", "2A-SSMIS-F17", ...
product_type = "RS"
# Specify the version
version = 5 # 7
[4]:
# Download the data
gpm.download(
product=product,
product_type=product_type,
version=version,
start_time=start_time,
end_time=end_time,
force_download=False,
verbose=True,
progress_bar=True,
check_integrity=False,
)
/home/ghiggi/Python_Packages/gpm/gpm/io/download.py:524: GPMDownloadWarning: 'No data found on PPS on date 2020-08-01 for product 2A-MHS-METOPB'
warnings.warn(msg, GPMDownloadWarning)
No files are available for download !
/home/ghiggi/Python_Packages/gpm/gpm/io/download.py:524: GPMDownloadWarning: 'No data found on PPS on date 2020-08-02 for product 2A-MHS-METOPB'
warnings.warn(msg, GPMDownloadWarning)
Once, the data are downloaded on disk, let’s load the 2A product and look at the dataset structure.
2. Data Loading#
[5]:
# Load the dataset
# - If scan_mode is not specified, it automatically load one!
ds = gpm.open_dataset(
product=product,
product_type=product_type,
version=version,
start_time=start_time,
end_time=end_time,
)
ds
/home/ghiggi/Python_Packages/gpm/gpm/dataset/conventions.py:42: GPM_Warning: 'The dataset start at 2020-08-01 12:00:01, although the specified start_time is 2020-08-01 12:00:00; and the dataset end_time 2020-08-02 11:59:58 occurs before the specified end_time 2020-08-02 12:00:00.'
warnings.warn(msg, GPM_Warning)
[5]:
<xarray.Dataset> Dimensions: (cross_track: 90, along_track: 32399, nspecies: 5) Coordinates: lon (cross_track, along_track) float32 ... lat (cross_track, along_track) float32 ... time (along_track) datetime64[ns] 2020-08-01T12:00... gpm_id (along_track) <U10 ... gpm_granule_id (along_track) int64 ... gpm_cross_track_id (cross_track) int64 ... gpm_along_track_id (along_track) int64 ... crsWGS84 int64 0 Dimensions without coordinates: cross_track, along_track, nspecies Data variables: (12/27) pixelStatus (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> qualityFlag (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> L1CqualityFlag (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> surfaceTypeIndex (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> totalColumnWaterVaporIndex (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> CAPE (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> ... ... profileScale (cross_track, along_track, nspecies) float32 dask.array<chunksize=(90, 237, 5), meta=np.ndarray> SCorientation (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SClatitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SClongitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SCaltitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> FractionalGranuleNumber (along_track) float64 dask.array<chunksize=(237,), meta=np.ndarray> Attributes: (12/19) FileName: 2A.METOPB.MHS.GPROF2017v2.20200801-S102909-E121030.04... EphemerisFileName: AttitudeFileName: MissingData: 0 DOI: 10.5067/GPM/MHS/METOPB/GPROF/2A/05 DOIauthority: http://dx.doi/org/ ... ... MetadataVersion: cr Satellite: METOPB Sensor: MHS ScanMode: S1 history: Created by ghiggi/gpm_api software on 2023-07-20 10:4... gpm_api_product: 2A-MHS-METOPB
If you want to load another scan_mode
, first have a look at the available ones:
[6]:
gpm.available_scan_modes(product=product, version=version)
[6]:
['S1']
and then specify the scan_mode
argument in open_dataset
:
[7]:
ds = gpm.open_dataset(
product=product,
product_type=product_type,
version=version,
start_time=start_time,
end_time=end_time,
scan_mode="S1",
)
ds
/home/ghiggi/Python_Packages/gpm/gpm/dataset/conventions.py:42: GPM_Warning: 'The dataset start at 2020-08-01 12:00:01, although the specified start_time is 2020-08-01 12:00:00; and the dataset end_time 2020-08-02 11:59:58 occurs before the specified end_time 2020-08-02 12:00:00.'
warnings.warn(msg, GPM_Warning)
[7]:
<xarray.Dataset> Dimensions: (cross_track: 90, along_track: 32399, nspecies: 5) Coordinates: lon (cross_track, along_track) float32 ... lat (cross_track, along_track) float32 ... time (along_track) datetime64[ns] 2020-08-01T12:00... gpm_id (along_track) <U10 ... gpm_granule_id (along_track) int64 ... gpm_cross_track_id (cross_track) int64 ... gpm_along_track_id (along_track) int64 ... crsWGS84 int64 0 Dimensions without coordinates: cross_track, along_track, nspecies Data variables: (12/27) pixelStatus (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> qualityFlag (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> L1CqualityFlag (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> surfaceTypeIndex (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> totalColumnWaterVaporIndex (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> CAPE (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> ... ... profileScale (cross_track, along_track, nspecies) float32 dask.array<chunksize=(90, 237, 5), meta=np.ndarray> SCorientation (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SClatitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SClongitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> SCaltitude (along_track) float32 dask.array<chunksize=(237,), meta=np.ndarray> FractionalGranuleNumber (along_track) float64 dask.array<chunksize=(237,), meta=np.ndarray> Attributes: (12/19) FileName: 2A.METOPB.MHS.GPROF2017v2.20200801-S102909-E121030.04... EphemerisFileName: AttitudeFileName: MissingData: 0 DOI: 10.5067/GPM/MHS/METOPB/GPROF/2A/05 DOIauthority: http://dx.doi/org/ ... ... MetadataVersion: cr Satellite: METOPB Sensor: MHS ScanMode: S1 history: Created by ghiggi/gpm_api software on 2023-07-20 10:4... gpm_api_product: 2A-MHS-METOPB
You can list variables, coordinates and dimensions with the following methods
[8]:
# Available variables
variables = list(ds.data_vars)
print("Available variables: ", variables)
# Available coordinates
coords = list(ds.coords)
print("Available coordinates: ", coords)
# Available dimensions
dims = list(ds.dims)
print("Available dimensions: ", dims)
Available variables: ['pixelStatus', 'qualityFlag', 'L1CqualityFlag', 'surfaceTypeIndex', 'totalColumnWaterVaporIndex', 'CAPE', 'temp2mIndex', 'sunGlintAngle', 'probabilityOfPrecip', 'spare2', 'surfacePrecipitation', 'frozenPrecipitation', 'convectivePrecipitation', 'rainWaterPath', 'cloudWaterPath', 'iceWaterPath', 'mostLikelyPrecipitation', 'precip1stTertial', 'precip2ndTertial', 'profileTemp2mIndex', 'profileNumber', 'profileScale', 'SCorientation', 'SClatitude', 'SClongitude', 'SCaltitude', 'FractionalGranuleNumber']
Available coordinates: ['lon', 'lat', 'time', 'gpm_id', 'gpm_granule_id', 'gpm_cross_track_id', 'gpm_along_track_id', 'crsWGS84']
Available dimensions: ['cross_track', 'along_track', 'nspecies']
As you see, every variable has a prefix which indicates the group in the original HDF file where the variable is stored. You can remove the prefix when opening the dataset by specifying prefix_group=False
. You can also directly load only a subset of variables, by specifying the variables
argument.
[9]:
# List some variables of interest
variables = [
"surfacePrecipitation",
"rainWaterPath",
"iceWaterPath",
"cloudWaterPath",
]
# Load the dataset
ds = gpm.open_dataset(
product=product,
product_type=product_type,
version=version,
start_time=start_time,
end_time=end_time,
variables=variables,
prefix_group=False,
)
ds
/home/ghiggi/Python_Packages/gpm/gpm/dataset/conventions.py:42: GPM_Warning: 'The dataset start at 2020-08-01 12:00:01, although the specified start_time is 2020-08-01 12:00:00; and the dataset end_time 2020-08-02 11:59:58 occurs before the specified end_time 2020-08-02 12:00:00.'
warnings.warn(msg, GPM_Warning)
[9]:
<xarray.Dataset> Dimensions: (cross_track: 90, along_track: 32399) Coordinates: lon (cross_track, along_track) float32 ... lat (cross_track, along_track) float32 ... time (along_track) datetime64[ns] 2020-08-01T12:00:01 ..... gpm_id (along_track) <U10 ... gpm_granule_id (along_track) int64 ... gpm_cross_track_id (cross_track) int64 ... gpm_along_track_id (along_track) int64 ... crsWGS84 int64 0 Dimensions without coordinates: cross_track, along_track Data variables: surfacePrecipitation (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> rainWaterPath (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> iceWaterPath (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> cloudWaterPath (cross_track, along_track) float32 dask.array<chunksize=(90, 237), meta=np.ndarray> Attributes: (12/19) FileName: 2A.METOPB.MHS.GPROF2017v2.20200801-S102909-E121030.04... EphemerisFileName: AttitudeFileName: MissingData: 0 DOI: 10.5067/GPM/MHS/METOPB/GPROF/2A/05 DOIauthority: http://dx.doi/org/ ... ... MetadataVersion: cr Satellite: METOPB Sensor: MHS ScanMode: S1 history: Created by ghiggi/gpm_api software on 2023-07-20 10:4... gpm_api_product: 2A-MHS-METOPB
To select the DataArray corresponding to a single variable:
[10]:
variable = "surfacePrecipitation"
da = ds[variable]
da
[10]:
<xarray.DataArray 'surfacePrecipitation' (cross_track: 90, along_track: 32399)> dask.array<getitem, shape=(90, 32399), dtype=float32, chunksize=(90, 2281), chunktype=numpy.ndarray> Coordinates: lon (cross_track, along_track) float32 ... lat (cross_track, along_track) float32 ... time (along_track) datetime64[ns] 2020-08-01T12:00:01 ... ... gpm_id (along_track) <U10 ... gpm_granule_id (along_track) int64 ... gpm_cross_track_id (cross_track) int64 ... gpm_along_track_id (along_track) int64 ... crsWGS84 int64 0 Dimensions without coordinates: cross_track, along_track Attributes: units: mm/hr gpm_api_product: 2A-MHS-METOPB grid_mapping: crsWGS84 coordinates: lat lon
To extract from the DataArray the numerical array you use:
[11]:
print("Data type of numerical array: ", type(da.data))
da.data
Data type of numerical array: <class 'dask.array.core.Array'>
[11]:
|
If the numerical array data type is dask.Array
, it means that the data are not yet loaded into RAM memory. To put the data into memory, you need to call the method compute
, either on the xarray object or on the numerical array.
[12]:
# Option 1
da_opt1 = da.compute()
print("Data type of numerical array: ", type(da_opt1.data))
da_opt1.data
Data type of numerical array: <class 'numpy.ndarray'>
[12]:
array([[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
...,
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan]], dtype=float32)
[13]:
# Option 2
print("Data type of numerical array: ", type(da.data.compute()))
da.data.compute()
Data type of numerical array: <class 'numpy.ndarray'>
[13]:
array([[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
...,
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan]], dtype=float32)
3. Dataset Manipulations#
Now, let’s first have a look at the methods provided by GPM-API
[14]:
variable = "surfacePrecipitation"
da = ds[variable]
print("xr.Dataset gpm methods:", dir(ds.gpm))
print("")
print("xr.DataArray gpm methods:", dir(da.gpm))
xr.Dataset gpm methods: ['__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', '_obj', 'crop', 'crop_by_continent', 'crop_by_country', 'end_time', 'extent', 'frequency_variables', 'get_crop_slices_by_continent', 'get_crop_slices_by_country', 'get_crop_slices_by_extent', 'get_slices_contiguous_granules', 'get_slices_contiguous_scans', 'get_slices_regular', 'get_slices_regular_time', 'get_slices_valid_geolocation', 'has_contiguous_scans', 'has_missing_granules', 'has_regular_time', 'has_valid_geolocation', 'is_grid', 'is_orbit', 'is_regular', 'is_spatial_2d', 'is_spatial_3d', 'plot_image', 'plot_map', 'plot_map_mesh', 'plot_map_mesh_centroids', 'plot_swath_lines', 'plot_transect_line', 'pyresample_area', 'spatial_2d_variables', 'spatial_3d_variables', 'start_time', 'subset_by_time', 'subset_by_time_slice', 'title', 'variables']
xr.DataArray gpm methods: ['__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', '_obj', 'crop', 'crop_by_continent', 'crop_by_country', 'end_time', 'extent', 'frequency_variables', 'get_crop_slices_by_continent', 'get_crop_slices_by_country', 'get_crop_slices_by_extent', 'get_slices_contiguous_granules', 'get_slices_contiguous_scans', 'get_slices_regular', 'get_slices_regular_time', 'get_slices_valid_geolocation', 'get_slices_var_between', 'get_slices_var_equals', 'has_contiguous_scans', 'has_missing_granules', 'has_regular_time', 'has_valid_geolocation', 'is_grid', 'is_orbit', 'is_regular', 'is_spatial_2d', 'is_spatial_3d', 'plot_image', 'plot_map', 'plot_map_mesh', 'plot_map_mesh_centroids', 'plot_swath_lines', 'plot_transect_line', 'pyresample_area', 'spatial_2d_variables', 'spatial_3d_variables', 'start_time', 'subset_by_time', 'subset_by_time_slice', 'title', 'variables']
The GPM products are either ORBIT (i.e. PMW and RADAR) or GRID (i.e. IMERG) based. You can check the support of the data with the methods is_grid
and is_orbit
.
[15]:
print("Is GPM ORBIT data?: ", ds.gpm.is_orbit)
print("Is GPM GRID data?: ", ds.gpm.is_grid)
Is GPM ORBIT data?: True
Is GPM GRID data?: False
To check Whether the loaded GPM PMW product has contiguous scans, you can use:
[16]:
print(ds.gpm.has_contiguous_scans)
print(ds.gpm.is_regular)
True
True
In case there are non-contiguous scans, you can obtain the along-track slices over which the dataset is regular:
[17]:
list_slices = ds.gpm.get_slices_contiguous_scans()
print(list_slices)
[slice(0, 32399, None)]
You can then select a regular portion of the dataset with:
[18]:
slc = list_slices[0]
print(slc)
slice(0, 32399, None)
[19]:
ds_regular = ds.isel(along_track=slc)
ds_regular.gpm.is_regular
[19]:
True
To instead check if the xr.Dataset have just the 2D spatial dimensions, you can use:
[20]:
ds.gpm.is_spatial_2d
[20]:
True
4. Product Visualization#
The GPM-API provides two ways of displaying the data: - The plot_map
method plot the data in a geographic projection using the Cartopy pcolormesh
method - The plot_image
method plot the data as an image using the Maplotlib imshow
method
Let’s start by plotting the PMW scan in the geographic space
[21]:
da = ds[variable].isel(along_track=slice(0,8000))
da.gpm.plot_map()
[21]:
<cartopy.mpl.geocollection.GeoQuadMesh at 0x7f0c22889b70>
and now as an image, in “swath” view:
[22]:
da.gpm.plot_image()
[22]:
<matplotlib.image.AxesImage at 0x7f0c221a8610>
To facilitate the creation of a figure title, GPM-API also provide a title
method:
[23]:
# Title for a single-timestep dataset
print(ds[variable].gpm.title(add_timestep=True))
print(ds[variable].gpm.title(add_timestep=False))
2A-MHS-METOPB SurfacePrecipitation (2020-08-01 23:59)
2A-MHS-METOPB SurfacePrecipitation
To instead zoom on a specific regions of a plot_map
figure, you can use the axis method set_extent
. Note that render the image with this approach can be quite slow, because plot_map
plots all the data, and then restrict the figure extent over the area of interest. For a more efficient approach, see section 6. Dataset cropping
.
[24]:
from gpm.utils.geospatial import get_country_extent
title = ds.gpm.title(add_timestep=False)
extent = get_country_extent("United States")
print("Extent: ", extent)
da = ds[variable].isel(along_track=slice(0, 8000))
p = da.gpm.plot_map()
_ = p.axes.set_extent(extent)
_ = p.axes.set_title(label=title)
Extent: (-171.99111060299998, -66.76465999999999, 18.71619, 71.5577635769)
You can also customize the geographic projection, by specifying the wished Cartopy projection. The available projections are listed here
[25]:
import cartopy.crs as ccrs
import matplotlib.pyplot as plt
from gpm.visualization.plot import plot_cartopy_background
# Define some figure options
dpi = 100
figsize = (12, 10)
# Example of polar Cartopy projection
crs_proj = ccrs.Orthographic(180, -90)
# Subset the data for fast rendering
da = ds[variable].isel(along_track=slice(0, 8000))
# Create the map
fig, ax = plt.subplots(subplot_kw={"projection": crs_proj}, figsize=figsize, dpi=dpi)
plot_cartopy_background(ax)
da.gpm.plot_map(ax=ax)
ax.set_global()
It is possible to further customize these figures in multiply ways. For example by specifying the own colormap:
[26]:
da.gpm.plot_map(cmap="Spectral", vmin=0.1, vmax=100)
[26]:
<cartopy.mpl.geocollection.GeoQuadMesh at 0x7f0c21adf070>
5. Dataset Cropping#
GPM-API provides methods to easily spatially subset orbits by extent, country or continent. Note however, that an area can be crossed by multiple orbits. In other words, multiple orbit slices in along-track direction can intersect the area of interest. The method get_crop_slices_by_extent
, get_crop_slices_by_country
and get_crop_slices_by_continent
enable to retrieve the orbit portions intersecting the area of interest.
[27]:
# Subset the data for fast rendering
da = ds[variable].isel(along_track=slice(0, 8000))
# Crop by extent
extent = (-172, -67, 19, 72) # (xmin, xmax, ymin, ymax)
list_isel_dict = da.gpm.get_crop_slices_by_extent(extent)
print(list_isel_dict)
for isel_dict in list_isel_dict:
da_subset = da.isel(isel_dict)
slice_title = da_subset.gpm.title(add_timestep=True)
p = da_subset.gpm.plot_map()
p.axes.set_extent(extent)
p.axes.set_title(label=slice_title)
[{'along_track': slice(3770, 4117, None)}, {'along_track': slice(6029, 6398, None)}]
[28]:
# Crop by country
# - Option 1
list_isel_dict = da.gpm.get_crop_slices_by_country("United States")
print(list_isel_dict)
# - Option 2
from gpm.utils.geospatial import get_country_extent
extent = get_country_extent("United States")
list_isel_dict = da.gpm.get_crop_slices_by_extent(extent)
print(list_isel_dict)
# - Plot the swath crossing the country
for isel_dict in list_isel_dict:
da_subset = da.isel(isel_dict)
slice_title = da_subset.gpm.title(add_timestep=True)
p = da_subset.gpm.plot_map()
p.axes.set_extent(extent)
p.axes.set_title(label=slice_title)
[{'along_track': slice(3772, 4119, None)}, {'along_track': slice(6030, 6400, None)}]
[{'along_track': slice(3772, 4119, None)}, {'along_track': slice(6030, 6400, None)}]
[29]:
# Crop by continent
# - Option 1
list_isel_dict = da.gpm.get_crop_slices_by_continent("South America")
print(list_isel_dict)
# - Option 2
from gpm.utils.geospatial import get_continent_extent
extent = get_continent_extent("South America")
list_isel_dict = da.gpm.get_crop_slices_by_extent(extent)
print(list_isel_dict)
# - Plot the swath crossing the country
for isel_dict in list_isel_dict:
da_subset = da.isel(isel_dict)
slice_title = da_subset.gpm.title(add_timestep=True)
p = da_subset.gpm.plot_map()
p.axes.set_extent(extent)
p.axes.set_title(label=slice_title)
[{'along_track': slice(0, 87, None)}, {'along_track': slice(1836, 2340, None)}, {'along_track': slice(4116, 4602, None)}]
[{'along_track': slice(0, 87, None)}, {'along_track': slice(1836, 2340, None)}, {'along_track': slice(4116, 4602, None)}]