Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyter notebook: data loading gives error when updating dataset version #182

Open
yoniv1 opened this issue Feb 6, 2025 · 11 comments
Open
Labels
bug Something isn't working

Comments

@yoniv1
Copy link

yoniv1 commented Feb 6, 2025

What happened?

Hi @malmans2 ,

When updating the dataset version to 7.0 in stead of 6.0 of the glacier extent dataset, I get an error while loading in the data. Can you please check it?

The dataset:
https://cds.climate.copernicus.eu/datasets/insitu-glaciers-extent?tab=download
The notebook:
https://github.com/ecmwf-projects/c3s2-eqc-quality-assessment/blob/98e722072e4c67c82b48fa34db9e48cf7b3f796b/Satellite_ECVs/Cryosphere/satellite_insitu-glaciers-extent_consistency-assessment_q02.ipynb

Thanks

The original code which is working for version 6.0:

print("Downloading and handling glacier extent data from the CDS, this may take a couple of minutes...")

# Glacier extent data (raster)
request_extent_gridded = (
    "insitu-glaciers-extent",
    {
        "variable":"glacier_area",
        "product_type":"gridded",
        "format":"zip",
        "version":"6_0",
    },
)

# Get glacier extent gridded data
df2 = download.download_and_transform(*request_extent_gridded)
print("Download glacier extent data (raster) completed.")

print("Now for the vector data...")
# Glacier extent data (vector)
request_extent = (
    "insitu-glaciers-extent",
    {
    "variable": ["glacier_area"],
    "product_type": ["vector"],
    "version": "rgi_6_0"
    },
)

df = download.download_and_transform(*request_extent).to_pandas()
gdf = gpd.GeoDataFrame(
    df,
    geometry=gpd.points_from_xy(df["CENLON"], df["CENLAT"]),
    crs="EPSG:4326",
)

print("Downloading and handling glacier extent data completed.")

The error when changing to version 7.0:

2025-02-06 11:24:54,929 INFO [2024-09-26T00:00:00] Watch our [Forum](https://forum.ecmwf.int/) for Announcements, news and other discussed topics.
2025-02-06 11:24:54,930 WARNING [2024-06-16T00:00:00] CDS API syntax is changed and some keys or parameter names may have also changed. To avoid requests failing, please use the "Show API request code" tool on the dataset Download Form to check you are using the correct syntax for your API request.
2025-02-06 11:24:54,984 INFO [2024-09-26T00:00:00] Watch our [Forum](https://forum.ecmwf.int/) for Announcements, news and other discussed topics.
2025-02-06 11:24:54,985 WARNING [2024-06-16T00:00:00] CDS API syntax is changed and some keys or parameter names may have also changed. To avoid requests failing, please use the "Show API request code" tool on the dataset Download Form to check you are using the correct syntax for your API request.
  0%|          | 0[/1](http://localhost:5678/1) [00:00<?, ?it[/s](http://localhost:5678/s)]
---------------------------------------------------------------------------
HTTPError                                 Traceback (most recent call last)
Cell In[17], line 15
      4 request_extent_gridded = (
      5     "insitu-glaciers-extent",
      6     {
   (...)
     11     },
     12 )
     14 # Get glacier extent gridded data
---> 15 df2 = download.download_and_transform(*request_extent_gridded)
     16 print("Download glacier extent data (raster) completed.")
     18 print("Now for the vector data...")

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:625](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=624), in download_and_transform(collection_id, requests, chunks, split_all, transform_func, transform_func_kwargs, transform_chunks, n_jobs, invalidate_cache, cached_open_mfdataset_kwargs, quiet, **open_mfdataset_kwargs)
    621         cacholote.delete(
    622             func.func, *func.args, request_list=request_list, **func.keywords
    623         )
    624     with _set_env(tqdm_disable=quiet):
--> 625         ds = func(request_list=request_list)
    627 ds.attrs.pop("coordinates", None)  # Previously added to guarantee roundtrip
    628 return ds

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:436](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=435), in _download_and_transform_requests(collection_id, request_list, transform_func, transform_func_kwargs, **open_mfdataset_kwargs)
    429 def _download_and_transform_requests(
    430     collection_id: str,
    431     request_list: list[dict[str, Any]],
   (...)
    434     **open_mfdataset_kwargs: Any,
    435 ) -> xr.Dataset:
--> 436     sources = get_sources(collection_id, request_list)
    437     preprocess = functools.partial(
    438         _preprocess,
    439         collection_id=collection_id,
    440         preprocess=open_mfdataset_kwargs.pop("preprocess", None),
    441     )
    443     grib_ext = (".grib", ".grb", ".grb1", ".grb2")

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:349](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=348), in get_sources(collection_id, request_list)
    347 disable = os.getenv("TQDM_DISABLE", "False") == "True"
    348 for request in tqdm.tqdm(request_list, disable=disable):
--> 349     sources.update(retrieve(collection_id, request))
    350 return list(sources)

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:339](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=338), in retrieve(collection_id, request)
    334 def retrieve(collection_id: str, request: dict[str, Any]) -> list[str]:
    335     with cacholote.config.set(
    336         return_cache_entry=False,
    337         io_delete_original=True,
    338     ):
--> 339         return [file.path for file in _cached_retrieve(collection_id, request)]

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/cacholote/cache.py:102](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/cacholote/cache.py#line=101), in cacheable.<locals>.wrapper(*args, **kwargs)
     99                 warnings.warn(str(ex), UserWarning)
    100                 clean._delete_cache_entries(session, cache_entry)
--> 102 result = func(*args, **kwargs)
    103 cache_entry = database.CacheEntry(
    104     key=hexdigest,
    105     expiration=settings.expiration,
    106     tag=settings.tag,
    107 )
    108 try:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:324](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=323), in _cached_retrieve(collection_id, request)
    322 if NOCACHE:
    323     request = request | {"nocache": datetime.datetime.now().isoformat()}
--> 324 ds = earthkit.data.from_source("cds", collection_id, request, prompt=False)
    325 if isinstance(ds, ShapeFileReader) and hasattr(ds._parent, "_path_and_parts"):
    326     # Do not unzip vector data
    327     sources = [ds._parent._path_and_parts]

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py:150](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py#line=149), in from_source(name, lazily, *args, **kwargs)
    147     return from_source_lazily(name, *args, **kwargs)
    149 prev = None
--> 150 src = get_source(name, *args, **kwargs)
    151 while src is not prev:
    152     prev = src

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py:131](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py#line=130), in SourceMaker.__call__(self, name, *args, **kwargs)
    128     klass = find_plugin(os.path.dirname(__file__), name, loader)
    129     self.SOURCES[name] = klass
--> 131 source = klass(*args, **kwargs)
    133 if getattr(source, "name", None) is None:
    134     source.name = name

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/core/__init__.py:22](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/core/__init__.py#line=21), in MetaBase.__call__(cls, *args, **kwargs)
     20 obj = cls.__new__(cls, *args, **kwargs)
     21 args, kwargs = cls.patch(obj, *args, **kwargs)
---> 22 obj.__init__(*args, **kwargs)
     23 return obj

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py:126](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py#line=125), in CdsRetriever.__init__(self, dataset, prompt, *args, **kwargs)
    123 nthreads = min(self.settings("number-of-download-threads"), len(self.requests))
    125 if nthreads < 2:
--> 126     self.path = [self._retrieve(dataset, r) for r in self.requests]
    127 else:
    128     with SoftThreadPool(nthreads=nthreads) as pool:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py:126](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py#line=125), in <listcomp>(.0)
    123 nthreads = min(self.settings("number-of-download-threads"), len(self.requests))
    125 if nthreads < 2:
--> 126     self.path = [self._retrieve(dataset, r) for r in self.requests]
    127 else:
    128     with SoftThreadPool(nthreads=nthreads) as pool:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py:140](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py#line=139), in CdsRetriever._retrieve(self, dataset, request)
    137     self.source_filename = cds_result.location.split("[/](http://localhost:5678/)")[-1]
    138     cds_result.download(target=target)
--> 140 return_object = self.cache_file(
    141     retrieve,
    142     (dataset, request),
    143     extension=EXTENSIONS.get(request.get("format"), ".cache"),
    144 )
    145 return return_object

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py:68](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/__init__.py#line=67), in Source.cache_file(self, create, args, **kwargs)
     65 if owner is None:
     66     owner = re.sub(r"(?!^)([A-Z]+)", r"-\1", self.__class__.__name__).lower()
---> 68 return cache_file(owner, create, args, **kwargs)

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/core/caching.py:1056](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/core/caching.py#line=1055), in cache_file(owner, create, args, hash_extra, extension, force, replace)
   1054 with FileLock(lock):
   1055     if not os.path.exists(path):  # Check again, another thread[/process](http://localhost:5678/process) may have created the file
-> 1056         owner_data = create(path + ".tmp", args)
   1057         os.rename(path + ".tmp", path)
   1058 try:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py:136](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/earthkit/data/sources/cds.py#line=135), in CdsRetriever._retrieve.<locals>.retrieve(target, args)
    135 def retrieve(target, args):
--> 136     cds_result = self.client().retrieve(args[0], args[1])
    137     self.source_filename = cds_result.location.split("[/](http://localhost:5678/)")[-1]
    138     cds_result.download(target=target)

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/legacy_api_client.py:169](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/legacy_api_client.py#line=168), in LegacyApiClient.retrieve(self, name, request, target)
    167 submitted: Remote | Results
    168 if self.wait_until_complete:
--> 169     submitted = self.client.submit_and_wait_on_results(
    170         collection_id=name,
    171         **request,
    172     )
    173 else:
    174     submitted = self.client.submit(
    175         collection_id=name,
    176         **request,
    177     )

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/api_client.py:458](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/api_client.py#line=457), in ApiClient.submit_and_wait_on_results(self, collection_id, **request)
    442 def submit_and_wait_on_results(
    443     self, collection_id: str, **request: Any
    444 ) -> datapi.Results:
    445     """Submit a request and wait for the results to be ready.
    446 
    447     Parameters
   (...)
    456     datapi.Results
    457     """
--> 458     return self._retrieve_api.submit(collection_id, **request).make_results()

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py:727](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py#line=726), in Processing.submit(self, collection_id, **request)
    726 def submit(self, collection_id: str, **request: Any) -> Remote:
--> 727     return self.get_process(collection_id).submit(**request)

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py:319](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py#line=318), in Process.submit(self, **request)
    307 def submit(self, **request: Any) -> datapi.Remote:
    308     """Submit a request.
    309 
    310     Parameters
   (...)
    317     datapi.Remote
    318     """
--> 319     job = Job.from_request(
    320         "post",
    321         f"{self.url}[/execution](http://localhost:5678/execution)",
    322         json={"inputs": request},
    323         **self._request_kwargs,
    324     )
    325     return job.make_remote()

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py:177](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py#line=176), in ApiResponse.from_request(cls, method, url, headers, session, retry_options, request_options, download_options, sleep_max, cleanup, log_callback, log_messages, **kwargs)
    172 response = robust_request(
    173     method, url, headers=headers, **request_options, **kwargs
    174 )
    175 log(logging.DEBUG, f"REPLY {response.text}", callback=log_callback)
--> 177 cads_raise_for_status(response)
    179 self = cls(
    180     response,
    181     headers=headers,
   (...)
    188     log_callback=log_callback,
    189 )
    190 if log_messages:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py:100](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/datapi/processing.py#line=99), in cads_raise_for_status(response)
     93     else:
     94         message = "\n".join(
     95             [
     96                 f"{response.status_code} Client Error: {response.reason} for url: {response.url}",
     97                 error_json_to_message(error_json),
     98             ]
     99         )
--> 100         raise requests.HTTPError(message, response=response)
    101 response.raise_for_status()

HTTPError: 400 Client Error: Bad Request for url: https://cds.climate.copernicus.eu/api/retrieve/v1/processes/insitu-glaciers-extent/execution
invalid request
Request has not produced a valid combination of values, please check your selection.
{'product_type': 'gridded', 'variable': 'glacier_area', 'version': '7_0'}

Minimal Complete Verifiable Example

Relevant log output

Anything else we need to know?

No response

Environment

@yoniv1 yoniv1 added the bug Something isn't working label Feb 6, 2025
@malmans2
Copy link
Member

malmans2 commented Feb 6, 2025

Hi @yoniv1,

If you look at the forms here, the CDS introduced a breaking change in the forms.

You should use this request:

request_extent_gridded = (
    "insitu-glaciers-extent",
    {
        "variable": "glacier_area",
        "product_type": "gridded",
        "version": "rgi_7_0",
    },
)

@yoniv1
Copy link
Author

yoniv1 commented Feb 6, 2025

Hi @malmans2,

Thanks, it works for the gridded version, but for the vector data I get the following error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[19], line 28
     18 # Glacier extent data (vector)
     19 request_extent = (
     20     "insitu-glaciers-extent",
     21     {
   (...)
     25     },
     26 )
---> 28 df = download.download_and_transform(*request_extent).to_pandas()
     29 gdf = gpd.GeoDataFrame(
     30     df,
     31     geometry=gpd.points_from_xy(df["CENLON"], df["CENLAT"]),
     32     crs="EPSG:4326",
     33 )
     35 print("Downloading and handling glacier extent data (vector) completed.")

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:625](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=624), in download_and_transform(collection_id, requests, chunks, split_all, transform_func, transform_func_kwargs, transform_chunks, n_jobs, invalidate_cache, cached_open_mfdataset_kwargs, quiet, **open_mfdataset_kwargs)
    621         cacholote.delete(
    622             func.func, *func.args, request_list=request_list, **func.keywords
    623         )
    624     with _set_env(tqdm_disable=quiet):
--> 625         ds = func(request_list=request_list)
    627 ds.attrs.pop("coordinates", None)  # Previously added to guarantee roundtrip
    628 return ds

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:436](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=435), in _download_and_transform_requests(collection_id, request_list, transform_func, transform_func_kwargs, **open_mfdataset_kwargs)
    429 def _download_and_transform_requests(
    430     collection_id: str,
    431     request_list: list[dict[str, Any]],
   (...)
    434     **open_mfdataset_kwargs: Any,
    435 ) -> xr.Dataset:
--> 436     sources = get_sources(collection_id, request_list)
    437     preprocess = functools.partial(
    438         _preprocess,
    439         collection_id=collection_id,
    440         preprocess=open_mfdataset_kwargs.pop("preprocess", None),
    441     )
    443     grib_ext = (".grib", ".grb", ".grb1", ".grb2")

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:349](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=348), in get_sources(collection_id, request_list)
    347 disable = os.getenv("TQDM_DISABLE", "False") == "True"
    348 for request in tqdm.tqdm(request_list, disable=disable):
--> 349     sources.update(retrieve(collection_id, request))
    350 return list(sources)

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:339](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=338), in retrieve(collection_id, request)
    334 def retrieve(collection_id: str, request: dict[str, Any]) -> list[str]:
    335     with cacholote.config.set(
    336         return_cache_entry=False,
    337         io_delete_original=True,
    338     ):
--> 339         return [file.path for file in _cached_retrieve(collection_id, request)]

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/cacholote/cache.py:102](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/cacholote/cache.py#line=101), in cacheable.<locals>.wrapper(*args, **kwargs)
     99                 warnings.warn(str(ex), UserWarning)
    100                 clean._delete_cache_entries(session, cache_entry)
--> 102 result = func(*args, **kwargs)
    103 cache_entry = database.CacheEntry(
    104     key=hexdigest,
    105     expiration=settings.expiration,
    106     tag=settings.tag,
    107 )
    108 try:

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:331](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=330), in _cached_retrieve(collection_id, request)
    329     sources = ds.sources if hasattr(ds, "sources") else [ds]
    330 fs = fsspec.filesystem("file")
--> 331 return [fs.open(path) for path in get_paths(sources)]

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:314](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=313), in get_paths(sources)
    312 for source in sources:
    313     indexes = getattr(source, "_indexes", [source])
--> 314     paths.extend([index.path for index in indexes])
    315 return paths

File [/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py:314](http://localhost:5678/data/common/miniforge3/envs/wp5/lib/python3.11/site-packages/c3s_eqc_automatic_quality_control/download.py#line=313), in <listcomp>(.0)
    312 for source in sources:
    313     indexes = getattr(source, "_indexes", [source])
--> 314     paths.extend([index.path for index in indexes])
    315 return paths

AttributeError: 'EmptySource' object has no attribute 'path'

@malmans2
Copy link
Member

malmans2 commented Feb 6, 2025

Looks like a bug in ECMWF's earhtkit-data. This is the relevant issue: ecmwf/earthkit-data#299
We might have to re-open the issue. I'll make a couple of tests, and will re-open the issue if needed.

@malmans2
Copy link
Member

malmans2 commented Feb 6, 2025

Hi @yoniv1,

Unfortunately, something is not right with this dataset. earthkit.data processes v6 correctly, but it raises an error with v7. There are two potential issues:

  • The data produced in v7 is not being distributed correctly.
  • There is a bug in earthkit.data, as it should be able to open this dataset.

I will open an issue in earthkit.data so the developers can look into this problem. I suggest you contact the reviewer to inform them of these findings. v6 and v7 are distributed differently, which may itself be unintended.

Here is a MRE:

import earthkit.data

dataset = "insitu-glaciers-extent"
request = {
    "variable": "glacier_area",
    "product_type": "vector",
}

for version in ("rgi_6_0", "rgi_7_0"):
    ds = earthkit.data.from_source("cds", dataset, request | {"version": version})
    try:
        df = ds.to_pandas()
    except Exception as exc:
        print(f"{version = }: {exc!s}")
        raise
    else:
        print(f"{version = }: OK!")

@malmans2
Copy link
Member

malmans2 commented Feb 6, 2025

Here is the earthkit.data issue: ecmwf/earthkit-data#607

@yoniv1
Copy link
Author

yoniv1 commented Feb 6, 2025

Thanks, Mattia! Something to follow up then.

@yoniv1
Copy link
Author

yoniv1 commented Feb 6, 2025

By the way, when downloading the data manually it indeed looks that the .zip file contains another .zip file with in there the .shp data. There are two zip files to be unzipped.

Edit: ah, but this was also the case for version 6.0 I see.

@malmans2
Copy link
Member

malmans2 commented Feb 6, 2025

Yes, v6 is also a zip of a zip. Something else is different in v7 and is not compatible with earthkit. Not sure if this is an issue in earthkit or in the actual data.

@malmans2
Copy link
Member

malmans2 commented Feb 12, 2025

Hi @yoniv1,

The earthkit-data developer explained very well the issues with v7: ecmwf/earthkit-data#607 (comment)

I suggest to share this information with the reviewer, so they can decide how to proceed.

@yoniv1
Copy link
Author

yoniv1 commented Feb 12, 2025

Hi @malmans2

Thanks for the follow-up and info. Will someone be contacted from the CDS side to fix the issue with the file naming?

Yoni

@malmans2
Copy link
Member

Isn't the reviewer of your task a CDS technical officer? Anyway, B-Open only provides technical support. You should get in touch with CNR if you're not sure how to proceed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants