Datasets:

Hest download fails randomly due to consistency check failure

#7
by adiv5 - opened

While downloading the data using instructions mentioned here, randomly getting this error.

OSError: Consistency check failed: file should be of size 493976088 but has size 334638656 (MEND157.h5).
We are sorry for the inconvenience. Please retry with `force_download=True`.
If the issue persists, please let us know by opening an issue on https://github.com/huggingface/huggingface_hub.

Currently im re running the download script if it fails, but is there a solution to this error? Does restarting download corrupt the data for which error was observed (here its MEND157.h5) or does it skip it and starts downloading for next sample

AI for Pathology Image Analysis Lab @ HMS / BWH org

This can happen if your connection is unstable. You can fix using:

path = './hest_data/'

def load_dataset_until_success():
    while True:
        try:
            dataset = datasets.load_dataset(
                'MahmoodLab/hest', 
                cache_dir=path,
                patterns='*'
            )
            return dataset
        except Exception as e:
            print(f"Error occurred: {e}. Retrying...")
            time.sleep(1)  

# Call the function
dataset = load_dataset_until_success()

Then, to make sure everything is downloaded you can use:

from hest import iter_hest
df = pd.read_csv('./assets/HEST_v1_1_0.csv')
ids = df['id].values.tolist()
for st in iter_hest('./hest_data', id_list=ids):
    print(st)

Hi, I am getting an error as : BadZipFile Traceback (most recent call last)
Cell In[11], line 8
5 ids_to_query = ['TENX95', 'TENX99'] # list of ids to query
7 list_patterns = [f"*{id}[_.]**" for id in ids_to_query]
----> 8 dataset = datasets.load_dataset(
9 'MahmoodLab/hest',
10 cache_dir=local_dir,
11 patterns=list_patterns
12 )

any help? Thanks

plus 1. Any ideas how to solve this?

AI for Pathology Image Analysis Lab @ HMS / BWH org
edited Mar 16

It looks like one of the cell segmentation ZIP files wasn't fully downloaded.

Please try deleting the .zip files in your local_dir folder and restart the download.

To rule out any environment-related issues, here’s my minimal Python 3.9 environment:

huggingface-hub==0.29.3
datasets==3.4.0

Hey Paul, thx for getting back to this. I am not sure I understand your reply. There is no zip files in the local dir. I still get a BadZipFile: File is not a zip file error. Here is the full error msg. Please let me know, if I am doing anything wrong


BadZipFile Traceback (most recent call last)
Cell In[9], line 8
5 ids_to_query = ['TENX96', 'TENX99'] # list of ids to query
7 list_patterns = [f"*{id}[_.]**" for id in ids_to_query]
----> 8 dataset = datasets.load_dataset(
9 'MahmoodLab/hest',
10 cache_dir=local_dir,
11 patterns=list_patterns
12 )

File /opt/homebrew/Caskroom/miniconda/base/envs/general/lib/python3.11/site-packages/datasets/load.py:2061, in load_dataset(path, name, data_dir, data_files, split, cache_dir, features, download_config, download_mode, verification_mode, keep_in_memory, save_infos, revision, token, streaming, num_proc, storage_options, trust_remote_code, **config_kwargs)
2056 verification_mode = VerificationMode(
2057 (verification_mode or VerificationMode.BASIC_CHECKS) if not save_infos else VerificationMode.ALL_CHECKS
2058 )
2060 # Create a dataset builder
-> 2061 builder_instance = load_dataset_builder(
2062 path=path,
2063 name=name,
2064 data_dir=data_dir,
2065 data_files=data_files,
2066 cache_dir=cache_dir,
2067 features=features,
2068 download_config=download_config,
2069 download_mode=download_mode,
...
-> 1369 raise BadZipFile("File is not a zip file")
1370 if not endrec:
1371 raise BadZipFile("File is not a zip file")

BadZipFile: File is not a zip file

AI for Pathology Image Analysis Lab @ HMS / BWH org

Can you try to remove thehest_data folder entirely and restart the download please, one of the file is corrupted

Yes, I did that, but the error was not resolved. At the stage where the error message appears, no local folder was created—e.g., no local hest_data folder. The download breaks after resolving the data files (screenshot attached).

image.png

AI for Pathology Image Analysis Lab @ HMS / BWH org
edited Mar 18

I suspect that Hugging Face also caches files elsewhere. Could you try removing all folders related to hest in your HF cache directory:

~/.cache/huggingface on Linux
C:\Users\{USER}\.cache\huggingface on Windows

Hey Paul, I deleted the entire cache but I still experiencing the zip file error BadZiPFile: File is not a zip file. Are you sure this is not caused by a corrupted file in the repo? Can you download the repo on your end?

AI for Pathology Image Analysis Lab @ HMS / BWH org

Hi @Tommvie , I'm 99% that there are no corrupted file in the repo, I've successfully downloaded the dataset from scratch on macOS, windows and linux today without any issues.

@ravimk23 may I ask you if you managed to download the dataset in the end?

Hey, @pauldoucet you were right. The problem is always in front of the computer. sorry!

Hi @pauldoucet Yes, I am able to download all data. thanks

AI for Pathology Image Analysis Lab @ HMS / BWH org

No worries! Glad you could solve the issue in the end

pauldoucet changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment