Hest download fails randomly due to consistency check failure
While downloading the data using instructions mentioned here, randomly getting this error.
OSError: Consistency check failed: file should be of size 493976088 but has size 334638656 (MEND157.h5).
We are sorry for the inconvenience. Please retry with `force_download=True`.
If the issue persists, please let us know by opening an issue on https://github.com/huggingface/huggingface_hub.
Currently im re running the download script if it fails, but is there a solution to this error? Does restarting download corrupt the data for which error was observed (here its MEND157.h5) or does it skip it and starts downloading for next sample
This can happen if your connection is unstable. You can fix using:
path = './hest_data/'
def load_dataset_until_success():
while True:
try:
dataset = datasets.load_dataset(
'MahmoodLab/hest',
cache_dir=path,
patterns='*'
)
return dataset
except Exception as e:
print(f"Error occurred: {e}. Retrying...")
time.sleep(1)
# Call the function
dataset = load_dataset_until_success()
Then, to make sure everything is downloaded you can use:
from hest import iter_hest
df = pd.read_csv('./assets/HEST_v1_1_0.csv')
ids = df['id].values.tolist()
for st in iter_hest('./hest_data', id_list=ids):
print(st)
Hi, I am getting an error as : BadZipFile Traceback (most recent call last)
Cell In[11], line 8
5 ids_to_query = ['TENX95', 'TENX99'] # list of ids to query
7 list_patterns = [f"*{id}[_.]**" for id in ids_to_query]
----> 8 dataset = datasets.load_dataset(
9 'MahmoodLab/hest',
10 cache_dir=local_dir,
11 patterns=list_patterns
12 )
any help? Thanks
plus 1. Any ideas how to solve this?
It looks like one of the cell segmentation ZIP files wasn't fully downloaded.
Please try deleting the .zip files in your local_dir
folder and restart the download.
To rule out any environment-related issues, here’s my minimal Python 3.9 environment:
huggingface-hub==0.29.3
datasets==3.4.0
Hey Paul, thx for getting back to this. I am not sure I understand your reply. There is no zip files in the local dir. I still get a BadZipFile: File is not a zip file error. Here is the full error msg. Please let me know, if I am doing anything wrong
BadZipFile Traceback (most recent call last)
Cell In[9], line 8
5 ids_to_query = ['TENX96', 'TENX99'] # list of ids to query
7 list_patterns = [f"*{id}[_.]**" for id in ids_to_query]
----> 8 dataset = datasets.load_dataset(
9 'MahmoodLab/hest',
10 cache_dir=local_dir,
11 patterns=list_patterns
12 )
File /opt/homebrew/Caskroom/miniconda/base/envs/general/lib/python3.11/site-packages/datasets/load.py:2061, in load_dataset(path, name, data_dir, data_files, split, cache_dir, features, download_config, download_mode, verification_mode, keep_in_memory, save_infos, revision, token, streaming, num_proc, storage_options, trust_remote_code, **config_kwargs)
2056 verification_mode = VerificationMode(
2057 (verification_mode or VerificationMode.BASIC_CHECKS) if not save_infos else VerificationMode.ALL_CHECKS
2058 )
2060 # Create a dataset builder
-> 2061 builder_instance = load_dataset_builder(
2062 path=path,
2063 name=name,
2064 data_dir=data_dir,
2065 data_files=data_files,
2066 cache_dir=cache_dir,
2067 features=features,
2068 download_config=download_config,
2069 download_mode=download_mode,
...
-> 1369 raise BadZipFile("File is not a zip file")
1370 if not endrec:
1371 raise BadZipFile("File is not a zip file")
BadZipFile: File is not a zip file
Can you try to remove thehest_data
folder entirely and restart the download please, one of the file is corrupted
I suspect that Hugging Face also caches files elsewhere. Could you try removing all folders related to hest in your HF cache directory:
~/.cache/huggingface
on LinuxC:\Users\{USER}\.cache\huggingface
on Windows
Hey Paul, I deleted the entire cache but I still experiencing the zip file error BadZiPFile: File is not a zip file. Are you sure this is not caused by a corrupted file in the repo? Can you download the repo on your end?
No worries! Glad you could solve the issue in the end