Datasets:
Best way to use this dataset
I've made a training code that utilizes multi-gpus. I first tried getting the images with the url but I realized this is a bottleneck for multi-gpu training. So I've made a script that downloads all the images and adds it to a new column, but it shows it would take 300 hours to download the whole thing. I would like to know what is the best way to utilize this kind of dataset, I don't want to spend almost two weeks to download the images if possible.
Thanks.
@frutiemax Thanks for reaching out! You could try to use the parquets made available here: https://huggingface.co/datasets/Spawning/pd12m-full. These have been preprocessed to include the images in the parquets already. That would cut out some of your effort, but you would still need to download the tar files with the images.