我正在尝试在深度学习项目中使用 CelebA 数据集。我有来自 Kaggle 的压缩文件夹。我想解压图像,然后将图像分为训练、测试和验证,但后来发现这在我不太强大的系统上是不可能的。
因此,为了避免浪费时间,我想使用 TensorFlow-datasets 方法来加载 CelebA 数据集。但不幸的是,数据集无法访问,并出现以下错误:
(代码先)
ds = tfds.load('celeb_a', split='train', download=True)
Run Code Online (Sandbox Code Playgroud)
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-69-d7b9371eb674> in <module>
----> 1 ds = tfds.load('celeb_a', split='train', download=True)
c:\users\aman\appdata\local\programs\python\python38\lib\site-packages\tensorflow_datasets\core\load.py in load(name, split, data_dir, batch_size, shuffle_files, download, as_supervised, decoders, read_config, with_info, builder_kwargs, download_and_prepare_kwargs, as_dataset_kwargs, try_gcs)
344 if download:
345 download_and_prepare_kwargs = download_and_prepare_kwargs or {}
--> 346 dbuilder.download_and_prepare(**download_and_prepare_kwargs)
347
348 if as_dataset_kwargs is None:
c:\users\aman\appdata\local\programs\python\python38\lib\site-packages\tensorflow_datasets\core\dataset_builder.py in download_and_prepare(self, download_dir, download_config)
383 self.info.read_from_directory(self._data_dir)
384 else:
--> 385 …Run Code Online (Sandbox Code Playgroud) image python-3.x deep-learning tensorflow-datasets tensorflow2.0