W0227 13:02:02.246692 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\albumentations\imgaug\transforms.py:222: FutureWarning: IAASharpen is deprecated. Please use Sharpen instead
warnings.warn("IAASharpen is deprecated. Please use Sharpen instead", FutureWarning)
W0227 13:02:02.247689 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\albumentations\imgaug\transforms.py:165: FutureWarning: This augmentation is deprecated. Please use Emboss instead
warnings.warn("This augmentation is deprecated. Please use Emboss instead", FutureWarning)
W0227 13:02:02.248687 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\albumentations\augmentations\transforms.py:688: FutureWarning: This class has been deprecated. Please use CoarseDropout
warnings.warn(
W0227 13:02:02.249684 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\albumentations\augmentations\transforms.py:913: FutureWarning: This class has been deprecated. Please use ImageCompression
warnings.warn(
I0227 13:02:02.259657 380692 dataset.py:28] Subsampling buckets from 0 to 100, total buckets number is 100
I0227 13:02:02.261652 380692 dataset.py:71] Dataset has been created with 1243 samples
I0227 13:02:02.280601 380692 dataset.py:28] Subsampling buckets from 0 to 100, total buckets number is 100
I0227 13:02:02.283593 380692 dataset.py:71] Dataset has been created with 1243 samples
W0227 13:02:02.285588 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\torchvision\models\_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and will be removed in 0.15, please use 'weights' instead.
warnings.warn(
W0227 13:02:02.286585 380692 warnings.py:109] D:\anaconda3\envs\deblur-gan\lib\site-packages\torchvision\models\_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and will be removed in 0.15. The current behavior is equivalent to passing `weights=VGG19_Weights.IMAGENET1K_V1`. You can also use `weights=VGG19_Weights.DEFAULT` to get the most up-to-date weights.
warnings.warn(msg)
I0227 13:02:03.317861 380692 helpers.py:183] Loading pretrained weights from url (https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth)
Epoch 0, lr 0.0001: 0%| | 0/2000 [00:01<?, ?it/s]Traceback (most recent call last):
File "<string>", line 1, in <module>
Traceback (most recent call last):
File "d:\deblur\Ghost-DeblurGAN\train.py", line 187, in <module>
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\spawn.py", line 116, in spawn_main
trainer.train()
File "d:\deblur\Ghost-DeblurGAN\train.py", line 40, in train
exitcode = _main(fd, parent_sentinel)
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\spawn.py", line 126, in _main
self._run_epoch(epoch)
File "d:\deblur\Ghost-DeblurGAN\train.py", line 65, in _run_epoch
self = reduction.pickle.load(from_parent)
EOFError: Ran out of inputfor data in tq:
File "D:\anaconda3\envs\deblur-gan\lib\site-packages\tqdm\std.py", line 1185, in __iter__
for obj in iterable:
File "D:\anaconda3\envs\deblur-gan\lib\site-packages\torch\utils\data\dataloader.py", line 444, in __iter__
return self._get_iterator()
File "D:\anaconda3\envs\deblur-gan\lib\site-packages\torch\utils\data\dataloader.py", line 390, in _get_iterator
return _MultiProcessingDataLoaderIter(self)
File "D:\anaconda3\envs\deblur-gan\lib\site-packages\torch\utils\data\dataloader.py", line 1077, in __init__
w.start()
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\process.py", line 121, in start
self._popen = self._Popen(self)
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\context.py", line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\context.py", line 327, in _Popen
return Popen(process_obj)
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\popen_spawn_win32.py", line 93, in __init__
reduction.dump(process_obj, to_child)
File "D:\anaconda3\envs\deblur-gan\lib\multiprocessing\reduction.py", line 60, in
dump
ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'get_corrupt_function.<locals>.process'