Joh*_*ton 4 deep-learning pytorch dataloader
我有两个数据加载器,我想合并它们而不重新定义数据集,在我的例子中是 train_dataset 和 val_dataset。
train_loader = DataLoader(train_dataset, batch_size = 512, drop_last=True,shuffle=True)
val_loader = DataLoader(val_dataset, batch_size = 512, drop_last=False)
Run Code Online (Sandbox Code Playgroud)
想要的结果:
train_loader = train_loader + val_loader
Run Code Online (Sandbox Code Playgroud)
数据加载器是迭代器,您可以实现一个返回迭代器的函数,该迭代器产生数据加载器的内容,一个接一个的数据加载器。
给定多个迭代器itrs,它将迭代每个迭代器,然后迭代每个迭代器,一次生成一批。可能的实现非常简单:
def itr_merge(*itrs):
for itr in itrs:
for v in itr:
yield v
Run Code Online (Sandbox Code Playgroud)
这是一个用法示例:
>>> dl1 = DataLoader(TensorDataset(torch.zeros(5, 1)), batch_size=2, drop_last=True)
>>> dl2 = DataLoader(TensorDataset(torch.ones(10, 1)), batch_size=2)
>>> for x in itr_merge(dl1, dl2):
>>> print(x)
[tensor([[0.], [0.]])]
[tensor([[0.], [0.]])]
[tensor([[1.], [1.]])]
[tensor([[1.], [1.]])]
[tensor([[1.], [1.]])]
[tensor([[1.], [1.]])]
[tensor([[1.], [1.]])]
Run Code Online (Sandbox Code Playgroud)
有一个ConcatDataset可用的文档,记录在https://pytorch.org/docs/stable/_modules/torch/utils/data/dataset.html#ConcatDataset中。您可以在将数据集传递给DataLoader
import torch
from torch.utils.data import TensorDataset, ConcatDataset, DataLoader
dsa = TensorDataset(torch.rand(100, 3), torch.rand(100, 1) )
dsb = TensorDataset(torch.rand(150, 3), torch.rand(150, 1) )
dsab_cat = ConcatDataset([dsa, dsb])
dsab_cat_loader = DataLoader(dsab_cat)
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
13222 次 |
| 最近记录: |