Open
Description
Hi there,
I was following along with your video guide on this project but with my own dataset.
As I started training for my data I ran into an error.
RuntimeError : stack expects each tensor to be equal size
Our codes are basically identical and the structure of data seems to be identical as well. I'm not sure what's causing this issue or how to resolve it.
More in-depth error:
RuntimeError Traceback (most recent call last)
<timed exec> in <module>
<ipython-input-26-8ba1e19dd195> in train_epoch(model, data_loader, loss_fn, optimizer, device, scheduler, n_examples)
4 correct_predictions = 0
5
----> 6 for i in data_loader:
7 input_ids = i['input_ids'].to(device)
8 attention_mask = i['attention_mask'].to(device)
~\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py in __next__(self)
361
362 def __next__(self):
--> 363 data = self._next_data()
364 self._num_yielded += 1
365 if self._dataset_kind == _DatasetKind.Iterable and \
~\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py in _next_data(self)
401 def _next_data(self):
402 index = self._next_index() # may raise StopIteration
--> 403 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
404 if self._pin_memory:
405 data = _utils.pin_memory.pin_memory(data)
~\Anaconda3\lib\site-packages\torch\utils\data\_utils\fetch.py in fetch(self, possibly_batched_index)
45 else:
46 data = self.dataset[possibly_batched_index]
---> 47 return self.collate_fn(data)
~\Anaconda3\lib\site-packages\torch\utils\data\_utils\collate.py in default_collate(batch)
72 return batch
73 elif isinstance(elem, container_abcs.Mapping):
---> 74 return {key: default_collate([d[key] for d in batch]) for key in elem}
75 elif isinstance(elem, tuple) and hasattr(elem, '_fields'): # namedtuple
76 return elem_type(*(default_collate(samples) for samples in zip(*batch)))
~\Anaconda3\lib\site-packages\torch\utils\data\_utils\collate.py in <dictcomp>(.0)
72 return batch
73 elif isinstance(elem, container_abcs.Mapping):
---> 74 return {key: default_collate([d[key] for d in batch]) for key in elem}
75 elif isinstance(elem, tuple) and hasattr(elem, '_fields'): # namedtuple
76 return elem_type(*(default_collate(samples) for samples in zip(*batch)))
~\Anaconda3\lib\site-packages\torch\utils\data\_utils\collate.py in default_collate(batch)
53 storage = elem.storage()._new_shared(numel)
54 out = elem.new(storage)
---> 55 return torch.stack(batch, 0, out=out)
56 elif elem_type.__module__ == 'numpy' and elem_type.__name__ != 'str_' \
57 and elem_type.__name__ != 'string_':
RuntimeError: stack expects each tensor to be equal size, but got [160] at entry 0 and [161] at entry 5
Metadata
Metadata
Assignees
Labels
No labels