Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/main/course/vi/chapter8/section4.ipynb
Views: 2555
Kernel: Unknown Kernel
Gỡ lỗi quy trình huấn luyện
Install the Transformers, Datasets, and Evaluate libraries to run this notebook.
In [ ]:
In [ ]:
'ValueError: You have to specify either input_ids or inputs_embeds'
In [ ]:
{'hypothesis': 'Product and geography are what make cream skimming work. ',
'idx': 0,
'label': 1,
'premise': 'Conceptually cream skimming has two basic dimensions - product and geography.'}
In [ ]:
'ValueError: expected sequence of length 43 at dim 1 (got 37)'
In [ ]:
'[CLS] conceptually cream skimming has two basic dimensions - product and geography. [SEP] product and geography are what make cream skimming work. [SEP]'
In [ ]:
dict_keys(['attention_mask', 'hypothesis', 'idx', 'input_ids', 'label', 'premise'])
In [ ]:
transformers.models.distilbert.modeling_distilbert.DistilBertForSequenceClassification
In [ ]:
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
In [ ]:
True
In [ ]:
1
In [ ]:
['entailment', 'neutral', 'contradiction']
In [ ]:
~/git/transformers/src/transformers/data/data_collator.py in torch_default_data_collator(features)
105 batch[k] = torch.stack([f[k] for f in features])
106 else:
--> 107 batch[k] = torch.tensor([f[k] for f in features])
108
109 return batch
ValueError: expected sequence of length 45 at dim 1 (got 76)
In [ ]:
<function transformers.data.data_collator.default_data_collator(features: List[InputDataClass], return_tensors='pt') -> Dict[str, Any]>
In [ ]:
RuntimeError: CUDA error: CUBLAS_STATUS_ALLOC_FAILED when calling `cublasCreate(handle)`
In [ ]:
In [ ]:
In [ ]:
In [ ]:
~/.pyenv/versions/3.7.9/envs/base/lib/python3.7/site-packages/torch/nn/functional.py in nll_loss(input, target, weight, size_average, ignore_index, reduce, reduction)
2386 )
2387 if dim == 2:
-> 2388 ret = torch._C._nn.nll_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index)
2389 elif dim == 4:
2390 ret = torch._C._nn.nll_loss2d(input, target, weight, _Reduction.get_enum(reduction), ignore_index)
IndexError: Target 2 is out of bounds.
In [ ]:
2
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
TypeError: only size-1 arrays can be converted to Python scalars
In [ ]:
TypeError: only size-1 arrays can be converted to Python scalars
In [ ]:
In [ ]:
TypeError: only size-1 arrays can be converted to Python scalars
In [ ]:
((8, 3), (8,))
In [ ]:
{'accuracy': 0.625}
In [ ]:
In [ ]:
In [ ]:
{'accuracy': 1.0}