pyabsa.networks.losses.LDAMLoss
Module Contents
Classes
References: |
- class pyabsa.networks.losses.LDAMLoss.LDAMLoss(cls_num_list, max_m=0.5, weight=None, s=30)[source]
Bases:
torch.nn.Module
References: Cao et al., Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss. NeurIPS 2019.
- Parameters:
s (float, double) – the scale of logits, according to the official codes.
max_m (float, double) – margin on loss functions. See original paper’s Equation (12) and (13)
- Notes: There are two hyper-parameters of LDAMLoss codes provided by official codes,
but the authors only provided the settings on long-tailed CIFAR. Settings on other datasets are not avaliable (https://github.com/kaidic/LDAM-DRW/issues/5).