SELU

SeLU (Scaled-exponential linear units) , на сайте с June 16, 2023 16:54
When using kaiming_normal or kaiming_normal_ for initialisation, nonlinearity='linear' should be used instead of nonlinearity='selu' in order to get Self-Normalizing Neural Networks. See torch.nn.init.calculate_gain() for more information. SELU. Docs. Access comprehensive developer documentation for PyTorch.