LayerNorm.__init__() got an unexpected keyword argument 'bias'
#65
by
clabluo
- opened
ert.py", line 201, in init
self.norm = nn.LayerNorm(config.hidden_size, eps=config.norm_eps, bias=config.norm_bias)
TypeError: LayerNorm.init() got an unexpected keyword argument 'bias
I have the same problem with the newest release of transformers (4.49.0). Any updates on this?