mamba-130m-niah / config.json
jon12398's picture
added model
20930d3
raw
history blame contribute delete
165 Bytes
{"d_model": 768, "n_layer": 24, "vocab_size": 50277, "ssm_cfg": {}, "rms_norm": true, "residual_in_fp32": true, "fused_add_norm": true, "pad_vocab_size_multiple": 8}