This is a Llama3
model uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends.
This model is related to a CausalLM
task.
Model config:
- name: llama3_backbone
- trainable: True
- vocabulary_size: 128256
- num_layers: 32
- num_query_heads: 32
- hidden_dim: 4096
- intermediate_dim: 14336
- rope_max_wavelength: 10000
- rope_scaling_factor: 1.0
- num_key_value_heads: 8
- layer_norm_epsilon: 1e-06
- dropout: 0
This model card has been generated automatically and should be completed by the model author. See Model Cards documentation for more information.
- Downloads last month
- 21
Inference API (serverless) does not yet support keras-hub models for this pipeline type.