Model Card
This is a text generation model based on the unsloth/Meta-Llama-3.1-8B architecture.
3. Creating config.json
:
If the config.json
file is missing, create it based on the model architecture. Here’s a generic example for a transformer-based model:
{
"architectures": [
"MetaLlamaForCausalLM"
],
"model_type": "meta-llama",
"hidden_size": 8192,
"num_attention_heads": 64,
"num_hidden_layers": 32,
"vocab_size": 50257,
"max_position_embeddings": 1024,
"initializer_range": 0.02,
"layer_norm_epsilon": 1e-5,
"bos_token_id": 50256,
"eos_token_id": 50256
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support