meta-llama/Llama-3.1-8B - 4b_2n4m_128bs Compression

This is a compressed model using deltazip.

Paper, Compression Tool, Inference Engine (Soon).

Compression Configuration

  • Base Model: meta-llama/Llama-3.1-8B
  • Compression Scheme: 4b_2n4m_128bs
  • Dataset: HuggingFaceH4/ultrachat_200k
  • Dataset Split: train_sft
  • Max Sequence Length: 2048
  • Number of Samples: 256

Sample Output

Prompt:

[{'role': 'user', 'content': 'Who is Alan Turing?'}]

Output:

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

Cutting Knowledge Date: December 2023
Today Date: 26 Jul 2024

<|eot_id|><|start_header_id|>user<|end_header_id|>

Who is Alan Turing?<|eot_id|><|start_header_id|>assistant<|end_header_id|>

Alan Turing (1912-1954) was a British mathematician, computer scientist, logician, and philosopher. He is widely considered one of the most influential figures in the development of modern computer science and artificial intelligence.

Turing was born in London, England, and showed exceptional mathematical abilities from an early age. He studied mathematics at King's College, Cambridge, where he graduated with a degree in mathematics in 1934. He

Evaluation

Downloads last month
7
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for deltazip/meta-llama.Llama-3.1-8B-Instruct.4b_2n4m_128bs

Finetuned
(581)
this model

Dataset used to train deltazip/meta-llama.Llama-3.1-8B-Instruct.4b_2n4m_128bs

Collection including deltazip/meta-llama.Llama-3.1-8B-Instruct.4b_2n4m_128bs