File size: 2,942 Bytes
b192c66
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
base_model: pints-ai/1.5-Pints-2K-v0.1
datasets:
- pints-ai/Expository-Prose-V1
- HuggingFaceH4/ultrachat_200k
- Open-Orca/SlimOrca-Dedup
- meta-math/MetaMathQA
- HuggingFaceH4/deita-10k-v0-sft
- WizardLM/WizardLM_evol_instruct_V2_196k
- togethercomputer/llama-instruct
- LDJnr/Capybara
- HuggingFaceH4/ultrafeedback_binarized
language:
- en
license: mit
pipeline_tag: text-generation
tags:
- mlx
extra_gated_prompt: Though best efforts has been made to ensure, as much as possible,
  that all texts in the training corpora are royalty free, this does not constitute
  a legal guarantee that such is the case. **By using any of the models, corpora or
  part thereof, the user agrees to bear full responsibility to do the necessary due
  diligence to ensure that he / she is in compliance with their local copyright laws.
  Additionally, the user agrees to bear any damages arising as a direct cause (or
  otherwise) of using any artifacts released by the pints research team, as well as
  full responsibility for the consequences of his / her usage (or implementation)
  of any such released artifacts. The user also indemnifies Pints Research Team (and
  any of its members or agents) of any damage, related or unrelated, to the release
  or subsequent usage of any findings, artifacts or code by the team. For the avoidance
  of doubt, any artifacts released by the Pints Research team are done so in accordance
  with the 'fair use' clause of Copyright Law, in hopes that this will aid the research
  community in bringing LLMs to the next frontier.
extra_gated_fields:
  Company: text
  Country: country
  Specific date: date_picker
  I want to use this model for:
    type: select
    options:
    - Research
    - Education
    - label: Other
      value: other
  I agree to use this model for in accordance to the afore-mentioned Terms of Use: checkbox
model-index:
- name: 1.5-Pints
  results:
  - task:
      type: text-generation
    dataset:
      name: MTBench
      type: ai2_arc
    metrics:
    - type: LLM-as-a-Judge
      value: 3.73
      name: MTBench
    source:
      url: https://huggingface.co/spaces/lmsys/mt-bench
      name: MTBench
---

# mlx-community/1.5-Pints-2K-v0.1

The Model [mlx-community/1.5-Pints-2K-v0.1](https://huggingface.co/mlx-community/1.5-Pints-2K-v0.1) was converted to MLX format from [pints-ai/1.5-Pints-2K-v0.1](https://huggingface.co/pints-ai/1.5-Pints-2K-v0.1) using mlx-lm version **0.19.2**.

## Use with mlx

```bash
pip install mlx-lm
```

```python
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/1.5-Pints-2K-v0.1")

prompt="hello"

if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
    messages = [{"role": "user", "content": prompt}]
    prompt = tokenizer.apply_chat_template(
        messages, tokenize=False, add_generation_prompt=True
    )

response = generate(model, tokenizer, prompt=prompt, verbose=True)
```