khaimaitien
commited on
Commit
•
f222a51
1
Parent(s):
6728726
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# This model can generate the solution to problem in [LeetCode](https://leetcode.com)
|
2 |
+
|
3 |
+
## The training data: [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/datasets/khaimaitien/leetcode_problem_solution)
|
4 |
+
## The base model: [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf)
|
5 |
+
You can find more information at: https://github.com/khaimt/coding_challenge_solver
|
6 |
+
|
7 |
+
The prompt template is:
|
8 |
+
``` python
|
9 |
+
prompt_str = (
|
10 |
+
f"[INST] Write code to solve the following coding problem that obeys"
|
11 |
+
f"the constraints and passes the example test cases."
|
12 |
+
f"Please wrap your code answer using ```:\n{input}\n[/INST]```python\n"
|
13 |
+
)
|
14 |
+
```
|
15 |
+
Where input is the problem in LeetCode, an example is: https://github.com/khaimt/coding_challenge_solver/blob/main/test_cases/problem1.txt
|
16 |
+
|
17 |
+
**Example for inference:**
|
18 |
+
|
19 |
+
```python
|
20 |
+
prompt_str = (
|
21 |
+
f"[INST] Write code to solve the following coding problem that obeys"
|
22 |
+
f"the constraints and passes the example test cases."
|
23 |
+
f"Please wrap your code answer using ```:\n{input}\n[/INST]```python\n"
|
24 |
+
)
|
25 |
+
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto", torch_dtype=torch.bfloat16)
|
26 |
+
token_ids = tokenizer([prompt_str], return_tensors="pt")["input_ids"]
|
27 |
+
token_ids = token_ids.to(model.device)
|
28 |
+
outputs = model.generate(input_ids=token_ids, max_new_tokens=1024, do_sample=True, temperature=0.0001)
|
29 |
+
all_token_ids = outputs[0].tolist()
|
30 |
+
ouput_token_ids = all_token_ids[token_ids.shape[-1] :]
|
31 |
+
output = tokenizer.decode(ouput_token_ids)
|
32 |
+
print("-------------Solution generated from Model---------")
|
33 |
+
print(output)
|
34 |
+
```
|