File size: 10,425 Bytes
eaab160 70c58c7 1e45dde 70c58c7 eaab160 3f7425a 70c58c7 8d2eb51 70c58c7 8d2eb51 70c58c7 8d2eb51 70c58c7 8d2eb51 70c58c7 8d2eb51 70c58c7 8d2eb51 1e45dde 8d2eb51 70c58c7 8d2eb51 70c58c7 8d2eb51 70c58c7 1e45dde 8d2eb51 70c58c7 3f7425a 70c58c7 2062591 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 |
---
license: other
language:
- en
tags:
- causal-lm
- code
metrics:
- code_eval
library_name: transformers
model-index:
- name: stabilityai/stable-code-instruct-3b
results:
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Python)
metrics:
- name: pass@1
type: pass@1
value: 32.4
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C++)
metrics:
- name: pass@1
type: pass@1
value: 30.9
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Java)
metrics:
- name: pass@1
type: pass@1
value: 32.1
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (JavaScript)
metrics:
- name: pass@1
type: pass@1
value: 32.1
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (PHP)
metrics:
- name: pass@1
type: pass@1
value: 24.2
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Rust)
metrics:
- name: pass@1
type: pass@1
value: 23.0
verified: false
quantized_by: bartowski
pipeline_tag: text-generation
lm_studio:
param_count: 3b
use_case: coding
release_date: 19-03-2024
model_creator: stabilityai
prompt_template: ChatML
system_prompt: A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
base_model: stablelm
original_repo: stabilityai/stable-code-instruct-3b
---
## 💫 Community Model> Stable Code Instruct 3B by Stability AI
*👾 [LM Studio](https://lmstudio.ai) Community models highlights program. Highlighting new & noteworthy models by the community. Join the conversation on [Discord](https://discord.gg/aPQfnNkxGC)*.
**Model creator:** [Stability AI](https://huggingface.co/stabilityai)<br>
**Original model**: [stable-code-instruct-3b](https://huggingface.co/stabilityai/stable-code-instruct-3b)<br>
**GGUF quantization:** provided by [bartowski](https://huggingface.co/bartowski) based on `llama.cpp` release [b2440](https://github.com/ggerganov/llama.cpp/releases/tag/b2440)<br>
## Model Summary:
Stable Code Instruct 3B is part of the Stable Code family, instruction tuned using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290) from the base Stable Code 3B model.
This model is great for general purpose coversational coding, as well as SQL generation.<br>
## Prompt Template:
Choose the `ChatML` preset in your LM Studio.
Under the hood, the model will see a prompt that's formatted like so:
```
<|im_start|>system
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
## Use case and examples
Stable Code Instruct 3B is trained primarily on coding, with a bit of conversational tuning to provide some assistant behavior. It also excels at SQL generation. This makes it great for light-weight coding purposes and its size makes it extremely fast for potential code-completion.
### SQL generation with explanation:
```
<|im_start|>system<br>
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.<|im_end|><br>
<|im_start|>user<br>
Using python, how can I create an SQL table where the primary key is a combination of two columns?<|im_end|><br>
<|im_start|>assistant<br>
To create an SQL table with a composite primary key in Python, you can use the `sqlite3` module. Here's an example code snippet:
import sqlite3
# Connect to the database (or create it if it doesn't exist)
conn = sqlite3.connect('example.db')
# Create a cursor object to execute SQL commands
c = conn.cursor()
# Define the table schema with a composite primary key
schema = '''
CREATE TABLE example_table (
column1 TEXT,
column2 INTEGER,
PRIMARY KEY (column1, column2)
);
'''
# Execute the SQL command to create the table
c.execute(schema)
# Commit the changes to the database
conn.commit()
# Close the database connection
conn.close()
In this example, we first import the `sqlite3` module and create a connection to the database (or create it if it doesn't exist). We then create a cursor object to execute SQL commands.
Next, we define the table schema with a composite primary key using the `CREATE TABLE` command. In this example, the table is named `example_table` and has two columns: `column1` (text data type) and `column2` (integer data type). The `PRIMARY KEY` constraint is used to specify that the combination of `column1` and `column2` should be unique for each row in the table.
After defining the table schema, we execute the SQL command using the cursor object. We then commit the changes to the database and close the database connection.
Note that this example assumes that you have already installed the `sqlite3` module in your Python environment. If you haven't done so already, you can install it by running the following command in your terminal or command prompt:
pip install sqlite3
This will install the `sqlite3` module in your Python environment.<|im_end|>
```
### General python coding questions:
```
<|im_start|>system<br>
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.<|im_end|><br>
<|im_start|>user<br>
Using python, how can I create an SQL table where the primary key is a combination of two columns?<|im_end|><br>
<|im_start|>assistant<br>
You can achieve this by using Python's built-in `sort()` function or `sorted()` function. Both these functions allow you to specify multiple sorting conditions.
Here is an example code snippet that demonstrates how to sort a list of integer tuples based on the second item, then the first:
# Sample list of integer tuples
my_list = [(1, 2), (3, 1), (2, 3)]
# Sort the list by the second item, then the first
my_list.sort(key=lambda x: (x[1], x[0]))))
# Print the sorted list
print(my_list)
In this example, we use a lambda function as the `key` argument to the `sort()` function. The lambda function takes each tuple in the list and returns a tuple of the second item and the first item of the original tuple. This way, the tuples are sorted based on the second item first, then by the first item if there is a tie.
The output of this code will be:
[(3, 1), (1, 2), (2, 3)]
This means that the list has been sorted in ascending order based on the second item of each tuple, and if there is a tie, it is resolved by sorting based on the first item of each tuple.<|im_end|>
```
### General C coding questions:
```
<|im_start|>system<br>
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.<|im_end|><br>
<|im_start|>user<br>
<|im_start|>user
Can you write a C program that takes two numbers and prints their sum to the console?<|im_end|>
<|im_start|>assistant
Sure! Here is a simple C program that does exactly what you asked for:
#include <stdio.h>
int main() {
int num1, num2, sum;
printf("Enter first number: ");
scanf("%d", &num1);
printf("Enter second number: ");
scanf("%d", &num2);
sum = num1 + num2;
printf("The sum of %d and %d is %d\n", num1, num2, sum);
return 0;
}
This program first prompts the user to enter two numbers. It then calculates the sum of those two numbers and prints the result to the console.<|im_end|>
```
## Technical Details
The base model was trained from the following datasets:
- [tiiuae/falcon-refinedweb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb)
- [bigcode/the-stack-github-issues](https://huggingface.co/datasets/bigcode/the-stack-github-issues)
- [bigcode/commitpackft](https://huggingface.co/datasets/bigcode/commitpackft)
- [bigcode/starcoderdata](https://huggingface.co/datasets/bigcode/starcoderdata)
- [EleutherAI/proof-pile-2](https://huggingface.co/datasets/EleutherAI/proof-pile-2)
- [meta-math/MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA)
Specializing in the following languages:
- C
- CPP
- Java
- JavaScript
- CSS
- Go
- HTML
- Ruby
- Rust
- Markdown
- Shell
- Php
- Sql
- R
- Typescript
- Python
- Jupyter-Clean
- RestructuredText
## Special thanks
🙏 Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.
## Disclaimers
LM Studio is not the creator, originator, or owner of any Model featured in the Community Model Program. Each Community Model is created and provided by third parties. LM Studio does not endorse, support, represent or guarantee the completeness, truthfulness, accuracy, or reliability of any Community Model. You understand that Community Models can produce content that might be offensive, harmful, inaccurate or otherwise inappropriate, or deceptive. Each Community Model is the sole responsibility of the person or entity who originated such Model. LM Studio may not monitor or control the Community Models and cannot, and does not, take responsibility for any such Model. LM Studio disclaims all warranties or guarantees about the accuracy, reliability or benefits of the Community Models. LM Studio further disclaims any warranty that the Community Model will meet your requirements, be secure, uninterrupted or available at any time or location, or error-free, viruses-free, or that any errors will be corrected, or otherwise. You will be solely responsible for any damage resulting from your use of or access to the Community Models, your downloading of any Community Model, or use of any other Community Model provided by or through LM Studio.
|