deepnight-research commited on
Commit
0f9d1ff
1 Parent(s): 491a604

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: deepnight-responsible-ai
4
+ license_link: LICENSE
5
+ ---
6
+
7
+ # SaiLy 100B (deepnight-research/saily_100B)
8
+ <img src="https://i.ibb.co/TvZQjZM/Leonardo-Diffusion-XL-Furious-and-strong-Elephant-and-anchor-l-1.jpg" alt="Saily: Experimental AI Models by DEEPNIGHT">
9
+
10
+ ---
11
+ ### SaiLy is a series/collection of AI Models by DEEPNIGHT-RESEARCH which are highly experimental and uncensored. Please use with responsibility.
12
+ ---
13
+ <br>
14
+ Prompt Template: Alpeca
15
+
16
+ ```
17
+ Below is an instruction that describes a task. Write a response that appropriately completes the request.
18
+ ### Instruction:
19
+ {prompt}
20
+ ### Response:
21
+ ```
22
+
23
+ ### Description:
24
+ This is the first *stable* model of the series. The model is based on Llama2-chat.
25
+
26
+ ---
27
+
28
+ ### Did some said CODE?
29
+ Here you go!
30
+ ```python
31
+ import transformers
32
+ model = transformers.AutoModelForCausalLM.from_pretrained(
33
+ 'deepnight-research/saily_100B'
34
+ )
35
+ ```
36
+
37
+ To use the optimized triton implementation of FlashAttention, you can load the model on GPU ```(cuda:0)``` with ```attn_impl='triton'``` and with ```bfloat16``` precision:
38
+ ```python
39
+ import torch
40
+ import transformers
41
+
42
+ name = 'deepnight-research/saily_100B'
43
+
44
+ config = transformers.AutoConfig.from_pretrained(name)
45
+ config.attn_config['attn_impl'] = 'triton'
46
+ config.init_device = 'cuda:0' # For fast initialization directly on GPU!
47
+
48
+ model = transformers.AutoModelForCausalLM.from_pretrained(
49
+ name,
50
+ config=config,
51
+ torch_dtype=torch.bfloat16, # Load model weights in bfloat16
52
+ trust_remote_code=True
53
+ )
54
+
55
+ ```
56
+ ---
57
+
58
+ If you would like to support us, please consider donating for [#aiforcause](https://github.com/deepnight-ai/aiforcause).
59
+
60
+ Cheers✌️
61
+ - Team [DEEPNIGHT](https://deepnight.tech)