arnocandel commited on
Commit
a577186
1 Parent(s): 29a00ba

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -1
README.md CHANGED
@@ -20,4 +20,36 @@ This model can be fine-tuned with [H2O.ai](https://h2o.ai/) open-source software
20
  - h2oGPT https://github.com/h2oai/h2ogpt/
21
  - H2O LLM Studio https://h2o.ai/platform/ai-cloud/make/llm-studio/
22
 
23
- Try our live [h2oGPT demo](https://gpt.h2o.ai) with side-by-side LLM comparisons and private document chat!
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  - h2oGPT https://github.com/h2oai/h2ogpt/
21
  - H2O LLM Studio https://h2o.ai/platform/ai-cloud/make/llm-studio/
22
 
23
+ Try our live [h2oGPT demo](https://gpt.h2o.ai) with side-by-side LLM comparisons and private document chat!
24
+
25
+
26
+ ## Model Architecture
27
+
28
+ ```
29
+ LlamaForCausalLM(
30
+ (model): LlamaModel(
31
+ (embed_tokens): Embedding(32000, 5120, padding_idx=0)
32
+ (layers): ModuleList(
33
+ (0-39): 40 x LlamaDecoderLayer(
34
+ (self_attn): LlamaAttention(
35
+ (q_proj): Linear(in_features=5120, out_features=5120, bias=False)
36
+ (k_proj): Linear(in_features=5120, out_features=5120, bias=False)
37
+ (v_proj): Linear(in_features=5120, out_features=5120, bias=False)
38
+ (o_proj): Linear(in_features=5120, out_features=5120, bias=False)
39
+ (rotary_emb): LlamaRotaryEmbedding()
40
+ )
41
+ (mlp): LlamaMLP(
42
+ (gate_proj): Linear(in_features=5120, out_features=13824, bias=False)
43
+ (up_proj): Linear(in_features=5120, out_features=13824, bias=False)
44
+ (down_proj): Linear(in_features=13824, out_features=5120, bias=False)
45
+ (act_fn): SiLUActivation()
46
+ )
47
+ (input_layernorm): LlamaRMSNorm()
48
+ (post_attention_layernorm): LlamaRMSNorm()
49
+ )
50
+ )
51
+ (norm): LlamaRMSNorm()
52
+ )
53
+ (lm_head): Linear(in_features=5120, out_features=32000, bias=False)
54
+ )
55
+ ```