Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
jonabur commited on
Commit
21dbecf
1 Parent(s): 17b86f2

update for final checkpoint

Browse files
Files changed (1) hide show
  1. README.md +13 -5
README.md CHANGED
@@ -16,11 +16,9 @@ language:
16
 
17
  # Viking 7B
18
 
19
- **NOTE: We are aware of an incompatibility with HF transformers that impacts finetuning and are working to correct it.**
20
-
21
- _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
22
-
23
- Viking 7B is a 7B parameter decoder-only transformer pretrained on Finnish, English, Swedish, Danish, Norwegian, Icelandic and code. It is being trained on 2 trillion tokens (1 trillion as of this release). Viking 7B is a fully open source model and is made available under the Apache 2.0 License.
24
 
25
  Viking was created in a collaboration between the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/),and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
26
 
@@ -97,6 +95,16 @@ Training Checkpoints are available as branches in the repository. Checkpoints w
97
  * [800B](https://huggingface.co/LumiOpen/Viking-7B/tree/800B)
98
  * [900B](https://huggingface.co/LumiOpen/Viking-7B/tree/900B)
99
  * [1000B](https://huggingface.co/LumiOpen/Viking-7B/tree/1000B)
 
 
 
 
 
 
 
 
 
 
100
 
101
  The transformers library allows you to load a checkpoint from a branch as follows:
102
 
 
16
 
17
  # Viking 7B
18
 
19
+ Viking 7B is a 7B parameter decoder-only transformer pretrained on Finnish,
20
+ English, Swedish, Danish, Norwegian, Icelandic and code. It has been trained
21
+ on 2 trillion tokens. Viking 7B is a fully open source model and is made available under the Apache 2.0 License.
 
 
22
 
23
  Viking was created in a collaboration between the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/),and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
24
 
 
95
  * [800B](https://huggingface.co/LumiOpen/Viking-7B/tree/800B)
96
  * [900B](https://huggingface.co/LumiOpen/Viking-7B/tree/900B)
97
  * [1000B](https://huggingface.co/LumiOpen/Viking-7B/tree/1000B)
98
+ * [1100B](https://huggingface.co/LumiOpen/Viking-7B/tree/1100B)
99
+ * [1200B](https://huggingface.co/LumiOpen/Viking-7B/tree/1200B)
100
+ * [1300B](https://huggingface.co/LumiOpen/Viking-7B/tree/1300B)
101
+ * [1400B](https://huggingface.co/LumiOpen/Viking-7B/tree/1400B)
102
+ * [1500B](https://huggingface.co/LumiOpen/Viking-7B/tree/1500B)
103
+ * [1600B](https://huggingface.co/LumiOpen/Viking-7B/tree/1600B)
104
+ * [1700B](https://huggingface.co/LumiOpen/Viking-7B/tree/1700B)
105
+ * [1800B](https://huggingface.co/LumiOpen/Viking-7B/tree/1800B)
106
+ * [1900B](https://huggingface.co/LumiOpen/Viking-7B/tree/1900B)
107
+ * [2000B](https://huggingface.co/LumiOpen/Viking-7B/tree/2000B)
108
 
109
  The transformers library allows you to load a checkpoint from a branch as follows:
110