metadata
license: apache-2.0
pipeline_tag: text-generation
language:
- da
tags:
- pretrained
inference:
parameters:
temperature: 0.7
datasets:
- DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1
Model Card for Munin 7B Alpha
The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.
It has been trained on Danish Gigaword using continual pretraining.
For full details of this model please read our release blog post. The code-base can be found on the our Git repo.
Note: This model is an Alpha model. We don't recommend using this model in production. If you do use the model, please let us know.
Notice
Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.
The Danish Foundation Models Team
- From the Center for Humanities Computing at Aarhus University:
- Kenneth Enevoldsen ([email protected])
- Lasse Hansen ([email protected])
- Kristoffer Laigaard Nielbo ([email protected])
- From the Alexandra Institute:
- Peter Bjørn Jørgensen ([email protected])
- Rasmus Larsen ([email protected])
- Dan Saattrup Nielsen ([email protected])
With Support From
- Danish e-infrastructure Consortium
- Acquisition and Logistics Organisation at the Danish Ministry of Defence
- Danish Ministry of Higher Education and Science under the Digital Security, Trust and Data Ethics performance contract