kevinkawchak
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -14,14 +14,14 @@ datasets:
|
|
14 |
---
|
15 |
|
16 |
- **Developed by:** kevinkawchak
|
17 |
-
- **License:**
|
18 |
- **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit
|
19 |
- **Finetuned using dataset :** zjunlp/Mol-Instructions, cc-by-4.0
|
20 |
- **Dataset identification:** Molecule-oriented Instructions
|
21 |
- **Dataset function:** Description guided molecule design
|
22 |
|
23 |
The following are modifications or improvements to original notebooks. Please refer to the authors' models for the published primary work.
|
24 |
-
[Cover Image](https://drive.google.com/file/d/1J-spZMzLlPxkqfMrPxvtMZiD2_hfcGyr/view?usp=sharing) <br>
|
25 |
|
26 |
A 4-bit quantization of Meta-Llama-3-8B-Instruct was used to reduce training memory requirements when fine-tuning on the zjunlp/Mol-Instructions dataset. (1-2) In addition, the minimum LoRA rank value was utilized to reduce the overall size of created models. In specific, the molecule-oriented instructions description guided molecule design was implemented to answer general questions and general biochemistry questions. General questions were answered with high accuracy, while biochemistry related questions returned 'SELFIES' structures but with limited accuracy.
|
27 |
|
|
|
14 |
---
|
15 |
|
16 |
- **Developed by:** kevinkawchak
|
17 |
+
- **License:** llama3
|
18 |
- **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit
|
19 |
- **Finetuned using dataset :** zjunlp/Mol-Instructions, cc-by-4.0
|
20 |
- **Dataset identification:** Molecule-oriented Instructions
|
21 |
- **Dataset function:** Description guided molecule design
|
22 |
|
23 |
The following are modifications or improvements to original notebooks. Please refer to the authors' models for the published primary work.
|
24 |
+
[Cover Image](https://drive.google.com/file/d/1J-spZMzLlPxkqfMrPxvtMZiD2_hfcGyr/view?usp=sharing). [META LLAMA 3 COMMUNITY LICENSE AGREEMENT](https://llama.meta.com/llama3/license/). Built with Meta Llama 3. <br>
|
25 |
|
26 |
A 4-bit quantization of Meta-Llama-3-8B-Instruct was used to reduce training memory requirements when fine-tuning on the zjunlp/Mol-Instructions dataset. (1-2) In addition, the minimum LoRA rank value was utilized to reduce the overall size of created models. In specific, the molecule-oriented instructions description guided molecule design was implemented to answer general questions and general biochemistry questions. General questions were answered with high accuracy, while biochemistry related questions returned 'SELFIES' structures but with limited accuracy.
|
27 |
|