Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ license: cc-by-nc-4.0
|
|
5 |
# ICKG Model Card
|
6 |
|
7 |
## Model Details
|
8 |
-
ICKG (Integrated Contextual Knowledge Graph Generator) 2.0 is a knowledge graph construction (KGC) task-specific instruction-following language model fine-tuned from LMSYS's Vicuna-7B, which itself is derived from Meta's LLaMA LLM.
|
9 |
|
10 |
- **Developed by**: [Xiaohui Li](https://xiaohui-victor-li.github.io/)
|
11 |
- **Model type**: Auto-regressive language model based on the transformer architecture.
|
@@ -21,8 +21,7 @@ ICKG (Integrated Contextual Knowledge Graph Generator) 2.0 is a knowledge graph
|
|
21 |
The primary use of ICKG LLM is for generating knowledge graphs (KG) based on instruction-following capability with specialized prompts. It's intended for researchers, data scientists, and developers interested in natural language processing, and knowledge graph construction.
|
22 |
|
23 |
## How to Get Started with the Model
|
24 |
-
- **Python Code**: [https://github.com/
|
25 |
-
- **Command line interface of FastChat**: [https://github.com/your-github-repo#ikg-weights](https://github.com/your-github-repo#ikg-weights)
|
26 |
|
27 |
## Training Details
|
28 |
ICKG 2.0 is fine-tuned from the latest Vicuna-7B using ~3K instruction-following demonstrations including KG construction input document and extracted KG triplets as response output. iKG is thus learnt to extract list of KG triplets from given text document via prompt engineering. For more in-depth training details, refer to the "Generative Knowledge Graph Construction with Fine-tuned LLM" section of [the accompanying paper](https://arxiv.org/abs/your-paper-id).
|
|
|
5 |
# ICKG Model Card
|
6 |
|
7 |
## Model Details
|
8 |
+
ICKG (Integrated Contextual Knowledge Graph Generator) 2.0 is a knowledge graph construction (KGC) task-specific instruction-following language model fine-tuned from LMSYS's Vicuna-7B, which itself is derived from Meta's LLaMA 2.0 LLM.
|
9 |
|
10 |
- **Developed by**: [Xiaohui Li](https://xiaohui-victor-li.github.io/)
|
11 |
- **Model type**: Auto-regressive language model based on the transformer architecture.
|
|
|
21 |
The primary use of ICKG LLM is for generating knowledge graphs (KG) based on instruction-following capability with specialized prompts. It's intended for researchers, data scientists, and developers interested in natural language processing, and knowledge graph construction.
|
22 |
|
23 |
## How to Get Started with the Model
|
24 |
+
- **Python Code**: [https://github.com/xiaohui-victor-li/FinDKG](https://github.com/xiaohui-victor-li/FinDKG)
|
|
|
25 |
|
26 |
## Training Details
|
27 |
ICKG 2.0 is fine-tuned from the latest Vicuna-7B using ~3K instruction-following demonstrations including KG construction input document and extracted KG triplets as response output. iKG is thus learnt to extract list of KG triplets from given text document via prompt engineering. For more in-depth training details, refer to the "Generative Knowledge Graph Construction with Fine-tuned LLM" section of [the accompanying paper](https://arxiv.org/abs/your-paper-id).
|