victorlxh commited on
Commit
4f34a54
1 Parent(s): 83ea3ce

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -5,12 +5,12 @@ license: cc-by-nc-4.0
5
  # ICKG Model Card
6
 
7
  ## Model Details
8
- ICKG (Integrated Contextual Knowledge Graph Generator) is a knowledge graph construction (KGC) task-specific instruction-following language model fine-tuned from LMSYS's Vicuna-7B, which itself is derived from Meta's LLaMA LLM.
9
 
10
  - **Developed by**: [Xiaohui Li](https://xiaohui-victor-li.github.io/)
11
  - **Model type**: Auto-regressive language model based on the transformer architecture.
12
  - **License**: Non-commercial
13
- - **Finetuned from model**: [Vicuna-7B](https://huggingface.co/lmsys/vicuna-7b-v1.3) (originally from [LLaMA](https://arxiv.org/abs/2302.13971)).
14
 
15
  ## Model Sources
16
  - **Repository**: [https://github.com/your-github-repo](https://github.com/your-github-repo)
@@ -18,14 +18,14 @@ ICKG (Integrated Contextual Knowledge Graph Generator) is a knowledge graph cons
18
  - **Paper**: [https://arxiv.org/abs/your-paper-id](https://arxiv.org/abs/your-paper-id)
19
 
20
  ## Uses
21
- The primary use of iKG LLM is for generating knowledge graphs (KG) based on instruction-following capability with specialized prompts. It's intended for researchers, data scientists, and developers interested in natural language processing, and knowledge graph construction.
22
 
23
  ## How to Get Started with the Model
24
  - **Python Code**: [https://github.com/your-github-repo/tree/main#api](https://github.com/your-github-repo/tree/main#api)
25
  - **Command line interface of FastChat**: [https://github.com/your-github-repo#ikg-weights](https://github.com/your-github-repo#ikg-weights)
26
 
27
  ## Training Details
28
- iKG is fine-tuned from Vicuna-7B using ~3K instruction-following demonstrations including KG construction input document and extracted KG triplets as response output. iKG is thus learnt to extract list of KG triplets from given text document via prompt engineering. For more in-depth training details, refer to the "Generative Knowledge Graph Construction with Fine-tuned LLM" section of [the accompanying paper](https://arxiv.org/abs/your-paper-id).
29
 
30
  - **Prompt Template**: The entities and relationship can be customized for specific tasks. `<input_text>` is the document text to replace.
31
 
@@ -65,7 +65,7 @@ iKG is fine-tuned from Vicuna-7B using ~3K instruction-following demonstrations
65
  ```
66
 
67
  ## Evaluation
68
- iKG has undergone preliminary evaluation comparing its performance to GPT-3.5, GPT-4, and the original Vicuna-7B model. With respect to the KG construction task, it outperforms GPT-3.5 and Vicuna-7B while exhibiting comparative capability as GPT-4. iKG excels in generating instruction-based knowledge graphs with a particular emphasis on quality and adherence to format.
69
 
70
  For a more detailed introduction, refer to [the accompanying paper](https://arxiv.org/abs/your-paper-id).
71
 
 
5
  # ICKG Model Card
6
 
7
  ## Model Details
8
+ ICKG (Integrated Contextual Knowledge Graph Generator) 2.0 is a knowledge graph construction (KGC) task-specific instruction-following language model fine-tuned from LMSYS's Vicuna-7B, which itself is derived from Meta's LLaMA LLM.
9
 
10
  - **Developed by**: [Xiaohui Li](https://xiaohui-victor-li.github.io/)
11
  - **Model type**: Auto-regressive language model based on the transformer architecture.
12
  - **License**: Non-commercial
13
+ - **Finetuned from model**: [Vicuna-7B](https://huggingface.co/lmsys/vicuna-7b-v1.5) (originally from [LLaMA 2.0](https://ai.meta.com/llama/)).
14
 
15
  ## Model Sources
16
  - **Repository**: [https://github.com/your-github-repo](https://github.com/your-github-repo)
 
18
  - **Paper**: [https://arxiv.org/abs/your-paper-id](https://arxiv.org/abs/your-paper-id)
19
 
20
  ## Uses
21
+ The primary use of ICKG LLM is for generating knowledge graphs (KG) based on instruction-following capability with specialized prompts. It's intended for researchers, data scientists, and developers interested in natural language processing, and knowledge graph construction.
22
 
23
  ## How to Get Started with the Model
24
  - **Python Code**: [https://github.com/your-github-repo/tree/main#api](https://github.com/your-github-repo/tree/main#api)
25
  - **Command line interface of FastChat**: [https://github.com/your-github-repo#ikg-weights](https://github.com/your-github-repo#ikg-weights)
26
 
27
  ## Training Details
28
+ ICKG 2.0 is fine-tuned from the latest Vicuna-7B using ~3K instruction-following demonstrations including KG construction input document and extracted KG triplets as response output. iKG is thus learnt to extract list of KG triplets from given text document via prompt engineering. For more in-depth training details, refer to the "Generative Knowledge Graph Construction with Fine-tuned LLM" section of [the accompanying paper](https://arxiv.org/abs/your-paper-id).
29
 
30
  - **Prompt Template**: The entities and relationship can be customized for specific tasks. `<input_text>` is the document text to replace.
31
 
 
65
  ```
66
 
67
  ## Evaluation
68
+ ICKG has undergone preliminary evaluation comparing its performance to GPT-3.5, GPT-4, and the original Vicuna-7B model. With respect to the KG construction task, it outperforms GPT-3.5 and Vicuna-7B while exhibiting comparative capability as GPT-4. iKG excels in generating instruction-based knowledge graphs with a particular emphasis on quality and adherence to format.
69
 
70
  For a more detailed introduction, refer to [the accompanying paper](https://arxiv.org/abs/your-paper-id).
71