sammysun0711 commited on
Commit
5780d92
·
1 Parent(s): 3a97f6b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -1
README.md CHANGED
@@ -5,14 +5,19 @@ pipeline_tag: text-generation
5
  ---
6
  FP32 Model converted from Pytorch: https://github.com/FlagAI-Open/FlagAI/tree/master/examples/Aquila
7
 
8
- Supports Inference with AutoModelForCausalLM, ORTModelForCausalLM and OVModelForCausalLM
9
  ```python
 
 
10
  import torch
11
  from transformers import AutoTokenizer, AutoModelForCausalLM
12
 
13
  tokenizer = AutoTokenizer.from_pretrained('sammysun0711/aquilachat-7b-hf')
14
  model = AutoModelForCausalLM.from_pretrained('sammysun0711/aquilachat-7b-hf', trust_remote_code=True)
15
  model = model.eval()
 
 
 
16
 
17
  question = '北京为什么是中国的首都?'
18
  prompt = (
 
5
  ---
6
  FP32 Model converted from Pytorch: https://github.com/FlagAI-Open/FlagAI/tree/master/examples/Aquila
7
 
8
+ Support Inference with AutoModelForCausalLM, ORTModelForCausalLM and OVModelForCausalLM
9
  ```python
10
+ #!pip install transformers>=4.30.2
11
+ #!pip install optimum>=1.8.7 optimum-intel>=1.9.0
12
  import torch
13
  from transformers import AutoTokenizer, AutoModelForCausalLM
14
 
15
  tokenizer = AutoTokenizer.from_pretrained('sammysun0711/aquilachat-7b-hf')
16
  model = AutoModelForCausalLM.from_pretrained('sammysun0711/aquilachat-7b-hf', trust_remote_code=True)
17
  model = model.eval()
18
+ # from optimum import ORTModelForCausalLM, OVModelForCausalLM
19
+ # model = ORTModelForCausalLM.from_pretrained('sammysun0711/aquilachat-7b-hf', export=True, use_cache=True, trust_remote_code=True)
20
+ # model = OVModelForCausalLM.from_pretrained('sammysun0711/aquilachat-7b-hf', export=True, use_cache=True, trust_remote_code=True)
21
 
22
  question = '北京为什么是中国的首都?'
23
  prompt = (