cuierfei commited on
Commit
53b2857
1 Parent(s): cdf9142

Upload folder using huggingface_hub

Browse files
Files changed (2) hide show
  1. README.md +3 -3
  2. config.json +1 -0
README.md CHANGED
@@ -65,7 +65,7 @@ For more information about the pipeline parameters, please refer to [here](https
65
  LMDeploy's `api_server` enables models to be easily packed into services with a single command. The provided RESTful APIs are compatible with OpenAI's interfaces. Below are an example of service startup:
66
 
67
  ```shell
68
- lmdeploy serve api_server OpenGVLab/InternVL-Chat-V1-5-AWQ --model-name InternVL-Chat-V1-5-AWQ --backend turbomind --server-port 23333 --model-format awq
69
  ```
70
 
71
  To use the OpenAI-style interface, you need to install OpenAI:
@@ -82,7 +82,7 @@ from openai import OpenAI
82
  client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
83
  model_name = client.models.list().data[0].id
84
  response = client.chat.completions.create(
85
- model="InternVL-Chat-V1-5-AWQ",
86
  messages=[{
87
  'role':
88
  'user',
@@ -104,7 +104,7 @@ print(response)
104
 
105
  ## License
106
 
107
- This project is released under the MIT license, while InternLM is licensed under the Apache-2.0 license.
108
 
109
  ## Citation
110
 
 
65
  LMDeploy's `api_server` enables models to be easily packed into services with a single command. The provided RESTful APIs are compatible with OpenAI's interfaces. Below are an example of service startup:
66
 
67
  ```shell
68
+ lmdeploy serve api_server OpenGVLab/InternVL-Chat-V1-5-AWQ --backend turbomind --server-port 23333 --model-format awq
69
  ```
70
 
71
  To use the OpenAI-style interface, you need to install OpenAI:
 
82
  client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
83
  model_name = client.models.list().data[0].id
84
  response = client.chat.completions.create(
85
+ model=model_name,
86
  messages=[{
87
  'role':
88
  'user',
 
104
 
105
  ## License
106
 
107
+ This project is released under the MIT license, while InternLM2 is licensed under the Apache-2.0 license.
108
 
109
  ## Citation
110
 
config.json CHANGED
@@ -8,6 +8,7 @@
8
  "AutoModel": "modeling_internvl_chat.InternVLChatModel",
9
  "AutoModelForCausalLM": "modeling_internvl_chat.InternVLChatModel"
10
  },
 
11
  "downsample_ratio": 0.5,
12
  "dynamic_image_size": true,
13
  "force_image_size": 448,
 
8
  "AutoModel": "modeling_internvl_chat.InternVLChatModel",
9
  "AutoModelForCausalLM": "modeling_internvl_chat.InternVLChatModel"
10
  },
11
+ "system_message": "You are an AI assistant whose name is InternLM (书生·浦语).",
12
  "downsample_ratio": 0.5,
13
  "dynamic_image_size": true,
14
  "force_image_size": 448,