fix readme
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -40,7 +40,7 @@ Trying the following codes, you can perform the batched offline inference with t 
     | 
|
| 40 | 
         
             
            ```python
         
     | 
| 41 | 
         
             
            from lmdeploy import pipeline, TurbomindEngineConfig
         
     | 
| 42 | 
         
             
            engine_config = TurbomindEngineConfig(model_format='awq')
         
     | 
| 43 | 
         
            -
            pipe = pipeline("internlm/internlm2-chat-20b-4bits", engine_config)
         
     | 
| 44 | 
         
             
            response = pipe(["Hi, pls intro yourself", "Shanghai is"])
         
     | 
| 45 | 
         
             
            print(response)
         
     | 
| 46 | 
         
             
            ```
         
     | 
| 
         | 
|
| 40 | 
         
             
            ```python
         
     | 
| 41 | 
         
             
            from lmdeploy import pipeline, TurbomindEngineConfig
         
     | 
| 42 | 
         
             
            engine_config = TurbomindEngineConfig(model_format='awq')
         
     | 
| 43 | 
         
            +
            pipe = pipeline("internlm/internlm2-chat-20b-4bits", backend_config=engine_config)
         
     | 
| 44 | 
         
             
            response = pipe(["Hi, pls intro yourself", "Shanghai is"])
         
     | 
| 45 | 
         
             
            print(response)
         
     | 
| 46 | 
         
             
            ```
         
     |