bokesyo commited on
Commit
9c28b51
β€’
1 Parent(s): d7301f2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -29,6 +29,8 @@ Our model is capable of:
29
 
30
  # News
31
 
 
 
32
  - 2024-07-14: πŸ€— We released **online huggingface demo**! Try our [online demo](https://huggingface.co/spaces/bokesyo/minicpm-visual-embeeding-v0-demo)!
33
 
34
  - 2024-07-14: πŸ˜‹ We released a **locally deployable Gradio demo** of `Memex`, take a look at [pipeline_gradio.py](https://huggingface.co/RhapsodyAI/minicpm-visual-embedding-v0/blob/main/pipeline_gradio.py). You can build a demo on your PC now!
 
29
 
30
  # News
31
 
32
+ - 2024-08-17: πŸ‘Š We open-sourced [cleaned version of training framework](https://github.com/RhapsodyAILab/MiniCPM-V-Embedding-v0-Train) for MiniCPM-Visual-Embedding, which supports `deepspeed zero stage 1,2` and large batchsize like `4096` for full-parameter training to turn VLMs into dense retrievers. We also developed methods to filter training datasets and generating queries using unlablled datasets, and we also supports extensible multi-nodes, multi-GPUs high-efficiency evaluation on large retrieval datasets.
33
+
34
  - 2024-07-14: πŸ€— We released **online huggingface demo**! Try our [online demo](https://huggingface.co/spaces/bokesyo/minicpm-visual-embeeding-v0-demo)!
35
 
36
  - 2024-07-14: πŸ˜‹ We released a **locally deployable Gradio demo** of `Memex`, take a look at [pipeline_gradio.py](https://huggingface.co/RhapsodyAI/minicpm-visual-embedding-v0/blob/main/pipeline_gradio.py). You can build a demo on your PC now!