--- base_model: [] library_name: transformers tags: - mergekit - merge --- # POINTS-1-5-Qwen-2-5-7B-Chat This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241130e2-sft-pointsv15-hf/ * /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241127e3-sft-pointsv15-hf/ * /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241129e2-sft-pointsv15-hf/ * /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241130e3-sft-pointsv15-hf/ ### Configuration The following YAML configuration was used to produce this model: ```yaml models: # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241123e2-sft-hf/ # parameters: # weight: 1.0 # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241124e2-sft-hf/ # parameters: # weight: 1.0 # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241125e1-sft-hf/ # parameters: # weight: 1.0 # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241125e3-sft-hf/ # parameters: # weight: 1.0 # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241127e1-sft-hf/ # parameters: # weight: 1.0 # - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241127e2-sft-hf/ # parameters: # weight: 1.0 - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241127e3-sft-pointsv15-hf/ parameters: weight: 1.0 - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241129e2-sft-pointsv15-hf/ parameters: weight: 1.0 - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241130e2-sft-pointsv15-hf/ parameters: weight: 1.0 - model: /mnt/cephfs/bensenliu/wfs/weights/mm/mmq-llava-20241130e3-sft-pointsv15-hf/ parameters: weight: 1.0 merge_method: linear parameters: normalize: true dtype: bfloat16 ```