unknown
commited on
Commit
·
61d727f
1
Parent(s):
ff842a9
add files
Browse files
README.md
CHANGED
@@ -59,23 +59,23 @@ This repository contains all fine-tuned models for experiments with ComBack.
|
|
59 |
| | Stmt. Comp. | Stmt. Comp. | Next. Sugg. | Next. Sugg. | Code. Gen. | Code. Gen. |
|
60 |
|-------------|:-----------------:|:-----------------:|:----------------:|:----------------:|:----------:|:----------:|
|
61 |
| **Model** | EM | ED | EM | ED | BLEU4 | ED |
|
62 |
-
| CodeBert | 0.00 | 0.97 | 0.00 | 1.31 | 0.00 | 0.44 |
|
63 |
-
| GraphCodeBert | 0.00 | 0.35 | 0.00 | 0.54 | 0.00 | 2.41 |
|
64 |
-
| UnixCoder | 0.07 | 27.56 | 15.93 | 29.11 | 0.00 | 31.81 |
|
65 |
-
| CodeT5 | 0.65 | 21.45 | xxx | xxx | xxx | xxx |
|
66 |
| NatGen | 0.00 | 13.52 | 0.02 | 15.95 | 0.01 | 28.76 |
|
67 |
-
| CodeT5
|
68 |
|
69 |
- Fine-Tuned
|
70 |
| | Stmt. Comp. | Stmt. Comp. | Next. Sugg. | Next. Sugg. | Code. Gen. | Code. Gen. |
|
71 |
|-------------|:-----------------:|:-----------------:|:----------------:|:----------------:|:----------:|:----------:|
|
72 |
| **Model** | EM | ED | EM | ED | BLEU4 | ED |
|
73 |
-
| CodeBert | 53.84 | 77.44 | 52.67 | 70.82 | xxx | xxx |
|
74 |
-
| GraphCodeBert | 43.00 | 71.89 | 47.10 | 61.31 | xxx | xxx |
|
75 |
-
| UnixCoder | **67.84** | **85.06** | 58.51 | 75.31 | 56.24 | 73.45 |
|
76 |
-
| CodeT5 | 66.38 | 84.34 | xxx | xxx | xxx | xxx |
|
77 |
| NatGen | 67.47 | 84.83 | **60.30** | **76.84** | 71.73 | 81.39 |
|
78 |
-
| CodeT5
|
79 |
|
80 |
|
81 |
- `New_Targets/All_Types/*`: **Take data of RISC-V,ARC,NVPTX both in GCC and LLVM as test set, split train/valid set in the ratio of 85%:15% of other 171(178 - 2*3 - 1) targets excluding RI5CY(RI5CY is custmoized based on RISCV)**
|
@@ -99,9 +99,9 @@ This repository contains all fine-tuned models for experiments with ComBack.
|
|
99 |
|----------|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
|
100 |
| | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX |
|
101 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
102 |
-
| ChatGPT | 10.34 | 38.41 | 15.35 | 42.94 | 12.01 | 41.47 | 6.44 | 12.9 | 9.75 | 20.79 | 7.97 | 17.79 | 7.33 | 30.83 | 7.35 | 32.34 | 8.12 | 32.71 |
|
103 |
-
| Code-LLaMA | 0.41 | 19.07 | 0.85 | 16.77 | 0.56 | 18.22 | 1.58 | 13.54 | 2.66 | 17.95 | 2.47 | 16.59 | 9.38 | 35.53 | 11.06 | 37.15 | 8.24 | 33.00 |
|
104 |
-
| CodeT5
|
105 |
|
106 |
|
107 |
- LLVM
|
@@ -110,9 +110,9 @@ This repository contains all fine-tuned models for experiments with ComBack.
|
|
110 |
|----------|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
|
111 |
| | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX |
|
112 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
113 |
-
| ChatGPT | 12.08 | 41.39 | 16.77 | 42.02 | 14.73 | 43.72 | 9.80 | 21.86 | 10.81 | 20.66 | 11.39 | 22.82 | 9.24 | 32.13 | 11.96 | 35.33 | 10.07 | 32.90 |
|
114 |
-
| Code-LLaMA | 0.45 | 17.61 | 0.61 | 17.21 | 0.99 | 17.23 | 1.75 | 15.04 | 0.42 | 11.27 | 2.42 | 16.25 | 6.92 | 32.54 | 8.95 | 38.22 | 8.20 | 34.16 |
|
115 |
-
| CodeT5
|
116 |
|
117 |
|
118 |
|
|
|
59 |
| | Stmt. Comp. | Stmt. Comp. | Next. Sugg. | Next. Sugg. | Code. Gen. | Code. Gen. |
|
60 |
|-------------|:-----------------:|:-----------------:|:----------------:|:----------------:|:----------:|:----------:|
|
61 |
| **Model** | EM | ED | EM | ED | BLEU4 | ED |
|
62 |
+
| CodeBert-c | 0.00 | 0.97 | 0.00 | 1.31 | 0.00 | 0.44 |
|
63 |
+
| GraphCodeBert-c | 0.00 | 0.35 | 0.00 | 0.54 | 0.00 | 2.41 |
|
64 |
+
| UnixCoder-base-nine | 0.07 | 27.56 | 15.93 | 29.11 | 0.00 | 31.81 |
|
65 |
+
| CodeT5-base | 0.65 | 21.45 | xxx | xxx | xxx | xxx |
|
66 |
| NatGen | 0.00 | 13.52 | 0.02 | 15.95 | 0.01 | 28.76 |
|
67 |
+
| CodeT5+-220m | 0.02 | 7.24 | 0.12 | 9.87 | 0.00 | 12.33 |
|
68 |
|
69 |
- Fine-Tuned
|
70 |
| | Stmt. Comp. | Stmt. Comp. | Next. Sugg. | Next. Sugg. | Code. Gen. | Code. Gen. |
|
71 |
|-------------|:-----------------:|:-----------------:|:----------------:|:----------------:|:----------:|:----------:|
|
72 |
| **Model** | EM | ED | EM | ED | BLEU4 | ED |
|
73 |
+
| CodeBert-c | 53.84 | 77.44 | 52.67 | 70.82 | xxx | xxx |
|
74 |
+
| GraphCodeBert-c | 43.00 | 71.89 | 47.10 | 61.31 | xxx | xxx |
|
75 |
+
| UnixCoder-base-nine | **67.84** | **85.06** | 58.51 | 75.31 | 56.24 | 73.45 |
|
76 |
+
| CodeT5-base | 66.38 | 84.34 | xxx | xxx | xxx | xxx |
|
77 |
| NatGen | 67.47 | 84.83 | **60.30** | **76.84** | 71.73 | 81.39 |
|
78 |
+
| CodeT5+-220m | 66.93 | 84.45 | 59.57 | 76.41 | **75.28** | **82.95** |
|
79 |
|
80 |
|
81 |
- `New_Targets/All_Types/*`: **Take data of RISC-V,ARC,NVPTX both in GCC and LLVM as test set, split train/valid set in the ratio of 85%:15% of other 171(178 - 2*3 - 1) targets excluding RI5CY(RI5CY is custmoized based on RISCV)**
|
|
|
99 |
|----------|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
|
100 |
| | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX |
|
101 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
102 |
+
| ChatGPT-3.5-Turbo | 10.34 | 38.41 | 15.35 | 42.94 | 12.01 | 41.47 | 6.44 | 12.9 | 9.75 | 20.79 | 7.97 | 17.79 | 7.33 | 30.83 | 7.35 | 32.34 | 8.12 | 32.71 |
|
103 |
+
| Code-LLaMA-34B | 0.41 | 19.07 | 0.85 | 16.77 | 0.56 | 18.22 | 1.58 | 13.54 | 2.66 | 17.95 | 2.47 | 16.59 | 9.38 | 35.53 | 11.06 | 37.15 | 8.24 | 33.00 |
|
104 |
+
| CodeT5+-220m | **51.16** | **75.32** | **52.45** | **74.57** | **50.56** | **75.52** | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx |
|
105 |
|
106 |
|
107 |
- LLVM
|
|
|
110 |
|----------|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
|
111 |
| | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX | RISC-V | RISC-V | ARC | ARC | NVPTX | NVPTX |
|
112 |
| Model | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | EM | ED | BLEU4 | ED | BLEU4 | ED | BLEU4 | ED |
|
113 |
+
| ChatGPT-3.5-Turbo | 12.08 | 41.39 | 16.77 | 42.02 | 14.73 | 43.72 | 9.80 | 21.86 | 10.81 | 20.66 | 11.39 | 22.82 | 9.24 | 32.13 | 11.96 | 35.33 | 10.07 | 32.90 |
|
114 |
+
| Code-LLaMA-34B | 0.45 | 17.61 | 0.61 | 17.21 | 0.99 | 17.23 | 1.75 | 15.04 | 0.42 | 11.27 | 2.42 | 16.25 | 6.92 | 32.54 | 8.95 | 38.22 | 8.20 | 34.16 |
|
115 |
+
| CodeT5+-220m | **62.68** | **82.02** | **71.34** | **85.98** | **64.45** | **81.53** | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx | xxx |
|
116 |
|
117 |
|
118 |
|