AhmedSSoliman commited on
Commit
4768852
·
1 Parent(s): c498e18

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -8
README.md CHANGED
@@ -1,20 +1,14 @@
1
  # MarianCG: A TRANSFORMER MODEL FOR AUTOMATIC CODE GENERATION
2
-
3
  In this work we worked to improve the solving of the code generation problem and implement a transformer model that can work with high accurate results. We implemented MarianCG transformer model which is a code generation model that can be able to generate code from natural language. This work declares the impact of using Marian machine translation model for solving the problem of code generation. In our implementation we prove that a machine translation model can be operated and working as a code generation model.Finally, we set the new contributors and state-of-the-art on CoNaLa reaching a BLEU score of 30.92 in the code generation problem with CoNaLa dataset.
4
-
5
- This is the model is avialable on the huggingface hub
6
- https://huggingface.co/AhmedSSoliman/MarianCG_NL-to-Code
7
-
8
  ```python
9
  # Model and Tokenizer
10
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
11
-
12
  # model_name = "AhmedSSoliman/MarianCG_NL-to-Code"
13
  model = AutoModelForSeq2SeqLM.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
14
  tokenizer = AutoTokenizer.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
15
-
16
  # Input (Natural Language) and Output (Python Code)
17
  NL_input = "create array containing the maximum value of respective elements of array `[2, 3, 4]` and array `[1, 5, 2]"
18
  output = model.generate(**tokenizer(NL_input, padding="max_length", truncation=True, max_length=512, return_tensors="pt"))
19
  output_code = tokenizer.decode(output[0], skip_special_tokens=True)
20
- ```
 
1
  # MarianCG: A TRANSFORMER MODEL FOR AUTOMATIC CODE GENERATION
 
2
  In this work we worked to improve the solving of the code generation problem and implement a transformer model that can work with high accurate results. We implemented MarianCG transformer model which is a code generation model that can be able to generate code from natural language. This work declares the impact of using Marian machine translation model for solving the problem of code generation. In our implementation we prove that a machine translation model can be operated and working as a code generation model.Finally, we set the new contributors and state-of-the-art on CoNaLa reaching a BLEU score of 30.92 in the code generation problem with CoNaLa dataset.
3
+ This is the model is avialable on the huggingface hub https://huggingface.co/AhmedSSoliman/MarianCG_NL-to-Code
 
 
 
4
  ```python
5
  # Model and Tokenizer
6
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
 
7
  # model_name = "AhmedSSoliman/MarianCG_NL-to-Code"
8
  model = AutoModelForSeq2SeqLM.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
9
  tokenizer = AutoTokenizer.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
 
10
  # Input (Natural Language) and Output (Python Code)
11
  NL_input = "create array containing the maximum value of respective elements of array `[2, 3, 4]` and array `[1, 5, 2]"
12
  output = model.generate(**tokenizer(NL_input, padding="max_length", truncation=True, max_length=512, return_tensors="pt"))
13
  output_code = tokenizer.decode(output[0], skip_special_tokens=True)
14
+ ```