matlok
's Collections
Papers - Coding
updated
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Paper
•
2002.08155
•
Published
•
2
OpenCodeInterpreter: Integrating Code Generation with Execution and
Refinement
Paper
•
2402.14658
•
Published
•
82
CodeFusion: A Pre-trained Diffusion Model for Code Generation
Paper
•
2310.17680
•
Published
•
70
CodePlan: Repository-level Coding using LLMs and Planning
Paper
•
2309.12499
•
Published
•
74
ReGAL: Refactoring Programs to Discover Generalizable Abstractions
Paper
•
2401.16467
•
Published
•
9
Enabling Memory Safety of C Programs using LLMs
Paper
•
2404.01096
•
Published
•
1
Advancing LLM Reasoning Generalists with Preference Trees
Paper
•
2404.02078
•
Published
•
44
CodeEditorBench: Evaluating Code Editing Capability of Large Language
Models
Paper
•
2404.03543
•
Published
•
15
Training LLMs over Neurally Compressed Text
Paper
•
2404.03626
•
Published
•
21
Program of Thoughts Prompting: Disentangling Computation from Reasoning
for Numerical Reasoning Tasks
Paper
•
2211.12588
•
Published
•
3
How Far Can We Go with Practical Function-Level Program Repair?
Paper
•
2404.12833
•
Published
•
6
FlowMind: Automatic Workflow Generation with LLMs
Paper
•
2404.13050
•
Published
•
34
Plot2Code: A Comprehensive Benchmark for Evaluating Multi-modal Large
Language Models in Code Generation from Scientific Plots
Paper
•
2405.07990
•
Published
•
16
McEval: Massively Multilingual Code Evaluation
Paper
•
2406.07436
•
Published
•
39
Is Programming by Example solved by LLMs?
Paper
•
2406.08316
•
Published
•
12
SelfCodeAlign: Self-Alignment for Code Generation
Paper
•
2410.24198
•
Published
•
23
OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models
Paper
•
2411.04905
•
Published
•
113
Classical Sorting Algorithms as a Model of Morphogenesis: self-sorting
arrays reveal unexpected competencies in a minimal model of basal
intelligence
Paper
•
2401.05375
•
Published
•
1
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding
and Generation
Paper
•
2102.04664
•
Published
•
2