C-BERT MLM

Exploring Software Naturalness through Neural Language Models

Overview

This model is the unofficial HuggingFace version of "C-BERT" with just the masked language modeling head for pretraining. The weights come from "An Empirical Comparison of Pre-Trained Models of Source Code". Please cite the authors if you use this in an academic setting.

Downloads last month
16
Safetensors
Model size
45.1M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.