CodeBERT for Defect Detection

Fine-tuned from microsoft/codebert-base for vulnerability detection.

Usage

from transformers import RobertaTokenizer, RobertaForSequenceClassification
import torch

tokenizer = RobertaTokenizer.from_pretrained("phamtungthuy/codebert-defect-detection")
model = RobertaForSequenceClassification.from_pretrained("phamtungthuy/codebert-defect-detection")

code = "int main() { char buf[10]; strcpy(buf, argv[1]); }"
inputs = tokenizer(code, return_tensors="pt", truncation=True, max_length=400)
prob = torch.sigmoid(model(**inputs).logits)
print("Vulnerable" if prob > 0.5 else "Secure")
Downloads last month
27
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for phamtungthuy/codebert-defect-detection

Finetuned
(124)
this model